Papers
Topics
Authors
Recent
2000 character limit reached

Cognitive Action Laws: The Case of Visual Features

Published 28 Aug 2018 in cs.CV | (1808.09162v1)

Abstract: This paper proposes a theory for understanding perceptual learning processes within the general framework of laws of nature. Neural networks are regarded as systems whose connections are Lagrangian variables, namely functions depending on time. They are used to minimize the cognitive action, an appropriate functional index that measures the agent interactions with the environment. The cognitive action contains a potential and a kinetic term that nicely resemble the classic formulation of regularization in machine learning. A special choice of the functional index, which leads to forth-order differential equations---Cognitive Action Laws (CAL)---exhibits a structure that mirrors classic formulation of machine learning. In particular, unlike the action of mechanics, the stationarity condition corresponds with the global minimum. Moreover, it is proven that typical asymptotic learning conditions on the weights can coexist with the initialization provided that the system dynamics is driven under a policy referred to as information overloading control. Finally, the theory is experimented for the problem of feature extraction in computer vision.

Citations (14)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.