Physically optimizing inference (1805.07512v3)
Abstract: Data is scaling exponentially in fields ranging from genomics to neuroscience to economics. A central question is: can modern machine learning methods be applied to construct predictive models of natural systems like cells and brains based on large data sets? In this paper, we examine how inference is impacted when training data is generated by the statistical behavior of a physical system, and hence outside direct control by the experimentalist. We develop an information-theoretic analysis for the canonical problem of spin-network inference. Our analysis reveals the essential role that the physical properties of the spin network and its environment play in determining the difficulty of the underlying machine learning problem. Specifically, stochastic fluctuations drive a system to explore a range of configurations providing `raw' information for a learning algorithm to construct an accurate model; yet they also blur energetic differences between network states and thereby degrade information. This competition leads spin networks to generically have an intrinsic optimal temperature at which stochastic spin fluctuations provide maximal information for discriminating among competing models, maximizing inference efficiency. We demonstrate a simple active learning protocol that optimizes network temperature to boost inference efficiency and dramatically increases the efficiency of inference on a neural circuit reconstruction task. Our results reveal a fundamental link between physics and information and show how the physical environment can be tuned to optimize the efficiency of machine learning.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.