Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Origin of the computational hardness for learning with binary synapses (1408.1784v1)

Published 8 Aug 2014 in cond-mat.dis-nn, cond-mat.stat-mech, cs.LG, and q-bio.NC

Abstract: Supervised learning in a binary perceptron is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights. However, to find such assignments in practice, is quite a nontrivial task. The relation between the weight space structure and the algorithmic hardness has not yet been fully understood. To this end, we analytically derive the Franz-Parisi potential for the binary preceptron problem, by starting from an equilibrium solution of weights and exploring the weight space structure around it. Our result reveals the geometrical organization of the weight space\textemdash the weight space is composed of isolated solutions, rather than clusters of exponentially many close-by solutions. The point-like clusters far apart from each other in the weight space explain the previously observed glassy behavior of stochastic local search heuristics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Haiping Huang (56 papers)
  2. Yoshiyuki Kabashima (83 papers)
Citations (48)

Summary

We haven't generated a summary for this paper yet.