Dice Question Streamline Icon: https://streamlinehq.com

Sharpest contraction rate for Hopfield neural networks with diagonally stable synaptic matrices

Determine the sharpest exponential contraction rate for continuous-time Hopfield neural networks whose synaptic matrix is diagonally stable; specifically, identify the largest rate and an associated norm or Riemannian metric under which the network dynamics are strongly contracting, ensuring exponential decay of distances between any two trajectories.

Information Square Streamline Icon: https://streamlinehq.com

Background

Hopfield networks are a foundational recurrent neural network model. When the synaptic matrix is diagonally stable, certain stability properties are known, but tight (sharp) contraction rates across norms or metrics have not been characterized.

Obtaining the sharp contraction rate would provide precise robustness margins and performance guarantees, complementing known sharp results for other models highlighted earlier in the paper.

References

For example, sharp characterizations of contractivity exist for some special dynamical systems (e.g., gradient flow, firing rate neural networks, and certain Lur’e models), yet there are other relatively simple dynamical systems whose sharpest rates of contraction are still unknown, e.g., primal dual dynamics for linear equality-constrained minimization and Hopfield neural networks with diagonally stable synaptic matrices.

Perspectives on Contractivity in Control, Optimization, and Learning (2404.11707 - Davydov et al., 17 Apr 2024) in Section 6 (Conjectures and Future Directions), Theory