Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 76 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 81 tok/s Pro
Kimi K2 206 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 35 tok/s Pro
2000 character limit reached

A Framework for Population-Based Stochastic Optimization on Abstract Riemannian Manifolds (1908.06783v3)

Published 19 Aug 2019 in math.OC

Abstract: We present Extended Riemannian Stochastic Derivative-Free Optimization (Extended RSDFO), a novel population-based stochastic optimization algorithm on Riemannian manifolds that addresses the locality and implicit assumptions of manifold optimization in the literature. We begin by investigating the Information Geometrical structure of statistical model over Riemannian manifolds. This establishes a geometrical framework of Extended RSDFO using both the statistical geometry of the decision space and the Riemannian geometry of the search space. We construct locally inherited probability distribution via an orientation-preserving diffeomorphic bundle morphism, and then extend the information geometrical structure to mixture densities over totally bounded subsets of manifolds. The former relates the information geometry of the decision space and the local point estimations on the search space manifold. The latter overcomes the locality of parametric probability distributions on Riemannian manifolds. We then construct Extended RSDFO and study its structure and properties from a geometrical perspective. We show that Extended RSDFO's expected fitness improves monotonically and it's global eventual convergence in finitely many steps on connected compact Riemannian manifolds. Extended RSDFO is compared to state-of-the-art manifold optimization algorithms on multi-modal optimization problems over a variety of manifolds. In particular, we perform a novel synthetic experiment on Jacob's ladder to motivate and necessitate manifold optimization. Jacob's ladder is a non-compact manifold of countably infinite genus, which cannot be expressed as polynomial constraints and does not have a global representation in an ambient Euclidean space. Optimization problems on Jacob's ladder thus cannot be addressed by traditional (constraint) optimization methods on Euclidean spaces.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.