Integrating adaptive optimization into least squares progressive iterative approximation (2501.10170v2)
Abstract: This paper introduces the Adaptive Gradient Least Squares Progressive iterative Approximation (AdagradLSPIA), an accelerated version of the Least Squares Progressive Iterative Approximation (LSPIA) method, enhanced with adaptive optimization techniques inspired by the adaptive gradient (Adagrad) algorithm. By using historical (accumulated) gradient information to dynamically adjust weights, AdagradLSPIA achieves faster convergence compared to the standard LSPIA method. The effectiveness of AdagradLSPIA is demonstrated through its application to tensor product B-spline surface fitting, where this method consistently outperforms LSPIA in terms of accuracy, computational efficiency, and robustness to variations in global weight selection.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.