A Partially Derivative-Free Proximal Method for Composite Multiobjective Optimization in the Hölder Setting (2508.20071v1)
Abstract: This paper presents an algorithm for solving multiobjective optimization problems involving composite functions, where we minimize a quadratic model that approximates $F(x) - F(xk)$ and that can be derivative-free. We establish theoretical assumptions about the component functions of the composition and provide comprehensive convergence and complexity analysis. Specifically, we prove that the proposed method converges to a weakly $\varepsilon$-approximate Pareto point in at most $\mathcal{O}\left(\varepsilon{-\frac{\beta+1}{\beta}}\right)$ iterations, where $\beta$ denotes the H\"{o}lder exponent of the gradient. The algorithm incorporates gradient approximations and a scaling matrix $B_k$ to achieve an optimal balance between computational accuracy and efficiency. Numerical experiments on robust biobjective instances with Lipschitz and H\"{o}lder-gradient components illustrate the method's behavior. In these tests, the proposed approach was able to approximate the Pareto front under different levels of uncertainty and consistently recovered distinct solutions, even in challenging cases where the objectives have only H\"{o}lder continuous gradients.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.