Global Optimization with A Power-Transformed Objective and Gaussian Smoothing
Abstract: We propose a novel method that solves global optimization problems in two steps: (1) perform a (exponential) power-$N$ transformation to the not-necessarily differentiable objective function $f$ and get $f_N$, and (2) optimize the Gaussian-smoothed $f_N$ with stochastic approximations. Under mild conditions on $f$, for any $\delta>0$, we prove that with a sufficiently large power $N_\delta$, this method converges to a solution in the $\delta$-neighborhood of $f$'s global optimum point. The convergence rate is $O(d2\sigma4\varepsilon{-2})$, which is faster than both the standard and single-loop homotopy methods if $\sigma$ is pre-selected to be in $(0,1)$. In most of the experiments performed, our method produces better solutions than other algorithms that also apply smoothing techniques.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.