2000 character limit reached
On the Softplus Penalty for Constrained Convex Optimization (2305.12603v1)
Published 21 May 2023 in math.OC
Abstract: We study a new penalty reformulation of constrained convex optimization based on the softplus penalty function. We develop novel and tight upper bounds on the objective value gap and the violation of constraints for the solutions to the penalty reformulations by analyzing the solution path of the reformulation with respect to the smoothness parameter. We use these upper bounds to analyze the complexity of applying gradient methods, which are advantageous when the number of constraints is large, to the reformulation.