Papers
Topics
Authors
Recent
2000 character limit reached

Improved Quasi-Convex Properties

Updated 25 November 2025
  • Improved quasi-convex properties are refined generalizations of classical convexity that introduce strong moduli and endpoint inequalities to ensure quantitative growth.
  • They underpin advanced optimization techniques by enabling global linear convergence and accelerated rates even in nonconvex or high-dimensional scenarios.
  • These developments extend to generalized convexity frameworks, impacting uncertainty quantification, robust Bayesian analysis, and numerical approximation.

Improved quasi-convex properties refer to a body of theoretical, algorithmic, and geometric advances that extend the foundational role of quasi-convexity in analysis, optimization, and geometry. Recent research has produced sharp characterizations (including strong and weak notions), variational tools, algorithmic guarantees, and structural results, significantly generalizing classical convexity-based principles. These improvements have led to new minimax theorems, refined error bounds, optimization strategies in high-dimensional and nonconvex settings, and deeper understanding in both pure and applied contexts.

1. Structural Refinements: Strong Quasi-Convexity and Endpoint Inequalities

The introduction of strong quasiconvexity brings quantitative growth and variational structure to the classical (qualitative) quasi-convex paradigm. For a function h:CRh:C\to\mathbb{R} (on a convex set CC), strong quasiconvexity with modulus γ>0\gamma>0 is defined by

h(x+t(yx))max{h(x),h(y)}γ2t(1t)yx2t[0,1].h(x+t(y-x)) \leq \max\{h(x), h(y)\} - \frac{\gamma}{2}\, t(1-t)\|y-x\|^2\quad\forall t\in[0,1].

New endpoint characterizations in the nonsmooth setting provide necessary and sufficient conditions solely in terms of two endpoints and an interior point along a segment, yielding inequalities such as

h(x)h(z)    h(z)h(y)γ4(1t2)yx2,h(x)\leq h(z) \implies h(z)\leq h(y) - \frac{\gamma}{4}(1-t^2)\|y-x\|^2,

for z=x+t(yx)z = x + t(y-x), 0<t10<t\leq1. These results, established in "Characterizations of Strongly Quasiconvex Functions" (Hadjisavvas et al., 25 Sep 2025), also connect to first-order conditions (via Dini derivatives), giving integral-free criteria for strong quasiconvexity even in the nonsmooth case.

A key consequence is an improved quadratic growth property at the global minimizer: strong quasiconvexity with modulus γ\gamma implies

h(y)h(xˉ)+γ4yxˉ2,h(y)\geq h(\bar x) + \frac{\gamma}{4}\|y-\bar x\|^2,

yC\forall y\in C, sharp between classical quasiconvexity (no growth lower bound) and strong convexity (γ/2\gamma/2 prefactor). This supports global linear convergence rates of first-order algorithms and facilitates analysis of acceleration schemes in nonconvex regimes (Hadjisavvas et al., 25 Sep 2025).

2. Improved Characterization Techniques: Linear Perturbations and Convexity Detection

A pivotal refinement in the interface between convexity and quasi-convexity is the detection of convexity by linear perturbations. The main theorem of "Linear Perturbations of Quasiconvex Functions and Convexity" (Duy et al., 2015) states that for a function f:CRf:C\to\mathbb{R}, radially lower stable at flat boundary points: f is convex    c≢const:f+λc is quasiconvex λR.f\text{ is convex} \iff \exists\,c^* \not\equiv \text{const}: f+\lambda c^* \text{ is quasiconvex}\ \forall\,\lambda\in\mathbb{R}. This collapses the need for checking all linear perturbations to a single non-constant direction, provided a mild boundary regularity (radial lower stability) holds. This sharpens the classical "all perturbations" results and exploits two-sided variation in the linear form to force the convex combination property, thereby precisely bridging quasi-convexity under perturbation and true convexity.

3. Minimax and Optimization Principles: Product Measure Extensions and Robust Applications

A far-reaching generalization of the Bauer maximum principle has been developed for quasi-convex, lower semicontinuous functionals on high-dimensional products of measure classes, bypassing compactness via barycentric representations. For product sets A=A1AdA=A_1\otimes\cdots\otimes A_d of convex measure classes, with each AiA_i generated by extreme points Δi\Delta_i and integral representation properties, and under generalized moment constraints: supμANf(μ)=supμiΔi(Ni+N),i Eμ[ϕ(j)]0,jf(μ),\sup_{\mu\in A^N} f(\mu) = \sup_{\substack{\mu_i\in \Delta_i^{(N_i+N)},\,i \ E_\mu[\phi^{(j)}]\leq 0,\,j}} f(\mu), where ANA^N is the class of measures satisfying the constraints and Δi(Ni+N)\Delta_i^{(N_i+N)} denotes mixtures over a finite number of extreme points. This result, established in "Optimization Of Quasi-convex Function Over Product Measure Sets" (Stenger et al., 2019), encompasses noncompact infinite-dimensional settings, strictly generalizing earlier minimax reductions for affine or convex functionals.

Consequences include:

  • Worst-case quantile bounds for computer experiments, reducing infinite-dimensional quantile optimization to finite combinatorial search over mixtures.
  • Robust Bayesian analysis, simplifying extremal posterior or predictive computations under moment constraints on priors to a finite mixture framework.
  • Sharp worst-case sensitivity bounds for uncertainty quantification indices (e.g., Sobol’).

The core property enabling these reductions is the improved quasi-convex optimization principle on barycentrically representable product spaces (Stenger et al., 2019).

4. Accelerated and Stochastic Optimization Beyond Classical Convexity

Generalized weak/strong quasi-convexity properties have underpinned algorithmic advances in both deterministic and stochastic optimization.

Accelerated rates matching convex minimization are achievable for α\alpha-weakly-quasi-convex functions (with parameter α(0,1]\alpha\in(0,1]), generalizing beyond star-convexity, notably in the absence of convexity: α(f(x)f)f(x),xx.\alpha(f(x)-f^*)\leq \langle\nabla f(x), x-x^*\rangle. Optimal O(1/k2)O(1/k^2) rates for first-order methods, up to a sharp dependence on α2\alpha^{-2}, are established for Sequential Subspace Optimization (SESOP); similarly, conjugate-gradient schemes attain O(L/μ)O(\sqrt{L/\mu}) complexity with an α1\alpha^{-1} penalty (Guminov et al., 2017). These benchmarks are proven tight via lower bounds.

In the stochastic setting, Normalized Gradient Descent (NGD) and its mini-batch stochastic variant (SNGD) converge globally on strictly local quasi-convex, locally Lipschitz functions, but require a batch size proportional to 1/ϵ21/\epsilon^2 rather than the 1/ϵ1/\epsilon typical for convex SGD. Quasi-convexity guarantees global convergence in the absence of strict convexity or smoothness, provided the function landscape satisfies a local geometric condition quantifying directional descent in the neighborhood of the global minimizer (Hazan et al., 2015).

5. Extensions to Generalized Convexity Structures and Geometric Settings

Quasi-convex properties have been systematically generalized to broader contexts:

  • XX-convexity and quasi-XX-convexity: These notions replace linear convex combinations with mappings involving a "shift" gg, allowing domains that are unions of convex sets, sets with discrete gaps, or non-linear retractions. Closure properties, level-set characterizations, and minimizer uniqueness/generalization results hold analogously to classical convex analysis (Ali et al., 2022).
  • Quasi-convex families and descent paths: Steepest descent and self-expanding path concepts for quasiconvex families lead to rectifiability and stability results for sublevel-set-ordered curves. The existence, uniqueness, length bounds, and regularity for such descent paths extend classical Morse-theoretic and variational results to the entirely nonsmooth context (Longinetti et al., 2013).
  • Quasi-convex subsets in non-Euclidean spaces: In Alexandrov spaces with curvature bounds, quasi-convex subsets (as defined via comparison angles at distance minimizers) incorporate both classical convex sets and extremal sets, and are preserved under tangentialization. They admit geometric rigidity, comparison, and dimension properties, bridging convex geometry in smooth manifolds and metric geometry in singular spaces (Su et al., 2020).

6. Analytical and Inequality Improvements: Quadrature, Approximation, and Beyond

Improved quasi-convexity has led to sharper analytic inequalities, notably for quadrature error estimates:

  • Simpson-type inequalities: For ff with f(4)|f^{(4)}| quasi-convex, the classical Simpson constant $1/2880$ is improved to $1/5760$ times endpoint/midpoint maxima, giving strictly better remainder bounds whenever f(4)|f^{(4)}| is smaller away from global maxima. This demonstrates that quasi-convexity can yield provably tighter analytic estimates in applied numerical analysis settings (Alomari, 2016).

Such advances suggest a broader applicability of quasi-convexity notions and their improvements to further variational inequalities, probabilistic bounds, and estimation theory.

7. Open Directions and Broader Impact

Current advances in improved quasi-convex properties motivate questions spanning analysis, optimization, and geometry:

  • The potential to weaken regularity/stability conditions (e.g., in boundary stability for linear perturbation detection).
  • Extension to operator-theoretic analogues, multidimensional quadrature, and more exotic convexity notions (such as semi-strict or non-linear perturbation classification).
  • Broadening algorithmic frameworks (e.g., acceleration, robust optimization) to encompass generalized convexity without dependence on strong smoothness or compactness.

These refined quasi-convex properties unify and generalize key threads in convexity theory, optimization, geometry, and analysis, establishing a powerful toolkit for both theoretical inquiry and practical optimization in high-dimensional, nonconvex, and uncertainty-quantified regimes.

Citations: (Hadjisavvas et al., 25 Sep 2025, Duy et al., 2015, Stenger et al., 2019, Guminov et al., 2017, Hazan et al., 2015, Longinetti et al., 2013, Ali et al., 2022, Su et al., 2020, Alomari, 2016)

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Improved Quasi-Convex Properties.