Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Min-Max Optimization without Gradients: Convergence and Applications to Adversarial ML (1909.13806v3)

Published 30 Sep 2019 in cs.LG, math.OC, and stat.ML

Abstract: In this paper, we study the problem of constrained robust (min-max) optimization ina black-box setting, where the desired optimizer cannot access the gradients of the objective function but may query its values. We present a principled optimization framework, integrating a zeroth-order (ZO) gradient estimator with an alternating projected stochastic gradient descent-ascent method, where the former only requires a small number of function queries and the later needs just one-step descent/ascent update. We show that the proposed framework, referred to as ZO-Min-Max, has a sub-linear convergence rate under mild conditions and scales gracefully with problem size. From an application side, we explore a promising connection between black-box min-max optimization and black-box evasion and poisoning attacks in adversarial ML. Our empirical evaluations on these use cases demonstrate the effectiveness of our approach and its scalability to dimensions that prohibit using recent black-box solvers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Sijia Liu (204 papers)
  2. Songtao Lu (60 papers)
  3. Xiangyi Chen (16 papers)
  4. Yao Feng (26 papers)
  5. Kaidi Xu (85 papers)
  6. Abdullah Al-Dujaili (15 papers)
  7. Minyi Hong (1 paper)
  8. Una-May O'Reilly (43 papers)
Citations (26)

Summary

We haven't generated a summary for this paper yet.