The study focuses on finite-sum monotone inclusion problems, which are common in machine learning and model broad classes of equilibrium problems. The authors present variants of the classical Halpern iteration employing variance reduction to improve complexity guarantees. This is particularly useful when operators in the finite sum are either cocoercive or Lipschitz continuous and monotone. This is the first variance reduction-type result for general finite-sum monotone inclusions and more specific problems such as convex-concave optimization. The authors argue the complexity of their methods is near-optimal.

 

Publication date: 4 Oct 2023
Project Page: https://arxiv.org/abs/2310.02987v1
Paper: https://arxiv.org/pdf/2310.02987