Pethick, Thomas MichaelsenFercoq, OlivierLatafat, PuyaPatrinos, PanagiotisCevher, Volkan2023-02-142023-02-142023-02-142023https://infoscience.epfl.ch/handle/20.500.14299/194860This paper introduces a family of stochastic extragradient-type algorithms for a class of nonconvex-nonconcave problems characterized by the weak Minty variational inequality (MVI). Unlike existing results on extragradient methods in the monotone setting, employing diminishing stepsizes is no longer possible in the weak MVI setting. This has led to approaches such as increasing batch sizes per iteration which can however be prohibitively expensive. In contrast, our proposed methods involves two stepsizes and only requires one additional oracle evaluation per iteration. We show that it is possible to keep one fixed stepsize while it is only the second stepsize that is taken to be diminishing, making it interesting even in the monotone setting. Almost sure convergence is established and we provide a unified analysis for this family of schemes which contains a nonlinear generalization of the celebrated primal dual hybrid gradient algorithm.ml-aiVariational inequalitiesStochastic first-order methodsNonconvex-nonconcaveMinimaxSolving stochastic weak Minty variational inequalities without increasing batch sizetext::conference output::conference poster not in proceedings