BIRS Workshop Lecture Videos

Banff International Research Station Logo

BIRS Workshop Lecture Videos

Golden Ratio Algorithms for Variational Inequalities Malitsky, Yura

Description

We present several novel methods for solving general (pseudo-) monotone variational inequalities. The first method uses fixed stepsize and is similar to the proximal reflected gradient method: it also requires only one value of operator and one prox-operator per iteration. However, its extension — the dynamic version — has a notable distinction. In every iteration it defines a stepsize, based on a local information about operator, without running any linesearch procedure. Thus, the iteration costs of this method is almost the same as in the first one with a fixed stepsize, but it converges without the Lipschitz assumption on the operator. We further discuss possible generalizations of the methods, in particular for solving large-scale nonlinear saddle point problems. Some numerical experiments are reported.

Item Media

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International