- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- BIRS Workshop Lecture Videos /
- Error bounds and convergence of proximal methods for...
Open Collections
BIRS Workshop Lecture Videos
BIRS Workshop Lecture Videos
Error bounds and convergence of proximal methods for composite minimization Lewis, Adrian
Description
Minimizing a simple nonsmooth outer function composed with a smooth inner map offers a versatile framework for structured optimization. A unifying algorithmic idea solves easy subproblems involving the linearized inner map and a proximal penalty on the step. I sketch a typical such algorithm - ProxDescent - illustrating computational results and representative basic convergence theory. Although such simple methods may be slow (without second-order acceleration), eventual linear convergence is common. An intuitive explanation is a generic quadratic growth property - a condition equivalent to an "error bound" involving the algorithm's stepsize. The stepsize is therefore a natural termination criterion, an idea that extends to more general Taylor-like optimization models.
Joint work with D. Drusvyatskiy (Washington), A. Ioffe (Technion), S.J. Wright (Wisconsin)
Item Metadata
Title |
Error bounds and convergence of proximal methods for composite minimization
|
Creator | |
Publisher |
Banff International Research Station for Mathematical Innovation and Discovery
|
Date Issued |
2017-09-21T15:01
|
Description |
Minimizing a simple nonsmooth outer function composed with a smooth inner map offers a versatile framework for structured optimization. A unifying algorithmic idea solves easy subproblems involving the linearized inner map and a proximal penalty on the step. I sketch a typical such algorithm - ProxDescent - illustrating computational results and representative basic convergence theory. Although such simple methods may be slow (without second-order acceleration), eventual linear convergence is common. An intuitive explanation is a generic quadratic growth property - a condition equivalent to an "error bound" involving the algorithm's stepsize. The stepsize is therefore a natural termination criterion, an idea that extends to more general Taylor-like optimization models.
Joint work with D. Drusvyatskiy (Washington), A. Ioffe (Technion), S.J. Wright (Wisconsin) |
Extent |
37 minutes
|
Subject | |
Type | |
File Format |
video/mp4
|
Language |
eng
|
Notes |
Author affiliation: Cornell University
|
Series | |
Date Available |
2018-03-26
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0364467
|
URI | |
Affiliation | |
Peer Review Status |
Unreviewed
|
Scholarly Level |
Faculty
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International