- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- BIRS Workshop Lecture Videos /
- A Unified Approach to Error Bounds for Structured Convex...
Open Collections
BIRS Workshop Lecture Videos
BIRS Workshop Lecture Videos
A Unified Approach to Error Bounds for Structured Convex Optimization Problems So, Anthony Man-Cho
Description
In this talk, we present a new framework for establishing error bounds for a class of structured convex optimization problems, in which the objective function is the sum of a smooth convex function and a general closed proper convex function. Such a class encapsulates fairly general constrained minimization problems, as well as various regularized loss minimization formulations in machine learning, signal processing, and statistics. Our framework not only allows us to recover a number of existing error bound results in a unified and transparent manner but also leads to new error bounds for problems whose objective functions do not necessarily have a polyhedral epigraph. As an illustration, we show that a class of nuclear-norm regularized loss minimization problems admits an error bound under a strict complementarity-type regularity condition. We then complement this result by constructing an example to show that the said error bound could fail to hold without the regularity condition. If time permits, we discuss the algorithmic implications of our results, particularly on the convergence rate analysis of a family of successive quadratic approximation methods for solving the aforementioned class of problems.
Item Metadata
Title |
A Unified Approach to Error Bounds for Structured Convex Optimization Problems
|
Creator | |
Publisher |
Banff International Research Station for Mathematical Innovation and Discovery
|
Date Issued |
2017-09-18T16:31
|
Description |
In this talk, we present a new framework for establishing error bounds for a class of structured convex optimization problems, in which the objective function is the sum of a smooth convex function and a general closed proper convex function. Such a class encapsulates fairly general constrained minimization problems, as well as various regularized loss minimization formulations in machine learning, signal processing, and statistics. Our framework not only allows us to recover a number of existing error bound results in a unified and transparent manner but also leads to new error bounds for problems whose objective functions do not necessarily have a polyhedral epigraph. As an illustration, we show that a class of nuclear-norm regularized loss minimization problems admits an error bound under a strict complementarity-type regularity condition. We then complement this result by constructing an example to show that the said error bound could fail to hold without the regularity condition. If time permits, we discuss the algorithmic implications of our results, particularly on the convergence rate analysis of a family of successive quadratic approximation methods for solving the aforementioned class of problems.
|
Extent |
33 minutes
|
Subject | |
Type | |
File Format |
video/mp4
|
Language |
eng
|
Notes |
Author affiliation: The Chinese University of Hong Kong
|
Series | |
Date Available |
2018-03-24
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0364442
|
URI | |
Affiliation | |
Peer Review Status |
Unreviewed
|
Scholarly Level |
Faculty
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International