UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

First-order methods for structured optimization Fang, Huang

Abstract

First-order methods are gaining substantial interest in the past two decades because of their superior performance in solving today's large-scale problems. In this thesis, we study some widely used first-order methods for problems that satisfy certain structures. Specifically, in the first part, we contribute to coordinate optimization and show that greedy coordinate descent (GCD) has an implicit screening ability that usually selects coordinates that are nonzero at the solution, which explains why GCD works exceptionally well for problems that admit sparse solutions. We also extend the elegant safe-screening rule that depends on duality gap to atomic-norm regularized problems. In the second part, we study online mirror descent (OMD) with unknown time horizon and unbounded domain, which is known to suffer from linear regret. We provide a stabilization technique and show that the stabilized-OMD can achieve sublinear regret. We also build the connection between stabilized-OMD and dual averaging. In the third part, we derive improved iteration complexity of the stochastic subgradient method for over-parameterized models that satisfy an interpolation condition. The obtained iteration complexity matches the rate of the stochastic gradient method applied to smooth problems that also satisfy an interpolation condition. Our analysis partially explains the empirical observation that nonsmoothness in modern machine learning models sometimes does not slow down the training process.

Item Media

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International