UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Set-restricted isometry for sub-Gaussian matrices and inversion of deep generative models Li, Xiaowei

Abstract

Sub-Gaussian random mappings are widely used in modern signal processing, compressed sensing and machine learning. Their performance is often captured by how close they are to an isometry on given datasets. In the first topic of this thesis, we study when sub-Gaussian matrices can become near isometries on arbitrary sets. We show that a previous result by Liaw, Mehrabian, Plan and Vershynin in 2017 on this subject has a sub-optimal dependence on the sub-Gaussian norms, and present the optimal dependence. We also generalize this result by relaxing the row-independence condition. Furthermore, as tools used in our proof, we develop a new Bernstein type inequality and a new Hanson-Wright inequality, both with improved bounds in the sub-Gaussian regime under certain moment constraints. Finally, we illustrate how our new results can be applied in some popular applications. In particular, we obtain a significant improvement in the characterization of null space property for 0-1 matrices. Recently, deep generative neural networks are becoming increasingly popular for modeling certain classes of signals or images. In the second topic of this thesis, we study a novel algorithm called Partially Linearized Update for Generative Inversion (PLUGIn) for solving inverse problems with deep generative models. We show convergence of PLUGIn (under certain assumptions) and validate with numerical experiments. One distinction in our analysis is that it allows for networks with possibly contractive layers, whereas similar work usually assume strictly expansive layers. Our results suggest that PLUGIn is an effective algorithm in denoising and/or compressive sensing with generative priors.

Item Media

Item Citations and Data

Rights

Attribution-NonCommercial-ShareAlike 4.0 International