UBC Theses and Dissertations
A sparsity-free compressed sensing theory with applications in generative model recovery Naderi, Alireza
We study the problem of reconstructing a high-dimensional signal x∈ℝⁿ from a low-dimensional noisy linear measurement y=Mx+e∈ℝˡ, assuming x admits a certain structure. We model the measurement matrix as M=BA, with arbitrary B∈ℝˡˣᵐ and sub-gaussian A∈ℝᵐˣⁿ; therefore allowing for a family of random measurement matrices which may have heavy tails, dependent rows and columns, and a large dynamic range for the singular values. The structure is either given as a non-convex cone T⊂ℝⁿ, or is induced via minimizing a given convex function f(·). We prove, in both cases, that an approximate empirical risk minimizer robustly recovers the signal if the effective number of measurements is sufficient, even in the presence of a model mismatch. While in classical compressed sensing the number of independent (sub)-gaussian measurements regulates the possibility of a robust reconstruction, here in our setting the effective number of measurements depends on the properties of B. We show that, in this model, the stable rank of B indicates the effective number of measurements, and an accurate recovery is guaranteed whenever it exceeds the effective dimension of the structure set. We apply our results to the special case of generative priors, i.e. when x is close to the range of a Generative Neural Network (GNN) with ReLU activation functions. Also, if the GNN has random weights in the last layer, our theory allows a partial Fourier measurement matrix, thus taking the first step in a theoretical analysis of compressed sensing MRI with GNN. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Yilmaz.
Item Citations and Data
Attribution-NonCommercial-NoDerivatives 4.0 International