UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Structured amortized variational inference Weilbach, Christian

Abstract

This thesis explores how structural knowledge about inference problems can be automatically mapped into solution mechanisms or inference artifacts, significantly enhancing efficiency and scalability. Initially, programming language theory is employed to implement a compiler that extracts structural knowledge from problem specifications and transforms it into a faithful inverse graph for inference. This graph constrains a neural network within a continuous normalizing flow (CNF) framework, providing efficient, high-quality inference for problems with a limited number of variables. However, the CNF framework faces inherent scalability and efficiency limitations. To address these, the approach is refined by structuring the attention mechanism in more powerful transformer neural networks within simulation-free denoising diffusion probabilistic models (DDPMs). The enhanced graphically structured diffusion model (GSDM) framework effectively handles various algorithmic problems, such as matrix factorization, Sudoku solving, and sorting. These tasks are integral subproblems in larger scientific inference settings. Furthermore, the framework is extended to scale beyond GPU memory limitations and integrated with the simulation-based inference community, facilitating its direct application in simulators.

Item Citations and Data

Rights

Attribution-ShareAlike 4.0 International