- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Structured amortized variational inference
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Structured amortized variational inference Weilbach, Christian
Abstract
This thesis explores how structural knowledge about inference problems can be automatically mapped into solution mechanisms or inference artifacts, significantly enhancing efficiency and scalability. Initially, programming language theory is employed to implement a compiler that extracts structural knowledge from problem specifications and transforms it into a faithful inverse graph for inference. This graph constrains a neural network within a continuous normalizing flow (CNF) framework, providing efficient, high-quality inference for problems with a limited number of variables. However, the CNF framework faces inherent scalability and efficiency limitations. To address these, the approach is refined by structuring the attention mechanism in more powerful transformer neural networks within simulation-free denoising diffusion probabilistic models (DDPMs). The enhanced graphically structured diffusion model (GSDM) framework effectively handles various algorithmic problems, such as matrix factorization, Sudoku solving, and sorting. These tasks are integral subproblems in larger scientific inference settings. Furthermore, the framework is extended to scale beyond GPU memory limitations and integrated with the simulation-based inference community, facilitating its direct application in simulators.
Item Metadata
Title |
Structured amortized variational inference
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2025
|
Description |
This thesis explores how structural knowledge about inference problems can be automatically mapped into solution mechanisms or inference artifacts, significantly enhancing efficiency and scalability. Initially, programming language theory is employed to implement a compiler that extracts structural knowledge from problem specifications and transforms it into a faithful inverse graph for inference. This graph constrains a neural network within a continuous normalizing flow (CNF) framework, providing efficient, high-quality inference for problems with a limited number of variables. However, the CNF framework faces inherent scalability and efficiency limitations. To address these, the approach is refined by structuring the attention mechanism in more powerful transformer neural networks within simulation-free denoising diffusion probabilistic models (DDPMs). The enhanced graphically structured diffusion model (GSDM) framework effectively handles various algorithmic problems, such as matrix factorization, Sudoku solving, and sorting. These tasks are integral subproblems in larger scientific inference settings. Furthermore, the framework is extended to scale beyond GPU memory limitations and integrated with the simulation-based inference community, facilitating its direct application in simulators.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2025-09-02
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-ShareAlike 4.0 International
|
DOI |
10.14288/1.0449997
|
URI | |
Degree (Theses) | |
Program (Theses) | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2025-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-ShareAlike 4.0 International