- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Asymptotically exact variational inference via measure-preserving...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Asymptotically exact variational inference via measure-preserving dynamical systems Xu, Zuheng
Abstract
Variational inference (VI) approximates a target distribution with a surrogate distribution
within a pre-specified variational family designed to allow \iid sampling and density evaluation.
The approximation is obtained by minimizing a divergence to the target, so its max obtainable quality is fundamentally determined by the expressiveness of the family. However, greater flexibility does not guarantee better approximations: the optimization problem is typically highly non-convex, making the theoretical optimum rarely attainable in practice. As a result, VI lacks the asymptotic exactness of Markov chain Monte Carlo methods (MCMC)---the guarantee of arbitrarily accurate inference results given sufficient computation regardless of tuning.
This thesis addresses this limitation by introducing mixed variational flows (MixFlows),
a framework for constructing practical, asymptotically exact variational families using measure-preserving dynamical systems. We develop both homogeneous MixFlows, derived from ergodic deterministic dynamics, and more general MixFlows, based on measure-preserving stochastic dynamics obtained from involutive MCMC kernels. We establish rigorous guarantees showing that every distribution in a MixFlow family converges in total variation to the target as the flow length increases, while retaining tractable density evaluation and i.i.d. sampling. A range of numerical examples illustrates these theoretical and methodological contributions.
Along the way, we develop tools to analyze how probabilistic error propagates through inexact flow dynamics. Such inexactness may arise from floating-point computation or discretization of continuous-time flows. While the error of a single flow map may be negligible, composing many maps can cause severe accumulation. Surprisingly, we show empirically that flow-based inference often remains accurate despite instability. Drawing on shadowing theory from chaotic dynamical systems, we prove that errors in sampling, density evaluation, and evidence lower bound estimation grow far more slowly than trajectory errors themselves, which often grow exponentially with flow length.
Item Metadata
| Title |
Asymptotically exact variational inference via measure-preserving dynamical systems
|
| Creator | |
| Supervisor | |
| Publisher |
University of British Columbia
|
| Date Issued |
2025
|
| Description |
Variational inference (VI) approximates a target distribution with a surrogate distribution
within a pre-specified variational family designed to allow \iid sampling and density evaluation.
The approximation is obtained by minimizing a divergence to the target, so its max obtainable quality is fundamentally determined by the expressiveness of the family. However, greater flexibility does not guarantee better approximations: the optimization problem is typically highly non-convex, making the theoretical optimum rarely attainable in practice. As a result, VI lacks the asymptotic exactness of Markov chain Monte Carlo methods (MCMC)---the guarantee of arbitrarily accurate inference results given sufficient computation regardless of tuning.
This thesis addresses this limitation by introducing mixed variational flows (MixFlows),
a framework for constructing practical, asymptotically exact variational families using measure-preserving dynamical systems. We develop both homogeneous MixFlows, derived from ergodic deterministic dynamics, and more general MixFlows, based on measure-preserving stochastic dynamics obtained from involutive MCMC kernels. We establish rigorous guarantees showing that every distribution in a MixFlow family converges in total variation to the target as the flow length increases, while retaining tractable density evaluation and i.i.d. sampling. A range of numerical examples illustrates these theoretical and methodological contributions.
Along the way, we develop tools to analyze how probabilistic error propagates through inexact flow dynamics. Such inexactness may arise from floating-point computation or discretization of continuous-time flows. While the error of a single flow map may be negligible, composing many maps can cause severe accumulation. Surprisingly, we show empirically that flow-based inference often remains accurate despite instability. Drawing on shadowing theory from chaotic dynamical systems, we prove that errors in sampling, density evaluation, and evidence lower bound estimation grow far more slowly than trajectory errors themselves, which often grow exponentially with flow length.
|
| Genre | |
| Type | |
| Language |
eng
|
| Date Available |
2026-01-02
|
| Provider |
Vancouver : University of British Columbia Library
|
| Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
| DOI |
10.14288/1.0451101
|
| URI | |
| Degree (Theses) | |
| Program (Theses) | |
| Affiliation | |
| Degree Grantor |
University of British Columbia
|
| Graduation Date |
2026-05
|
| Campus | |
| Scholarly Level |
Graduate
|
| Rights URI | |
| Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International