UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

From quantum computing to condensed matter : a machine learning perspective Kairon, Pranav

Abstract

This thesis investigates the intersection of physics and machine learning, with a primary focus on applications in condensed matter and quantum computing. In the first half, we demonstrate Bayesian machine learning (ML) models that use quantum properties in an ef- fectively lower-dimensional Hilbert space to make predictions for the Hamiltonian parame- ters that require a larger basis set as applied to a well-known problem in quantum statistical mechanics, the polaron problem. Two models of interest include the Su-Schrieffer-Heeger (SSH) model and the mixed SSH-Holstein model. We demonstrate ML models that can extrapolate polaron properties in phonon frequency. We consider the sharp transition in the ground-state momentum of the SSH polaron and examine the evolution of this transition from the anti-adiabatic regime to the adiabatic regime. We also demonstrate Bayesian mod- els that use the posterior distributions of highly approximate quantum calculations as the prior distribution for models of more accurate quantum results. This drastically reduces the number of fully converged quantum calculations required to map out the polaron dispersion relations for the full range of Hamiltonian parameters of interest. In the second half, we show that a parametrized quantum neural network can be used to build a parameter-free quantum kernel for machine learning that inherits the concentration properties of the neu- ral network cost function. This establishes a rigorous connection between barren plateaus in variational quantum algorithms and exponential concentration of quantum kernels for machine learning. Our results imply that recently proposed algorithms for building bar- ren plateau-free quantum circuits can be applied to construct useful quantum kernels for machine learning without inductive bias. Additionally, we investigate how leveraging the group-theoretic structure of datasets enables us to exploit the block diagonal structure of the unitary representation of the symmetry group. We derive new bounds on the variance of the kernel matrix, enhancing the stability and performance of quantum kernel-based machine learning algorithms.

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International