- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Using neural networks to accelerate molecular dynamics
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Using neural networks to accelerate molecular dynamics Bement, Phillip
Abstract
When running atomistic molecular dynamics simulations, typically integration time-steps must be on the order of 2fs to preserve numerical stability. This sharply limits our ability to generate trajectories for processes such as protein folding that can take on the order of milliseconds. A possible solution is training a neural net to predict the configuration of a given protein many time-steps in the future (conditional on its current configuration), while requiring less wall-clock time than a direct molecular dynamics simulation would need. Such a neural net cannot output a state deterministically but must sample from a probability distribution, because molecular dynamics is stochastic. A class of neural nets called Diffusion Denoising Probabalistic Models (DDPMs) provides a promising framework for performing this sampling. In this work, we construct a neural net architecture that can be trained as a conditional DDPM and is manifestly invariant under translations and rotations — physical symmetries of the dynamics. We train this architecture on molecular dynamics simulations of a test-system, alanine-tripeptide, and compare repeated application of the learned dynamics to the direct simulation.
Item Metadata
Title |
Using neural networks to accelerate molecular dynamics
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2025
|
Description |
When running atomistic molecular dynamics simulations, typically integration time-steps must be on the order of 2fs to preserve numerical stability. This sharply limits our ability to generate trajectories for processes such as protein folding that can take on the order of milliseconds. A possible solution is training a neural net to predict the configuration of a given protein many time-steps in the future (conditional on its current configuration), while requiring less wall-clock time than a direct molecular dynamics simulation would need. Such a neural net cannot output a state deterministically but must sample from a probability distribution, because molecular dynamics is stochastic. A class of neural nets called Diffusion Denoising Probabalistic Models (DDPMs) provides a promising framework for performing this sampling. In this work, we construct a neural net architecture that can be trained as a conditional DDPM and is manifestly invariant under translations and rotations — physical symmetries of the dynamics. We train this architecture on molecular dynamics simulations of a test-system, alanine-tripeptide, and compare repeated application of the learned dynamics to the direct simulation.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2025-09-02
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-ShareAlike 4.0 International
|
DOI |
10.14288/1.0449990
|
URI | |
Degree (Theses) | |
Program (Theses) | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2025-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-ShareAlike 4.0 International