UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Using neural networks to accelerate molecular dynamics Bement, Phillip

Abstract

When running atomistic molecular dynamics simulations, typically integration time-steps must be on the order of 2fs to preserve numerical stability. This sharply limits our ability to generate trajectories for processes such as protein folding that can take on the order of milliseconds. A possible solution is training a neural net to predict the configuration of a given protein many time-steps in the future (conditional on its current configuration), while requiring less wall-clock time than a direct molecular dynamics simulation would need. Such a neural net cannot output a state deterministically but must sample from a probability distribution, because molecular dynamics is stochastic. A class of neural nets called Diffusion Denoising Probabalistic Models (DDPMs) provides a promising framework for performing this sampling. In this work, we construct a neural net architecture that can be trained as a conditional DDPM and is manifestly invariant under translations and rotations — physical symmetries of the dynamics. We train this architecture on molecular dynamics simulations of a test-system, alanine-tripeptide, and compare repeated application of the learned dynamics to the direct simulation.

Item Citations and Data

Rights

Attribution-NonCommercial-ShareAlike 4.0 International