Graph Dynamical Networks for Learning Atomic Scale Dynamics

December 2, 2019

Tian Xie, Ph.D. Candidate, Materials Science and Engineering
 

Understanding the dynamics of atoms in complex materials is an important research problem in materials science, with applications in the design of better batteries, catalysis and membranes. However, extracting the correct information from massive atomic-scale simulation data is challenging, owing to the amorphous (disordered) nature of many materials. In this project, we develop graph dynamical networks (GDyNets) to learn the most important dynamics from time-series molecular dynamics (MD) simulation data. This allows us to learn the dynamics of atoms directly from MD data and provide new scientific insights into important materials.

The learning problem can be framed as finding a low dimensional feature space where the non-linear dynamics of atoms can be represented by a linear transition matrix. Assume that the atoms we are interested in follow any non-linear dynamics . We hope to find a mapping function to map the cartesian coordinates of atoms to a low dimensional feature space, such that there exists a low dimensional linear transition matrix that can approximate non-linear dynamics of . Effectively, we are mapping each atom into several "states", and we can understand their dynamics by analyzing the transition matrix .

Graph Dynamical Networks

 

The mapping function can be learned from time-series MD simulation data by minimizing a dynamical loss function (VAMP), but direct learning of is typically not possible in materials. This is because atoms can move between structurally similar yet distinct chemical environments, making the problem exponentially more complex. We need to encode this symmetry of atoms into the neural networks to solve this problem. We used a type of graph convolutional neural networks (CGCNN) from earlier work to encode the atomic structures in a way that respects the symmetries. In GdyNets, CGCNN is trained with time-series MD data to learn with a VAMP loss.

We first tested GDyNets in a toy system that has two symmetrically distinct states. We find that GDyNets successfully discover the correct two-state dynamics, while a network that does not respect symmetry learns the wrong dynamics. We then applied the method to a realistic system PEO/LiTFSI, a well-known polymer electrolyte for Li-ion batteries whose transport mechanism is still under debate. We discovered the four most important solvation structure of Li-ion and their dynamics from the MD simulation data and used the results to explain a "negative transference number" recently discovered in experiments.

We find that GDyNets can be used as a general approach to understanding the dynamics of atoms or molecules from the MD simulation data, with potential application to general graph-based data. We have open-sourced our code on GitHub. In the future, we hope to make GDyNets learn non-equilibrium dynamics as well as equilibrium dynamics.

This work has been published at Nature Communications on 17 June, 2019.

Tian Xie is a fifth-year Ph.D. candidate in the department of materials science and engineering at MIT, advised by Prof. Jeffrey Grossman. His research focuses on the development of machine learning algorithms for accelerating the design of materials for energy applications, motivated by the urgent need of material innovations for the next generation renewable energy technology. He developed a general deep learning framework (CGCNN) to encode arbitrary materials and solved multiple learning problems for different types of materials data. He also applied the models to discover novel materials and new scientific mechanisms for batteries and solar cells, and he worked closely with experimentalists to validate the materials and mechanisms identified by the learned models. Tian Xie received his B.S. in chemistry from Peking University in 2015, and he worked at Google X and DeepMind as research interns during his Ph.D. He can be contacted here.