Allegro: Scaling Equivariant Machine-Learning Interatomic Potentials
Albert Musaelian, Harvard University
Molecular dynamics is a powerful way to study material and chemical systems through simulations driven by a potential energy function. Its applicability, however, is limited by the length- and time-scales that can be achieved when using sufficiently accurate potentials. Machine-learning interatomic potentials, trained on expensive quantum mechanical data, promise to address this challenge by providing near quantum accuracy at a fraction of the cost. In particular, recently developed equivariant neural networks such as our NequIP model—which directly process geometric input data like vectors in a symmetry-respecting way—have demonstrated remarkable accuracy and data-efficiency. This poster discusses Allegro, a novel architecture that, unlike all other existing equivariant neural network potentials, is strictly spatially local. As a result, Allegro can be efficiently parallelized while preserving many advantages of previous equivariant approaches; we show this using various benchmark tasks and demonstrate scaling on a Department of Energy GPU cluster.
Abstract Author(s): Albert Musaelian (1), Simon Batzner (1), Anders Johansson (1), Boris Kozinsky (1,2); (1)Harvard University, Cambridge, MA, USA; (2)Robert Bosch LLC, Cambridge, MA, USA