Leveraging Randomized Linear Algebra to Accelerate the Training of Neural Network Wavefunctions

Gil Goldshlager, University of California, Berkeley

Photo of Gil Goldshlager

Better algorithms for the electronic structure problem can unlock progress across the fields of chemistry, materials science, and condensed matter physics. One promising approach is to use neural networks to represent the electronic wavefunction. This approach, known as neural network wavefunctions, has been shown to produce highly accurate results for small but strongly correlated systems. Unfortunately, the high cost of training such wavefunctions prevents their application to larger systems. We propose the Subsampled Projected-Increment Natural Gradient Descent (SPRING) optimizer to reduce this bottleneck. SPRING combines ideas from the recently proposed minimum-step stochastic reconfiguration optimizer and the classical randomized Kaczmarz method for solving linear least-squares problems. Numerical results demonstrate that SPRING outperforms the previous state-of-the-art optimizers for neural network wavefunctions.