Abstract
Deep neural network (DNN) potentials are an emerging tool for simulation of dynamical atomistic systems, with the promise of quantum mechanical accuracy at speedups of 10000$\times$. As with other DNN methods, hyperparameters used during training can make a substantial difference in model accuracy, and optimal settings vary with dataset. To enable rapid tuning of hyperparameters for DNN potential training, we developed a scalable multiobjective optimization evolutionary algorithm for supercomputers and tested it on the Summit system at the Oak Ridge Leadership Computing Facility (OLCF). The multiobjective approach is required due to the coupling of two learned values defining the potential: the energy and force. Using a large-scale implementation of the NSGA-II algorithm adapted for training DNN potentials, we discovered several optimal multiobjective combinations, including best choices of activation functions, learning rate scaling scheme, and pairing of the two radial cutoffs used in the three dimensional descriptor function.