Skip to main content
SHARE
Publication

GrainNN: A neighbor-aware long short-term memory network for predicting microstructure evolution during polycrystalline grain formation

by Yigong Qin, Stephen J Dewitt, Balasubram Radhakrishnan, George Biros
Publication Type
Journal
Journal Name
Computational Materials Science
Publication Date
Page Number
111927
Volume
218
Issue
1

High fidelity simulations of grain formation in alloys are an indispensable tool for process-to-mechanical-properties characterization. Such simulations, however, can be computationally expensive as they require fine spatial and temporal discretizations. Their cost becomes an obstacle to parametric studies and ensemble runs and ultimately makes downstream tasks like optimal control and uncertainty quantification challenging. To enable such downstream tasks, we introduce GrainNN, an efficient and accurate reduced-order model for epitaxial grain growth in additive manufacturing conditions. GrainNN is a sequence-to-sequence long-short-term-memory (LSTM) deep neural network that evolves the dynamics of manually crafted features. Its innovations are (1) an attention mechanism with grain-microstructure-specific transformer architecture; and (2) an overlapping combination of several clones of the network to generalize to grain configurations that are different from those used for training. This design enables GrainNN to predict grain formation for unseen physical parameters, grain number, domain size and geometry. Furthermore, GrainNN not only reconstructs the quantities of interest but also can be pointwise accurate. In our numerical experiments, we use a polycrystalline phase field method to both generate the training data and assess GrainNN. For multiparametric, ensemble simulations with many grains, GrainNN can be orders of magnitude faster than phase field simulations, while delivering 5%–15% pointwise error. This speedup includes the cost of the phase field simulations for generating training data.