Joseph Hart

Hyper-differential Sensitivity Analysis with Respect To Model Discrepancy

Joseph Hart , Sandia National Laboratories

Abstract:

Computational models have made great process toward simulating complex physical phenomena.  With this progress has come a significant increase in the computational cost and software complexity of the model.  This makes high-fidelity models impractical in many query analysis such as optimization.  To circumvent this challenge, a low-fidelity model may be constructed for which optimization is more amenable.  Examples of low-fidelity models include averaging small length scales, omission of physical processes, projection into lower dimensions, and data-driven model approximation.  However, modeling errors propagate through the optimization resulting in suboptimal solutions.  This work considers the use of hyper-differential sensitivity analysis to learn improved optimization solutions using limited offline evaluations of the high-fidelity model.  We introduce a representation of model discrepancy, the difference between high and low-fidelity models, and calibrate it in a Bayesian framework.  The model discrepancy posterior is propagated through the optimization problem using post-optimality sensitivity analysis to produce a posterior distribution for the optimization solution.  By leveraging tools from partial differential equation constrained optimization and randomized linear algebra, in a Bayesian formulation, we present an interpretable and efficient approach to learn higher-fidelity optimization solutions using limited evaluations of the high-fidelity model.  We demonstrate that as little as one high-fidelity solve can provide a significant improvement in the low-fidelity optimization solution.  Thanks to the Bayesian framing of the approach, we also enable optimal experimental design and uncertainty quantification to guide the high-fidelity model runs and characterize uncertainty due to limits on how many runs may be executed.

 

Speakers Bio:

Joseph Hart is a Principal Member of the Technical Staff in the Scientific Machine Learning Department at Sandia National Laboratories.  He joined Sandia as a student intern in 2017, earned his PhD in Applied Mathematics from North Carolina State University in 2018, and has been a staff member at Sandia since 2019.  Joseph is interested in quantifying, prioritizing, and mitigating uncertainty in large-scale optimization problems constrained by differential equations.  His research spans numerical optimization, sensitivity analysis, inverse problems, optimal experimental design, and scientific machine learning in the service of outer loop analysis.  Joseph focuses on finding and exploiting low dimensional structure which arises from taking a holistic perspective on the scientific computing pipeline from model development to decision-making.

June 26
3:15pm - 4:15pm
H308 5600
SHARE