Event Information

March 26, 2025

12:00 PM – 1:00 PM

Evans Conference Room, Warnock Engineering Building (WEB) Room 3780

Zoom Access

Meeting ID: 958 8067 9001

Passcode: 255836


Structure-Preserving, Low-Parameter, Interpretable, Operator Learning for Surrogate Modeling with Varun Shankar (Assistant Professor, Kahlert School of Computing)

Scientific machine learning (SciML) is a relatively new scientific discipline that weds scientific computing and high performance computing with carefully designed machine learning (ML) techniques. In the context of astrophysics, SciML has been applied to galaxy classification and identification, outlier detection, and uncertainty quantification.

Within SciML, operator learning is a rapidly emerging and powerful new paradigm for surrogate modeling across engineering and the sciences, with recent successes in climate modeling, material design, and carbon sequestration problems (to name a few). In this talk, I will present a unified framework that encompasses many operator learning paradigms and use this to present three advancements in operator learning: (1) the Kernel Neural Operator (KNO), a generalization of the Fourier neural operator that allows for greater flexibility in kernel choices and for local spatial adaptivity while inherently using far fewer trainable parameters; (2) the ensemble DeepONet, a generalization to Deep Operator Networks that enables the incorporation of spatial adaptivity directly into a set of basis functions; and (3) a new operator learning paradigm based on kernel approximation that analytically preserves the divergence free property and requires minimal training, all while achieving state-of-the-art performance on incompressible flow problems.

We argue that operator learning has the potential to positively impact astrophysics through trustworthy, rapid, and interpretable surrogate models for multiscale simulations of magnetohydrodynamics (MHD) and numerical general relativity (GR), and for inverse problems such as physical parameter estimation.