News

Sebastian Kaltenbach defends his Ph.D. thesis on "Physics-aware, probabilistic machine learning in the Small Data regime"


Abstract:

Solving high-dimensional, nonlinear systems is a key challenge in engineering and computational physics. We propose novel physics-aware machine learning models that rely both on physical knowledge as well as a small amount of data and are, after an initial training phase, able to solve these aforementioned high-dimensional systems efficiently. The key characteristic of this approach is incorporating inductive bias in contrast to purely statistical frameworks that, in general, lack interpretability and rely on large amounts of expensive simulation data.


We propose the concept of virtual observables to incorporate physical constraints and show that the addition of virtual observables enables extrapolative predictions. Virtual observables can be seamlessly integrated into any probabilistic state space model, and we are able to ob-
tain very accurate reduced order models for different dynamical systems. Moreover, we show that for model order reduction problems in computational physics, it can be beneficial to restrict the dynamics of the reduced order variables to dynamics that are inherently stable. This is based on the observation that most physical systems reach equilibrium in the long term despite being highly non-stationary. The proposed framework uses a flexible prior on the complex plane that facilitates the discovery of latent slow processes and ensures the long-term stability of the learned dynamics. Due to the added inductive bias, we are able to train the fully probabilistic model in the small data regime, and we demonstrate its accuracy in multiscale physical systems of particle dynamics where probabilistic, long-term predictions of phenomena not contained in the training data are  produced.

Finally, we extend the Deep Operator Network (DeepONet) architecture by making it invertible to solve both forward and inverse problems with one neural network. The resulting machine learning algorithm is very suitable for physical problems as, in most cases, we are simultaneously interested in forward and inverse operators. The obtained invertible Deep-ONet can also be used for Bayesian inverse problems, for which we can construct an approximative posterior without the need for any costly Markov Chain Monte Carlo (MCMC) sampling.