Learning Partial Differential Equations in Reproducing Kernel Hilbert Spaces

Wednesday, November 17, 2021 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

George Stepaniants

Affiliation

Mathematics

Building and Room Number

LIDS Lounge

Abstract

The rapid development of data-driven scientific discovery holds the promise of new and faster methods to analyze, understand, and predict various complex phenomena whose physical laws are still beyond our grasp. Of central interest to this development is the ability to solve efficiently a broad range of differential equations, and more precisely partial differential equations (PDEs) which still largely require advanced numerical techniques tailored for specific problems.

In this talk, we study what is certainly one of the most inspiring outcomes of this program: solving PDEs from input/output data. Namely, we aim to predict the solution u(x, t) of a PDE under new initial conditions u(x, 0) or external forcings f(x, t) that represents the ambient conditions of an evolving system. In other words, we propose to learn an operator that maps these inputs u(x,0), f(x, t) to output solutions u(x, t) from input/output data. While such operators can be nonlinear, we focus on linear operators which are a natural approximation to nonlinear phenomena and are, in general, more robust to model misspecification.

For many linear PDEs, the solution u(x, t) is given in closed-form by an integral equation where a kernel specific to the PDE, otherwise known as a Green’s function, is integrated against the input initial condition u(x,0) or forcing function f(x, t). We propose a new data-driven approach for learning the Green's functions of various linear PDEs, by estimating the best-fit Green's function in a reproducing kernel Hilbert space (RKHS) which allows us to regularize its smoothness and impose various structural constraints.

Finally, we derive nonasymptotic rates for the prediction error of our Green's function estimator and apply our method to several linear PDEs including the Poisson, Helmholtz, Schrödinger, Fokker-Planck, and heat equation. We highlight the method’s ability to extrapolate to more finely sampled meshes without any additional training.

The relevant preprint can be found here: https://arxiv.org/abs/2108.11580

Biography

George Stepaniants is a third-year PhD student at the Massachusetts Institute of Technology (MIT) in the Department of Mathematics advised by Prof. Philippe Rigollet and Prof. Jörn Dunkel. He is a member of the Interdisciplinary Doctoral Program in Statistics through the Institute for Data, Systems, and Society (IDSS). He obtained his Bachelor of Science at the University of Washington (UW) in Mathematics and Computer Science in 2019 where he performed research in the Department of Applied Mathematics under Prof. Nathan Kutz. George’s research is on the intersection of statistics and physical applied mathematics where he studies how statistical and machine learning algorithms can be used to infer and predict systems governed by ordinary and partial differential equations such as biological processes and trade networks.