TNet: A Model-Constrained Tikhonov Network Approach for Inverse Problems

Abstract

Deep Learning (DL), in particular deep neural networks (DNN), by default is purely data-driven and in general does not require physics. This is the strength of DL but also one of its key limitations when applied to science and engineering problems in which underlying physical properties and desired accuracy need to be achieved. DL methods in their original forms are not capable of respecting the underlying mathematical models or achieving desired accuracy even in big-data regimes. However, many data-driven science and engineering problems, such as inverse problems, typically have limited experimental or observational data, and DL would overfit the data in this case. Leveraging information encoded in the underlying mathematical models, we argue, not only compensates missing information in low data regimes but also provides opportunities to equip DL methods with the underlying physics, hence promoting better generalization. This paper develops a model-constrained deep learning approach and its variant TNet that are capable of learning information hidden in both the training data and the underlying mathematical models to solve inverse problems governed by partial differential equations. We provide the constructions and some theoretical results for the proposed approaches. We show that data randomization can enhance the smoothness of the networks and their generalizations. Comprehensive numerical results not only confirm the theoretical findings but also show that with even as little as 20 training data samples for 1D deconvolution, 50 for inverse 2D heat conductivity problem, 100 and 50 for inverse initial conditions for time-dependent 2D Burgers’ equation and 2D Navier-Stokes equations, respectively. TNet solutions can be as accurate as Tikhonov solutions while being several orders of magnitude faster. This is possible owing to the model-constrained term, replications, and randomization.

Publication
SIAM Journal of Scientific Computing, In Production , 2023