Lab Home  Phone  Search  


A conventional and efficient way to solve inference problems in dynamical systems—broadly defined by learning some “unknowns” in thedynamical systems given a set of data—is to adopt the adjoint state methods. The adjoint states contain the information of the gradients ofthe cost function with respect to the unknowns (referred to as the “sensitivities”), so can be used in local optimization procedures. However, in complex models, computing adjoint states can be numerically inefficient. Modern machinelearning (ML) procedures, on the other hand, computes the sensitivities by backpropagation, which entails generating computational graphs (also known as the expression trees) and automatic differentiation. Can backpropagation help us in these classicalinference problems? In this talk, I aim to share the preliminary results of our effort to generalize the modern ML platform (TensorFlow) for classical inferenceproblems. By revising the transfer functions of the Recurrent Neural Networks (RNN’s), we enable the architecture to carry out the temporalintegration of dynamical systems. In the models we tested, the backpropagation delivered accurate estimations of the sensitivities for the consequent optimization procedures, bypassing the necessity of computing the adjoint states. As the transfer function encodes the “physics” of the dynamical system, the endproduct our analysis isphysically interpretable. I will present our results by three proofofconcept type of inference problems for deterministic dynamical systems which encapsulate three classes of inference problems: (1) given terminal configuration and the evolution of the system, inferring the initial condition, (2) given the configuration of the system measured at (possibly sparse) discrete times, inferring the model parameters which minimizes the error of themodel prediction and the data, and (3) optimal control problems. We aim to share these preliminary results with the LANL physicsinformedML community as soon as possible and ask for feedback. As such, the talk will be delivered less formally, and we welcome possible discussionsduring the talk. Host: David Métivier 