Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 Current 
 Affiliates 
 Visitors 
 Students 
 Research 
 ICAM-LANL 
 Publications 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Colloquia 
 Colloquia Archive 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 Quantum Lunch Archive 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 P/T Colloquia 
 Archive 
 Kac Lectures 
 Kac Fellows 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Colloquia 
 
 Jobs 
 Postdocs 
 CNLS Fellowship Application 
 Students 
 Student Program 
 Visitors 
 Description 
 Past Visitors 
 Services 
 General 
 
 History of CNLS 
 
 Maps, Directions 
 CNLS Office 
 T-Division 
 LANL 
 
Thursday, May 16, 2019
2:00 PM - 3:00 PM
CNLS Conference Room (TA-3, Bldg 1690)

Postdoc Seminar

Solving classical inference problems on modern machine-learning platforms

Yen Ting Lin, collaborative work with Marian Anghel (CCS-3)
T-6/CNLS

A conventional and efficient way to solve inference problems in dynamical systems—broadly defined by learning some “unknowns” in thedynamical systems given a set of data—is to adopt the adjoint state methods. The adjoint states contain the information of the gradients ofthe cost function with respect to the unknowns (referred to as the “sensitivities”), so can be used in local optimization procedures. However, in complex models, computing adjoint states can be numerically inefficient. Modern machine-learning (ML) procedures, on the other hand, computes the sensitivities by back-propagation, which entails generating computational graphs (also known as the expression trees) and automatic differentiation. Can back-propagation help us in these classicalinference problems?

In this talk, I aim to share the preliminary results of our effort to generalize the modern ML platform (TensorFlow) for classical inferenceproblems. By revising the transfer functions of the Recurrent Neural Networks (RNN’s), we enable the architecture to carry out the temporalintegration of dynamical systems. In the models we tested, the back-propagation delivered accurate estimations of the sensitivities for the consequent optimization procedures, bypassing the necessity of computing the adjoint states. As the transfer function encodes the “physics” of the dynamical system, the end-product our analysis isphysically interpretable.

I will present our results by three proof-of-concept type of inference problems for deterministic dynamical systems which encapsulate three classes of inference problems: (1) given terminal configuration and the evolution of the system, inferring the initial condition, (2) given the configuration of the system measured at (possibly sparse) discrete times, inferring the model parameters which minimizes the error of themodel prediction and the data, and (3) optimal control problems. We aim to share these preliminary results with the LANL physics-informedML community as soon as possible and ask for feedback. As such, the talk will be delivered less formally, and we welcome possible discussionsduring the talk.

Host: David Métivier