Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 Current 
 Affiliates 
 Visitors 
 Students 
 Research 
 ICAM-LANL 
 Publications 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Colloquia 
 Colloquia Archive 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 Quantum Lunch Archive 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 P/T Colloquia 
 Archive 
 Kac Lectures 
 Kac Fellows 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Colloquia 
 
 Jobs 
 Postdocs 
 CNLS Fellowship Application 
 Students 
 Student Program 
 Visitors 
 Description 
 Past Visitors 
 Services 
 General 
 
 History of CNLS 
 
 Maps, Directions 
 CNLS Office 
 T-Division 
 LANL 
 
Thursday, August 25, 2022
1:00 PM - 2:00 PM
Webex

Seminar

PIML Seminar Series 5: Learning operators using deep neural networks for multiphysics, multiscale, & multifidelity problems

Lu Lu
University of Pennsylvania

It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN can accurately approximate any nonlinear continuous operator. This universal approximation theorem of operators is suggestive of the structure and potential of deep neural networks (DNNs) in learning continuous operators or complex systems from streams of scattered data. In this talk, I will present the deep operator network (DeepONet) to learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. I will also present several extensions of DeepONet, such as DeepM&Mnet for multiphysics problems, DeepONet with proper orthogonal decomposition (POD-DeepONet), MIONet for multiple-input operators, and multifidelity DeepONet. More generally, DeepONet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously. I will demonstrate the effectiveness of DeepONet and its extensions to diverse multiphysics and multiscale problems, such as nanoscale heat transport, bubble growth dynamics, high-speed boundary layers, electroconvection, and hypersonics.

Bio: Lu Lu is an Assistant Professor in the Department of Chemical and Biomolecular Engineering at University of Pennsylvania. He is also a faculty of Penn Institute for Computational Science and of Graduate Group in Applied Mathematics and Computational Science. Prior to joining Penn, he was an Applied Mathematics Instructor in the Department of Mathematics at Massachusetts Institute of Technology from 2020 to 2021. He obtained his Ph.D. degree in Applied Mathematics at Brown University in 2020, master's degrees in Engineering, Applied Mathematics, and Computer Science at Brown University, and bachelor's degrees in Mechanical Engineering, Economics, and Computer Science at Tsinghua University in 2013. Lu has a multidisciplinary research background with research experience at the interface of applied mathematics, physics, computational biology, and computer science. The goal of his research is to model and simulate physical and biological systems at different scales by integrating modeling, simulation, and machine learning, and to provide strategies for system learning, prediction, optimization, and decision making in real time. His current research interest lies in scientific machine learning, including theory, algorithms, and software, and its applications to engineering, physical, and biological problems. His broad research interests focus on multiscale modeling and high performance computing for physical and biological systems. Lu has received the 2022 U.S. Department of Energy Early Career Award, and 2020 Joukowsky Family Foundation Outstanding Dissertation Award of Brown University.



Host: Wenting Li, Arvind Mohan