Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 CNLS Staff Members 
 Executive Committee 
 Postdocs 
 Visitors 
 Students 
 Research 
 Publications 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 Quantum Lunch Archive 
 P/T Colloquia 
 Archive 
 Ulam Scholar 
 Anastasio Fellow 
 Fellow Program 
 
 Student Requests      
 Student Program 
 Visitor Requests 
 Description 
 Past Visitors 
 Services 
 General 
 
 History of CNLS 
 
 Maps, Directions 
 T-Division 
 LANL 
 
Thursday, March 19, 2026
2:00 PM - 3:00 PM
CNLS Conference Room (TA-3, Bldg 1690)

Seminar

Nonlinear splitting for gradient-based optimization

Brian Tran
CU Boulder

High-dimensional and non-convex optimization remains a challenging and important problem across a wide range of scientific disciplines, such as machine learning, data assimilation, and partial differential equation (PDE) constrained optimization. After discussing motivation and background on optimization, I will introduce nonlinear splitting for gradient-based optimization, which is designed to improve stability and efficiency in optimization by allowing a gradient descent iteration to be semi-implicit in the optimization parameters, while providing the flexibility to account for nonlinear coupling between the parameters. Furthermore, this framework is compatible with acceleration techniques and gradient-based optimizers, such as NAG, Adam, L-BFGS, Anderson acceleration, etc.I will focus the talk on the setting of unconstrained optimization (although it is applicable in the setting of constrained optimization) and discuss various theoretical and numerical results, concluding with preliminary work on stochastic nonlinear splitting with application to training neural networks.

Host: Jeremy Lilly