Lab Home | Phone | Search | ||||||||
![]() |
|
|||||||
![]() |
![]() |
![]() |
We introduce Featurized Koopman Mode Decomposition (FKMD), an integrated method for choosing KMD features using a learned Mahalanobis distance on a delay embedded space. The method is inspired by the recent observation that the outerproduct of the weights of a fully trained neural network agree with the average gradient outerproduct (AGOP) of the underlying interpolant (Mechanism for feature learning in neural networks and backpropagation-free machine learning models, Radhakrishnan et al., 2024). The Mahalanobis distance aids in featurizing KMD in cases where good features are not a priori known. We show that FKMD improves predictions for a high-dimensional linear oscillator, a high-dimensional Lorenz attractor that is partially observed, and a cell signaling problem from cancer research. Bio: David Aristoff is an associate professor of mathematics at Colorado State University, having joined the faculty in 2014. Prior to that, he was a Dunham Jackson Assistant Professor at the University of Minnesota, and earned his Ph.D. in mathematics from UT Austin in 2011. His research is in applied math and scientific computing. Join the meeting in Teams Meeting ID: 258 219 958 587 Passcode: 24cq2oZ7 Host: Danny Perez, T-1 |