Lab Home | Phone | Search | ||||||||
|
||||||||
Abstract: Reductions in computational expense are often achieved through low fidelity surrogates learned using data from fine-scale models. This reliance on summarized information presents challenges, however, since summarization generally fails to commute with fundamental model properties guaranteeing stability in the fine-scale system. This talk explores the consequences of imbuing mathematical structure preservation directly into low-fidelity architectures meant to accelerate downstream analyses. Remarkably, it is shown that the advantageous dynamical properties obtained through this procedure have utility as a general mechanism for information propagation as well as a surrogate modeling tool. Microsoft Teams: Join the meeting now Meeting ID: 210 033 373 136 Passcode: HePfZF Host: Yen Ting Lin (CCS-3) |