Lab Home | Phone | Search | ||||||||
|
||||||||
Computer models have become popular scientific tools when trying to understand complex phenomena that arise in many areas of science and engineering. Their usefulness arises in situations where the real-world process is too costly or difficult to observe. In such cases, an investigator would instead like to use the computer model to understand the real-world process and answer scientific questions. However, there are some difficulties with this approach in practice. For instance, the model may be computationally demanding so that only a few runs can be made; it may be ``wrong'' to some degree (i.e. missing physics, or simplified physics); and it may depend on unknown inputs which need to be estimated from observational data (i.e. calibration parameters). Statistical uncertainty quantification aims to account for these various sources of uncertainty in a probabilistic manner. The goal is to provide answers to scientific questions that include these various sources of uncertainty. In the first part of this talk, I will outline some common statistical approaches to quantifying uncertainty of complex computer models, and describe some typically encountered problems being actively researched in the statistics community. In the second part of the talk, I will outline a new statistical approach for calibrating computer model ensembles. Notably, the method can be seen as a Bayesian generalization of the popular Ensemble Kalman Filter. We will demonstrate this method by estimating two parameters of the Community Ice Sheet Model (CISM) which describes the flow evolution of icesheets in the Arctic and Antarctic. Host: Humberto C Godinez Vazquez, Mathematical Modeling and Analysis Theoretical Division, 5-9188 |