Uncertainties are generally classified according to their nature. There are several sources of error: errors related to simplifications of equations (dimensional reduction, empirical parameterization), errors related to the numerical schemes and errors related to spatio-temporal discretization. The mathematical model must be supplemented by a set of data and parameters that describe the geometry of the system, its initial conditions and its boundary conditions. The physical parameters govern the constitutive laws of the system. These data and parameters are only partially and approximately observed and known. These errors are generally classified into two groups :
- les erreurs épistémiques qui sont liées à un manque de connaissance sur le système,
- les erreurs aléatoires qui sont liées à la nature stochastique du système.
The aim is to understand and quantify the extent to which epistemic and random uncertainties affect the quantity of interest (QOI) response of the numerical model. Sensitivity statistical methods are used to study how the response of the numerical solver reacts to changes in uncertain input variables, considered as random variables. The propagation of uncertainties allows to determine from hypotheses on the input variables – for example on probability density distributions – the range of variation and/or the probabilistic distribution of the QOIs. This topic is in strong development in the field of scientific computation for geosciences and beyond. It mobilizes the available computing resources to perform a large number of simulations in parallel.
The most widespread method is the Monte-Carlo method which estimates QOI statistics from a random sample of inputs propagated by the physical model. The convergence of this method is related to the size of the sample and has the advantage of being independent of the size of the uncertainties. It is a non-intrusive approach, which is of great interest in an industrial or operational context insofar as it is applied to a simulation without modifying the numerical code. Although robust and adapted to HPC parallelism, this approach is numerically costly, especially for complex simulations, and other approaches are being developed by the scientific community. High-fidelity computing codes certainly provide a more satisfactory description of reality on a fine mesh, with a good representation of the equations of physics, solved for small time steps; but their computational cost can be prohibitive, especially in a context of ensemble such as the propagation of uncertainties or inverse problems. In this context, it becomes necessary to replace these precise but expensive models by more affordable surrogates.
At CERFACS, this activity is transversal to the following transversal axis at CERFACS Assimilation de données and research axis Environnement , Climat, Aérodynamique and Combustion. The applications at CERFACS relate to flood forecasting, wildland fire propagation, climate variability or combustion chamber ignition.
The activity on uncertainty quantification at CERFACS aims at estimating uncertainties for numerical solvers that require large computational resources dedicated to environmental risk, data assimilation, optimization especially under uncertainties and coupling, in a high performance computing context.
The principal actions are:
- Ensemble generation and Design of Experiment for functional variables with space reduction solutions
- Development and evaluation of objective oriented cost-effective Surrogate Model methodologies for large dimension problems
- Efficient use of surrogate models for sensitivity analysis, optimization, data assimilation
- Development of HPC efficient algorithms for stochastic estimation with solvers of increasing complexity (multi-fidelity, Multi Level Monte Carlo)
These actions are implemented for the following applications:
- Development of surrogate models for sensitivity analysis and ensemble-based data assimilation. Application in hydraulics and aerodynamics for large dimension uncertain variables.
- Development of UQ algorithms for exascale computing with efficient, scalable and resilient domain decomposition algorithms
- Surrogate models for large-eddy simulations of atmospheric boundary layer to emulate micro-scale meteorology and assess structural and parametric uncertainty. Development for pollutant dispersion application.
- Application of Multifidelity-MLMC algorithms for industrial CFD, multidiscplinary systems and geosciences.
- Building test bed for global climate emulation/simple climate models to test constraints on future climate projections and improve existing scenario projections, assess uncertainties in long term warming trajectories and improve consistency between physical system models and socioeconomic models.