Credible computational modeling & Uncertainty Quantification. Development of numerical tools to assess and control credibility. This embraces four underlying ideas: control the numerical accuracy (Verification), enhance the quality of the approximation (Adaptivity), monitor the pertinence of the model (Validation) and account for the aleatoric nature of the systems analyzed (Uncertainty Quantification).
Data-driven model-updating. Data assimilation strategies incorporate into models data from sensors, observations, and also from other models. This is complementary to Validation (via parameter identification) and strongly related with Uncertainty Quantification. This line includes developing novel Bayesian-based Markov chain Monte Carlo approaches.
Reduced order models. Intrusive and nonintrusive Reduced Order Models, using different numerical strategies accompanied by error control.
Robust solvers for computational science and engineering. Simulation tools, insensitive to mesh quality, tailored to specific physical problems of industrial interest.
Multi-fidelity surrogates for parametric studies. Detailed simulations of complex phenomena are often unaffordable due to their computational cost. At the same time, simplified models are usually not sufficiently accurate to achieve the precision required by physicists and engineers. In order to make real-time solution of parametric problems affordable, this line constructs and blends a hierarchy of simulations of different fidelities, bridging robust solvers with high-fidelity discretizations and reduced order models.