List of innovations by the CoEs, spotted by the EU innovation radar

idea-diego1100x350
The EU Innovation Radar aims to identify high-potential innovations and innovators. It is an important source of actionable intelligence on innovations emerging from research and innovation projects funded through European Union programmes. 
 
These are the innovations from the HPC Centres of Excellence as spotted by the EU innovation radar:
 
bioexcel-logo

Title: GROMACS, a versatile package to perform molecular dynamics
Market maturity: Exploring
Project: BioExcel
Innovation Topic: Excellent Science
KUNGLIGA TEKNISKA HOEGSKOLAN - SWEDEN

Cheese_logo

Title: Urgent Computing services for the impact assessment in the immediate aftermath of an earthquake
Market maturity: Tech Ready
Market creation potential: High
Project: ChEESE
Innovation Topic: Excellent Science
EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH - SWITZERLAND
BULL SAS - FRANCE

compbiomed_long_logo
ecam-300x238
eocoe
esiways logo_type_grey_left copy

Table: New coupled earth system model
Market maturity: Tech Ready
Project: ESiWACE
Innovation Topic: Excellent Science
BULL SAS - FRANCE
MET OFFICE - UNITED KINGDOM
EUROPEAN CENTRE FOR MEDIUM-RANGE WEATHER FORECASTS - UNITED KINGDOM
 

Excellerat_Logo_ELR_V1_20180209-01-300x106

Title: In-Situ Analysis of CFD Simulations
Market maturity: Tech Ready
Market creation potential: High
Project: Excellerat
Innovation Topic: Excellent Science
KUNGLIGA TEKNISKA HOEGSKOLAN - SWEDEN
FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V. - GERMAN

Title: Interactive in situ visualization in VR
Market maturity: Tech Ready
Market creation potential: High
Project: Excellerat
Innovation Topic: Excellent Science
UNIVERSITY OF STUTTGART - GERMANY

Title: Machine Learning Methods for Computational Fluid Dynamics (CFD) Data
Market maturity: Tech Ready
Market creation potential: Noteworthy
Project: Excellerat
Innovation Topic: Excellent Science
KUNGLIGA TEKNISKA HOEGSKOLAN - SWEDEN
FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V. - GERMAN

MaX-logo-without-subline

Title: Quantum Simulation as a Service
Market maturity: Exploring
Market creation potential: Noteworthy
Project: MaX
Innovation Topic: Excellent Science
EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH - SWITZERLAND
CINECA CONSORZIO INTERUNIVERSITARIO - ITALY

NOMAD_Logo_srgb_web_whigh
pop250

Watch The Presentations Of The First CoE Joint Technical Workshop

22. February 2021

Watch the recordings of the presentations from the first technical CoE workshop. The virtual event was organized by the three HPC Centres of Excellence ChEESE, EXCELLRAT and HiDALGO. The agenda for the workshop was structured in these four session:

Session 1: Load balancing
Session 2: In situ and remote visualisation
Session 3: Co-Design
Session 4: GPU Porting

You can also download a PDF version for each of the recorded presentations. The workshop took place on January 27 – 29, 2021. 

Session 1: Load balancing

Title: Introduction by chairperson
Speaker: Ricard Borell (BSC)

Title: Intra and inter-node load balancing in Alya Speaker: Marta Garcia and Ricard Borell (BSC)

Title: Load balancing strategies used in AVBP
Speaker: Gabriel Staffelbach (CERFACS)

Title: Addressing load balancing challenges due to fluctuating performance and non-uniform workload in SeisSol and ExaHyPE
Speaker: Michael Bader (TUM)

Title: On Discrete Load Balancing with Diffusion Type Algorithms
Speaker: Robert Elsäßer (PLUS)

Session 2: In situ and remote visualisation

Title: Introduction by chairperson
Speaker: Lorenzo Zanon & Anna Mack (HLRS)

 

Title: An introduction to the use of in-situ analysis in HPC
Speaker: Miguel Zavala (KTH)

Title: In situ visualisation service in Prace6IP
Speaker: Simone Bnà (CINECA)

Title: Web-based Visualisation of air pollution simulation with COVISE
Speaker: Anna Mack (HLRS)

Title: Virtual Twins, Smart Cities and Smart Citizens
Speaker: Leyla Kern, Uwe Wössner, Fabian Dembski (HLRS)

Title: In-situ simulation visualisation with Vistle Speaker: Dennis Grieger (HLRS)

Session 3: Co-Design

Title: Introduction by chairperson, and Excellerat’s Co-Design Methodology
Speaker: Gavin Pringle (EPCC)

Title: Accelerating codes on reconfigurable architectures
Speaker: Nick Brown (EPCC)

Title: Benchmarking of Current Architectures for Improvements
Speaker: Nikela Papadopoulou (ICCS)

Title: Example Co-design Approach with the Seissol and Specfem3D Practical cases
Speaker: Georges-Emmanuel Moulard (ATOS)

Title: Exploitation of Exascale Systems for Open-Source Computational Fluid Dynamics by Mainstream Industry
Speaker: Ivan Spisso (CINECA)

Session 4: GPU Porting

Title: Introduction
Speaker: Giorgio Amati (CINECA)

Title: GPU Porting and strategies by Excellerat Speaker: Ivan Spisso (CINECA)

Title: GPU Porting and strategies by ChEESE
Speaker: Piero Lanucara (CINECA)

Title: GPU porting by third party library
Speaker: Simone Bnà (CINECA)

Title: The HySEA GPGPU development and its role in ChEESE project Speaker: Marc de la Asunción (UMA)

Video of the Week: ChEESE Women in Science

11. February 2021
Cheese_logo
ChEESE celebrates the International Day of Women and Girls in Science 2021 by interviewing several of its women researchers. This video acknowledges their contributions and recognises their importance to earth sciences and to science in general.

Geomagnetic forecasts

 A Use Case by

Short description

The Earth’s magnetic field is sustained by a fluid dynamo operating in the Earth’s fluid outer core. Its geometry and strength define the equivalent of the climatological mean over which the interaction of the Earth with its magnetic environment takes place. It is consequently important to make physics-based predictions of the evolution of the dynamo field over the next few decades. In addition, the geomagnetic field has the remarkable ability to reverse its polarity every now and then (the last reversal occurred some 780.000 years ago). Observations of the properties of the field during polarity transition are sparse, and ultra-high resolution simulations should help better define these properties.

Objectives

To simulate and analyse the consequences of geomagnetic reversals with an unprecedented level of accuracy. These events are extremely rare in the history of our planet, hence the need to resort to numerical simulations to better understand the properties of reversals and their possible consequences for society.

Technologies

Workflow

XSHELLS produces simulated reversals which are subsequently analysed and assessed using the parallel python processing chain. Through ChEESE we are working to orchestrate this workflow using the WMS_light software developed within the ChEESE consortium.

Software involved

XSHELLS code 

Post-processing: Python 3

External library: SHTns

Use Case Owner

Alexandre Fournier
Institut de Physique du Globe de Paris (IPGP)

Collaborating Institutions

IPGP, CNRS

Physics-Based Probabilistic Seismic Hazard Assessment (PSHA)

 A Use Case by

Short description

Physics-Based Probabilistic Seismic Hazard Assessment (PSHA) is widely established for deciding safety criteria for making official national hazard maps, developing building code requirements, safety of critical infrastructure (e.g. nuclear power plants) and determining earthquake insurance rates by governments and industry. However, PSHA currently rests on empirical, time-independent assumptions known to be too simplistic and conflict with earthquake physics. Respective deficits become apparent as many damaging earthquakes occur in regions rated as low-risk by PSHA hazard maps and near-fault effects from rupture on extended faults is not taken into account. Combined simulations of dynamic fault rupture and seismic wave propagation are crucial tools to shed light onto the poorly constrained processes of earthquake faulting. Realistic model setups should acknowledge topography, 3D geological structures, rheology, and fault geometries with appropriate stress and frictional parameters, all of which contribute to complex ground motion patterns. A fundamental challenge hereby is to model the high frequency content of the three-dimensional wave field, since the frequency range of 0–10 Hz is of pivotal importance for engineering purposes. Multiple executions of such multi-physics simulations need to be performed to provide a probabilistic-based hazard estimation.

Results & Achievements

Fault models built up in both north and south Iceland

Fully non-linear dynamic simulations accounting for 3-D velocity structures, topography, off-fault plasticity, and model parameter uncertainties and achieved target resolution.

Cybershake implemented successfully and a demo run for south Iceland
Generate the rupture probability using SHERIFS

GMPEs based hazard curves and maps with OpenQuake

About the code SeisSol: Extended YATeTo DSL to generate GPU GEMM kernels

Developed a python library as a GEMM backend for YATeTo

Adapted both SeisSol and YATeTO for batched computations

Implemented Elastic Solver: time, local, neighbour integrals

Both GTS and LTS scheme are working Enabled a distributed Multi-GPU setup Implemented Plasticity kernel (needs to get updated)

Tested performance on a multi-GPU distributed cluster: M100

Merged first stage from experimental to the production code

As a result, we obtained a 23% time reduction with respect to the GPU-only execution. In practice, this represents a performance boost equivalent to attaching an additional GPU per node and thus a much more efficient exploitation of the resources.

Objectives

The objectives of this use case is to develop general concepts for enabling physics-based seismic hazard assessment with state-of-the-art multi-physics earthquake simulation software (SeisSol, SpecFEM3D, ExaHyPE, AWP-ODC) and conduct 3D physics-based seismic simulations to improve PSHA for validation scenarios provided by IMO (Iceland) and beyond. This use case is expected to be applicable to supplement established methods by stakeholders, for different target regions and varying degrees of complexity.

Technologies

Workflow

The workflow of this pilot is shown in Figure 1.

To use the SeisSol code to run fully non-linear dynamic rupture simulations, accounting for various fault geometries, 3D velocity structures, off-fault plasticity, and model parameters uncertainties, to build a fully physics-based dynamic rupture database of mechanically plausible scenarios. 

Then the linked post-processing python codes are used to extract ground shakings (PGD, PGV, PGA and SA in different periods) from the surface output of SeisSol simulations to build a ground shaking database.

SHERIFS uses a logic tree method, with the input of the fault to fault ruptures from dynamic rupture database, converting the slip rate to the annual seismic rate given the geometry of the fault system. 

With the rupture probability estimation from SHERIFS, and ground shakings from the SeisSol simulations, we can generate the hazard curves for selected site locations and hazard maps for the study region. 

In addition, the OpenQuake can use the physics-based ground motion models/prediction equations, established with the ground shaking database from fully dynamic rupture simulations. And the Cybershake, which is based on the kinematic simulations, to perform the PSHA and complement the fully physics-based PSHA. 

Software involved

SeisSol (LMU)

ExaHyPE (TUM)

AWP-ODC (SCEC)

SHERIFS (Fault2SHA and GEM)

sam(oa)² (TUM) 

OpenQuake (GEM): https://github.com/gem/oq-engine 

Pre-processing:

Mesh generation tools: Gmsh (open source), Simmetrix/SimModeler (free for academic institution), PUMGen

Post-processing & Visualization: Paraview, python tools 

Use Case Owner

Alice-Agnes Gabriel
Ludwig Maximilian University of Munich (LMU)

Collaborating Institutions

IMO, BSC, TUM, INGV, SCEC, GEM, FAULT2SHA, Icelandic Civil Protection, Italian Civil Protection

Faster Than Real-Time Tsunami Simulations

 A Use Case by

Short description

Faster-than-real-time (FTRT) tsunami computations are crucial in the context of Tsunami Early Warning Systems (TEWS). Greatly improved and highly efficient computational methods are the first raw ingredient to achieve extremely fast and effective calculations. High-performance computing facilities have the role to bring this efficiency to a maximum possible while drastically reducing computational times. This use case will comprise both earthquake and landslide sources. Earthquake tsunami generation is to an extent simpler than landslide tsunami generation, as landslide generated tsunamis depend on the landslide dynamics which necessitate coupling dynamic landslide simulation models to the tsunami propagation. In both cases, FTRT simulations in several contexts and configurations will be the final aim of this use case.

Results & Achievements

Improving HySEA codes in CHEESE:

We have improved the load balancing algorithm. In particular we have added support for heterogeneous GPUs in the load balancing algorithm by assigning a numerical weight to each GPU.

We have developed a new algorithm for the nested meshes processing based on the current state values and we have implemented the activation of the nested meshes processing when a movement of the water is detected in their area.

Implemented asynchronous file writing by creating an additional thread for each MPI process using C++11 threads (see Table 1 at the end of this document).

Added the possibility of resuming a stored simulation.

Added sponge layers for a better processing of the border conditions, in order to avoid possible numerical instabilities in the borders of the domain. Implemented asynchronous CPU-GPU memory transfers.

We have dramatically reduced the size of the output files compressing the data using the algorithm described in Tolkova (2008) and saving most of the data in single precision files. A new version of Tsunami-HySEA has been developed to run simultaneous simulations on the same domain attending the requirements of PD7 and PD8, by executing one simulation on each GPU. This new version is able to use up to 1024 GPUs simultaneously with a very good weak scaling (losing around 3% of efficiency).

With these improvements, we obtain around 30% of reduction on the computational time with respect to the previous version of the codes.

The codes have been tested on CTE-POWER (BSC), DAVIDE and Marconi100 (CINECA) and Piz Daint (CSCS) supercomputers.

Objectives

The aim of this use case is to provide robust and very efficient numerical codes for FTRT Tsunami simulations that can be run in massively parallel multi-GPU architectures.

Technologies

Workflow

The Faster-Than-Real-Time (FTRT) prototype for extremely fast and robust tsunami simulations is based upon GPU/multi-GPU (NVIDIA) architectures and is able to use earthquake information from different locations and with heterogeneous content (full Okada parameter set, hypocenter and magnitude plus Wells and Coppersmith (1994)). Using these inhomogeneous inputs, and according to the FTRT workflow (see Fig. 1), tsunami computations are launched for a single scenario or a set of related scenarios for the same event. Basically, the automated retrieval of earthquake information is sent to the system and on-the-fly simulations are automatically launched. Therefore, several scenarios are computed at the same time. As updated information about the source is provided, new simulations should be launched. As output, several options are available tailored to the end-user needs, selecting among: sea surface height and its maximum, simulated isochrones and arrival times to the coastal areas, estimated tsunami coastal wave height, times series at Points of Interest (POIs) and oceanographic sensors.

 A first successful implementation  has been done for the Emergency Response Coordination Centre (ERCC), a service provided by the ARISTOTLE-ENHSP Project. The system implemented for ARISTOTLE follows the general workflow presented in Figure 1. Currently, in this system, a single source is used to assess the hazard and the computational grids are predefined.  The computed wall-clock time is provided for each experiment and the outputs of the simulation are maximum water height and arrival times on the whole domain, and water height time-series on a set of selected POIs, predefined for each domain. 

A library of Python codes are used to generate the input data required to run HySEA codes and to extract the topo-bathymetric data and construct the grids used by HySEA codes.

Software involved

Tsunami-HySEA has been successfully tested with the following tools and versions:

Compilers: GNU C++ compiler 7.3.0 or 8.4.0, OpenMPI 4.0.1, Spectrum MPI 10.3.1, CUDA 10.1 or 10.2

Management tools: CMake 3.9.6 or 3.11.4

External/third party libraries: NetCDF 4.6.1 or 4.7.3, PnetCDF 1.11.2 or 1.12.0

Pre-processing:

Nesting mesh generation tools.

In-house developed python tools for pre-processing purposes. 

Visualization tools:

In-house developed python tools.

Use Case Owner

Jorge Macías Sanchez
Universidad de Málaga

Collaborating Institutions

UMA, INGV, NGI, IGN, PMEL/NOAA (with a role in pilot’s development).

Other institutions benefiting from use case results with which we collaborate:
IEO, IHC, IGME, IHM, CSIC, CCS, Junta de Andalucía (all Spain); Italian Civil Protection, Seismic network of Puerto Rico (US), SINAMOT (Costa Rica), SHOA and UTFSM (Chile), GEUS (Denmark), JRC (EC), University of Malta, INCOIS (India), SGN (Dominican Republic), UNESCO, NCEI/NOAA (US), ICG/NEAMTWS, ICG/CARIBE-EWS, among others.

Probabilistic Volcanic Hazard Assessment (PVHA)

 A Use Case by

Short description

PVHA methodologies provide a framework for assessing the likelihood of a given measure of intensity of different volcanic phenomena, such as tephra loading on the ground, airborne ash concentration, pyroclastic flows, etc., being exceeded at a particular location within a given time period. This pilot deals with regional long- and short-term PVHA. Regional assessments are crucial for a better land-use planning and for counter-measurements for risk mitigation actions of civil protection authorities. Because of the computational costs required to adequately simulate volcanic phenomena, PVHA is most based on single or very few selected reference scenarios. Independently of the degree of approximation of the used numerical model, PVHA for tephra loading and/or airborne ash concentration will necessitate a high number (typically several thousands, in order to capture variability in meteorological and volcanological conditions) of tephra dispersion simulations of which each is moderately intensive. This pilot will comprise both long- and short-term probabilistic hazard assessment for volcanic tephra fallout by adopting and improving a methodology recently proposed (Sandri et al., 2016) able to capture aleatory and epistemic uncertainties. Long term probabilistic hazard assessment for PDCs will also be envisaged, focussing on aleatory and epistemic uncertainties on Eruptive Source Parameters. Since tephra fallout models allow also a consistent treatment of spatially and temporally variable wind fields and can describe also phenomena like ash aggregation, an Exascale capacity will allow also to spatially extend, for the first time, the PVHA for evaluating potential impact from all active volcanoes in Italy on the entire national territory.

Results & Achievements

The award of PRACE resources, in association with PD3 (High-resolution volcanic plume simulation ) and PD12 (High-Resolution Volcanic Ash Dispersal Forecast), to run FALL3d simulations at the required target resolution and spatial domain.

The prototypal version of PVHA_WF to process the simulations and produce hazard maps.

The application of PVHA_WF to the case of Campi Flegrei volcano, in Southern Italy, in an illustrative example for the days 5, 6 and 7 December 2019, to show the proof-of-concept and feasibility

Objectives

The objective of this use case is to provide innovative hazard maps with uncertainty, and overcoming the current limits of PVHA imposed so far by the high computational cost required to adequately simulate complex volcanic phenomena (such as tephra dispersal) while fully exploring the natural variability associated to such volcanic phenomena, on a country-size domain (~thousands of km) at a high resolution (one to few km).

Technologies

Workflow

PVHA_WF_st fetches the monitoring data (seismic and deformation) and, together with the configuration file of the volcano, calculates the eruptive forecasting (probability curves and vent opening positions) and uses the output file from  alphabeta_MPI.py to create the volcanic hazard probabilities and maps. 

PVHA_WF_lt uses the configuration file of the volcano to calculate the eruptive forecasting and, together with the output file from alphabeta_MPI.py, creates the volcanic hazard probabilities and maps. 

Meteo data download process is fully automated. PVHA_WF_st and PVHA_WF_lt connect to the Climate Data Store (Copernicus data server) and download the meteorological data associated with a specified analysis grid. These data will later be used to obtain the results of tephra deposition by FALL3D. 

Software involved

FALL3D

Use Case Owner

Laura Sandri
INGV Bologna

Collaborating Institutions

INGV
BSC
IMO

High-Resolution Volcanic Ash Dispersal Forecast

 A Use Case by

Short description

Operational volcanic ash dispersal forecasts are routinely used to prevent aircraft encounters with volcanic ash clouds and to perform re-routings avoiding contaminated airspace areas. However, a gap exists between current operational forecast products (e.g. issued by the Volcanic Ash Advisory Centers) and the requirements of the aviation sector and related stakeholders. Two aspects are particularly critical: 1) time and space scales of current forecasts are coarse (for example, the current operational setup of the London VAAC at U.K. Met. Office outputs on a 40 km horizontal resolution grid and 6 hour time averages) and; 2) quantitative forecasts. Several studies (e.g. Kristiansen et al., 2012) have concluded that the main source of epistemic/aleatory uncertainty in ash dispersal forecasts comes from the quantification of the source term (eruption column height and strength) which, very often, is not fully-constrained on real time. This limitation can be circumvented in part by integrating into models ash cloud observations away from the source, typically from satellite retrievals of fine ash column mass load (i.e. vertical integration of concentration). Model data assimilation has the potential to improve ash dispersal forecasts by an efficient joint estimation of the (uncertain) volcanic source parameters and the state of the ash cloud.

Results & Achievements

Implementation of ensemble forecasts in FALL3D to run different ensemble members (realizations) as a single model run.

A new workflow component has been developed to retrieve ash (and SO2) cloud column mass from last-generation satellite instrumentation.

A new satellite data assimilation module based on the Parallel Data Assimilation Framework (PDAF) has been implemented.

Objectives

Volcanic ash cloud forecasts are performed shortly before or during an eruption in order to predict expected fallout rates in the next hours/days and/or to prevent aircraft encounters with volcanic clouds. These forecasts constitute the main decision tool for flight cancellations and airplane re-routings avoiding contaminated airspace areas. However, an important gap exists between current operational products and the actual requirements from the aviation industry and related stakeholders in terms of model resolution, frequency of forecasts, and quantification of airborne ash concentration. This pilot demonstrator is implementing an ensemble-based data assimilation system (workflow) combining the FALL3D dispersal model with high-resolution geostationary satellite retrievals in order to furnish high-resolution forecasts

Technologies

Workflow

Use case workflow includes the following components:

The download and pre-process of required meteorological data.

The download of raw satellite data and the cloud mass quantitative retrievals (SEVIRI retrievals at 0.1º resolution, 1-hour frequency).

The ensemble forecast execution using the FALL3D model (i.e. the HPC component of the workflow)

No WMS available yet (work in progress)

Software involved

 

FALL3D code

Use Case Owner

Arnau Folch
Barcelona Supercomputing Center-Centro Nacional de Supercomputación (BSC-CNS)

Collaborating Institutions

BSC
INGV
IMO

ChEESE: New open access publication on Probabilistic Tsunami Hazard Analysis

5. January 2021
cheese
Check out new open access publication by ChEESE CoE on Probabilistic Tsunami Hazard Analysis.
(c) ChEESE

New POP CoE blog post: Speedups of a Volcanic Hazard Assessment Code

11. November 2020

Latest blog post by POP CoE – discover how their work on The Probabilistic Volcanic Hazard Assessment Work Flow package (PVHA_WF) led to speedups of around 500x over the total execution time.

The package is a workflow created for the ChEESE CoE Pilot Demonstrator 6 (PD6).

>> POP CoE Blog Post
>> ChEESE Pilot Demonstrators

(c) POP CoE