Optimization of Earth System Models on the path to the new generation of Exascale high-performance computing systems

 A Use Case by

Short description

In recent years, our understanding of climate prediction has significantly grown and deepened. This is being facilitated by improvements of our global Earth System Models (ESMs). These models aim for representing our future climate and weather ever more realistically, reducing uncertainties in these chaotic systems and explicitly calculating and representing features that were previously impossible to resolve with models of coarser resolutions.

A new generation of exascale supercomputers and massive parallelization are needed in order to calculate small-scale processes and features using high resolution climate and weather models.

However, the overhead produced by the new massive parallelization will be dramatic, and new high performance computing techniques will be required to rise to the challenge. These new HPC techniques will enable scientists to make efficient use of upcoming exascale machines, and to set up ultra-high resolution experiment configurations of ESMs and run the respective simulations. Such experiment configurations will be used to predict climate change over the next decades, and to study extreme events like hurricanes.

Results & Achievements

The new EC-Earth version in development is being tested for the main components (OpenIFS and NEMO) on Marenostrum IV, using a significant number of cores to test the new ultra-high resolutions of 10 km in the horizontal domain, using up to 2048 nodes (98,304 cores) for the NEMO component and up to 1024 nodes (49,152 cores) for the OpenIFS component.

Different optimizations (developed in the framework of the projects ESiWACE and ESiWACE2) included in these components have been tested to evaluate the computational efficiency achieved. For example, the OpenIFS version including the new integrated parallel I/O allows for an output of hundreds of Gigabytes, while the execution time increases only by 2% compared to the execution without I/O. This is much better than the previous version, which produced an overhead close to 50%. Moreover, this approach will allow for using the same I/O server for both components, facilitating more complex computations online and using a common file format (netCDF).

Preliminary results using the new mixed precision version integrated in NEMO have shown an improvement of almost 40% in execution time, without any loss of accuracy in the simulation results.

Objectives

EC-Earth is one such model system, and it is being used in 11 different countries and by up to 24 meteorological or academic institutions to produce reliable climate predictions and climate projections. It is composed of different components, with the atmospheric model OpenIFS and the ocean model NEMO being the most important ones.

EC-Earth is one of the ESMs that suffer from a lack of scalability when using higher resolutions, with an urgent need for improvements in capability and capacity on the path to exascale. Our main goal is achieving a good scalability of EC-Earth using resolutions of up to 10 km of horizontal spatial resolution with extreme parallelization. In order to achieve this, different objectives are being pursued:

(1) The computational profiling analysis of EC-Earth. Analysing the most severe bottlenecks of the main components when extreme parallelization is being used.

(2) Trying to exploit high-end architectures efficiently, reducing the energy consumption of the model to achieve a minimum efficiency in order to be ready for the new hardware. For this purpose, different High Performance Computing techniques are being applied, for example the integration of a full parallel in- and output (I/O), or the reduction in precision of some variables used by the model, maintaining the same accuracy in the results while improving the final execution time of the model.

(3) Evaluating, if massive parallel execution and the new methods implemented could affect the quality of the simulations or impair reproducibility.

ESiWACE Newsletter 01/2021 published

13. January 2021
The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) published a new issue of their newsletter: Learn more about upcoming virtual trainings and workshops, as well as further news.

ETP4HPC handbook 2020 released

6. November 2020

The 2020 edition of the ETP4HPC Handbook of HPC projects is available. It offers a comprehensive overview over the European HPC landscape that currently consists of around 50 active projects and initiatives. Amongst these are the 14 Centres of Excellence and FocusCoE, that are also represented in this edition of the handbook.

>> Read here

ESiWACE success story: Improving weather and climate forecasting with a new NEMO configuration

Highlighted Centre of Excellence

ESiWACE, the “Centre of Excellence in Simulation of Weather and Climate in Europe” has been funded by the European Commission to substantially improve efficiency and productivity of numerical weather and climate simulation. Overall, the Centre of Excellence prepares the European weather and climate community to make use of future exascale systems in a co-design effort involving modelling groups, computer scientists and HPC industry. .

Organisations & Codes Involved:

Atos Center for Excellence in Performance Programming (CEPP) provides expertise and services in High Performance Computing, Artificial Intelligence and Quantum Computing.
LOCEAN laboratory, part of the CNRS-IPSL, conducts studies on the physical and biogeochemical processes of the ocean and their role in climate in interaction with marine ecosystems.

CERFACS research center is specialized in modelling and numerical simulation, through its facilities and expertise in high-performance computing.

 

CHALLENGE:​

A key model for weather and climate comprehension is NEMO (Nucleus for European Modelling of the Ocean), a modelling framework for research activities and forecasting services in ocean and climate sciences. NEMO is used by a wide variety of applications at global or regional focus, with different resolutions, different numerical schemes, parameterizations and therefore with different performance constraints. The technical challenge here was to find a way to ease the profiling and benchmarking of NEMO for its versatile uses in order to increase the performance of this framework.

 

SOLUTION:​

In response to this challenge, ESiWACE has developed a new configuration, adapted to HPC benchmarking, that is polymorphic and can very simply reproduce any use of NEMO.
Thanks to a close collaboration between the Atos CEPP, LOCEAN and CERFACS, a dedicated configuration of NEMO to ease its profiling and benchmarking has been set up.
Many tests of this configuration, including large-scale experiments on the Atos Bull supercomputers at Météo France and CEA’s Very Large Computing Centre (TGCC) have been performed. This resulted in several optimisations improving the performance of the NEMO code in this configuration by up to 38%.

The open source NEMO 4.0, which was released in 2019, benefited from this work and included the following improvements: an automatic MPI sub-domain decomposition, and a rewriting of the communication routines with an optimisation of treatment of the North pole singularity.

Business impact:

Some of the NEMO uses such as weather or climate forecasts are among the key challenges our society must address. Improvements of NEMO numerical performance allow to refine model results, to reduce forecast uncertainties and to better predict high-impact extreme events, thus saving lives. On Earth the Ocean has a huge impact on the atmosphere. Thus, NEMO is widely used coupled with atmospheric models. For example, it is used to simulate ocean eddies, temperature and salinity that play a key role in cyclone intensity and trajectory forecast. Therefore, business impacts of improving NEMO may be indirect but there are significant as they concern everyone and all kinds of companies and entities.

This work benefits society by improving the efficiency and productivity of numerical weather and climate simulation and by preparing them for future exascale systems. It fosters expertise exchanges which enable researchers and industry to be more productive, leading to scientific excellence in Europe, through direct and tight collaborations.

Benefits for further research:

  • Simplified profiling and benchmarking of NEMO at different scales to find the most relevant optimisation paths.
  • Up to 38% efficiency and scalability increase of NEMO with the optimized configuration on HPC systems.
  • Reduction in time/cost of ocean simulations for both research and production purpose improving weather and climate predictions, allowing to protect property, interests and saving lives.
Image: Example of NEMO High Resolution Simulation (ORCA 1/12°) showing here the Sea Surface Salinity in both Atlantic and Pacific Ocean on the equator. Source: https://www.nemo-ocean.eu/

ESiWACE newsletter October edition published

October 12th, 2020

The October edition of the ESiWACE newsletter was recently published. It features online training courses, events, news and latest publications from the ESiWACE CoE.

>> Read on Zenodo

>> Subscribe to newsletter here