Forschungszentrum Jülich has expanded its modular supercomputer »Jureca«. It now performs 23.5 quadrillion computing operations per second – 23.5 petaflops for short. One focus of the system is on processing large amounts of data.
The new computing power was achieved by installing the »Jureca-DC« module - DC stands for »data-centric«. The module was supplied by the French company Atos and works with the operating system of the German company ParTec.
As of today, the new module alone, with its 18.5 petaflops, is currently one of the 30 fastest supercomputers in the world. In total, Jureca consists of two modules: a cluster module, which can be used universally, and a booster module for special codes and compute-intensive program parts that can be computed very efficiently in parallel on many compute cores. The cluster module has been replaced by the Jureca module in recent months. The booster module was installed in 2017 and remains in operation.
The Jureca module became completely operational in May. It is based on Atos' Sequana XH2000 series and has a total of 768 compute nodes equipped with two AMD EPYC Rome CPUs with 64 cores and 512 GB to 1 TB of main memory. Data-intensive applications can be accelerated with the flash memory system from Atos. In a joint project, Forschungszentrum Jülich, Atos and ParTec are working to further optimize the system step by step in its early production phase. A total of 192 of the 768 nodes are each equipped with four A100 GPUs from Nvidia.
In addition, the »Just-IME« flash memory from Hewlett Packard Enterprise and DDN was installed as a storage system from which several Jülich supercomputers benefit. It is specially designed for fast data read-in and read-out. It also accelerates data analysis and machine learning. It was supplied by Hewlett Packard Enterprise and DDN and is networked with the Juwels, Jureca and Jusuf supercomputers. Together, they achieve a bandwidth of more than 2 TB/s – roughly double the bandwidth of the TAT-14 transatlantic cable, which has been between the U.S. and Germany since 2001.
The new possibilities are being used for climate research, for example, where the volume of data to be processed is constantly increasing. With the help of data assimilation, increasingly accurate predictions of climate change can be derived. This involves linking data from computer simulations with real measurement data. Another example is weather forecasting using deep learning. These use patterns extracted from very large data sets to improve the prediction of locally occurring thunderstorms and heavy rainfall.