Prometheus for Polish Science
On April 27, 2015, a ceremonial launch of the most powerful supercomputer in the history of Poland – Prometheus – took place in the ACC Cyfronet AGH. The new supercomputer, built by Hewlett-Packard, is one of the largest installations of this type in the world and also the first in Europe, based on the latest technology of direct water cooling. It is located in 15 racks and covers an area of just 13 sq. m.
Prometheus is the third in the world implementation of the HP Apollo 8000 platform. The first took place in the United States, the second – in Norway. In both cases, energy-efficiency of the solution has been crucial for its selection. The installation in Poland is not only the most recent, containing components of so called 9 generation, but also the most efficient in the world.
Prometheus brings to the existing IT infrastructure:
- 1,658 PetaFlops of theoretical performance (1 658 880 000 000 000 floating point operations per second),
- 1728 servers of the HP Apollo 8000 platform, connected by super-fast InfiniBand network with 56 Gbit/s bandwidth,
- 41 472 cores (of the Intel Haswell latest-generation processors),
- 216 000 000 000 000 B of DDR4 RAM (216 TeraBytes),
- two file systems with a total capacity of 10 PB and access speed of 180 GB/s.
To illustrate the speed of the Prometheus: in order to catch up with his abilities, one should use more than 40 000 PCs in the strongest configuration.
Thanks to the innovative technology of direct water cooling of processors and memory modules, Prometheus is one of the most energy-efficient computers in its class in the world. To maintain the proper temperature of the liquid in our climate (Poland) it is enough to use so-called dry-coolers, cheap in exploitation, instead of ice-water generators, which consume relatively large amounts of electricity. This solution will positively impact not only on reliability, but also will provide much higher performance than similar systems based on the classic air cooling. In the case of Prometheus, the Power Usage Efectiveness (PEU) factor (which for the best data centers, using air cooling, is 1.6) will be 1.06, so only additional 6% of energy for cooling is needed, instead of additional 60%. These are very significant savings. In practice, this means that the computer will use less electrical power per 1 TeraFlops. Furthermore, liquid cooling enables extremely high-density of installation of 144 computing servers in a single rack. Therefore, the computational part, weighing more than thirty tons, fits just in fifteen cabinets.
To illustrate the functionality of the adopted solution, one can just compare the density in racks, when traditional air cooling is used. Then, Prometheus would occupy much more space! In the technology of 374 TeraFlops' Zeus (the Prometheus' predecessor) it would be already 120 racks, and in the technology of 1.5 TeraFlops' Baribal (Zeus' predecessor, recently excluded from the use) as many as 8,000 racks.
The infrastructure of the new supercomputer requires to ensure special conditions for its proper work. The entire computing system, along with the necessary accompanying elements, including the guaranteed power supply system with the additional emergency generator, was installed in the new Cyfronet building, in a computer hall fully adapted to Prometheus' exploitation.
Prometheus will serve scientists from various disciplines, including chemistry, biology, physics, astrophysics, energy sector or nanotechnologies. Much more efficient processors, the greater amount of RAM and the InfiniBand network faster by almost 30% than in the case of Zeus will allow scientists to carry out calculations on a scale impossible to achieve with use of current resources of the center. The calculations carried out currently with use of Zeus can be performed even several times faster on Prometheus.
Not only the technical aspects distinguish the new Polish supercomputer: at the front of Prometheus the eye-catching graphics has been placed. The winning design was chosen through a competition organized by the ACC Cyfronet AGH. It is presented below.