“The Other Side of Words”: Exascale – Enerzine

Denis Trystram, University of Grenoble Alpes (UGA)

Exascale is the current culmination of a long process of computing development. A computer’s performance is measured in the number of “floating operations”, multiplications and/or additions it can perform per second, and more precisely on real numbers encoded in 64 bits. We record Flops/s for this unit.

On our mobile phones or personal computers, we count billions of operations per second (giga) in order to browse the Internet even faster or watch videos interactively. On the largest machines, it is in the order of mils. Giga is the third power (10003 or 109), then come in order tera, peta and exa. These International System of Units prefixes are deformations of Greek tetra, fifth AND hex, so they can be remembered quite easily. Exascale therefore corresponds to the scale of supercomputers whose computing capacity exceeds 10 to the power of 18 Flops/s.

High-performance computing (HPC) aims to perform scientific applications such as nuclear simulation or meteorology as quickly as possible. The needs of computing beyond human capabilities emerged during the Second World War and have since increased with the massive digitization of society and globalization.

The history of high-performance computing has traced major technological milestones from the invention of integrated circuits in the early 1960s to the recent 5 nanotechnologies. Vector processors emerged in the 1970s and were designed around a reduced instruction set that worked efficiently on large collections of homogeneous data. Since the 1980s, we have witnessed the development of parallelism, especially with massively parallel multiprocessors, a large number of computing units connected by sophisticated interconnection networks.

In the mid-1990s, another type of supercomputer appeared, based on the idea of ​​building a general device. Taken to an extreme, this gave rise to peer-to-peer computing, where anyone could provide their own internet-connected computer to contribute to an ambitious scientific experiment such as protein folding. The term accelerator (a specialized co-processor, more efficient for a certain type of operation) appeared around 2000, and at the birth of the development of deep learning in 2013, it took on a fundamental dimension with GPUs (graphics processors).

Today, all the latest consumer processors contain several dedicated units. GPUs have evolved and their advanced parallelism makes them very efficient for matrix and tensor computing, specialized in machine learning, such as Google’s TPUs.

The exascale goal was achieved in 2022 when the Frontier supercomputer topped the Top500. This is a ranking of the 500 most efficient systems in the world. This evaluation is done after carefully preparing the supercomputer for classical linear algebra benchmarks. Not all of the top performing HPC systems are in the Top 500. Large companies or the military have HPC systems that are not there.

Like the conquest of space, ranking first in the Top 500 is clearly a geopolitical issue spearheaded by the United States to assert technological supremacy and sovereignty. France and Europe have programs to build their own exascale systems.

The issue of energy consumption for large HPC platforms has always been present, but today it is becoming crucial in the fight against the carbon emissions that are the cause of the climate crisis. Some believe that the race for power can contribute to solving the current environmental challenges, but more and more voices warn against the headlong rush that contributes to uncontrolled acceleration and advocate slowing down, especially since there are very few applications that really require so much energy. . Debating these two opposing positions, politicians and technophiles are already considering the next step of zettascale (the seventh power of 1000, the unit just after exascale, i.e. 1,000,000,000,000,000,000,000 Flops/s).


This article is part of a series “The Other Side of Words”, dedicated to how our vocabulary expands and adapts as social issues emerge and new challenges confront science and technology. Concepts that we thought we already knew well are enriched with new meanings, recently created words enter the dictionary. Where do they come from? How do they enable us to fully understand the nuances of a changing world?

From ‘validism’ to ‘silence’, from ‘bifurcation’ to ‘degender’, our researchers focus on these neologisms to help us better understand them and therefore better engage in public debate.

Also discover in this series:

Denis Trystram, university professor of computer science, University of Grenoble Alpes (UGA)

This article is republished from The Conversation under a Creative Commons license. Read the original article.

(writing)

Leave a Comment