What is Exascale Computing?

What is Exascale Computing?

We’ve covered what super computers are
here in this video, and they are indeed super. The fastest machines today are in
the realm of petascale computing, that’s more than a quadrillion computing
operations per second. The most powerful computer in the world today is Summit,
the supercomputer at Oak Ridge National Lab. It’s followed closely by Lawrence
Livermore’s Sierra supercomputer, but even today’s most powerful high-performance
machines can’t keep up with the demands of the increasingly complex simulations
required for national security, precision medicine, climate research, and a host of
other areas of basic science. So, what’s next ? Some computing experts have called
improving computing performance on this huge scale the ‘space race of this
century’ because after petascale computing, comes exascale computing. If
petascale is a quadrillion calculations per second, exascale is a quintillion
operations per second. That’s 18 zeroes in one second. 10 to the 18th. And by the
way—that’s why Axascale Day is on October 18th! Get it? It’s a small word
for an almost unimaginably big number. If we went back in time a quintillion
seconds? That’s about 32 billion years ago…18 billion years before the Big Bang.
A quintillion is about the number of neurons in ten million human brains. And
if you somehow linked the top 100 fastest and most powerful supercomputers
in the world right now and had them all work together on a single problem, you
still wouldn’t reach the computing power of one exascale system. If we look back
at the progress we’ve made in improving computing power, we’ve made these strides
because we’ve been able to pack more and more computing power into smaller and
smaller spaces. That improvement in power has been holding steady for many years,
but we’re starting to see it slow down. We can’t get the transistors any smaller,
and plus moving data around within a computing system that large takes so
much energy. Huge leaps in computing also require leaps in the accompanying
infrastructure. Interdisciplinary innovation has resulted in improvements to the
architecture, data storage and management, and the kinds of software that can run
on these machines. This and much more is what has made exascale computing a
reality. The nation’s first exascale system, Aurora, is slated for arrival at
Argonne National Lab in 2021, followed later that year by Frontier at Oak Ridge
National Lab, and El Capitan at Lawrence Livermore in 2022, projected to perform
at more than 1.5 exaflops. That’s one-and-a-half quintillion
operations per second. But why do we need this huge leap in computing power? At a
national lab like Lawrence Livermore, scientists rely on computer modeling and
simulations to solve problems that are way beyond the scope of any human. And
see, the universe is in 3D. And the ability to incorporate additional
physics into 3D simulations and perform them regularly means that the
information scientists get from these models will be closer than ever before
to what happens in the real world. El Capitan will allow scientists and
physicists to perform larger and higher- resolution 3D simulations than ever
before, and fast. Simulations that might take a week to perform on Sierra—
Livermore’s existing petascale computer— could be completed in a single workday
on El Capitan. El Capitan’s primary mission will be supporting America’s national
security missions, ensuring the safety, security, and effectiveness of the
nation’s nuclear stockpile in the absence of underground testing. Across
the Department of Energy, exascale computing will help scientists answer
some of the biggest and most fundamental questions humanity has. It will allow for
faster, more detailed models in areas of basic science such as predicting the
effects of climate change, simulating how drug molecules interact, with the body in
order to create more effective medications, optimizing designs for 3D
printing studying, earthquakes, or how the 100 billion neurons
the human brain connect and function. Throughout history we’ve pushed the
boundaries of humanity’s ability to understand and ask questions of our
universe, and exascale computing is simply the next step in better, faster
science. And what’s next, beyond exascale computing, beyond the
boundary of Moore’s Law, into the realm of fields like quantum computing? Well, we
can only imagine.

Daniel Ostrander

Related Posts

5 thoughts on “What is Exascale Computing?

  1. uNstructured U. ndersTanding says:

    All so we can make better nuclear weapons, more nuclear weapons.
    Ignore the waste problem and mining pollution and ungodly profits. All good

  2. Tony Danis says:

    All these video, publications from LLN and other government labs are just pure glossy Madison Avenue ads, they tell you virtually nothing but are slick, colorful and have a sexy woman's voice in the background.
    This isn't what science is about.

  3. Merle Patterson says:

    "What is Exascale Computing?" It's a system that can find the Wall St crooks in a millisecond but isn't tasked to do so for some reason?

  4. jmalmsten says:

    So… Can it compute the ultimate question about Life, the Universe and Everything?

    – Question courtesy of Amalgamated Union of Philosophers, Sages, Luminaries and Other Thinking Persons ( aka AUPSLOTP for short)

Leave a Reply

Your email address will not be published. Required fields are marked *