Rob Johnson
6 min readJan 21, 2020

--

Aurora Supercomputer (Image courtesy of Argonne National Laboratory)

Aurora Supercomputer Will Open New Frontiers for Science and Industry

Believe it or not, supercomputers are changing your life, today

Most people don’t realize it, but supercomputing, more commonly known in science as high-performance computing or HPC, is changing how engineers innovate and researchers make breakthroughs. Using HPC, scientists explore the universe, improve healthcare, design safer vehicles, advance clean energy technologies, and so much more.

The term “supercomputer” conjures mental images of a basketball court-sized, big iron machine. Indeed, the largest HPC systems in the world do occupy a lot of space. However, HPC systems are not just one giant machine. They are assembled by connecting many smaller computers known as nodes.

The combined capability of all the nodes gives an HPC system the needed power to accomplish more complex tasks than any single machine could. It’s a bit like putting together hundreds or thousands of cardboard puzzle pieces to see the big picture.

Aurora

The U.S. Department of Energy’s (DOE) Argonne National Laboratory, based in Illinois, is currently planning for the deployment of a new supercomputer, named Aurora. When the system comes online in 2021, Aurora is expected to be among the fastest computing systems in the world.

An HPC system’s performance is often defined by the number of calculations-per-second it can perform. Computer experts refer to “calculations-per-second” with a more specific term: Floating Point Operations Per Second, or FLOPS.

An HPC system capable of a billion FLOPS is tough to comprehend. If someone could stack a billion pennies in a second, the tower would extend about 900 miles. On its side, that same stack reaches roughly from New York to Atlanta, Georgia.

With Aurora, though, the FLOPS target is much higher. Argonne is working with developers Intel and Cray to create a system capable of a billion-billion (a quintillion) FLOPS. Think of it this way: A quintillion pennies laid flat would blanket the Earth — twice!

A supercomputer capable of performing a quintillion FLOPs earns a unique moniker: An exascale system.

Reflecting on the Aurora exascale project, Rick Stevens, Associate Laboratory Director for Argonne’s Computing, Environment and Life Sciences directorate, shared his insights. “We chose the name Aurora to represent our goal to create something which in some sense can illuminate the world. Argonne’s resources make it possible for us to tackle extremely complicated, long-term scientific problems. As a result, we can support breakthrough science every day.”

Why do scientists need that much computing power?

Without the aid of HPC, some scientific problems are so vast and involve so much data, that a young researcher could reach retirement age before he or she obtained the answers needed. Modern HPC is changing all that.

Even though Aurora’s deployment is over a year away, many researchers and engineers are already preparing to use the system. To ready the supercomputer for science, the Argonne Leadership Computing Facility formed the Aurora Early Science Program (ESP) to help researchers prepare the software they wish to use on the machine. The ESP provides research teams with early access to the new architecture to optimize their codes for Aurora, while also paving the way for other researchers to run on the system as soon as it’s available.

After a rigorous evaluation process of ESP applications, 15 projects that span a wide range of scientific areas and computing methods were awarded pre-production computing time on Aurora. Most of those projects are so complicated they have outgrown the power of HPC systems available today. Therefore, Aurora will help lead the charge into a new era of science where “impossible” compute-intensive scientific endeavors become a reality.

Developing safe, clean fusion reactors

Aurora will enable more realistic simulations and visualize previously impossible things like modeling the universe. It will also lead to new approaches for energy production.

“Reducing or eliminating carbon emissions is not only urgent; it is critical. The energy of the future comes from clean and safe fusion. Today, it is an achievable goal thanks to exascale computing, the emergence of AI, and deep learning,” said Dr. William Tang, Principal Research Physicist at the Princeton Plasma Physics Laboratory and principal investigator of an Aurora ESP project.

Fusion is the type of power our sun produces. Because temperatures created by fusion reach into the tens of millions of degrees, physical materials cannot take the heat — literally. Dr. Tang’s research centers on a fusion reaction chamber that involves magnetic containment, rather than the traditional methods in use today.

Magnetic containment model for nuclear fusion (Image courtesy Dr. William Tang)

The reactor would use deuterium, a hydrogen variant derived from seawater and tritium. Because deuterium’s tiny amount of radioactivity is very short-lived, the reactor design avoids the risk of environmental contamination associated with uranium-based reactors. Additionally, the magnetically contained reaction chamber will feed on a minute’s worth of fuel at a time. Without the addition of more deuterium, the reactor shuts itself down.

When Aurora comes online, Dr. Tang will leverage artificial intelligence techniques to advance fusion research efforts aimed at achieving a cleaner, safer approach to nuclear energy generation. In turn, the solution provides a virtually unlimited energy source.

Seeking more effective treatments for cancer

Dr. Amanda Randles, Alfred Winborne Mordecai and Victoria Stover Mordecai Assistant Professor Department of Biomedical Engineering at Duke University, and her colleagues developed a way to perform computer simulations of the human body’s blood flow.

The code, named HARVEY, predicts the travel of all the blood cells moving through the body simultaneously. The task requires enormous computing power since each person’s circulatory system includes about 10,000 miles of plumbing — the distance of a flight from Seattle to Tokyo!

Today, HARVEY’s capability could help surgeons identify the best placement for an artery shunt or visualize how plaque buildup within arteries impacts a patient’s circulatory health.

HARVEY simulation of blood cells and vessels (Image courtesy of Dr. Amanda Randles)

Dr. Randles and her colleagues now seek to repurpose HARVEY to understand the process of metastasis in cancer. By predicting where wayward cancer cells might lodge in the body, HARVEY can help doctors anticipate where to look for secondary tumors.

The system must track cancer cells flowing through the bloodstream and predict how they act when they bump into existing red blood cells or artery walls. Aurora will offer the computing speed necessary to do it.

Designing more fuel-efficient aircraft

Kenneth Jansen, Professor of Aerospace Engineering at the University of Colorado Boulder, seeks to use Aurora’s capabilities to help develop safer and more fuel-efficient airplanes.

“As any airline passenger knows, air turbulence can vary greatly throughout a flight. Sometimes you barely notice it, and other times, well, it’s quite bumpy,” he chuckled. “The variability of turbulence makes it difficult to simulate an entire aircraft’s interaction with it. Each second, different parts of a plane experience different impacts from the airflow. We need to evaluate data in real-time as the simulation progresses,” said Jansen.

Jansen’s CFD research models and predicts fluid flow around aerospace vehicles to allow engineers to design more fuel-efficient planes. (Image courtesy Ken Jansen, University of Colorado Boulder, and Argonne National Laboratory).

Argonne’s current supercomputer, Theta, has contributed to Jansen’s work, but it can only simulate a plane that is one-nineteenth its actual size, flying at a quarter of its real-world speed. Noted Jansen, “Aurora will help us learn more about the fundamental physics involved with a full-sized, full-speed aircraft simulation. We can then identify where design improvements can make an important difference for in-flight characteristics.”

Technologies under the hood

Built by Cray, a Hewlett-Packard Enterprise company, Aurora’s performance is possible thanks to advanced capabilities enabled by the future generations of Intel HPC and AI-ready technologies like Intel Xeon processors, Intel Optane DC Persistent Memory, Intel Xe technologies, and more.

Commented Jansen, “Aurora would not be possible without the support of companies like Cray and Intel. Aurora will advance many scientific projects, including my own. With a tool that powerful, my team has new opportunities to make meaningful contributions to aircraft manufacturing and the environment too.”

A bright future ahead

2021 will be an exciting year for science. “When exascale machines first turn on, we anticipate it’s going to be a crowded dance floor,” added Argonne’s Stevens. “Exascale computing is an incredibly powerful tool that will help us in almost unimaginable ways. Our team is proud to help lead the way into next-generation computing.”

--

--

Rob Johnson

Marketing & public relations consultant, freelance writer, technology & gadget geek, TONEAudio gear reviewer, hopeful artist, foodie, and fan of the outdoors.