Exascale Computing Paves the Way to New Scientific Discoveries

Rob Johnson
6 min readApr 22, 2019

Argonne’s Aurora system will enable a paradigm shift in research

High performance computing (HPC) systems have existed for many years. However, solving more significant problems faster fuels the drive to develop more advanced systems that address both strong and weak scaling problems. Today, we stand on the cusp of exascale computing, the key that unlocks the door to solving elusive and increasingly-complex scientific achievements. Argonne National Laboratory’s Aurora system will become the first exascale system in the United States. Upon its deployment in 2021, Aurora will have the prowess to handle a stunning billion-billion (1018) calculations per second.

Source: Argonne National Laboratory

Trish Damkroger, vice president and general manager for the Technical Computing Initiative (TCI) at Intel, is among the world’s top advocates for exascale-level computing. Before her time at Intel, Damkroger served as acting associate director at Lawrence Livermore National Laboratory (LLNL). In that role, she oversaw the organization’s 1000-person Computation Directorate, including the High Performance Computing Center, application development, research, and information technology sectors.

Aurora represents a significant milestone for several reasons. According to Damkroger, “Aurora will advance science at an unprecedented level. With an exascale system, we will have the compute capacity to realistically-simulate precision medicine, weather, climate, materials science, and so much more.”

Aurora also offers a new pinnacle for convergence. The new system will provide converged infrastructure to support new workloads. It will also support converged workloads — like HPC and AI workflows — on the same hardware. Plus, Aurora offers the ability to converge HPC first principles models with AI data models, helping accelerate discovery and transformation.

Rick Stevens, associate laboratory director for the computing, environmental and life sciences directorate at Argonne National Laboratory, describes how converged approaches address novel problems. “What excites me most about exascale systems like Aurora is the fact that we now have, in one platform and one environment, the ability to mix simulation and artificial intelligence,” he said. “This idea of mixing simulation and data-intensive science will give us an unprecedented capability, and open doors in research which were inaccessible before.” For Stevens, a familiar example resides in his cancer research endeavors. He added, “We need the capability to predict what a complex cancer cell is going to do when exposed to a drug. To do that, we must acquire more high-quality data to gain a greater understanding of the biology behind the process. Therefore, the machine learning methods must integrate many, many sources of data to overcome that hurdle. Of course, we also need more testing to hone our predictive models and move that capability from the laboratory environment to a place where it can be used to help patients. It’s a big challenge, but we are making good progress.”

A third major driver for Aurora, according to Damkroger, is the race to exascale taking place around the world. “In our work, we embrace the slogan, to out-compute is to out-compete,” said Damkroger. “In other words, we recognize the power of exascale computing, and its ability to advance national economic competitiveness as well as scientific discovery. All of these factors make projects like Aurora critical to our future.”

Exascale in action

Already, researchers around the nation prepare for exascale computing through the Argonne Leadership Computing Facility’s Aurora Early Science Program. Each year, Argonne National Laboratory’s billion-dollar budget supports about 3,500 researchers. Stevens articulates the unique capability his organization enables, “Unlike many university research environments which face budget and staffing constraints, Argonne’s resources make it possible for us to dedicate larger teams and technologies to tackle extremely difficult, long-term scientific problems. As a result, we can support breakthrough science every day,” he said.

Dr. William Tang, Principal Research Physicist at the Princeton Plasma Physics Lab, is among those scientists vying for exascale compute cycles to further his groundbreaking work in magnetic containment systems for fusion reactors. “Great minds like Stephen Hawking saw fusion as important for the future energy needs of humanity. Of course, fusion happens in nature. However, creating it in an earthly environment is a grand challenge,” noted Tang. “Climate change represents a major challenge for our planet. Reducing or eliminating carbon emissions is not only urgent; it is critical. The energy of the future comes from clean and safe fusion. We face major challenges in making that transition. However, today, it is an achievable goal thanks to exascale computing, the emergence of AI, and deep learning.”

Dr. Nikola Ferrier, a senior computer scientist at Argonne National Laboratory, is another researcher seeking to tackle previously-impossible endeavors. She and her colleagues around the country intend to model the human brain’s neural structure. To date, a significant hurdle for their research lies in the massive data sets involved in modeling something as complex as the brain. Ferrier further characterized the scope of the problem, “The big challenge we face is not just obtaining data but managing the sheer volume of it. For example, one cubic centimeter of brain tissue may sound tiny, but analysis of the imagery from that small sample can generate petabytes of data. If we want to understand the scope of a human brain, exascale-capable computing is mandatory.”

A Team Effort

Given the multi-layered importance of exascale computing, the U.S. government offered the team at Argonne a $500 million contract to build Aurora and bring it online. “Building and deploying an exascale system requires an extensive partnership of HPC technologists. The herculean effort of making Aurora a reality involves the Department of Energy in addition to the scientists at Argonne,” noted Damkroger. “Cray’s expertise is central to Aurora’s success too. Cray’s flagship system platform, code name Shasta, will serve as Aurora’s foundation, including their newest interconnect technology code-named Slingshot. Cray will also offer some onsite support as Aurora goes live.”

Damkroger also outlines Intel’s role in the endeavor. “Our team at Intel assists the Argonne team with Aurora’s design, development, and forthcoming deployment. Intel engineers and architects have a chance to push the boundaries of innovation thanks to new technologies like future Intel Xeon Scalable processors, Intel Optane DC Persistent Memory, and Intel Xe architecture.”

Addressing grand challenges

Of course, a system of this magnitude requires specialized applications optimized explicitly for exascale-level workloads. According to Damkroger, hundreds of developers are already utilizing existing, pre-exascale compute systems to help design new applications to make the most of Aurora’s architecture. Upon Aurora’s rollout in 2021, the team behind it anticipates many users lining up to utilize the technology.

Aurora will have a broad impact on the full HPC ecosystem,” Damkroger noted. “The Aurora system is breaking down the boundaries between simulation-based computational science, data-intensive computing, and both deep learning and machine learning. This type of convergence will empower us to tackle enormous tasks like seeking cures for cancer, reducing world hunger, and ensuring future generations have clean drinking water.”

Expanding on this sentiment, Tang added, “The advanced exascale systems of tomorrow and the new insights derived from them will empower us to do even more amazing things in the years ahead. AI has been around for a while, but the accelerated development of neural nets and other methodologies enabled by exascale computers empower us to make more impactful use of it. Our work is both intellectually stimulating and exciting because we have an opportunity to do something which can benefit the world.”

“Steady investments in this area will also encourage the best and brightest young people to join and dedicate their intellect to AI and deep learning methodologies to solve grand challenges,” Tang concluded

With great optimism, Damkroger echoed Tang’s thought adding, “My daughter has a keen interest in computer engineering. Exascale computing systems like Aurora offer give her generation an opportunity — and new tools — to solve the truly grand challenges facing the world today. We live in an inspiring time in computing history.”

###

This article was produced as part of Intel’s HPC editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC community through advanced technology. The publisher of the content has final editing rights and determines what articles are published.

--

--

Rob Johnson

Marketing & public relations consultant, freelance writer, technology & gadget geek, TONEAudio gear reviewer, hopeful artist, foodie, and fan of the outdoors.