Lawrence Livermore’s El Capitan supercomputer is officially fastest in the world

Garry McLeod/LLNL

El Capitan’s data processing abilities represent a major advancement in scientific research, particularly for managing the nuclear stockpile.

The El Capitan supercomputer housed at Lawrence Livermore National Laboratory in California was officially named the fastest supercomputer in the world, processing a peak of 2.7 exaflops and able to perform 1.742 quintillion calculations per second, a 20-fold increase over the lab’s flagship system, Sierra.

Announced in a press call on Sunday, El Capitan’s verification was the result of a collaboration between the National Nuclear Security Administration, Lawrence Livermore National Lab, Hewlett Packard Enterprise and IT company AMD. 

“​​This level of computational power allows us to solve ensembles of problems on El Capitan in hours or days that could take weeks or even months to execute on current systems,” Rob Neely, the director of Lawrence Livermore's Advanced Simulation and Computing Program, said on Sunday.

Given its strong computing performance –– which is equivalent to conducting a calculation every second for 54 billion years –– El Capitan will serve as the NNSA’s first exascale computer, where it will help manage the country’s nuclear stockpile testing. The hardware combines CPU and GPU cores, and will be suited to run artificial intelligence applications for modeling and simulations in both scientific and national security domains. 

“El Capitan is more than just a machine, as NNSA's first exascale computer,” Corey Hinderstein, the acting principal deputy administrator of NNSA said in the press call. “It represents a pivotal next step in our commitment to ensuring the safety, security and reliability of our nation's nuclear stockpile without the need to resume underground nuclear testing. With El Capitan, we're entering a new area of predictive simulation and analysis.”

Hinderstein said that, historically, NNSA has relied on other high-performance computing to simulate nuclear weapons tests, and that El Capitan’s advanced computational capabilities “significantly improves” the agency’s ability to model and gauge exactly how these weapons will perform and anticipate potential modifications.

“These advancements are vital for supporting stockpile modernization and our concurrent life extension programs, which are critical to ensuring our nation's nuclear deterrent remains strong as the stockpile ages and as we introduce new systems,” she said. 

Neely added that the precise, high resolution 3D modeling and simulations that El Capitan offers weren’t possible before, meaning that it offers more accurate predictive capabilities that inform decision-making.

“We expect al Capitan to make those hero runs of yesterday more commonplace, allowing us to analyze components of the stockpile in great detail and with more precision than ever before,” Neely said. “This machine's power will enable us to incorporate various real-world factors, such as materials, manufacturing imperfections, environmental conditions, abnormal and hostile environments.”

Beyond working with the U.S. stockpile, Neely said that El Capitan will be applied to other disciplines that national labs oversee, like materials science, climate modeling, computational biology and more.