Scientists have high hopes for the world’s fastest computer, which is now up and running at Oak Ridge National Laboratory in Tennessee after its June 8 launch.
A $200-million, water-cooled monster that covers an area the size of two tennis courts, the computer, dubbed "Summit," has been clocked at handling 200 quadrillion calculations a second (or 200 petaflops). That's more than twice as fast as the previous record-holder, China’s 93-petaflop Sunway TaihuLight, and so fast that it would take every person on Earth doing one calculation a second for 305 days to do what Summit can do in a single second.
Summit gives the U.S. bragging rights. More important, it gives scientists a new tool to conduct research that is all but impossible to do with other supercomputers.
"My hopes are that we are able to attract the world's best scientists to work on their dream problems,” says Jack Wells, director of science at the Oak Ridge Leadership Computing Facility. “To open up venues people thought were not possible in this time frame, to solve problems that people thought were 20 years away in the next five years."
Roughly 10 percent of people who take an opioid painkiller slip into addiction, says Dan Jacobson, a computational biologist at the laboratory. “That screams genetics to us — otherwise everybody would get addicted,” he says. He and his team plan to use Summit to compare the genetic profiles of 600,000 individuals — in this case U.S. military veterans — against clinical records showing whether they were prescribed opioids and if they became addicted.
If there is a genetic basis for opioid addiction, Jacobson said, it’s likely the result of a complex interplay between multiple genes. So Jacobson created computer code that maps genes into a giant network to help study interactions.
Scientists have long been working to develop fusion power as a limitless source of clean energy. But several technical hurdles remain before fusion power becomes a reality, including finding a way to contain the superheated form of matter known as plasma in which nuclear fusion occurs.
The world’s largest experimental fusion reactor, the ITER facility in Grenoble, France, is designed to contain plasma using powerful magnetic fields. But the plasma is highly turbulent at its edges, and that makes it hard for scientists to tell just how much energy the plasma will exert against the reactor walls.
Now that Summit is available, Wells says, scientists from Princeton University plan to use it to simulate the behavior of plasma in unprecedented detail. That should help make sure that ITER will work as designed — and speed the development of larger fusion reactors.
Another group of scientists, led by Princeton University geoscientist Jeroen Tromp, plans to use Summit to map Earth’s interior in unprecedented detail — from the outer crust straight down to the core. They’ll create the map by crunching seismic data from 4,000 earthquakes.
Previously, scientists used another supercomputer to process seismic data from 1,500 earthquakes. That allowed them to map Earth’s interior down to the core's outer edge, but Tromp says Summit should make it possible to map all the way down to the inner core. And that should improve scientists’ understanding of the effects of earthquakes and volcanic eruptions as well as help prospectors pinpoint underground oil and gas deposits.
Scientists have long known that most of the scores of elements that exist in the universe — everything other than hydrogen, helium and lithium — were created inside exploding stars known as supernovas. But the precise nature of the nuclear processes that create these elements is poorly understood.
Using other supercomputers, scientists simulated the processes — but only for a dozen or so different types of atoms at once. Wells said Summit should make it possible to simulate the processes for hundreds of elements simultaneously. That could lead to fundamental discoveries about the relative abundance of elements across the universe.
Astronomers also use light emitted by supernovas as a sort of cosmic yardstick to understand how quickly the universe is expanding. Better knowledge of the mechanisms driving these explosions will make these measurements even more precise, Wells says.
Climate scientists have developed sophisticated models that can be used to predict broad changes in Earth’s climate. But it’s hard to make accurate predictions about how climate change will affect specific areas, in part because it is hard to model clouds precisely. Clouds can cool the planet by blocking sunlight but also can trap heat in the atmosphere.
Using Summit, researchers from the U.S. Department of Energy expect to model the impact of clouds on the climate with greater precision than ever before. This should boost the accuracy of global climate simulations and make predictions of regional impacts of climate change more reliable, says Oak Ridge computational climate scientist Matt Norman.
“Is this specific area going to dry up, or is it going to get wetter?” he asks. “What about extreme events like flooding or storms? Those kinds of things are a lot more difficult to really nail down, so that’s exactly what we’re aiming at.”