Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Anvil helps researchers simulate and predict gravitational waves

  • Science Highlights
  • Anvil

Scientists from the collaborative Simulating eXtreme Spacetimes (SXS) research group are using Purdue’s Anvil supercomputer to explore the physics of cataclysmic space-time events and help shed light on the nature of one of the Universe’s fundamental forces: gravity.

Vijay Varma, Assistant Professor in the Image descriptionDepartment of Mathematics at the University of Massachusetts, Dartmouth, and Nils Deppe, Assistant Professor of Physics at Cornell University, are both computational astrophysicists. The two love space. Their work, however, lies not in looking through telescopes to study the stars, but in developing and using state-of-the-art numerical relativity codes that make high-accuracy gravitational wave predictions for Laser Interferometer Gravitational-Wave Observatory (LIGO) signals. Through their computational work, Varma, Deppe, and their SXS collaborators create high-accuracy simulations of Einstein's equations that are matched with experimentally observed gravitational waves, providing the necessary insight required to extract useful astrophysical data from those waves.

Ears not eyes: LIGO and Gravitational Waves

Gravitational waves are ripples in space-time caused by highly energetic events in space, such as the coming together and subsequent merger of multiple supermassive objects. One specific example is the merger of two black holes. These collisions are considered to be among the most violent processes in our universe, and the gravitational waves produced by a black hole merger can travel billions of light-years through space before reaching Earth. Einstein predicted the existence of gravitational waves in 1916 with his theory of general relativity, but it would be another century before hard evidence proved him right.

On September 14th, 2015, LIGO made history by detecting gravitational wave signals for the first time. These signals came in the form of vibrations—physical disturbances in the fabric of space-time. Up until this point, astrophysical observations relied wholly on “sight.” Scientists only had electromagnetic radiation data to work with, like the kind you get from looking through a telescope. But LIGO’s method of detection is more akin to hearing than seeing, with gravitational wave antennae picking up undulations in space-time. This opened up an entirely new sector of astrophysics.

“The analogy that I’ve heard Image descriptionregarding LIGO is this,” says Deppe. “The way you’re searching for data is much more like a microphone or ear. You’re searching for signals coming from everywhere, unlike with telescopes that need to be pointed to a specific location. Once you ‘hear’ the signal, you can pinpoint the general location that it came from. But knowing the precise location or the specific details of the event that caused the gravitational wave is very difficult to predict, and you need computer simulations to do that.”

LIGO and its European companion Virgo have allowed scientists to detect these cataclysmic space-time events. However, to extract useful astrophysical data from the observations, researchers need to rely on high-accuracy simulations.

Supercomputing and Space-time

Varma and Deppe have both contributed to the development of the Spectral Einstein Code (SpEC), the current most accurate numerical relativity code for black hole mergers. SpEC is used to run thousands of simulations that can help predict the expected gravitational wave signal given the underlying characteristics of the black holes involved in a merger, such as how massive the black holes are or how fast they are spinning. The predictions are then passed along to LIGO scientists (and others) for analysis.

“These numerical models,” says Deppe, “especially for the black holes, are pivotal in understanding what LIGO is observing. You need a very accurate and trustworthy model. At the collision, traditional models completely break down, so you have to do these simulations. You can then map these results back to the parameters that best fit the data and determine where these black holes came from.”

Varma elaborates on their research: “The main reason we have all of this infrastructure is to make predictions for LIGO for what happens when black holes or neutron stars collide. To extract valuable astrophysics such as the properties of black holes and neutron stars from LIGO observations, you need to compare with the prediction, which requires solving Einstein equations, and the only way to do that near the merger is using supercomputers.”

One supercomputer that Varma, Deppe, and the SXS collaboration have relied heavily upon is Anvil. Since the system came online, the team has been using Anvil almost constantly. But even with such a powerful supercomputer, the simulations take time. According to Deppe, if they are really lucky, one simulation will take about a week, but usually, it will take a few months. However, to analyze LIGO data, the researchers would need to run millions of simulations for one single gravitational wave event. While SpEC is very accurate, it is much too slow for direct use with LIGO applications involving black hole astrophysics and testing Einstein's relativity near black hole mergers. To expedite the process, the SXS group has turned to surrogate modeling.

Taking inspiration from machine learning, the SXS group developed data-driven gravitational wave models that efficiently interpolate between thousands of simulations and present results almost instantly, without losing the accuracy of the original simulations.

“The SpEC simulations are computationally expensive,” says Varma. “They can take weeks to months, and what LIGO really needs is something that can be evaluated in a fraction of a second so that you can generate millions of evaluations and figure out which properties of the black holes best explain the data. To accomplish this, we've developed the so-called surrogate models. The idea is that we take all of our existing simulations, which are of the order of a thousand, and we efficiently build an accurate interpolant through all of them. And once we build this interpolant, the final model is extremely fast. It can be evaluated in 100 milliseconds on a laptop instead of a few months on a supercomputer.”

Varma continues, “The big advantage of our method is that when we build this interpolant, we effectively capture the accuracy and the full physics of the simulation without losing any of the underlying complexity, but we still get something very fast. With this, we can pass the model to LIGO analysis, and reliably extract astrophysical information like the black hole masses and spins.”

By developing these surrogate models, the SXS group has brought the power of supercomputers like Anvil directly to LIGO data analysis.

Looking to the Future

Another aspect of the SXS collaboration’s work involves preparing for the future—anticipating and writing code for technology that we do not yet have.

Much like sound waves diminish over distance, the ripples of gravitational waves grow ever smaller throughout their journey. By the time these waves reach Earth, they have often traveled billions of light-years, and the amount of space-time wobbling they generate can be 10,000 times smaller than the nucleus of an atom. LIGO was specifically designed to capture such small vibrations, but even in its current state, there are limitations.

“LIGO is extremely precise,” says Varma. “It is the most precise instrument ever built by humans, but the field of gravitational wave astronomy is still in its infancy. In the 2030s, we'll get even better detectors that will be 10x more sensitive. At that point, all of the current methods to solve Einstein's equations will fail, as none of them will be accurate enough for those detectors.”

The SXS collaboration decided to address this problem proactively and began developing a new numerical relativity code, named SpECTRE. SpECTRE is being designed not only with more accurate detectors in mind but also with exascale supercomputers. The SXS group has seen the upward trend in computing power and has no doubts that it will continue.

“The current codes in computational astrophysics have a hard time using exascale computing,” says Deppe. “Even if we're not talking about exascale, but sort of these large, very fat nodes where you have 100+ CPUs [Anvil has 128 cores per compute node], using these machines efficiently is is a big challenge, both algorithmically and from sort of the computer science perspective.”

SpECTRE (jokingly named, according to Deppe, due to a team member who “watches too much James Bond”) is a complete redesign of the SpEC code. The goal is to develop accurate algorithms while also implementing them efficiently. The SXS team uses a work library that allows them to move data or jobs between different cores so that nothing ever sits idly. With SpEC, specific job types are assigned to specific cores, which leads to a load imbalance that is very difficult to scale up to big computing systems.

“SpECTRE is designed specifically to take advantage of machines like Anvil. We do a lot of our scaling and testing runs on Anvil and we're able to get 95% utilization across several 100 cores. Currently [with SpEC], our limit is, depending on the exact machine, usually around 60 to maybe 100 cores, and at that point, we're only at about 50% efficiency. So, with SpECTRE, we're able to get 90-95% parallel scaling at 90+ percent utilization and up to several hundred cores, which is perfect for the current problems we’ve been looking at. Going beyond that [regarding the number of cores], you simply run out of work.”

Varma, Deppe and the SXS team have also been using Anvil to push the boundaries of their models to improve performance. They’ve extended the validity of their models by incorporating more extreme conditions, such as a black hole merger containing two black holes of very different masses or very high spin rates. The goal is to run enough high-quality simulations to cover any observation that LIGO could make and then incorporate these outlier conditions into the surrogate models.

“In the last few years,” says Varma, “we've gone from two black holes that are at most a factor of four different in masses, to a factor of eight. So that's a factor of two improvement in coverage of the region. With this data set, the model will cover nearly all of the LIGO-Virgo binary black hole observations that have been made so far, which will ensure that the most accurate models are being used for the analysis and that we are getting the most astrophysics out of the observations.”

According to the two, Anvil has performed exceedingly well in their work. And aside from the Anvil’s outright performance, Varma and Deppe were especially pleased with the support they received from the Anvil team.

“The team at Anvil has been absolutely terrific in working with us to resolve any issues we had when we started,” says Deppe. “At this point, everything is just smooth sailing. So a huge shoutout to the team for working really hard and being flexible with our needs. We run huge simulations that take several months, so we need to keep the data around for a long time. Most supercomputers have a frequent purge policy, so I’m extremely grateful for the team figuring out how to let us store data for longer.”

Varma added, “Indeed, we definitely couldn't have done so many simulations without the special allocation of scratch storage for the group; that was extremely important.”

For more information about the work conducted by the SXS group, please visit: https://www.black-holes.org/

Additionally, to learn more about Varma and his work, please visit here. To learn more about Deppe and his work, please visit here.

To learn more about High-Performance Computing and how it can help you, please visit our “Why HPC?” page.

Anvil is Purdue University’s most powerful supercomputer, providing researchers from diverse backgrounds with advanced computing capabilities. Built through a $10 million system acquisition grant from the National Science Foundation (NSF), Anvil supports scientific discovery by providing resources through the NSF’s Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS), a program that serves tens of thousands of researchers across the United States.

Researchers may request access to Anvil via the ACCESS allocations process. More information about Anvil is available on Purdue’s Anvil website. Anyone with questions should contact anvil@purdue.edu. Anvil is funded under NSF award No. 2005632.

Written by: Jonathan Poole, poole43@purdue.edu

Originally posted: