Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Particle accelerator, supercomputing are revealing the universe’s original recipe

  • Science Highlights

Quark-gluon plasma, formed at temperatures 100,000 times hotter than the center of the sun, is something you won’t find cooking in any kitchen on Earth. But scientists believe it was the soup of the day over the entire universe for a few microseconds after the Big Bang, which took place about 13.7 billion years ago.

Researchers, including Purdue physics Professor Fuqiang Wang and his lab, now can study this by smashing atoms together inside particle accelerators like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in New York. The inferno created by these collisions is enough, briefly, to recreate conditions like those during the universe’s formation.

In effect, scientists using the accelerator, the data it generates, data mining and computer simulation can run time backwards and reverse engineer the recipe of the early universe. They also can examine the inner workings of cosmological phenomenon such as neutron stars, whose cores may contain quark-gluon plasma, says Wang, a member of the Purdue Physics Department’s High Energy Nuclear Physics Group.

Wang is part of an international team working on the STAR experiment (it stands for Solenoidal Tracker at RHIC) housed inside the ring of the 2.4 mile-long accelerator, where atoms are run up to a speed 99.99 percent of the speed of light and then set on a collision course.

STAR is designed to search for signatures of the quark-gluon plasma that the RHIC was built to generate. It includes several different types of detectors, each specializing in detecting certain types of particles and characterizing their motion. It measures in three dimensions the trajectory of the thousands of particles jolted loose by each collision.

"The data volume is huge," Wang says. "This is really kind of a massive data mining process."

A supercomputing cluster for the project at Brookhaven does much of the work on the raw data. But Wang further refines it using Purdue’s new Hansen supercomputer and other community cluster supercomputers.

Wang's lab also does demanding computer simulations to explore what physicists think is happening in the accelerator. The models are compared to the real-world data to check for consistency and validate results and conclusions.

"It's a data-driven approach," Wang says. "One year's data, you have to spend another year to analyze it, at least, sometimes several years."

The lab runs its own codes as well as ROOT, a data analysis platform developed by CERN, the European Organization for Nuclear Research. In addition to a generous number of processors, input and output speed and storage for a plethora of large data files are important to the work, Wang says. Wang and his team also participate in experiments at CERN’s Large Hadron Collider where atoms are smashed at a rate of energy 50 times that at Brookhaven.

When Wang came to Purdue 12 years ago he used a small physics department cluster, which served him well. But Purdue's Community Cluster Program, begun in 2008, was too good a deal to pass up. The cooperative system makes idle nodes available to faculty researchers and their labs, while their nodes are likewise shared with peers when not in use.

"You could in principle use all the CPUs if they’re available," Wang says. "That's very attractive. You have nothing to lose, you gain. If you need to run things, your nodes are available and if other people’s nodes are not running, you can use those."

More Information

Originally posted: