Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Partnership between Purdue physicists and ITaP will help make science of the data deluge from a more powerful Large Hadron Collider

  • Science Highlights

Dark matter is worse than Big Foot. It’s big, in theory making up about a quarter of the universe versus just 5 percent for “normal” matter. But unlike the fabled Sasquatch nobody’s ever claimed to see dark matter. We don’t even have an out-of-focus video supposedly showing it.

But our view of dark matter could be about to change, in a landmark development for the field of physics aided by Purdue faculty researchers and by Purdue’s ambitious program for building new research supercomputers every year.

Purdue is a key partner in the Compact Muon Solenoid experiment at the Large Hadron Collider, the giant particle accelerator on the French and Swiss borders. In 2012, the LHC revealed the Higgs Boson, long theorized but never seen, confirming much of our standard model of how the physical universe works at its basic level.

Now, the accelerator has begun crashing particles together again, at an even higher energy after an equipment upgrade. The Purdue researchers and thousands of scientists internationally involved in the project are looking for the next big thing, with dark matter in their sights, as well as questions related to the existence of extra dimensions.

“When the Large Hadron collider was designed of course one of the motivations was to find the Higgs particle,” says Purdue physics Professor Norbert Neumeister. “But it was just one item on a long list of things to be done with the machine. Historically, whenever we’ve reached higher energies, we’ve gained additional insights.”

Before producing science, the LHC and its Compact Muon Solenoid detector, which is something like a five-story digital camera that takes pictures of the proton-on-proton collisions, produce something else — data. Lots and lots of data. This is where Purdue’s nation-leading campus research computing infrastructure and its Community Cluster Program, operated by ITaP, come into play.

“We approach the Community Cluster Program, and our research computing services generally, as a partnership with faculty to advance their research and Purdue’s goal of having worldwide impact,” says Donna Cumberland, ITaP’s executive director for research computing. “Professor Neumeister and the CMS Tier 2 Center at Purdue were among our first partners. In many ways it’s a model example of the program’s success.”

The LHC produces about 15 petabytes of data annually, roughly equivalent to 3 or 4 million DVDs. The CMS experiment alone records about 100 megabytes per second. (Think 100 novels every second, as long as the author wasn’t too wordy.)

As a Tier 2 center for the CMS project, Purdue stores data, makes it quickly accessible to physicists worldwide who are analyzing it and provides high-performance computing power for data processing and analysis. The University currently has about 4 petabytes of CMS data on hand, approximately a million DVDs worth, which Neumeister thinks is the largest data set stored by any single Purdue research group.

“Coming with the data are two important factors," says Neumeister, who leads the CMS Tier 2 center at Purdue with Associate Professor Thomas Hacker. “One is network connectivity because we constantly load data. The second is processing power because we constantly run over the data to look for interesting things. The more data, the more demands there are on computing and networking.”

Neumeister and the CMS Tier 2 Center began working with ITaP in 2005 and bought into the first Community Cluster Program machine, Steele, when it was built in 2008. They’ve been partners in the program ever since, taking advantage of the computing power offered by the community clusters, but also of the good prices yielded from centralized hardware purchases each time a new cluster is built.

In addition to the community clusters, the CMS center uses a specialized multi-petabyte data storage cluster for holding detector data for processing, distribution and analysis, which ITaP Research Computing built and operates for the center. In a review, leaders of the international project praised the computational capability Purdue makes available as well as its cost effectiveness.

“From the very beginning, we started the Purdue CMS Tier 2 project in a partnership with ITaP and I think that makes a difference,” Neumeister says. “We are not just a client for ITaP. There is a mutual interest in operating a successful Tier 2 center. It’s a good team.”

Starting with Steele, Purdue has built a research cluster supercomputer annually, with seven of them ranked in the top 500 in the world, and another planned this fall. The partnership between faculty and ITaP gives Purdue’s researchers the best collection of high-performance computing resources for use on a single campus in the country, including fast research networking to move data on and off campus and options for active and long-term research data storage.

For information about the Community Cluster Program and other research computing systems and services email rcac-help@purdue.edu or contact Preston Smith, ITaP’s director of research services and support, psmith@purdue.edu or 49-49729.

Originally posted: