Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Purdue again leads the nation in supercomputing systems for campus research

  • Science Highlights

Severe thunderstorms and the potential for tornadoes are a fact of life in Indiana, at Purdue no less than the rest of the state. Dan Dawson wants to better understand the complex behaviors of these storms to eventually help provide people with more warning when they arise.

The new faculty member in Earth, Atmospheric and Planetary Sciences focuses on numerical modeling of the atmosphere to study the dynamics of severe storms and tornadoes.

“I’m interested in understanding the variety of physical processes that are at play within storms, as well as developing better methods to predict their behavior in advance,” says Dawson, who is joining the Purdue faculty this fall.

To do his research, Dawson requires high-performance computing. He says the availability of supercomputing systems was one of the first things he asked about in considering a faculty position. Purdue’s new Rice community cluster supercomputer answered the question.

“I was pleased to see that Purdue had extensive resources along these lines, along with what appeared to be a robust and well-documented support system for its use,” Dawson says.

Rice, built by ITaP in May 2015, made the latest TOP500 list of the world’s most powerful supercomputers, released this month. Purdue has three machines listed, more than for any other U.S. school, giving the University the best collection of high-performance computing systems for use by faculty researchers on any single campus in the country.

Like the other machines in Purdue’s award-winning Community Cluster Program, Rice is designed for tightly coupled science and engineering applications and parallel computation, the largest portion of the high-performance computing work done on the West Lafayette campus.

At the same time, ITaP added two smaller community clusters: Snyder designed for memory-intensive applications, particularly in the life sciences, and Hammer for high-throughput serial work.

ITaP Research Computing, working with faculty partners, has built seven TOP500-class high-performance computing clusters at Purdue since 2008, along with a major research data storage cluster, the Research Data Depot, in 2014.

For more information on Rice, the Community Cluster Program, the Research Data Depot and other research computing services, email rcac-help@purdue.edu or contact Preston Smith, ITaP’s director of research services and support, psmith@purdue.edu or 49-49729.

Rice joins the Conte cluster, built in 2014, and the Carter cluster, built in 2012, on the TOP500 list. Conte remains the most powerful supercomputer for use by researchers on a single campus nationwide. The TOP500 Supercomputer Sites project has been ranking the 500 most powerful known systems twice a year since 1993 to track trends in high-performance computing.

There are now 165 faculty partners from all of Purdue’s primary colleges and schools using the community clusters for research spanning more than 30 science, engineering and social science disciplines. Faculty and their students use these supercomputers to develop new treatments for cancer, improve crop yields to better feed the planet, engineer quieter aircraft, study global climate change and probe the origins of the universe, among many other topics.

System CIO Gerry McCartney says ITaP plans to continue building a top-flight research computing system annually, because demand for it by faculty and their students engaged in the process of discovery will only continue to grow. He says the high-performance systems also are vital for accelerating “time to science” as, more and more, research is driven in myriad disciplines by big data and complex modeling and simulation.

“You can’t do that effectively on a laptop,” says McCartney, who also is Purdue’s vice president for information technology and Olga Oesterle England Professor of Information Technology. “New faculty coming to Purdue for President Daniels’ Purdue Moves and other initiatives, and faculty here now breaking new ground in their fields, absolutely require computational resources like the community clusters and the data depot. They simply can’t do their work without them.”

Originally posted: