Retiring Steele cluster kicked off Purdue's current national prominence in research supercomputing
November 27, 2013
Sustainable energy supplies and energy efficiency are among the biggest issues facing society and heat transfer is important to the question of how to keep producing energy to meet demand now and in the future, and do it as efficiently as possible.
Lost heat — whether from the sun, a power plant, or out of the exhaust pipe of an automobile — still contains useful energy. Conversion of waste heat to power by even 5 to 10 percent “is a huge amount when you consider the amount of waste heat,” says Xuilin Ruan, a Purdue associate professor of mechanical engineering.
Ruan’s research focuses on heat transfer processes at a nanoscale level with the idea that by leveraging size effects, new materials and nanostructures with improved heat conversion properties, for example to be used in solar cells, may be designed from the atoms up. His work involves computational simulations that demand high-performance computing.
“A simulation can easily take a month,” Ruan says.
So it is no surprise that when Ruan was establishing his lab at Purdue in 2008, ITaP’s proposal to build a new top-flight cluster supercomputer that faculty researchers would share interested him. He was one of the first partners in the Steele community cluster.
“I hadn't used my startup fund yet for computing,” says Ruan, who came to Purdue in 2007. “Then ITaP launched this program and I thought, this is excellent, I will just join the Community Cluster Program because they provide a better price and more service. We knew that if we built our own machine the maintenance was going to be a headache.”
Steele, retired as of November 30, kicked off Purdue’s Community Cluster Program. In partnership with faculty, ITaP has now built six TOP500 supercomputers in as many years, including Conte in 2013, the most powerful supercomputer for use by researchers on a single U.S. campus. The program gives Purdue the best supercomputing infrastructure for research on any campus in the country.
Not everybody was as ready as Ruan to jump in when the Community Cluster Program began, however. Many on campus balked at the idea of having ITaP centrally manage their research computing systems. Gerry McCartney, Purdue’s vice president for information technology, who heads ITaP, likes to joke that a prevailing opinion was: “I’d rather remove my appendix with a Spork than let you people run my research computers.”
But the price ITaP negotiated with vendors for a group purchase was good enough to get a dozen faculty members to sign on initially, says McCartney, who also is the chief information officer for the Purdue system and the Olga Osterle England Professor of Information Technology. Eventually, more than 60 researchers bought into Steele.
More than 500 researchers in fields ranging from biology, chemistry, physics and multiple engineering disciplines to agronomy, communications, economics and mathematics have joined the Community Cluster Program since. Despite a money-back guarantee from ITaP, no one has ever withdrawn from the program.
“I love ITaP, you can quote me on that,” says Ashlie Martini, associate professor of mechanical engineering, who also was one of the first Steele partners. “They’re very service oriented. It was just painless. It just worked.”
Martini even continued to use Steele remotely, working with doctoral students she advises who remained at Purdue, after she moved to the University of California Merced in 2011.
ITaP made an audacious decision to build Steele in a day, in part to show how committed it was, says Donna Cumberland, director of research services and support for ITaP Research Computing (RCAC). A small army of ITaP and Purdue IT staff, researchers, students and even a few participants from in-state rival Indiana University showed up for the high-tech barn raising. In the end, Steele was actually ready to run by noon — and running research jobs after lunch.
“That day was just pure excitement,” says Cumberland, the event’s chief organizer. “I think everybody knew it was something special. You still hear people talk about it.”
The Steele build garnered national media attention, but more important was the message sent to potential Community Cluster Program partners.
“We know what we’re doing, we’re prepared, we can do this,” Cumberland says. “I think that changed a lot of attitudes on campus.”
A supercomputer’s retirement isn’t something people generally get misty eyed about, but Ruan, for one, has a soft spot for Steele.
“Steele is the most critical part for my career here in terms of equipment,” says Ruan, who’s moved his research to 2012’s Carter cluster. “I can certainly claim that without Steele our research wouldn't have been done."