Purdue Cluster Challenge team aiming to take technology to limits
December 2, 2008
Two groups of Purdue University students and staff will compete in events aimed at pushing the limits of technology to meet the needs of the scientific community.
A team of five students enrolled in a high performance computing class at Purdue is gearing up for the Cluster Challenge, scheduled for Nov. 17-19 at the SuperComputing '08 Conference in Austin, Texas. Another Purdue team of four professionals from Purdue's Rosen Center for Advanced Computing will compete in an event called the Bandwidth Challenge.
The premier international conference for high performance computing, networking, storage and analysis runs Nov. 15-21, and 11 universities will compete in the challenges during the event.
"Our challenge entries are all about pushing the technology to meet the demands of the scientific community," said John Campbell, associate vice president for information technology, who heads the Rosen Center.
Cluster Challenge teams must run a battery of benchmarking programs and working applications on their machines as efficiently as possible. Purdue's team has spent most of the semester preparing to run a wide range of scientific software within the constraints of the contest. Rules limit the amount of power the entries can draw from two 120-volt, 13-amp circuits, which limits the number of processors that can be employed.
Purdue's competition machine has been in Boston, where partner SiCortex is located, and the group has been working remotely - as if the computer was in the same room. Most of the team's work has involved retooling software to run on its entry in a way that takes maximum advantage of its voluminous array of processors.
SiCortex builds unusual supercomputers designed to deliver high performance by using thousands of slower, energy-efficient processors, both in power consumption and cooling. Among other things, the company uses a unique, very fast "interconnect fabric" - the wiring that links its processors for working in concert - offsetting the raw speed disadvantage.
"It's not your standard architecture, so you can't just install it and it works," said team member Ryan Weinschenk, a senior in electrical and computer engineering technology from Noblesville. "The hardware performance is there. It's just actually getting the software to run."
Purdue's team consists of Andy Howard, a senior in electrical and computer engineering technology from West Lafayette; Alex Younts, a sophomore in computer science from West Lafayette; David King, a senior in electrical and computer engineering technology from Lafayette; Paul Willmann, a senior in computer technology from Carmel, Ind.; and Weinschenk.
The Students are enrolled in a class taught by Jeffrey Evans, an assistant professor in the Department of Electrical and Computer Engineering Technology.
The Bandwidth Challenge pushes the limits of moving data over a long-distance computer network, such as the Internet or the TeraGrid, the world's largest network for open science computing. Purdue is a partner in and resource provider on the TeraGrid, which is funded by the National Science Foundation.
Participants in the challenge are judged on the peak amount of data they can move from one place to another and how high a sustained rate they can maintain. This year, the competition also is placing a premium on "real-world applications and data movement issues" in addition to merely filling the pipeline; however. Purdue will try to meet the challenge using free software and commodity hardware.
Purdue's Bandwidth Challenge team includes Ramon Williamson, the senior storage engineer at the Rosen Center for Advanced Computing, who is serving as team manager; Michael Shuey, high-performance computing technical architect; Greg Veldman, storage administrator, and Patrick Finnegan, UNIX system administrator. The team is partnering with BlueArc, Texas Memory Systems, makers of fast solid-state storage units akin to a USB flash drive, and Foundry Networks, which is providing the high-speed network connections between the storage appliances and the SC08 network in Austin for the competition.
What the Rosen Center is learning from the exercise, besides being of value to other high-performance computing centers, should prove useful at home, Williamson said.
A cluster to be built at Purdue Calumet from parts of West Lafayette's decommissioned Lear supercomputer is likely to be accessing storage on the West Lafayette campus and could employ something like the system developed by the Bandwidth Challenge team.
In addition, at some point the Rosen Center, pressed for space in its machine room in the Mathematical Sciences Building, might create a central, remote storage facility in the West Lafayette area or elsewhere, which could function, using the techniques, as if it were on site, Williamson said.
Writer: Greg Kline, (765) (765) 494-8167, firstname.lastname@example.org
Sources: John Campbell, (765) 494-1289, email@example.com
Jeffrey Evans, (765) 494-8167, firstname.lastname@example.org
Ryan Weinschenk, (317) 690-1906, email@example.com
Purdue News Service: (765) 494-2096; firstname.lastname@example.org