Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Students take top honors at TeraGrid conference

  • Announcements

Sanjiv Kumar is something of a fortune teller, but he and his colleagues don’t use a crystal ball to peer into the future of water systems like the St. Joseph River Watershed in northern Indiana and southern Michigan.

Instead, they use supercomputers at Purdue University and also spread across the country on a high-speed network dedicated to cutting-edge science.

The researchers project how climate change and other factors may affect, for instance, flow and water quality. They’re also identifying management practices that may help keep a system like the St. Joseph, which supplies water for more than 1.5 million people in a 15-county region around South Bend, viable in coming years.

“This tool can be very well applied … to other watersheds,” said Kumar, whose research poster about it during the TeraGrid ‘08 conference June 9-13 in Las Vegas turned out to be an award winner.

Purdue representatives took three top honors in two competitions at the annual conference focusing on the world’s largest high performance computing network for open scientific research. Purdue, in an effort led by the Rosen Center for Advanced Computing’s Carol Song, is a TeraGrid partner, contributing computing resources and developing technology for use on the system. That includes largely Web-based portals, called “science gateways,” designed to make doing research easier.

Kumar, a doctoral student in civil engineering, said the water system modeling might take weeks to a year on even a powerful desktop computer, depending on the amount of data being fed it. The supercomputers accessible through the TeraGrid and on campus at Purdue can reduce that to hours, or less.

Besides Kumar, who won first place in the student research competition, Purdue student Vinaitheerthan Sundaram, a doctoral student in electrical and computer engineering, won second place in the same competition. Sundaram, who works for the Rosen Center, presented a system for visualizing data from the latest generation of Doppler radar, interactively and in 3-D. He developed the system with Purdue Professor Bedrich Benes and his graduate student Yi Ru in the Computer Graphics Technology Department of the College of Technology and with Rosen Center researchers Song and Lan Zhao.

That’s the same data from radar stations nationwide you use to decide whether to take along an umbrella in the morning. But the flat 2-D pictures and even the 3-D graphics on the Weather Channel don’t make use of all the information hiding in the radar data, Sundaram said.

Purdue researchers created tools to automatically collect, process and render the data in near real time resulting in an interactive 3-D picture that can allow observers not only to see a storm coming, but spin it around and look at it from various angles, peer inside and more. When the weather is really active, as with the recent rains that have flooded parts of Indiana and the rest of the Midwest, the volume of data the system has to deal with is huge. That’s where the high performance computing resources available on the TeraGrid come into play, especially Purdue’s Condor pool, Sundaram said.

Condor puts the unused compute cycles of networked machines–on desktops, in labs, and servers–to work when the computers otherwise would be idle. Purdue has one of the largest Condor pools for distributed computing in the world, employing more than 15,000 processors on campus. In 2007 alone, BoilerGrid, as the system is called, provided more than 10 million hours of computation time for researchers and students, at Purdue and elsewhere.

Speaking at the TeraGrid’s annual conference, the National Science Foundation’s Steve Meacham, senior science and technology advisor in the NSF Office of Cyberinfrastructure, said the agency will be looking at high throughput computing resources, like Condor, in funding the next round of TeraGrid expansion. Meacham’s office oversees the TeraGrid.

“The fact that NSF specifically lists high throughput computing in (its funding proposals request) shows the value our Condor pool brings to the TeraGrid and the need for such resources from the user community,” Song said.

As student competition finalists, Kumar and Sundaram received travel, housing and registration expenses related to the conference. A panel of judges evaluated the entries on scientific merit and potential for impact. There were divisions for high school, undergraduate and graduate students, with the Purdue students taking the top two spots in the latter.

In addition, Rosen Center Customer Service Manager Kay Hunt won first place in the general poster competition for her poster on the TeraGrid Campus Champions program. The new program provides training and support for local representatives who then serve as ambassadors for high performance computing and the TeraGrid on their campuses. The aim is to spread the use of the tools.

The conference was an opportunity for the initial group, located on campuses from Hawaii to New Jersey, to meet and served as a kickoff for the program, which began enlisting participants in March.

But Hunt, who organized Campus Champions from scratch in December, said the meetings drew a crowd beyond the program’s initial 11 participants. More than a dozen additional schools are now considering participation. “There was a lot of interest from other institutions that had heard about it and wanted to check it out,” Hunt said.

Song highlighted a field programmable gate array tutorial by Rosen Center staff members David Braun and David Matthews. They outlined how the technology for programming, or reprogramming, chips more or less on the fly, the better to accomplish specific tasks, can be used in scientific computation–accelerating it, sometimes radically, while also lowering power consumption to make for “greener” high performance computing, a recent emphasis at ITaP.

Braun was able to recruit new users to the Rosen Center’s Brutus resource, which Purdue makes available to develop field programmable gate array code, part of its TeraGrid commitment. NSF’s Meacham also said such experimental resources will be among the criteria for funding the network’s expansion.

Originally posted: