Visitors at supercomputing conference impressed with Purdue-developed technology
A Purdue-developed technology called HUBzero sounds chilly, but it looks to Chris McPhee like a hot way to allow researchers to do science on line easily in a graphical Web-based fashion, rather than typing in cryptic commands DOS style.
“We’re tired of people seeing black screens,” said McPhee, a system engineer from Queen’s University in Ontario, Canada, who stopped by the Purdue booth at the SuperComputing ’08 conference in Austin, Texas, for a presentation on HUBzero.
The presentation by Michael McLennan, senior research scientist and hub technology architect at Purdue, caught McPhee’s eye as he walked by the Purdue booth Tuesday at the conference, known as SC08.
“I thought, ‘That’s something we might be interested in,’” said McPhee, representing a consortium of Canadian universities at the premier international gathering of high performance computing experts, which runs through Friday.
Presentations by Purdue researchers at the booth and in other conference events have covered a variety of topics on the cutting edges of high performance computing, from an easy way to do advanced science in a Web-based environment today to getting ready to take advantage of tomorrow’s next-generation high performance computers.
The booth at SC08 is designed to promote Purdue, Information Technology at Purdue (ITaP), the University’s central information technology organization, and the Rosen Center for Advanced Computing, ITaP’s research and discovery arm. The conference has more than 10,000 attendees this year.
The theme of the Purdue booth is: “No cycle left behind, no byte left unexplored,” highlighting creative ways Purdue, ITaP and the Rosen Center are finding to improve scientific productivity, said John Campbell, associate vice president for information technology, who heads the Rosen Center.
For example, Purdue’s more than 20,000-processor strong Condor pool, a high-throughput distributed computing system that makes use of otherwise idle machines in offices, labs and elsewhere for research purposes, not only on campus but internationally through the Open Science Grid and the TeraGrid, the National Science Foundation-funded world’s largest network for open science computing.
Purdue this week announced that it is adding to what’s already the world's largest science-focused distributed computing system with the addition of computers from athletics rival, but longtime computing and research collaborator, Indiana University.
IU is linking nearly 5,000 computers from its research pool to Purdue's national computing grid, which is known as Diagrid. Notre Dame, Indiana State University and Purdue's Calumet regional campus are already contributors in the effort in the Hoosier state. Diagrid also will soon include computers from Indiana University-Purdue University Fort Wayne and Purdue North Central.
Like HUBzero, Purdue’s use of Condor, the system underlying Diagrid, also attracted a lot of interest at SC08. Central Florida University just installed its own high performance computing center, but like Purdue and other schools a lot of compute cycles elsewhere around campus go to waste, said Ravi Palaniappan and Joe Sottilare of Central Florida, which may look to Purdue’s model to capture a good number of those.
“One of the things we were told to look at was using Condor to harvest cycles,” said Palaniappan, who added that he was “given explicit instructions” to check out the SC08 talk on using Condor by Preston Smith, the Rosen Center’s senior UNIX administrator.
HUBzero was developed for nanoHUB, an online community and toolbox for the nanotechnology development and education. But Purdue realized the underlying technology could be extracted for use by all sorts of scientific, technical and research communities, said Gerry McCartney, Purdue’s vice president for information technology and chief information officer. McCartney likened HUBzero to a great design for a library, just waiting for books to fill the shelves.
In HUBzero’s case, those books are scientific simulation and modeling tools in particular, but also things like interactive online classes and tutorials, question and answer forums, podcasts and more, McLennan said. More than a half dozen hubs, in fields ranging from cancer care and accelerating health care research in general to advanced manufacturing techniques and global engineering education, have now been developed using the technology.
In a packed presentation on nanoHUB at Purdue’s SC08 booth, Purdue electrical and computer engineering Professor Gerhard Klimeck said the resource now has 85,000 users in 172 countries, and growing. The more than 120 simulation tools on nanoHUB, which users can employ immediately without installing any software, are being used for real science, Klimeck said. He noted that nanoHUB simulations have been cited more than 256 times in scientific literature, mostly peer-reviewed journals and proceedings. Klimeck also is the technical Director of the Network for Computational Nanotechnology at Purdue.
Writer: Greg Kline, (765) 494-8167, email@example.com
Sources: Gerry McCartney, (765) 496-2270, firstname.lastname@example.org John Campbell, (765) 494-1289, email@example.com Michael McLennan, (765) 494-6495, firstname.lastname@example.org Gerhard Klimeck, (765) 494-9212, email@example.com