Skip to main content
Have a request for an upcoming news/science story? Submit a Request

ThinLinc lets Purdue researchers access community clusters from almost anywhere

  • Science Highlights

When Purdue Professor Michael Grant found himself in Ohio with nothing but a 4G LTE cellular connection to get back to campus, his research employing Purdue’s community cluster supercomputers did not grind to a halt.

Grant fired up a piece of software on his laptop called ThinLinc and kept working in the same graphical interface he prefers when using MATLAB on Purdue’s research supercomputers.

ThinLinc makes it possible to run graphical applications and environments remotely on the community clusters using a Mac, PC or Linux computer. Almost any software that can run on a cluster also can be used in ThinLinc, which Grant says makes it possible for him and his students to “log in all over the world and do our work.” For more information, email rcac-help@purdue.edu.

Community cluster users need only install the ThinLinc client on a remote computer, whether a desktop in their office or a laptop they’re taking on a trip, and then log in to their cluster account to get started.

“It’s easy; we basically had to do nothing,” says Grant, an assistant professor in the School of Aeronautics and Astronautics whose lab develops fast algorithms for flight control systems in advanced hypersonic vehicles that may fly through the atmosphere at more than 4,000 miles per hour.

More than 50 Purdue researchers are employing ThinLinc already to run graphical applications such as MATLAB and RStudio, says Preston Smith, manager of research support for ITaP Research Computing (RCAC).

Purdue animal sciences Professor William Muir first prompted ITaP Research Computing (RCAC) to road test ThinLinc. Muir makes heavy use of CLC bio bioinformatics software for analysis of biological data, such as DNA and RNA sequences, in his research on animal genetics and genetically modified organisms. With the ThinLinc client he says he can access the community clusters graphically not only remotely from his campus lab, but just as easily when traveling or at home.

“It worked well enough for Professor Muir that lots of people have now been able to take advantage of it,” Smith says.

Grant and his students use MATLAB as the primary tool for developing algorithms they then covert to C code, compile and run on the clusters. While the algorithms are designed to execute in a fraction of a second, they don’t necessarily require a high-performance computing system to run, although Grant’s lab is beginning to work on harder problems that may.

However, employing a cluster in his research gives Grant and his students a common, group environment in which to work, including a Git repository for storing their code as it is revised. With ThinLinc, it is like his lab’s own cloud computing resource, Grant says.

ITaP Research Computing (RCAC) provides the infrastructure for, installs, administers and maintains the community clusters, including software installation and user support, so researchers can concentrate on research rather than on running a high-performance computing system. The Community Cluster Program also maximizes use by sharing computing power among faculty partners whenever it is idle. Researchers always have ready access to the capacity they purchase, and potentially to much more if needed.

The community clusters are integrated with the Research Data Depot, a high-capacity, fast, reliable and secure data storage system designed, configured and operated for the needs of Purdue researchers and campus units in any field and shareable with both on-campus and off-campus collaborators. A central solution for storing large, active research data sets at a competitive price, the Research Data Depot is available to Purdue researchers whether they’re cluster users or not.

For more information about the community clusters and the Research Data Depot, visit www.rcac.purdue.edu, email rcac-help@purdue.edu or contact Preston Smith, psmith@purdue.edu or 49-49729.

Originally posted: