Revealing the secrets of Zika virus only the latest breakthrough enabled by Purdue’s community clusters
When the Zika virus became a global health crisis in early 2016, Purdue researchers were well prepared to make a breakthrough discovery in efforts to fight the virus. They had impressive virus expertise and plenty of high-performance computing power — thanks to an ambitious supercomputer building program Purdue began in 2008.
Purdue’s award-winning Community Cluster Program played a significant role in enabling the research team to create the first detailed, 3-D structural map of the Zika virus, a key step toward developing treatments.
The technique the team used to map the virus structure combined cryo-electron microscopy and high-performance computing. With the microscope, the Purdue researchers capture images of many Zika virus particles, or individual instances of the virus. They then use Purdue’s community cluster supercomputers to turn those images into a unified, finely detailed picture of the overall virus structure, opening a window on how it works.
The Zika structure is the latest in a continuing series of notable research enabled by the Community Cluster Program. Each year since 2008 Purdue has built a research cluster, with seven of them ranked in the top 500 in the world, and another planned this fall. The program managed by ITaP gives Purdue’s faculty the best collection of high-performance computing resources on a single campus in the country.
"Purdue’s goal is to have global impact and the Zika research wasn’t the first time our high-performance computing capability has been an integral part of doing that, nor will it be the last," says Gerry McCartney, Purdue’s vice president for information technology and chief information officer. "We planned to be a leader in enabling research driven by high-performance computing when we started the Community Cluster Program, and we plan to continue to be."
Little is known about Zika virus — yet. With its structure mapped, researchers can begin to identify potential targets for treating the virus or vaccines against it. The virus structure could not have been mapped, not in timely fashion anyway, without a tool like the community clusters.
"We have tens of thousands of pictures of the Zika virus and each one is in a different, random orientation," says Michael Rossmann, Purdue’s Hanley Distinguished Professor of Biological Sciences. "If we can find out the relative orientations of all those thousands of particles, then we can put those together to make a three-dimensional image."
High-performance computing — Purdue’s Snyder community cluster in this instance — is employed first to whittle down the collection to the best images, from 60,000 to 10,000 in the case of Zika. Next, the computer links similarly oriented views of the virus to each other, kind of like organizing pieces of a very large puzzle. The cluster then assembles those puzzle pieces into a 3-D image of the virus structure at near-atomic resolution.
"Then, if we’re interested in how the virus can be inhibited, neutralized, we have to do many structures, each one of which is like that, of the virus complex interacting with antibodies," says Rossmann, who led the research team with Richard Kuhn, director of the Purdue’s new Institute for Inflammation, Immunology and Infectious Diseases.
There are now more than 180 faculty partners and hundreds of researchers from their labs in all of Purdue’s primary colleges and schools using the community clusters for research spanning more than 30 science, engineering and social science disciplines.
"When we built the first community cluster in 2008, we knew the demand was there among our faculty researchers in a number of fields, for example in engineering and physical sciences," says Donna Cumberland, ITaP’s executive director for research computing. "While Zika virus wasn’t on the radar then, we also could see the breadth and variety of research being done on the clusters today coming in the future."
Besides mapping viruses, Purdue’s community clusters are being used, among many other things, to:
- Find out why our eyes get worse as we age on the way to developing ways of forestalling it.
- Model how interactions with marine life can affect the health of the oceans and the organisms swimming in them.
- Create new drugs to fight diseases like cancer and diabetes, as well as find new uses for existing drugs, which can move new treatments from lab to clinic faster.
- Determine how climate change affects severe weather like thunderstorms and tornados, making them more frequent and more intense.
- Continue to engineer new generations of more capable electronics, including aiding development of the world’s first single-atom transistor and nanowires just four atoms wide and one tall.
- Help probe one of physics’ most historical discoveries — the Higgs Boson particle — by processing data from the Compact Muon Solenoid detector in the Large Hadron Collider. In a recent review, leaders of the international project praised the computational capability Purdue makes available as well as its cost effectiveness.
Built in 2015 and expanded twice already, Snyder is designed for memory-intensive life science research like the Zika virus effort. It aligns with Purdue’s new Pillars of Excellence in the Life Sciences Initiative, which aims to produce preeminent contributions in infectious diseases as well as integrative neurosciences and inflammation and immunology.
The Zika team also uses the Research Data Depot, Purdue’s storage cluster for moving and sharing active research data sets, and the Fortress archive for long-term storage, nearly 20 terabytes worth of data on Zika so far. Movement of data from the microscope to the clusters and storage is eased by a high-speed research network pipeline connecting the instrument and computational resources.
"We’re talking about terabytes per day," says Thomas Klose, a postdoctoral researcher in Rossmann’s lab. "We generate a lot of raw data, we generate a lot of process data and all of that needs to be available for future use."
Built in collaboration with Purdue faculty, the community clusters are operated by ITaP Research Computing. Centralizing purchasing negotiation through the Office of the CIO continues to yield excellent prices on hardware through economies of scale and ITaP provides the infrastructure, power, cooling and expert staff under the “condo computing” model.
Researchers always have ready access to the cluster capacity they purchase, but they also can share capacity fellow researchers aren’t using at the moment. This can give them access to substantially more computational power when needed and it keeps the machines busy. No one who’s become a partner in the program has ever left it, despite a standing offer from ITaP to buy them out if they want.
For information about the Community Cluster Program, the Research Data Depot and other research computing systems and services email firstname.lastname@example.org or contact Preston Smith, ITaP’s director of research services and support, email@example.com or 49-49729.