Skip to main content
Have a request for an upcoming news/science story? Submit a Request

ITaP-led team aims to get big data from scientific instruments to computational, storage resources faster, more reliably

  • Science Highlights

A project led by ITaP Research Computing staff has been awarded a two-year, $323,000 grant from the National Science Foundation (NSF award number #1827184) to build a high-speed network infrastructure that will quickly and reliably get big data from scientific instruments to storage and computational resources.

ITaP Research Computing Senior Research Scientist Carol Song is the project’s principal investigator, and Preston Smith, director of research services and support for ITaP Research Computing, is a co-principal investigator. Purdue faculty members Robin Tanamachi, assistant professor of earth, atmospheric and planetary sciences, and Wen Jiang, professor of biological sciences, are also co-PIs.

Researchers using scientific instruments increasingly generate very large amounts of data, and moving that data to places such as the Data Depot storage array or Purdue’s community cluster research supercomputers for storage or computation requires tapping into high-speed research networks. To make that easier, this ITaP-led project is upgrading certain campus facilities with greater bandwidth, giving researchers the ability to connect at much faster speeds.

The upgrades will not only provide a faster network connection, but will use the “Science DMZ” model to provide a more predictable connection that is optimized for scientific data flows.

“We’re getting data from your instrument to where you need to work on it quicker and more reliably, and decreasing your time to science,” says Smith.

The campus instruments and facilities that are being upgraded with the NSF grant include:

  • The cryo-electron microscope that Jiang’s lab uses to study the molecular structure of viruses, which collects several terabytes of data per day.
  • The X-band weather radar system used by Tanamachi’s group for monitoring winds and precipitation in the lower atmosphere, which requires storing an archive of 360 TB of data.
  • ITaP’s Envision Center, which generates virtual reality simulations and visualizations that can take up to 12 gigabytes of space for a single 10-minute video.

Originally posted: