Snyder Scratch Move & Maintenance

November 7, 2017  8:00am – 5:00pm

UPDATE: November 7, 2017  6:42pm

As of 6:42pm, engineers have completed maintenance and have returned the Snyder cluster back to normal service. All queues have been enabled and jobs have resumed scheduling. Please report any issues to

ORIGINAL: October 10, 2017  10:53am

The Snyder scratch storage will be moving to a new storage system on Tuesday, November 7, 2017. There will not be any automatic transfer of files from your old scratch space to your new scratch space. You will find there are two new environment variables to assist you in a smooth transition to this new space:

  • $CLUSTER_SCRATCH_NEW (new scratch space)
  • $CLUSTER_SCRATCH_OLD (old/existing scratch space)

The new scratch space ($CLUSTER_SCRATCH_NEW) is already available on all Snyder submission front-ends and compute nodes. During the maintenance on Tuesday, November 7, 2017, the $CLUSTER_SCRATCH variable will switch from matching $CLUSTER_SCRATCH_OLD to matching $CLUSTER_SCRATCH_NEW.

The old scratch space will remain available until December 12 through the $CLUSTER_SCRATCH_OLD variable. However, this scratch space will be removed from Snyder at that point. Also on December 12, the $CLUSTER_SCRATCH_NEW and _OLD variables will be removed.

Please be sure to copy any files you need from the old path to the new path.

During the maintenance Tuesday, November 7, 2017 from 8:00am – 5:00pm, Snyder will also have the operating system patched, minor software upgrades, and improvements to Thinlinc support. Any PBS jobs which request a walltime which would take them past Tuesday, November 7th, 2017 at 8:00am will not start and will remain in the queue until after the maintenance is completed.

Originally posted: October 10, 2017  10:53am

Purdue University, 610 Purdue Mall, West Lafayette, IN 47907, (765) 494-4600

© 2017 Purdue University | An equal access/equal opportunity university | Copyright Complaints | Maintained by ITaP Research Computing

Trouble with this page? Disability-related accessibility issue? Please contact us at so we can help.