Overview of Coates

Coates was a compute cluster operated by ITaP and was a member of Purdue's Community Cluster Program. ITaP installed Coates on July 21, 2009, and at the time it was the largest entirely 10 Gigabit Ethernet (10GigE) academic cluster in the world. Coates consisted of 982 64-bit, 8-core Hewlett-Packard Proliant and 11 64-bit, 16-core Hewlett-Packard Proliant DL585 G5 systems with between 16 GB and 128 GB of memory. All nodes had 10 Gigabit Ethernet interconnects and a 5-year warranty. Coates was decommissioned on September 30, 2014.

Coates Installation Day Video


Coates is named in honor of Ben Coates, former head of Electrical Engineering and founder of both the Computer Engineering degree program and the Engineering Computer Network (ECN) at Purdue. More information about his life and impact on Purdue is available in an ITaP Biography of Ben Coates.

Coates consisted of five logical sub-clusters, each with a different memory/storage configuration as described in the following table. All nodes in the cluster featured 10 Gigabit Ethernet (10GigE).

Detailed Hardware Specification

Sub-Cluster Number of Nodes Processors per Node Cores per Node Memory per Node Interconnect Disk
Coates-A 640 Two 2.5 GHz Quad-Core AMD 2380 8 32 GB 10 GigE 500 GB
Coates-B 45 Two 2.5 GHz Quad-Core AMD 2380 8 32 GB 10 GigE 2 TB
Coates-C 264 Two 2.5 GHz Quad-Core AMD 2380 8 16 GB 10 GigE 500 GB
Coates-D 33 Two 2.5 GHz Quad-Core AMD 2380 8 16 GB 10 GigE 2 TB
Coates-E 11 Four 2.5 GHz Quad-Core AMD 8380 16 128 GB 10 GigE 2 TB

Coates nodes ran Red Hat Enterprise Linux 5 (RHEL5) and used Moab Workload Manager 7 and TORQUE Resource Manager 4 as the portable batch system (PBS) for resource and job management. Coates also ran jobs for BoilerGrid whenever processor cores in it would otherwise have been idle.

Purdue University, 610 Purdue Mall, West Lafayette, IN 47907, (765) 494-4600

© 2017 Purdue University | An equal access/equal opportunity university | Copyright Complaints | Maintained by ITaP Research Computing

Trouble with this page? Disability-related accessibility issue? Please contact us at so we can help.