What is the Blueshark cluster?

Body

Blueshark will be permanently shut down on July 29th, 2024

The Blueshark Cluster is comprised with:

  • 50 compute nodes
  • 13 big memory nodes
  • 11 GPU nodes
  • 1 head node

totalling 1720 cores and 4,397 GB of RAM. It was funded by the National Science Foundation (NSF) through the Major Research Instrumentation Grant.

The configuration of each of the 50 compute nodes is:

  • IBM System x iDataPlex dx360 M3
  • 2 x Hexa-Core Intel Xeon X5650 @ 2.67GHz CPUs
  • 24GB of RAM
  • 250GB SATA HDD
  • 1 GbE Ethernet Interconnect

The head node configuration is:

  • IBM System x iDataPlex dx360 M3
  • 2 x 4 Core Intel Xeon X5550 @ 2.67GHz
  • 24GB of RAM
  • LSI MegaRAID SAS Controller
  • Storage Expansion Unit
  • 8 x 1 TB 7200RPM SAS Hot-Swap HDD
  • 10 GbE link to compute nodes via Chelsio T310 10GbE Adapter
  • Redundant Hot-swap Power Supplies

There are 11 GPU compute nodes with this configuration:

  • Dell PowerEdge C4130
  • 2 x 10 core Intel Xeon E5-2650 @ 2.30GHz
  • 131GB of RAM
  • 1TB SATA HDD
  • 4 x Nvidia Tesla K40m
  • 1 Gb Ethernet Interconnect
  • Mellanox InfiniBand Interconnect

The 13 big memory compute nodes are configuration:

  • SuperMicro 1u Servers
  • 2 x 10 core Intel Xeon E5-2650 @ 2.30GHz
  • 131GB of RAM
  • 120GB SATA HDD
  • 1 Gb Ethernet Interconnect

The storage node configuration:

  • Dell PowerEdge R630
  • Dell PowerVault MD3060
  • 60 x 4TB HDD
  • 240TB raw capacity

Other hardware resources:

  • 2 x BNT 48port 1GbE switches with dual 10GbE

Details

Details

Article ID: 2110
Created
Sun 11/27/22 10:32 PM
Modified
Mon 3/4/24 1:23 PM