Skip to main content

Balena HPC cluster

We will soon begin decommissioning the Balena HPC cluster. Find out what this means for you and when core services will be cut-off.


Factsheet

Picture of the Balena HPC
We will soon begin decommissioning the Balena HPC cluster. Act now to secure your data.

Balena HPC shutdown

IMPORTANT: Action required

14 November 2022: We will soon begin decommissioning the Balena HPC cluster. Find out when core services will be cut-off and what you need to do.

Hardware specifications

Processor and memory

Intel Ivybridge Intel Skylake
Total number of nodes 196 17
Number of CPU cores 3,136 408
Memory per core 4GB, 8GB or 32GB 8GB
Memory per node 64GB, 128GB or 512GB 192GB
Total memory 15.8TBs 3.2TBs
Performance (Rpeak) 63.2 TFLOPS 16.0 TFLOPS (AVX2) or 23.3 TFLOPS (AVX512)
Linpack performance (HPL) 87.3% of Rpeak
  • Intel Ivybridge nodes are dual socket nodes with E5-2650v2 processors, 2.6GHz with 8 cores
  • Intel Skylake nodes are dual socket nodes with Gold 6126, 2.6GHz with 12 cores
  • Two high-capacity nodes with 512GB of memory
  • Two high-end visualisation nodes

Network

  • Intel TrueScale 40GB/s Infiniband
  • 10GbE for data and 1GbE for management

Multi-core accelerators

  • NVIDIA Kepler K20x
  • NVIDIA Pascal P100
  • AMD S10000
  • Intel Xeon Phi 5110p

Storage

  • 5GB home area per user
  • 0.6PBs of non-archival parallel filesystem based on BeeGFS
  • Research data storage available on login nodes

Software environment

  • Scientific Linux Operating System (OS) (based on RedHat)
  • SLURM scheduler
  • Module-based software environments
  • Intel and GNU C and Fortran compilers
  • Intel and OpenSource Math libraries and MPI libraries
  • Intel and Allinea debugger and profiling tools
  • Allinea Performance Report tools

Access

  • Connect via Secure Shell (SSH) or visualisation service
  • Interactive test and development sessions
  • Workloads submitted to the batch scheduler

Enquiries

If you have any questions, please contact us.


On this page