Access to the HPC cluster

  • Accounts
    • The access for the HPC Cluster at TUHH-RZ has to applied with the following (PDF) form ("Benutzerantrag"). Please tick HPC on the rear side and return the form to the USC.
  • Access hosts
    • Access to the login nodes via SSH protocol. (Linux: ssh username@hpc5.rz.tu-harburg.de, Windows: putty). Data transfer with scp or sftp (Windows: WinSCP).
    • SSH fingerprints may vary by time and server but are signed by a certificate authority.
      Please add the content of this file to your known-host file /etc/ssh/ssh_known_hosts or ~/.ssh/known_hosts to ensure you are really connecting to our servers.
    • The login nodes are usually accessed from within the TUHH network.
      hpc1.rz.tuhh.de can be accessed worldwide in particular for data transfer, with restrictions concerning availability and performance.
    • The login nodes are for interactive use (Pre- and post processing, building software or alike), the computes nodes are accessible via the batch system.

Best practice for using the HPC clusters

The HPC cluster uses the free batch system SLURM for submitting and handling compute jobs.

Therefore each user has to formulate the compute jobs as a Bash script with special directives for SLURM. With the directives mentioned below a user requests certain ressources (Number of CPU cores, time, memory) from the cluster. This information is used by SLURM to start the compute job when the requested hardware becomes idle.

Typical steps while performing a scientific simulation is usually as follows:

  • Generating a model (e.g. Matlab script or Ansys case) and graphical preprocessing (if applicable) on your personal workstation.
  • Make yourself familiar with the command line handling of your software, e.g. with a short run on a Linux computer.
  • Copy the input data with scp onto the HPC cluster, e.g. to your directory below /work.
  • SSH into a login node (e.g. hpc5.rz.tu-harburg.de) and generating a batch script.
  • Submit the batch script with the command sbatch <scriptname> . Helpful commands include squeue, scancel and sview.
  • Wait until job has finished.
  • Copy the results back to your personal workstation.
  • Evaluate the results and graphical postprocessing (if applicable) on your local workstation.