Instructions and hints on how to run for the MPI course
Where to run
The exercises will be run on PDC's CRAY XC-40 system Beskow:
beskow.pdc.kth.se
How to login
To access PDC's cluster you should use your laptop and the Eduroam or KTH Open wireless networks.
Instructions on how to connect from various operating systems.
More about the environment on Beskow
The Cray automatically loads several modules at login.
- Heimdal - Kerberos commands
- OpenAFS - AFS commands
- SLURM - batch jobs and interactive jobs
- Programming environment - Compilers for software development
Compiling MPI programs on Beskow
By default the cray compiler is loaded into your environment. In order to use another compiler you have to swap compiler modules:
module swap PrgEnv-cray PrgEnv-gnu
or
module swap PrgEnv-cray PrgEnv-intel
On Beskow one should always use the compiler wrappers cc
, CC
or
ftn
(for C, C++ and Fortran codes, respectively),
which will automatically link to MPI libraries and linear
algebra libraries like BLAS, LAPACK, etc.
Examples:
# Fortran
ftn [flags] source.f90
# C
cc [flags] source.c
# C++
CC [flags] source.cpp
Running MPI programs on Beskow
First it is necessary to book a node for interactive use:
salloc -A <allocation-name> -N 1 -t 1:0:0
You might also need to specify a reservation by adding the flag
--reservation=<name-of-reservation>
.
Then the srun command is used to launch an MPI application:
srun -n 32 ./example.x
In this example we will start 32 MPI tasks (there are 32 cores per node on the Beskow nodes).
If you do not use srun and try to start your program on the login node then you will get an error similar to
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(408): Initialization failed
MPID_Init(123).......: channel initialization failed
MPID_Init(461).......: PMI2 init failed: 1
MPI Exercises
- MPI Lab 1: Program Structure and Point-to-Point Communication in MPI
- MPI Lab 2: Collective and Non-Blocking Communication
- MPI Lab 3: Advanced Topics