Skip to content
Snippets Groups Projects
Select Git revision
  • b47e670eac24db970f78570f356240e58465154c
  • 2022 default
  • 2021
  • master protected
  • 2021
5 results

mpi-lab-exercises

user avatar
Kjartan Thor Wikfeldt authored
b47e670e
History
Name Last commit Last update
lab1
lab2
lab3
README.md

Instructions and hints on how to run for the MPI course

Where to run

The exercises will be run on PDC's CRAY XC-40 system Beskow:

beskow.pdc.kth.se

How to login

To access PDC's cluster you should use your laptop and the Eduroam or KTH Open wireless networks.

Instructions on how to connect from various operating systems.

More about the environment on Beskow

The Cray automatically loads several modules at login.

Compiling MPI programs on Beskow

By default the cray compiler is loaded into your environment. In order to use another compiler you have to swap compiler modules:

module swap PrgEnv-cray PrgEnv-gnu

or

module swap PrgEnv-cray PrgEnv-intel

On Beskow one should always use the compiler wrappers cc, CC or ftn (for C, C++ and Fortran codes, respectively), which will automatically link to MPI libraries and linear algebra libraries like BLAS, LAPACK, etc.

Examples:

# Fortran
ftn [flags] source.f90
# C
cc [flags] source.c
# C++
CC [flags] source.cpp

Running MPI programs on Beskow

First it is necessary to book a node for interactive use:

salloc -A <allocation-name> -N 1 -t 1:0:0

You might also need to specify a reservation by adding the flag --reservation=<name-of-reservation>.

Then the srun command is used to launch an MPI application:

srun -n 32 ./example.x

In this example we will start 32 MPI tasks (there are 32 cores per node on the Beskow nodes).

If you do not use srun and try to start your program on the login node then you will get an error similar to

Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(408): Initialization failed
MPID_Init(123).......: channel initialization failed
MPID_Init(461).......:  PMI2 init failed: 1

MPI Exercises