Skip to content
Snippets Groups Projects
Select Git revision
  • b928cdc1ed9217ce39a1f96a949048d726d279e7
  • 2022 default
  • 2021
  • master protected
  • 2021
5 results

mpi-lab-exercises

user avatar
tor@totoro authored
Changed srun --> mpirun (as it should be on Tegner).
b928cdc1
History
Name Last commit Last update
lab1
lab2
README.md

PDC Summer School 2021: General Instructions for the MPI Labs

Where to run

The exercises will be run on PDC's cluster Tegner:

tegner.pdc.kth.se

How to login

To access PDC's systems you need an account at PDC. Check the instructions for obtaining an account.

Once you have an account, you can follow the instructions on how to connect from various operating systems.

Related to the Kerberos-based authentication environment, please check the Kerberos commands documentation

More about the environment on Tegner

Software, which is not available by default, needs to be loaded as a module at login. Use module avail to get a list of available modules. The following modules are of interest for this lab exercises:

  • Different versions of OpenMPI based on different versions of the GNU compiler suite (openmpi/*)

For more information see the software development documentation page.

Home directories are provided through an OpenAFS services. See the AFS data management page for more information.

To use the Tegner compute nodes you have to submit SLURM batch jobs or run SLURM interactive jobs.

Compiling MPI programs on Tegner

The following shell commands show a simple example on how to compile an MPI program using Fortran, C or C++:

module load gcc/8.2.0
module load openmpi/4.0-gcc-8.2
mpif90 my_prog.f90
mpicc my_prog.c
mpicxx my_prog.cc

With the first line the default version of the OpenMPI module is loaded.

Running MPI programs

First it is necessary to book a node for interactive use:

salloc -A <allocation-name> -N 1 -t 1:0:0

You might also need to specify a reservation by adding the flag --reservation=<name-of-reservation>.

Then the mpirun command is used to launch an MPI application:

mpirun -n 24 ./example.x

In this example we will start 24 MPI tasks (there are 24 cores per node on the Tegner Thin nodes).

MPI Exercises