Skip to content
Snippets Groups Projects
Select Git revision
  • cd11c258d9af915fa805f1802e4cd7965b6c4038
  • 2022 default
  • 2021
  • master protected
  • 2021
5 results

README.md

Blame
  • user avatar
    Kjartan Thor Wikfeldt authored
    cd11c258
    History

    Instructions and hints on how to run for the MPI course

    Where to run

    The exercises will be run on PDC's CRAY XC-40 system Beskow:

    beskow.pdc.kth.se

    How to login

    To access PDC's cluster you should use your laptop and the Eduroam or KTH Open wireless networks.

    Instructions on how to connect from various operating systems.

    More about the environment on Beskow

    The Cray automatically loads several modules at login.

    Running MPI programs on Beskow

    First it is necessary to book a node for interactive use:

    salloc -A <allocation-name> -N 1 -t 1:0:0

    Then the srun command is used to launch an MPI application:

    srun -n 32 ./example.x

    In this example we will start 32 MPI tasks (there are 32 cores per node on the Beskow nodes).

    If you do not use srun and try to start your program on the login node then you will get an error similar to

    Fatal error in MPI_Init: Other MPI error, error stack:
    MPIR_Init_thread(408): Initialization failed
    MPID_Init(123).......: channel initialization failed
    MPID_Init(461).......:  PMI2 init failed: 1

    MPI Exercises