Skip to content

dehhrvetuj/mpi-lab-exercises

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Instructions and hints on how to run for the MPI course

Where to run

The exercises will be run on PDC's CRAY XC-40 system Beskow:

beskow.pdc.kth.se

How to login

To access PDC's cluster you should use your laptop and the Eduroam or KTH Open wireless networks.

Instructions on how to connect from various operating systems.

More about the environment on Beskow

The Cray automatically loads several modules at login.

Running MPI programs on Beskow

First it is necessary to book a node for interactive use:

salloc -A <allocation-name> -N 1 -t 1:0:0

Then the aprun command is used to launch an MPI application:

aprun -n 32 ./example.x

In this example we will start 32 MPI tasks (there are 32 cores per node on the Beskow nodes).

If you do not use aprun and try to start your program on the login node then you will get an error similar to

Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(408): Initialization failed
MPID_Init(123).......: channel initialization failed
MPID_Init(461).......:  PMI2 init failed: 1

MPI Exercises

About

MPI exercises used for the Introduction to MPI course

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C 55.9%
  • Fortran 44.1%