Commit b6974b0a authored by Jan Siwiec's avatar Jan Siwiec

Update running-mpich2.md

parent fe08afe1
Pipeline #11121 passed with stages
in 23 minutes and 35 seconds
......@@ -2,7 +2,7 @@
## MPICH2 Program Execution
The MPICH2 programs use mpd daemon or ssh connection to spawn processes, no PBS support is needed. However the PBS allocation is required to access compute nodes. On Anselm, the **Intel MPI** and **mpich2 1.9** are MPICH2 based MPI implementations.
The MPICH2 programs use MPD daemon or SSH connection to spawn processes, no PBS support is needed. However, the PBS allocation is required to access compute nodes. On Anselm, the **Intel MPI** and **mpich2 1.9** are MPICH2 based MPI implementations.
### Basic Usage
......@@ -26,7 +26,7 @@ $ mpirun -ppn 1 -hostfile $PBS_NODEFILE ./helloworld_mpi.x
In this example, we allocate 4 nodes via the express queue interactively. We set up the intel MPI environment and interactively run the helloworld_mpi.x program. We request MPI to spawn 1 process per node.
Note that the executable helloworld_mpi.x must be available within the same path on all nodes. This is automatically fulfilled on the /home and /scratch filesystem.
You need to preload the executable, if running on the local scratch /lscratch filesystem
You need to preload the executable if running on the local scratch /lscratch filesystem:
```console
$ pwd
......@@ -39,7 +39,7 @@ $ mpirun -ppn 1 -hostfile $PBS_NODEFILE ./helloworld_mpi.x
Hello world! from rank 3 of 4 on host cn110
```
In this example, we assume the executable helloworld_mpi.x is present on shared home directory. We run the cp command via mpirun, copying the executable from shared home to local scratch . Second mpirun will execute the binary in the /lscratch/15210.srv11 directory on nodes cn17, cn108, cn109 and cn110, one process per node.
In this example, we assume the executable helloworld_mpi.x is present on shared home directory. We run the cp command via mpirun, copying the executable from shared home to local scratch. Second mpirun will execute the binary in the /lscratch/15210.srv11 directory on nodes cn17, cn108, cn109 and cn110, one process per node.
!!! note
MPI process mapping may be controlled by PBS parameters.
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment