Skip to content
Snippets Groups Projects
Commit 1eec6847 authored by Jan Siwiec's avatar Jan Siwiec
Browse files

Merge branch 'hol0598-master-patch-83551' into 'master'

Update orca.md

See merge request !482
parents 1d1a597d 8b11a657
No related branches found
No related tags found
1 merge request!482Update orca.md
Pipeline #41294 failed
......@@ -31,12 +31,12 @@ Next, create a Slurm submission file for Karolina cluster (interactive job can b
#!/bin/bash
#SBATCH --job-name=ORCA_SERIAL
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=128
#SBATCH --partition=qexp
#SBATCH --partition=qcpu_exp
#SBATCH --time=1:00:00
#SBATCH --account=OPEN-0-0
ml ORCA/5.0.1-OpenMPI-4.1.1
srun orca orca_serial.inp
ml ORCA/6.0.0-gompi-2023a-avx2
orca orca_serial.inp
```
Submit the job to the queue.
......@@ -56,34 +56,39 @@ $ cat ORCA_SERIAL.o1417552
* O R C A *
*****************
--- An Ab Initio, DFT and Semiempirical electronic structure package ---
#######################################################
# -***- #
# Department of molecular theory and spectroscopy #
# Directorship: Frank Neese #
# Max Planck Institute for Chemical Energy Conversion #
# D-45470 Muelheim/Ruhr #
# Germany #
# #
# All rights reserved #
# -***- #
#######################################################
Program Version 5.0.1 - RELEASE -
#########################################################
# -***- #
# Department of theory and spectroscopy #
# #
# Frank Neese #
# #
# Directorship, Architecture, Infrastructure #
# SHARK, DRIVERS #
# Core code/Algorithms in most modules #
# #
# Max Planck Institute fuer Kohlenforschung #
# Kaiser Wilhelm Platz 1 #
# D-45470 Muelheim/Ruhr #
# Germany #
# #
# All rights reserved #
# -***- #
#########################################################
Program Version 6.0.0 - RELEASE -
...
****ORCA TERMINATED NORMALLY****
TOTAL RUN TIME: 0 days 0 hours 0 minutes 1 seconds 47 msec
TOTAL RUN TIME: 0 days 0 hours 0 minutes 0 seconds 980 msec
```
## Running ORCA in Parallel
Your serial computation can be easily converted to parallel.
Simply specify the number of parallel processes by the `%pal` directive.
In this example, 4 nodes, 128 cores each are used.
In this example, 1 node, 16 cores are used.
!!! warning
Do not use the `! PAL` directive as only PAL2 to PAL8 is recognized.
......@@ -91,7 +96,7 @@ In this example, 4 nodes, 128 cores each are used.
```bash
! HF SVP
%pal
nprocs 512 # 4 nodes, 128 cores each
nprocs 16
end
* xyz 0 1
C 0 0 0
......@@ -106,13 +111,14 @@ You have to specify number of nodes, cores, and MPI-processes to run:
#!/bin/bash
#SBATCH --job-name=ORCA_PARALLEL
#SBATCH --nodes=4
#SBATCH --ntasks-per-node=128
#SBATCH --partition=qexp
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=16
#SBATCH --partition=qcpu_exp
#SBATCH --account=OPEN-0-0
#SBATCH --time=1:00:00
ml ORCA/5.0.1-OpenMPI-4.1.1
srun orca orca_parallel.inp > output.out
ml ORCA/6.0.0-gompi-2023a-avx2
$(which orca) orca_parallel.inp > output.out
```
!!! note
......@@ -122,67 +128,79 @@ srun orca orca_parallel.inp > output.out
Submit this job to the queue and see the output file.
```console
$ srun submit_parallel.slurm
1417598
$ ll ORCA_PARALLEL.*
-rw------- 1 user user 0 Aug 21 13:12 ORCA_PARALLEL.e1417598
-rw------- 1 user user 23561 Aug 21 13:13 ORCA_PARALLEL.o1417598
$ sbatch submit_parallel.slurm
Submitted batch job 2127305
$ cat ORCA_PARALLEL.o1417598
$ cat output.out
*****************
* O R C A *
*****************
--- An Ab Initio, DFT and Semiempirical electronic structure package ---
#######################################################
# -***- #
# Department of molecular theory and spectroscopy #
# Directorship: Frank Neese #
# Max Planck Institute for Chemical Energy Conversion #
# D-45470 Muelheim/Ruhr #
# Germany #
# #
# All rights reserved #
# -***- #
#######################################################
Program Version 5.0.1 - RELEASE -
#########################################################
# -***- #
# Department of theory and spectroscopy #
# #
# Frank Neese #
# #
# Directorship, Architecture, Infrastructure #
# SHARK, DRIVERS #
# Core code/Algorithms in most modules #
# #
# Max Planck Institute fuer Kohlenforschung #
# Kaiser Wilhelm Platz 1 #
# D-45470 Muelheim/Ruhr #
# Germany #
# #
# All rights reserved #
# -***- #
#########################################################
Program Version 6.0.0 - RELEASE -
...
************************************************************
* Program running with 64 parallel MPI-processes *
* Program running with 16 parallel MPI-processes *
* working on a common directory *
************************************************************
...
****ORCA TERMINATED NORMALLY****
TOTAL RUN TIME: 0 days 0 hours 0 minutes 11 seconds 859 msec
TOTAL RUN TIME: 0 days 0 hours 0 minutes 17 seconds 62 msec
```
You can see, that the program was running with 512 parallel MPI-processes.
In version 5.0.1, only the following modules are parallelized:
You can see, that the program was running with 16 parallel MPI-processes.
In version 6.0.0, only the following modules are parallelized:
* ANOINT
* CASSCF / NEVPT2
* AUTOCI
* CASSCF / NEVPT2 / CASSCFRESP
* CIPSI
* CIS/TDDFT
* CPSCF
* EPRNMR
* GTOINT
* MDCI (Canonical-, PNO-, DLPNO-Methods)
* MP2 and RI-MP2 (including Gradient and Hessian)
* GRAD (general Gradient program)
* GUESS
* LEANSCF (memory conserving SCF solver)
* MCRPA
* MDCI (Canonical- and DLPNO-Methods)
* MM
* MP2 and RI-MP2 (including Gradients)
* MRCI
* PC
* PLOT
* PNMR
* POP
* PROP
* PROPINT
* REL
* ROCIS
* SCF
* SCFGRAD
* SCFHESS
* SOC
* Numerical Gradients and Frequencies
* SCFRESP (with SCFHessian)
* STARTUP
* VPOT
* Numerical Gradients, Frequencies, Overtones-and-Combination-Bands
* VPT2
* NEB (Nudged Elastic Band
## Example Submission Script
......@@ -253,4 +271,5 @@ in order to take advantage of updates, announcements, and the users forum.
A comprehensive [manual][b] is available online for registered users.
[a]: https://orcaforum.kofo.mpg.de/app.php/portal
[b]: https://orcaforum.kofo.mpg.de/app.php/dlext/?cat=1
[b]: https://www.faccts.de/docs
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment