6 merge requests!368Update prace.md to document the change from qprace to qprod as the default...,!367Update prace.md to document the change from qprace to qprod as the default...,!366Update prace.md to document the change from qprace to qprod as the default...,!323extended-acls-storage-section,!196Master,!157Orca update
ORCA is a flexible, efficient and easy-to-use general purpose tool for quantum chemistry with specific emphasis on spectroscopic properties of open-shell molecules. It features a wide variety of standard quantum chemical methods ranging from semiempirical methods to DFT to single- and multireference correlated ab initio methods. It can also treat environmental and relativistic effects.
ORCA is a flexible, efficient and easy-to-use general purpose tool for quantum chemistry with specific emphasis on spectroscopic properties of open-shell molecules. It features a wide variety of standard quantum chemical methods ranging from semiempirical methods to DFT to single- and multireference correlated ab initio methods. It can also treat environmental and relativistic effects.
## Making ORCA Available
## ORCA Available on the Cluster
The following module command makes the latest version of orca available to your session
Many versions of ORCA is available on our clusters. You can list all versions by `ml av` command.
TOTAL RUN TIME: 0 days 0 hours 0 minutes 1 seconds 47 msec
```
```
## Example Single Core Job
## Running ORCA in Parallel
Create a file called orca_serial.inp that contains the following orca commands
Your serial computation can be converted to parallel in a very simple way. You simply have to specify the number of parallel processes by directive **%pal**. In this example, 4 nodes, 16 cores each are used.
```cpp
!!! warning
# My first ORCA calculation :-)
Don't use **! PAL** directive as only PAL2 to PAL8 is recognized!
#
```bash
# Taken from the Orca manual
# Taken from the Orca manual
# https://orcaforum.cec.mpg.de/OrcaManual.pdf
# https://orcaforum.cec.mpg.de/OrcaManual.pdf
! HF SVP
! HF SVP
%pal
nprocs 64 # 4 nodes, 16 cores each
end
* xyz 0 1
* xyz 0 1
C 0 0 0
C 0 0 0
O 0 0 1.13
O 0 0 1.13
*
*
```
```
Create a Sun Grid Engine submission file called submit_serial.sh that looks like this
You also need to edit the previously used PBS submission file. You have to specify number of nodes, cores and MPI-processes to run.
```console
```bash
!/bin/bash
#!/bin/bash
#PBS -S /bin/bash
#PBS -N ORCA_PARALLEL
#PBS -l select=4:ncpus=16:mpiprocs=16
#PBS -q qexp
module load ORCA/3_0_3-linux_x86-64
ml ORCA/4.0.1.2
orca orca_serial.inp
${EBROOTORCA}/orca orca_parallel.inp
```
```
Submit the job to the queue with the command
!!! note
When running ORCA in parallel, ORCA should **NOT** be started with mpirun: e.g. mpirun -np 4 orca etc. like many MPI programs and has to be called with full pathname.
Submit this job to the queue and see the output file.
```console
```console
$qsub -q qexp -I-lselect=1
$qsub submit_parallel.pbs
qsub: waiting for job 196821.isrv5 to start
1417598.dm2
qsub: job 196821.isrv5 ready
[username@r37u04n944 ~]$./submit_serial.sh
$ll ORCA_PARALLEL.*
-rw------- 1 hra0031 hra0031 0 Aug 21 13:12 ORCA_PARALLEL.e1417598
-rw------- 1 hra0031 hra0031 23561 Aug 21 13:13 ORCA_PARALLEL.o1417598
TOTAL RUN TIME: 0 days 0 hours 0 minutes 11 seconds 859 msec
```
```
You can see, that the program was running with 64 parallel MPI-processes. In version 4.0.1.2, only the following modules are parallelized:
* ANOINT
* CASSCF / NEVPT2
* CIPSI
* CIS/TDDFT
* CPSCF
* EPRNMR
* GTOINT
* MDCI (Canonical-, PNO-, DLPNO-Methods)
* MP2 and RI-MP2 (including Gradient and Hessian)
* MRCI
* PC
* ROCIS
* SCF
* SCFGRAD
* SCFHESS
* SOC
* Numerical Gradients and Frequencies
## Register as a User
## Register as a User
You are encouraged to register as a user of Orca at [Here](https://orcaforum.cec.mpg.de/) in order to take advantage of updates, announcements and also of the users forum.
You are encouraged to register as a user of ORCA at [Here](https://orcaforum.cec.mpg.de/) in order to take advantage of updates, announcements and also of the users forum.
## Documentation
## Documentation
A comprehensive [.pdf](https://orcaforum.cec.mpg.de/OrcaManual.pdf) manual is available online.
A comprehensive [PDF](https://orcaforum.cec.mpg.de/OrcaManual.pdf) manual is available online.