Skip to content
Snippets Groups Projects
Commit bdf56d86 authored by Jan Siwiec's avatar Jan Siwiec
Browse files

Update workbench.md

parent 1c74a32e
No related branches found
No related tags found
No related merge requests found
Pipeline #40242 passed with warnings
!!!warning
This page has not been updated yet. The page does not reflect the transition from PBS to Slurm.
# Workbench # Workbench
## Workbench Batch Mode ## Workbench Batch Mode
It is possible to run Workbench scripts in a batch mode. You need to configure solvers of individual components to run in parallel mode. Open your project in Workbench. Then, for example, in *Mechanical*, go to *Tools - Solve Process Settings...*. It is possible to run Workbench scripts in a batch mode.
You need to configure solvers of individual components to run in parallel mode.
Open your project in Workbench.
Then, for example, in *Mechanical*, go to *Tools - Solve Process Settings...*.
![](../../../img/AMsetPar1.png) ![](../../../img/AMsetPar1.png)
Enable the *Distribute Solution* checkbox and enter the number of cores (e.g. 72 to run on two Barbora nodes). If you want the job to run on more than 1 node, you must also provide a so called MPI appfile. In the *Additional Command Line Arguments* input field, enter: Enable the *Distribute Solution* checkbox and enter the number of cores (e.g. 72 to run on two Barbora nodes).
If you want the job to run on more than 1 node, you must also provide a so called MPI appfile.
In the *Additional Command Line Arguments* input field, enter:
```console ```console
-mpifile /path/to/my/job/mpifile.txt -mpifile /path/to/my/job/mpifile.txt
``` ```
Where /path/to/my/job is the directory where your project is saved. We will create the file mpifile.txt programmatically later in the batch script. For more information, refer to \*ANSYS Mechanical APDL Parallel Processing\* \*Guide\*. Where `/path/to/my/job` is the directory where your project is saved.
We will create the file `mpifile.txt` programmatically later in the batch script.
For more information, refer to \*ANSYS Mechanical APDL Parallel Processing\* \*Guide\*.
Now, save the project and close Workbench. We will use this script to launch the job: Now, save the project and close Workbench.
We will use this script to launch the job:
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -l select=2:ncpus=128 #SBATCH --nodes=2
#PBS -q qprod #SBATCH --ntasks-per-node=128
#PBS -N test9_mpi_2 #SBATCH --job-name=test9_mpi_2
#PBS -A OPEN-0-0 #SBATCH --partition=qcpu
#SBATCH --account=ACCOUNT_ID
# change the working directory # change the working directory
DIR=/scratch/project/PROJECT_ID/$PBS_JOBID DIR=/scratch/project/PROJECT_ID/$SLURM_JOB_ID
mkdir -p "$DIR" mkdir -p "$DIR"
cd "$DIR" || exit cd "$DIR" || exit
...@@ -36,15 +41,15 @@ Now, save the project and close Workbench. We will use this script to launch the ...@@ -36,15 +41,15 @@ Now, save the project and close Workbench. We will use this script to launch the
echo Time is `date` echo Time is `date`
echo Directory is `pwd` echo Directory is `pwd`
echo This jobs runs on the following nodes: echo This jobs runs on the following nodes:
echo `cat $PBS_NODEFILE` echo `$SLURM_NODELIST`
ml ANSYS/21.1-intel-2018a ml ANSYS/2023R2-intel-2022.12
#### Set number of processors per host listing #### Set number of processors per host listing
procs_per_host=24 procs_per_host=24
#### Create MPI appfile #### Create MPI appfile
echo -n "" > mpifile.txt echo -n "" > mpifile.txt
for host in `cat $PBS_NODEFILE` for host in `$SLURM_NODELIST`
do do
echo "-h $host -np $procs_per_host $ANSYS160_DIR/bin/ansysdis161 -dis" > mpifile.txt echo "-h $host -np $procs_per_host $ANSYS160_DIR/bin/ansysdis161 -dis" > mpifile.txt
done done
...@@ -61,4 +66,6 @@ Now, save the project and close Workbench. We will use this script to launch the ...@@ -61,4 +66,6 @@ Now, save the project and close Workbench. We will use this script to launch the
runwb2 -R jou6.wbjn -B -F test9.wbpj runwb2 -R jou6.wbjn -B -F test9.wbpj
``` ```
The solver settings are saved in the solvehandlers.xml file, which is not located in the project directory. Verify your solved settings when uploading a project from your local computer. The solver settings are saved in the `solvehandlers.xml` file,
which is not located in the project directory.
Verify your solved settings when uploading a project from your local computer.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment