Skip to content
Snippets Groups Projects
Commit 835de294 authored by Roman Sliva's avatar Roman Sliva
Browse files

Update slurm-job-submission-and-execution.md

parent de193f26
No related branches found
No related tags found
No related merge requests found
Pipeline #33225 passed with warnings
...@@ -40,13 +40,19 @@ $ squeue --me ...@@ -40,13 +40,19 @@ $ squeue --me
Show job details for specific job Show job details for specific job
```console ```console
$ scontrol -d show job JOBID $ scontrol show job JOBID
``` ```
Show job details for executing job from job session Show job details for executing job from job session
```console ```console
$ scontrol -d show job $SLURM_JOBID $ scontrol show job $SLURM_JOBID
```
Show my jobs with long output format (includes time limit)
```console
$ squeue --me -l
``` ```
Show all jobs Show all jobs
...@@ -79,7 +85,7 @@ Run interactive job - queue qcpu_exp, one node by default, one task by default ...@@ -79,7 +85,7 @@ Run interactive job - queue qcpu_exp, one node by default, one task by default
$ salloc -A PROJECT-ID -p qcpu_exp $ salloc -A PROJECT-ID -p qcpu_exp
``` ```
Run interactive job on four nodes, 36 tasks per node (Barbora cluster, cpu partition recommended value based on core count), two hours time limit Run interactive job on four nodes, 36 tasks per node (Barbora cluster, cpu partition recommended value based on node core count), two hours time limit
```console ```console
$ salloc -A PROJECT-ID -p qcpu -N 4 --ntasks-per-node 36 -t 2:00:00 $ salloc -A PROJECT-ID -p qcpu -N 4 --ntasks-per-node 36 -t 2:00:00
...@@ -106,7 +112,7 @@ File script content: ...@@ -106,7 +112,7 @@ File script content:
```shell ```shell
#!/usr/bin/bash #!/usr/bin/bash
#SBATCH -J MyJobName #SBATCH -J MyJobName
#SBATCH -A OPEN-00-00 #SBATCH -A PROJECT-ID
#SBATCH -N 4 #SBATCH -N 4
#SBATCH --ntasks-per-node 36 #SBATCH --ntasks-per-node 36
#SBATCH -p qcpu #SBATCH -p qcpu
...@@ -158,7 +164,7 @@ cn[101-102] ...@@ -158,7 +164,7 @@ cn[101-102]
Expand nodelist to list of nodes. Expand nodelist to list of nodes.
``` ```
$ scontrol show hostnames $SLURM_JOB_NODELIST $ scontrol show hostnames
cn101 cn101
cn102 cn102
``` ```
...@@ -169,7 +175,13 @@ cn102 ...@@ -169,7 +175,13 @@ cn102
$ scontrol update JobId=JOBID ATTR=VALUE $ scontrol update JobId=JOBID ATTR=VALUE
``` ```
for example Modify job's time limit
```
scontrol update job JOBID timelimit=4:00:00
```
Set/modify job's comment
``` ```
$ scontrol update JobId=JOBID Comment='The best job ever' $ scontrol update JobId=JOBID Comment='The best job ever'
...@@ -177,9 +189,42 @@ $ scontrol update JobId=JOBID Comment='The best job ever' ...@@ -177,9 +189,42 @@ $ scontrol update JobId=JOBID Comment='The best job ever'
## Deleting Jobs ## Deleting Jobs
Delete job by job id.
``` ```
$ scancel JOBID $ scancel JOBID
``` ```
Delete all my jobs
```
$ scancel --me
```
Delete all my jobs in interactive mode
```
$ scancel --me -i
```
Delete all my running jobs
```
$ scancel --me -t running
```
Delete all my pending jobs
```
$ scancel --me -t pending
```
Delete all my pending jobs for project PROJECT-ID
```
$ scancel --me -t pending -A PROJECT-ID
```
[1]: https://slurm.schedmd.com/ [1]: https://slurm.schedmd.com/
[2]: https://slurm.schedmd.com/srun.html#SECTION_OUTPUT-ENVIRONMENT-VARIABLES [2]: https://slurm.schedmd.com/srun.html#SECTION_OUTPUT-ENVIRONMENT-VARIABLES
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment