Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
docs.it4i.cz
Manage
Activity
Members
Labels
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container registry
Model registry
Operate
Environments
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
SCS
docs.it4i.cz
Commits
213c5435
Commit
213c5435
authored
3 years ago
by
Jan Siwiec
Browse files
Options
Downloads
Patches
Plain Diff
Update capacity-computing.md
parent
77f95047
No related branches found
No related tags found
No related merge requests found
Pipeline
#22245
passed with warnings
3 years ago
Stage: test
Stage: build
Stage: deploy
Stage: after_test
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
docs.it4i/general/capacity-computing.md
+20
-8
20 additions, 8 deletions
docs.it4i/general/capacity-computing.md
with
20 additions
and
8 deletions
docs.it4i/general/capacity-computing.md
+
20
−
8
View file @
213c5435
...
@@ -191,7 +191,9 @@ You thus do not have to manually aggregate your tasks into PBS jobs. See the [pr
...
@@ -191,7 +191,9 @@ You thus do not have to manually aggregate your tasks into PBS jobs. See the [pr
*
On Barbora and Karolina, you can simply load the HyperQueue module:
*
On Barbora and Karolina, you can simply load the HyperQueue module:
`$ ml HyperQueue`
```
console
$
ml HyperQueue
```
*
If you want to install/compile HyperQueue manually, follow the steps on the
[
official webpage
][
b
]
.
*
If you want to install/compile HyperQueue manually, follow the steps on the
[
official webpage
][
b
]
.
...
@@ -202,7 +204,9 @@ You thus do not have to manually aggregate your tasks into PBS jobs. See the [pr
...
@@ -202,7 +204,9 @@ You thus do not have to manually aggregate your tasks into PBS jobs. See the [pr
To use HyperQueue, you first have to start the HyperQueue server. It is a long-lived process that
To use HyperQueue, you first have to start the HyperQueue server. It is a long-lived process that
is supposed to be running on a login node. You can start it with the following command:
is supposed to be running on a login node. You can start it with the following command:
`$ hq server start`
```
console
$
hq server start
```
#### Submitting Computation
#### Submitting Computation
...
@@ -211,15 +215,19 @@ You can find more information in the [documentation][2].
...
@@ -211,15 +215,19 @@ You can find more information in the [documentation][2].
*
Submit a simple job (command
`echo 'Hello world'`
in this case)
*
Submit a simple job (command
`echo 'Hello world'`
in this case)
`$ hq submit echo 'Hello world'`
```console
$ hq submit echo 'Hello world'
```
*
Submit a job with 10000 tasks
*
Submit a job with 10000 tasks
`$ hq submit --array 1-10000 my-script.sh`
```console
$ hq submit --array 1-10000 my-script.sh
```
Once you start some jobs, you can observe their status using the following commands:
Once you start some jobs, you can observe their status using the following commands:
```
```
console
#
Display status of a single job
#
Display status of a single job
$
hq job <job-id>
$
hq job <job-id>
...
@@ -238,17 +246,21 @@ The workers should run on computing nodes, so you can start them using PBS.
...
@@ -238,17 +246,21 @@ The workers should run on computing nodes, so you can start them using PBS.
*
Start a worker on a single PBS node:
*
Start a worker on a single PBS node:
``$ qsub <qsub-params> -- `which hq` worker start``
```console
$ qsub <qsub-params> -- `which hq` worker start
```
*
Start a worker on all allocated PBS nodes:
*
Start a worker on all allocated PBS nodes:
``$ qsub <qsub-params> -- `which pbsdsh` `which hq` worker start``
```console
$ qsub <qsub-params> -- `which pbsdsh` `which hq` worker start
```
In an upcoming version, HyperQueue will be able to automatically submit PBS jobs with workers
In an upcoming version, HyperQueue will be able to automatically submit PBS jobs with workers
on your behalf.
on your behalf.
!!! tip
!!! tip
For debugging purposes, you can also start the worker, e.g. on a login
using
simply by running
For debugging purposes, you can also start the worker, e.g. on a login
node,
simply by running
`$ hq worker start`
. Do not use such worker for any long-running computations.
`$ hq worker start`
. Do not use such worker for any long-running computations.
### Architecture
### Architecture
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment