Skip to content
Snippets Groups Projects
Commit 8a1eb760 authored by David Hrbáč's avatar David Hrbáč
Browse files

Spell check

parent 227f5082
No related branches found
No related tags found
5 merge requests!368Update prace.md to document the change from qprace to qprod as the default...,!367Update prace.md to document the change from qprace to qprod as the default...,!366Update prace.md to document the change from qprace to qprod as the default...,!323extended-acls-storage-section,!36Spell check
Pipeline #
......@@ -344,10 +344,10 @@ The procedure to obtain the CESNET access is quick and trouble-free.
CESNET storage access
---------------------
### Understanding Cesnet storage
### Understanding CESNET storage
!!! Note "Note"
It is very important to understand the Cesnet storage before uploading data. Please read <https://du.cesnet.cz/en/navody/home-migrace-plzen/start> first.
It is very important to understand the CESNET storage before uploading data. Please read <https://du.cesnet.cz/en/navody/home-migrace-plzen/start> first.
Once registered for CESNET Storage, you may [access the storage](https://du.cesnet.cz/en/navody/faq/start) in number of ways. We recommend the SSHFS and RSYNC methods.
......@@ -356,9 +356,9 @@ Once registered for CESNET Storage, you may [access the storage](https://du.cesn
!!! Note "Note"
SSHFS: The storage will be mounted like a local hard drive
The SSHFS provides a very convenient way to access the CESNET Storage. The storage will be mounted onto a local directory, exposing the vast CESNET Storage as if it was a local removable harddrive. Files can be than copied in and out in a usual fashion.
The SSHFS provides a very convenient way to access the CESNET Storage. The storage will be mounted onto a local directory, exposing the vast CESNET Storage as if it was a local removable hard disk drive. Files can be than copied in and out in a usual fashion.
First, create the mountpoint
First, create the mount point
```bash
$ mkdir cesnet
......@@ -407,18 +407,18 @@ Rsync finds files that need to be transferred using a "quick check" algorithm (b
More about Rsync at <https://du.cesnet.cz/en/navody/rsync/start#pro_bezne_uzivatele>
Transfer large files to/from Cesnet storage, assuming membership in the Storage VO
Transfer large files to/from CESNET storage, assuming membership in the Storage VO
```bash
$ rsync --progress datafile username@ssh.du1.cesnet.cz:VO_storage-cache_tape/.
$ rsync --progress username@ssh.du1.cesnet.cz:VO_storage-cache_tape/datafile .
```
Transfer large directories to/from Cesnet storage, assuming membership in the Storage VO
Transfer large directories to/from CESNET storage, assuming membership in the Storage VO
```bash
$ rsync --progress -av datafolder username@ssh.du1.cesnet.cz:VO_storage-cache_tape/.
$ rsync --progress -av username@ssh.du1.cesnet.cz:VO_storage-cache_tape/datafolder .
```
Transfer rates of about 28MB/s can be expected.
Transfer rates of about 28 MB/s can be expected.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment