Newer
Older
The Salomon cluster is accessed by SSH protocol via login nodes login1, login2, login3 and login4 at address salomon.it4i.cz. The login nodes may be addressed specifically, by prepending the login node name to the address.
The alias salomon.it4i.cz is currently not available through VPN connection. Please use loginX.salomon.it4i.cz when connected to VPN.
| ---------------------- | ---- | -------- | ------------------------------------- |
| salomon.it4i.cz | 22 | ssh | round-robin DNS record for login[1-4] |
| login1.salomon.it4i.cz | 22 | ssh | login1 |
| login1.salomon.it4i.cz | 22 | ssh | login1 |
| login1.salomon.it4i.cz | 22 | ssh | login1 |
| login1.salomon.it4i.cz | 22 | ssh | login1 |
The authentication is by the [private key](../get-started-with-it4innovations/accessing-the-clusters/shell-access-and-data-transfer/ssh-keys/)
Please verify SSH fingerprints during the first logon. They are identical on all login nodes:
f6:28:98:e4:f9:b2:a6:8f:f2:f4:2d:0a:09:67:69:80 (DSA)
70:01:c9:9a:5d:88:91:c7:1b:c0:84:d1:fa:4e:83:5c (RSA)
Private key authentication:
On **Linux** or **Mac**, use
```bash
local $ ssh -i /path/to/id_rsa username@salomon.it4i.cz
```
If you see warning message "UNPROTECTED PRIVATE KEY FILE!", use this command to set lower permissions to private key file.
```bash
local $ chmod 600 /path/to/id_rsa
```
On **Windows**, use [PuTTY ssh client](../get-started-with-it4innovations/accessing-the-clusters/shell-access-and-data-transfer/putty.md).
After logging in, you will see the command prompt:
_____ _
/ ____| | |
| (___ __ _| | ___ _ __ ___ ___ _ __
\___ \ / _` | |/ _ \| '_ ` _ \ / _ \| '_ \
____) | (_| | | (_) | | | | | | (_) | | | |
|_____/ \__,_|_|\___/|_| |_| |_|\___/|_| |_|
[username@login2.salomon ~]$
```
The environment is **not** shared between login nodes, except for [shared filesystems](storage/).
Data in and out of the system may be transferred by the [scp](http://en.wikipedia.org/wiki/Secure_copy) and sftp protocols.
| salomon.it4i.cz | 22 | scp, sftp |
| login1.salomon.it4i.cz | 22 | scp, sftp |
| login2.salomon.it4i.cz | 22 | scp, sftp |
| login3.salomon.it4i.cz | 22 | scp, sftp |
| login4.salomon.it4i.cz | 22 | scp, sftp |
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
The authentication is by the [private key](../get-started-with-it4innovations/accessing-the-clusters/shell-access-and-data-transfer/ssh-keys/)
On linux or Mac, use scp or sftp client to transfer the data to Salomon:
```bash
local $ scp -i /path/to/id_rsa my-local-file username@salomon.it4i.cz:directory/file
```
```bash
local $ scp -i /path/to/id_rsa -r my-local-dir username@salomon.it4i.cz:directory
```
or
```bash
local $ sftp -o IdentityFile=/path/to/id_rsa username@salomon.it4i.cz
```
Very convenient way to transfer files in and out of the Salomon computer is via the fuse filesystem [sshfs](http://linux.die.net/man/1/sshfs)
```bash
local $ sshfs -o IdentityFile=/path/to/id_rsa username@salomon.it4i.cz:. mountpoint
```
Using sshfs, the users Salomon home directory will be mounted on your local computer, just like an external disk.
Learn more on ssh, scp and sshfs by reading the manpages
```bash
$ man ssh
$ man scp
$ man sshfs
```
On Windows, use [WinSCP client](http://winscp.net/eng/download.php) to transfer the data. The [win-sshfs client](http://code.google.com/p/win-sshfs/) provides a way to mount the Salomon filesystems directly as an external disc.
More information about the shared file systems is available [here](storage/).
Outgoing connections, from Salomon Cluster login nodes to the outside world, are restricted to following ports:
| 22 | ssh |
| 80 | http |
| 443 | https |
| 9418 | git |
Please use **ssh port forwarding** and proxy servers to connect from Salomon to all other remote ports.
Outgoing connections, from Salomon Cluster compute nodes are restricted to the internal network. Direct connections form compute nodes to outside world are cut.
Port forwarding allows an application running on Salomon to connect to arbitrary remote host and port.
It works by tunneling the connection from Salomon back to users workstation and forwarding from the workstation to the remote host.
Pick some unused port on Salomon login node (for example 6000) and establish the port forwarding:
```bash
local $ ssh -R 6000:remote.host.com:1234 salomon.it4i.cz
```
In this example, we establish port forwarding between port 6000 on Salomon and port 1234 on the remote.host.com. By accessing localhost:6000 on Salomon, an application will see response of remote.host.com:1234. The traffic will run via users local workstation.
Port forwarding may be done **using PuTTY** as well. On the PuTTY Configuration screen, load your Salomon configuration first. Then go to Connection->SSH->Tunnels to set up the port forwarding. Click Remote radio button. Insert 6000 to Source port textbox. Insert remote.host.com:1234. Click Add button, then Open.
Port forwarding may be established directly to the remote host. However, this requires that user has ssh access to remote.host.com
```bash
$ ssh -L 6000:localhost:1234 remote.host.com
```
Note: Port number 6000 is chosen as an example only. Pick any free port.
Remote port forwarding from compute nodes allows applications running on the compute nodes to access hosts outside Salomon Cluster.
First, establish the remote port forwarding form the login node, as [described above](#port-forwarding-from-login-nodes).
Second, invoke port forwarding from the compute node to the login node. Insert following line into your jobscript or interactive shell
```bash
```
In this example, we assume that port forwarding from login1:6000 to remote.host.com:1234 has been established beforehand. By accessing localhost:6000, an application running on a compute node will see response of remote.host.com:1234
Port forwarding is static, each single port is mapped to a particular port on remote host. Connection to other remote host, requires new forward.
Applications with inbuilt proxy support, experience unlimited access to remote hosts, via single proxy server.
To establish local proxy server on your workstation, install and run SOCKS proxy server software. On Linux, sshd demon provides the functionality. To establish SOCKS proxy server listening on port 1080 run:
```bash
local $ ssh -D 1080 localhost
```
On Windows, install and run the free, open source [Sock Puppet](http://sockspuppet.com/) server.
Once the proxy server is running, establish ssh port forwarding from Salomon to the proxy server, port 1080, exactly as [described above](#port-forwarding-from-login-nodes).
```bash
local $ ssh -R 6000:localhost:1080 salomon.it4i.cz
```
Now, configure the applications proxy settings to **localhost:6000**. Use port forwarding to access the [proxy server from compute nodes](#port-forwarding-from-compute-nodes) as well.
* The [X Window system](../get-started-with-it4innovations/accessing-the-clusters/graphical-user-interface/x-window-system/) is a principal way to get GUI access to the clusters.
* The [Virtual Network Computing](../get-started-with-it4innovations/accessing-the-clusters/graphical-user-interface/vnc/) is a graphical [desktop sharing](http://en.wikipedia.org/wiki/Desktop_sharing) system that uses the [Remote Frame Buffer protocol](http://en.wikipedia.org/wiki/RFB_protocol) to remotely control another [computer](http://en.wikipedia.org/wiki/Computer).
* Access to IT4Innovations internal resources via [VPN](../get-started-with-it4innovations/accessing-the-clusters/vpn-access/).