Newer
Older
**Virtual Network Computing** (**VNC**) is a graphical [desktop sharing][a] system that uses the [Remote Frame Buffer protocol (RFB)][b] to remotely control another [computer][c]). It transmits the [keyboard][d] and [mouse][e] events from one computer to another, relaying the graphical [screen][f] updates back in the other direction, over a [network][g].
Vnc-based connections are usually faster (require less network bandwidth) than [X11][1] applications forwarded directly through SSH.
The recommended clients are [TightVNC][h] or [TigerVNC][i] (free, open source, available for almost any platform).
In this chapter, we show how to create an underlying SSH tunnel from your client machine to one of our login nodes. Then, how to start your own VNC server on our login node and finally how to connect to your VNC server via the encrypted SSH tunnel.
Local VNC password should be set before the first login. Use a strong password.
To access VNC, a local VNC Server must be started first and a tunnel using SSH port forwarding must be established.
Start by **choosing your display number**.
To choose a free one, you should check currently occupied display numbers - list them using the command:
[username@login2 ~]$ ps aux | grep Xvnc | sed -rn 's/(\s) .*Xvnc (\:[0-9]+) .*/\1 \2/p'
As you can see above, displays ":79" and ":60" are already occupied.
Generally, you can choose display number freely *except these occupied numbers*.
Also remember that display number should be lower or equal 99.
Based on this, **we have chosen the display number 61** for us, as seen in the examples below.
Your situation may be different so the choice of your number may differ, as well. **Choose and use your own display number accordingly!**
[username@login2 ~]$ vncserver :61 -geometry 1600x900 -depth 16
New 'login2:1 (username)' desktop is login2:1
Starting applications specified in /home/username/.vnc/xstartup
Log file is /home/username/.vnc/login2:1.log
Check whether the VNC server is running on the chosen display number (61):
[username@login2 .vnc]$ vncserver -list
TigerVNC server sessions:
[username@login2 .vnc]$ ps aux | grep Xvnc | sed -rn 's/(\s) .*Xvnc (\:[0-9]+) .*/\1 \2/p'
The VNC server runs on port 59xx, where xx is the display number. To get your port number, simply add 5900 + display number, in our example 5900 + 61 = 5961. Another example for display number 102 is calculation of TCP port 5900 + 102 = 6002, but note that TCP ports above 6000 are often used by X11. **Calculate your own port number and use it instead of 5961 from examples below**.
To access the VNC server you have to create a tunnel between the login node using TCP port 5961 and your machine using a free TCP port (for simplicity the very same) in next step. See examples for [Linux/Mac OS][2] and [Windows][3].
The tunnel must point to the same login node where you launched the VNC server, e.g. login2. If you use just cluster-name.it4i.cz, the tunnel might point to a different node due to DNS round robin.
local $ ssh -TN -f username@login2.cluster-name.it4i.cz -L 5961:localhost:5961
Issue the following command to check the tunnel is established (note the PID 2022 in the last column, it is required for closing the tunnel):
local $ netstat -natp | grep 5961
(Not all processes could be identified, non-owned process info
will not be shown, you would have to be root to see it all.)
tcp 0 0 127.0.0.1:5961 0.0.0.0:* LISTEN 2022/ssh
tcp6 0 0 ::1:5961 :::* LISTEN 2022/ssh
local-mac $ lsof -n -i4TCP:5961 | grep LISTEN
ssh 75890 sta545 7u IPv4 0xfb062b5c15a56a3b 0t0 TCP 127.0.0.1:5961 (LISTEN)
In this example, we connect to VNC server on port 5961, via the SSH tunnel. The connection is encrypted and secured. The VNC server listening on port 5961 provides screen of 1600x900 pixels.
You have to close the SSH tunnel which is still running in the background after you finish the work. Use the following command (PID 2022 in this case, see the netstat command above):
You can watch the instruction video on how to make a VNC connection between a local Ubuntu desktop and the IT4I Salomon cluster [here][k].
Start the VNC server using the vncserver command described above.
Search for the localhost and port number (in this case 127.0.0.1:5961):
[username@login2 .vnc]$ netstat -tanp | grep Xvnc
(Not all processes could be identified, non-owned process info
will not be shown, you would have to be root to see it all.)
tcp 0 0 127.0.0.1:5961 0.0.0.0:* LISTEN 24031/Xvnc
On the PuTTY Configuration screen, go to Connection->SSH->Tunnels to set up the tunnel.
Fill the Source port and Destination fields. **Do not forget to click the Add button**.
[Windows Subsystem for Linux][j] is another way to run Linux software in a Windows environment.
At your machine, create the tunnel:
```console
local $ ssh username@login2.cluster-name.it4i.cz -L 5961:localhost:5961
```
Run the VNC client of your choice, select the VNC server 127.0.0.1, port 5961 and connect using the VNC password.
In this example, we connect to the VNC server on port 5961, via the SSH tunnel, using the TigerVNC viewer. The connection is encrypted and secured. The VNC server listening on port 5961 provides a screen of 1600x900 pixels.
Use your VNC password to log using the TightVNC Viewer and start a Gnome Session on the login node.
After the successful login, you should see the following screen:
If the screen gets locked, you have to kill the screensaver. Do not forget to disable the screensaver then.
username 1503 0.0 0.0 103244 892 pts/4 S+ 14:37 0:00 grep screen
username 24316 0.0 0.0 270564 3528 ? Ss 14:12 0:00 gnome-screensaver
Killing Xvnc process ID 7074
Xvnc process ID 7074 already killed
Also, do not forget to terminate the SSH tunnel, if it was used. For details, see the end of [this section][2].
The very same methods as described above may be used to run the GUI applications on compute nodes. However, for maximum performance, follow these steps:
Open a Terminal (Applications -> System Tools -> Terminal). Run all the following commands in the terminal.
Allow incoming X11 graphics from the compute nodes at the login node:
Get an interactive session on a compute node (for more detailed info [look here][4]). Use the **-v DISPLAY** option to propagate the DISPLAY on the compute node. In this example, we want a complete node (16 cores in this example) from the production queue:
$ qsub -I -v DISPLAY=$(uname -n):$(echo $DISPLAY | cut -d ':' -f 2) -A PROJECT_ID -q qprod -l select=1:ncpus=16
Test that the DISPLAY redirection into your VNC session works, by running an X11 application (e.g. XTerm) on the assigned compute node:
[a]: http://en.wikipedia.org/wiki/Desktop_sharing
[b]: http://en.wikipedia.org/wiki/RFB_protocol
[c]: http://en.wikipedia.org/wiki/Computer
[d]: http://en.wikipedia.org/wiki/Computer_keyboard
[e]: http://en.wikipedia.org/wiki/Computer_mouse
[f]: http://en.wikipedia.org/wiki/Computer_screen
[g]: http://en.wikipedia.org/wiki/Computer_network
[h]: http://www.tightvnc.com
[i]: http://sourceforge.net/apps/mediawiki/tigervnc/index.php?title=Main_Page
[j]: http://docs.microsoft.com/en-us/windows/wsl
[k]: https://www.youtube.com/watch?v=b9Ez9UN2uL0
[1]: x-window-system.md
[2]: #linuxmac-os-example-of-creating-a-tunnel
[3]: #windows-example-of-creating-a-tunnel