VNC server configuration for multi-monitor support

I use VNC at the office for the vast majority of my work. We have a cluster of Linux servers set up for us to log into via SSH to do our work. They enable us to use VNC so that we can get a persistent desktop environment for ease of workflow. I'm connecting from a dual-monitor Windows system on my desk. I can set up the TightVNC server for my session as I feel appropriate.

What I would like to be able to do is set up the server so that it will create a session that has two separate monitors. They can be stitched together into a single viewport but I don't want my Fluxbox taskbar or maximized windows spanning across the whole screen, as when I just double the horizontal resolution. To X, I want it to see two screens. But to the TightVNC Viewer, it would just appear as a single wide display.

I've looked around online and have seen mention of people being able to do this, but no real tutorials or lists of switches to pass in order to make it happen. I can't use xorg.conf to create multiple adapters, as I don't have root (and we don't have system-wide xorg.conf files anyway). The servers exist on a rack and are headless, so there's no unused physical adapters that I can repurpose as virtual monitors for VNC. I have tried specifying multiple screens on the vncserver command-line with -screen and then using xrandr to try and place "VNC-1" next to "VNC-0", but it keeps reporting that the output named "VNC-1" is not found, even though if I query xrandr with --screen 1 it shows a display connected to it. If I VNC into a session set up this way, I still only see screen 0 (on output "VNC-0").

I figure there must be something I'm missing. Configuring with xrandr seems to be key but I can't seem to figure out the prerequisite steps when invoking the VNC server.

Also to note:

  • We are using Xvnc TightVNC 1.3.0 as the server. As such, it does not use x11vnc as a back-end.
  • My Windows machine is connecting with the latest version of the TightVNC viewer (2.7.10 as of this post).
  • I've tried doing this setup with both Fluxbox and Gnome and there was no difference in behavior--only in what desktop/WM was running.
  • I don't want to set up two separate DISPLAYs in this session. I want windows created on one monitor to be able to be dragged over to the other.

What am I missing? Or is what I want to do even possible?

EDIT: (6/16/2016) To emphasize, there is no physical display (used or unused) that I can see available. I don't know what the physical machine looks like but it is headless so there very well could be no physical display adapter on it.

When I run xrandr after connecting via SSH with X forwarding, this is what I see:

xrandr: Failed to get size of gamma for output default
Screen 0: minimum 0 x 0, current 3840 x 1200, maximum 32768 x 32768
default connected 3840x1200+0+0 0mm x 0mm
   3840x1200       0.0* 

From a VNC session:

Screen 0: minimum 32 x 32, current 1920 x 1200, maximum 32768 x 32768
VNC-0 connected 1920x1200+0+0 0mm x 0mm
   1920x1200      60.0* 
   1920x1080      60.0  
   1600x1200      60.0  
   1680x1050      60.0  
   1400x1050      60.0  
   1360x768       60.0  
   1280x1024      60.0  
   1280x960       60.0  
   1280x800       60.0  
   1280x720       60.0  
   1024x768       60.0  
   800x600        60.0  
   640x480        60.0  

If I try to add a mode to an unnamed output, it will show this:

> xrandr --addmode VIRTUAL2 1920x1200_60.00
xrandr: cannot find output "VIRTUAL2"

I could use any name for an output (I've tried several), and this error is all that I see.


The trick here is to generate a virtual monitor on the server, place it in relation to the real monitor wherever you would like it to be, then instruct VNC to use that portion of the total monitor (real + virtual) you would like to use. All of this is well documented on the ever helpful Arch Linux forum.

  1. To create a virtual mirror:

    $ gtf 1920 1080 60
    # 1920x1080 @ 60.00 Hz (GTF) hsync: 67.08 kHz; pclk: 172.80 MHz
    Modeline "1920x1080_60.00"  172.80  1920 2040 2248 2576  1080 1081 1084 1118  -HSync +Vsync
    

    allows you to find the modeline needed: this assumes a virtual monitor of size 1920x1080, and a refresh rate rate of 60Hz, you may adjust this to your needs.

    Now you may generate the new modeline by means of

    xrandr --newmode "1920x1080_60.00"  172.80  1920 2040 2248 2576  1080 1081 1084 1118  -HSync +Vsync
    
  2. You may now generate the VIRTUAL1 monitor by means of:

    xrandr --addmode VIRTUAL1 1920x1080_60.00
    
  3. Now you may place the virtual monitor to the left of your existing monitor (HDMI1, in my case, change this as it best suits you):

    xrandr --output VIRTUAL1 --mode 1920x1080_60.00 --left-of HDMI1
    
  4. Lastly, you may decide to see only the virtual part of the display (for instance) by means of

    x11vnc -clip 1920x1080+0+0
    x11vnc -clip xinerama1
    

    (whichever works for you). If you want to see the whole monitor space (virtual + real), just omit the above.

  5. The original post referenced above suggests passing the following two parameters on the remote computer, when you wish to start the vnc session:

    vncviewer -encodings "tight copyrect"
    

    I never found this necessary, but I will pass it along to you since YMMV.

P.S.: your description of how X11 should treat the presence of the two distinct monitors is confusing. What X11 does easily, and without any need of our intervention, is to create a single workspace (called a screen) out of the existing monitors, in such a way that windows can be dragged from one monitor to another, and the mouse flows without barriers over the whole monitor space (the screen). Also, maximizing windows fills only the monitor they are located in, not the screen. This is what I assumed you desired, and what the above achieves without any extra work. Anything else requires work (if it can be done at all).


Using Fedora 26, I faced the same issue with VIRTUAL1 not being shown by xrandr command.

Then I followed other instructions to reuse an empty output, like HDMI-2, and it worked quite well, but:

  • Gnome (and xrandr) does not recognize the empty output as connected, and does not show the virtual monitor to arrange position in extended desktop or clone. Every time a new physical monitor is connected in other output, it has a high probability to break.
  • worse part is that as gnome-shell (and mutter underneath) do not
    consider the new virtual monitor as part of the viewable area in
    composite framebuffer, it's not properly repainted, and windows
    tearing when moved, leaving a permanent animated trail behind them,
    even after closing the window, even after restarting x11vnc, they are still there.

Then after a short trial and error with dummy driver, I found that it's not required a full xorg.conf file but it's enough to add the "intel" driver (in my case; you should use your own driver name) info to a file under /etc/X11/xorg.conf.d directory to activate the VIRTUAL1 and VIRTUAL2 outputs, and make gnome-shell recognize them as valid outputs. (I also noticed output names changed slightly, e.g. from "eDP-1" to "eDP1") Additionally I added the option "TearLess" set to true, to have the driver repainting when composite manager does not.

~# vi /etc/X11/xorg.conf.d/01-dummy-monitor.conf
Section "Device"
        Identifier      "Configured Video Device"
    Driver "intel"         #CHANGE THIS
    Option "TearLess"   "1"
EndSection

Section "Monitor"
        Identifier      "Configured Monitor"
EndSection

Section "Screen"
        Identifier      "Default Screen"
        Monitor         "Configured Monitor"
        Device          "Configured Video Device"
EndSection

After that, the instructions in above response work well, and new virtual monitor can be managed in Gnome screen settings. If you have VNC client devices with different resolutions, you can add then using xrandr as explained, and assign them in the Gnome screen settings.