How do I set up a second display via DVI/viewport with a Lenovo T420S?

My Lenovo T420S has a discrete graphic card with nvidia optimus technology. What I'm basically trying to achieve is the following:

  • use a second display - connected via DVI/viewport - at work (VGA is blurry)
  • achieve a moderate to good battery performance while travelling (2 hours)

As far as I figured out, I have the following options:

  1. disable Optimus, use internal graphics exclusively
  2. disable Optimus, use nvidia exclusively
  3. enable Optimus, use bumblebee (homepage) / ironhide

(1) disqualifies as I read (and experienced) that DVI / viewport is technically not usable via internal graphics

(2) haven't really tried so far, just a quick test-install that booted into a black screen after I added the nvidia drivers :-(

(3) followed this blog, used bumblebee instead of ironhide. Bumblebee worked (I can see impressive FPS on glxgears), but I did not get the second display to be recognized. I also felt lost in the nvidia-driver-hell and had no chance to run nvidia-xconfig, simply because it was not installed. No idea if a second display is supposed to be recognized out of the box? Do I need to install more? Do I need to mess with my xorg.conf? Many questions, few answers.

So, what can I do to achieve my goals? Which path to follow, and what are the next steps?

Any hint is welcome :-)

Update: Thanks to everyone who answered. I will migrate my work environment to a "discrete" installation, and will use "optimus/bumblebee" as a parallel play project and see how far I get... I will post future questions in new threads.


Solution 1:

Ubuntu 14.10 and later: It's much, much simpler there. Please see this answer and my comment below.

Note: This only works in Ubuntu 13.04. There are some differences in 13.10.

I have managed to connect two external monitors (in addition to the built-in panel) to my ThinkPad T430 on Ubuntu 13.04, with Optimus ("Switchable graphics") enabled (option 3 in your list). The monitors are connected via the DVI interface, one of them is rotated. In contrast to other solutions, all monitors are attached to the same window manager, so windows can be moved freely between the monitors. This achieves the goal of enhanced battery life if disconnected and using external monitors if connected.

The key idea here is:

  • The internal graphics adapter is responsible for managing the image (bitmap) that is actually displayed
  • By default, everything is rendered on the internal graphics adapter
  • GPU accelerated applications use the discrete graphics adapter, the output is copied to the internal graphics adapter
  • For each external monitor, the internal graphics adapter provides a "virtual" display
  • Output to the external monitors happens using a second X server, the contents from the "virtual" displays are constantly copied to the second X server

The major benefit over other solutions is that all displays are (seemingly) part of the same X session, so you can freely move windows between the displays.

So far I have noticed no performance penalty.

Instructions

You need to do the following:

  • Install Bumblebee from a PPA
  • Build and install a custom Intel video driver
  • Download, compile and finally install a small program
  • Edit two configuration files
  • Reboot several times

For most actions you will need a terminal, a text editor, and root access (sudo). Detailed instructions are given below.

Install Bumblebee

Follow the "basic setup" section of the instructions. Execute as root, the last command actually initiates the reboot:

add-apt-repository ppa:bumblebee/stable
apt-get update
apt-get install bumblebee virtualgl linux-headers-generic
reboot

Don't try to Run bumblebee with nouveau driver only?. From my experience it doesn't work, at least not in this setup.

Validation

You should be able to run optirun glxgears.

Install a patched version of xserver-xorg-video-intel

Option 1: Install from my PPA (currently only Ubuntu 13.04)

Execute the following as root:

add-apt-repository ppa:krlmlr/ppa
apt-get update
apt-get install xserver-xorg-video-intel

Option 2: Build and install your own package

Choose the most recent patch for xserver-xorg-video-intel. Click the file, click the "Raw" button, copy the URL in the browser. At the time of writing, this was https://raw.github.com/liskin/patches/master/hacks/xserver-xorg-video-intel-2.20.14_virtual_crtc.patch.

sudo apt-get build-dep xserver-xorg-video-intel
cd ~
apt-get source xserver-xorg-video-intel
cd xserver-xorg-video-intel
# replace the URL below with the one you have noted, if necessary
wget https://raw.github.com/liskin/patches/master/hacks/xserver-xorg-video-intel-2.20.14_virtual_crtc.patch
patch -p1 < *.patch
# The next command will ask for a change log message. Supply something meaningful,
# this will later allow you to distinguish your patched package from the distribution's.
dch -l+virtual
dpkg-buildpackage -b
cd ..
sudo dpkg --install xserver-xorg-video-intel_*.deb

Validation (1), for both options

The command

apt-cache policy xserver-xorg-video-intel

should show the patched version (+virtual suffix) and the original Ubuntu version.

Necessary for 13.04, for both options

Add the following to your /etc/X11/xorg.conf, create if necessary:

Section "Device"
    Identifier "intel"
    Driver "intel"
    Option "AccelMethod" "uxa"
    Option "Virtuals" "2"
EndSection

Validation (2), for both options

After a reboot, run xrandr in a terminal. The output should list two additional virtual displays.

Download and build screenclone

Get puetzk's fork of screenclone and its dependencies, and compile it.

sudo apt-get install libxcursor-dev libxdamage-dev libxinerama-dev libxtst-dev git build-essential
cd ~
git clone git://github.com/puetzk/hybrid-screenclone.git
cd hybrid-screenclone
make

Validation

The file screenclone exists and is executable. (It won't run yet, though.)

Edit xorg.conf.nvidia

  • Open the file /etc/bumblebee/xorg.conf.nvidia in a text editor, as root
  • Comment out or remove the lines that read UseEDID or UseDisplayDevice
  • In the Section "ServerLayout", add an entry Screen "Screen0"
  • At the bottom of the file, add the following:

    Section "Screen"
        Identifier     "Screen0"
        Device         "Device0"
        DefaultDepth    24
        SubSection     "Display"
        Depth       24
        EndSubSection
    EndSection
    
  • Reboot

Testing

My setup assumes a landscape monitor connected to the first DVI port of the docking station, and a portrait one connected to the second DVI port. Run the following commands in a terminal from the directory where screenclone is located, adapt as necessary.

xrandr --output LVDS1 --output VIRTUAL1 --mode 1920x1200 --right-of LVDS1 --output VIRTUAL2 --mode 1920x1200 --right-of VIRTUAL1 --rotate left
./screenclone -b -x 1:0 -x 2:1 &
sleep 1
xrandr -d :8 --output DP-2 --right-of DP-1 --rotate left
fg

Note how the display rotation has to be defined twice. You can omit the second invocation of xrandr if no rotation is desired (and, of course, the --rotate left in the first invocation).

By terminating screenclone with Ctrl+C (which has been put into the foreground again using fg), the discrete graphics adapter is shut off. You can verify this with cat /proc/acpi/bbswitch. Still, screen space is reserved for the two now disconnected monitors. To switch back to laptop display only, use

xrandr --output LVDS1 --output VIRTUAL1 --off --output VIRTUAL2 --off

Cleanup

  • Copy screenclone to a directory that is in the PATH (e.g., /usr/local/bin)

  • Create a bash script to automate startup and shutdown of the external displays. This script will setup external displays on start and switch to laptop display only on exit (e.g., by hitting Ctrl+C).

    #!/bin/bash
    set -m
    xrandr --output LVDS1 --output VIRTUAL1 --mode 1920x1200 --right-of LVDS1 --output VIRTUAL2 --mode 1920x1200 --right-of VIRTUAL1 --rotate left
    trap "xrandr --output LVDS1 --output VIRTUAL1 --off --output VIRTUAL2 --off" EXIT
    screenclone -b -x 1:0 -x 2:1 &
    sleep 1
    xrandr -d :8 --output DP-2 --right-of DP-1 --rotate left
    fg
    
  • Alternative option: My collection of scriptlets contains two scripts, extmon-start and extmon-stop, that enable and disable the second and third monitor. Edit the extmon-start script to suit your configuration.

References

My answer largely draws from the following resources:

  • http://zachstechnotes.blogspot.ch/2012/04/post-title.html (the original description)
  • http://sagark.org/optimal-ubuntu-graphics-setup-for-thinkpads/ (a simplified version of the above)
  • https://github.com/liskin/patches/issues/4 (instructions on how to enable the patched version of the Intel driver on 13.04)
  • https://github.com/puetzk/hybrid-screenclone (an enhanced fork of screenclone with integrated Bumblebee support)
  • https://github.com/liskin/hybrid-screenclone/issues/2 (for a list of dependencies)
  • https://github.com/liskin/hybrid-screenclone/issues/7 (with crucial hints on how to enable screen rotation)
  • https://github.com/Bumblebee-Project/Bumblebee/issues/77#issuecomment-18899607 (an earlier comment on GitHub)

Solution 2:

I have a Thinkpad W520 and have messed around with this extensively. I am not sure how much the W520 and T420S have in common, but I have written a blog post here outlining the big picture situation and giving some suggestions.

You should be able to get 2 hours of battery life using the nvidia card with proprietary drivers full time. That will also probably be the option that involves the least hassle when adding an external display (The program "disper" is very helpful for this). If the nvidia proprietary drivers for the card in the T420s function the same as for the W520, they will automatically underclock the card when it is not in full use (they call this "Power Mizer"), so your battery life will not be too horrible. Honestly, I only get a 25 or 30% increase in battery life by turning off the nvidia card. Also, I believe that with Bumblebee installed, you may not be able to use your external monitor because the nvidia card is already running an X server "under the hood". In summary, I would recommend pursuing option (2) further. Hopefully, once you get the proprietary drivers installed and working, X will autodetect everything and you will not have to mess with your xorg.conf.

When I installed and uninstalled Bumblebee, I had a little bit of trouble getting the nvidia proprietary drivers to work again. Here a few things to look into: (a) The W520 has BIOS options related to which graphics scheme is in use. I'm not sure what the T420 options are, but if you want to use the nvidia graphics on your Thinkpad display, you probably have to be in "discrete" mode. (b) You may have to mess around with the "jockey" program in ubuntu to get it to use the proprietary drivers.

Good luck! I hope this was of some help.