Calibrate two monitors to the same video settings

Background: Just started a new job as a software developer for an amazing company. I have at my workstation two beautiful 22” LCD monitors. It just happens, however, that the laptop port replicators we use have one DVI and one VGA connector each; so one monitor is connected via DVI and the other via VGA.

It appears that having these two identical monitors connected over different interfaces screws with the color/brightness/contrast/whatever enough to make a noticeable difference, even after I have reset both back to their default settings. I have tried manually adjusting away the difference, but just can’t seem to reconcile the difference easily (“okay now the whites are the same, but the colors are off…okay now the desktop colors match, but the whites are off…). An exhaustive experimentation would entail something like 256^3 (for the RGB) multiplied by 100^2 (for each of the brightness and contrast). And that is a lot of monitor button clicking.

Without resorting to some expensive monitor calibrator, does anyone have a strategy they use to calibrate two monitors to look the same (e.g., “match the reds first, then the contrast…”)?


It's likely not the interfaces--just like high-end microphones, the same assembly line turns out monitors with different temperatures, highs, and lows. And like those trinitron lines, if you're sensitive enough to notice the difference it's going to bug the heck out of you no matter what you do.

One option: You can use one monitor for visual tasks and one for more monotony--or have your tools on one screen and your main work area on the other. Then you don't have to have them matched.

Otherwise:

  1. Get a test card up on both screens.

    enter image description here

  2. Keep the brightness and contrast near the middle when you start calibrating. It's a little muddier, but you'll be able to get it equally muddy at least.
  3. Start by calibrating the temperature, if your monitors have that setting.
  4. Then it's 100^2 X 256^3.

I just matched two Eizo L768 monitors. One is connected via VGA, the other via DVI. As of @bert answer I used the DVI monitor as reference and tried to change the VGA monitors settings to match the DVI monitor.

This was my strategy:

  1. First I went to http://tft.vanity.dk/MonitorTest_pureHTML.html and resized my browser window over both screens.
  2. I opened a white screen and noticed, that the DVI monitor looked more red. But roughly the same brightness.
  3. I opened a red screen. No noticeable difference.
  4. Blue screen: Yes, there was a difference. On the VGA monitor I opened the gain settings of my monitor, which were red 100 %, blue 100 % and green 100 % by default. Lowered blue to 89 %.
  5. Green screen: Difference! Lowered green in the gain settings to 89 %.
  6. Now I opened a video from the German news television. The speakers face was still a bit pale on the VGA monitor. So I opened the saturation settings of the monitor and raised them to 15.
  7. Now some other parts of the picture in the news (the blue background of the studio) was to bright). I opened the brightness settings of the monitor and changed them to 78%.
  8. Watched the whole news with this setting and the video screen resized to both monitors. Ok.
  9. White screen again. Ok.

Finished. Might be useful to others.


With DVI, the signal is just 0-255 binary information for each subpixel. The monitor is set up to make a decent sRGB image of it with a 2.2 gamma by default.

With VGA however, the output can be controlled via software. the DA converter on the videocard translates every 0-255 value into a voltage on the VGA cable. Then the monitor converts the voltage back to a 0-255 subpixel value again. The "output voltage curve" can be controlled by programs like Adobe Gamma, or the Nvidia control center. Try fiddling with that, instead of the options on the monitor itself.