Why is HDMI->DVI image sharper than VGA?

VGA is the only analog signal from the above mentioned ones so it's already an explanation for difference. Using the adapter can further worsen your situation.

some further reading: http://www.digitaltrends.com/computing/hdmi-vs-dvi-vs-displayport-vs-vga/


Assuming brightness,contract and sharpness are the same in both cases, there could be 2 other reasons why text is sharper with DVI/HDMI:

The first has already been stated, VGA is analog so will need to go through an analog to digital conversion inside the monitor, this will theoretically degrade image quality.

Secondly, assuming you are using Windows there is a technique called ClearType (developed by Microsoft) which improves the appearance of text by manipulating the sub pixels of an LCD monitor. VGA was developed with CRT monitors in mind and the notion of a sub pixel is not the same. Because of the requirement for ClearType to use an LCD screen and the fact that the VGA standard doesn't tell the host the specifications of the display ClearType would be disabled with a VGA connection.

Source: I remember hearing about ClearType from one its creators on a podcast for This().Developers().Life() IIRC, but http://en.wikipedia.org/wiki/ClearType also supports my theory. Also HDMI is backward compatible with DVI and DVI supports Electronic Display Identification (EDID)


The others make some good points, but the main reason is an obvious clock and phase mismatch. The VGA is analog and is subject to interference and mismatch of the analog sending and receiving sides. Normally one would use a pattern like this:

http://www.lagom.nl/lcd-test/clock_phase.php

And adjust the clock and phase of the monitor to get the best match and the sharpest picture. However, since it is analog, these adjustments may shift over time, and thus you ideally should just use a digital signal.


There are a few answers indicating a digital signal vs. analog which is correct. But that does not answer the why? A few mentioned translation layers, this is sorta true too, a mainstream A/D conversion can cause a loss in fidelity, but you'd have to measure this as it is hard to see the differences with the naked eye. A cheap conversion and all bets are off.

So why is digital better than analog?

An analog RGB signal (such as VGA) uses the amplitude of the signal (.7 Volts peak to peak in the case of VGA). This like all signals has noise which if large enough will cause the levels to be incorrectly translated.

Reflections in the cable (impedance mismatches) are actually the biggest downfall of an analog video signal. This introduces additional noise and gets worse with longer cables (or cheaper ones), the higher the resolution of the video also increases the signal to noise ratio. Interestingly, you should not be able to see any difference in a 800x600 signal unless the VGA cable is too long.

How does a digital signal avoid those pitfalls? Well for one the level is no longer relevant. Also DVI-D/HDMI uses a differential signal as well as error correction to assure the ones and zeros are faithfully transmitted correctly. There's also additional conditioning added to a digital signal that is not practical adding to an analog video signal.

Sorry for the soap box guys, but thems the facts.