Maximum resolution through VGA/DVI/HDMI(/etc)?

Solution 1:

See How Many Dots Has It Got for what is today still called VGA, but has evolved far beyond 640x480.

The resolution champion today seems to be WHUXGA (Wide Hex Ultra Extended Graphics Array) with resolution of 7680x4800 (36,864,000 total pixels).

Using VGA, the signal from the CPU is converted to VGA by the video adapter and sent to a monitor with VGA input. The conversion to VGA causes some loss of quality.

With DVI the signal is not converted (kept digital) and sent to the DVI input on the monitor.

DVI and HDMI are exactly the same as one another, image-quality-wise. The principal differences are that HDMI carries audio as well as video, and uses a different type of connector, but both use the same encoding scheme, and that's why a DVI source can be connected to an HDMI monitor, or vice versa, with a DVI/HDMI cable, with no intervening converter box.

source

(Note: Remember that this answer is from the year 2010.)

Solution 2:

DVI's clock speed determines the maximum bandwidth, which is resolution times refresh rate. You can get a higher resolution by lowering the refresh rate - some LCD monitors will let you run them at, say, 50Hz instead of 60Hz refresh, and while the screen is a little slower to update, they don't flicker like the old CRTs used to. Single-link DVI has a specified maximum clock of 165MHz but various unofficial 'overclocking' hacks exist. Dual-link DVI has at least twice the bandwidth of ordinary DVI, but according to Wikipedia has no upper limit on clock speed, so 'is constrained only by hardware'. For example 3840x2400@31Hz is practical with the right hardware. Short, good-quality cables help.

VGA being an analogue rather than digital connector tends to degrade gradually as bandwidth increases. Higher resolutions just aren't as crisp as DVI, even with high-quality cable and a good monitor. (This isn't so much the fault of the VGA cable as the electronics at either end. It may be that these days anyone who cares uses DVI, so even high-end monitors use cheap electronics for the analogue to digital conversion.) I have 1920x1080 over VGA but the display ends up a bit smudged compared to using DVI. Fiddling the sharpness setting on the monitor helps. There were CRTs which went up to 2048x1536 or 2304x1440 and used a VGA connector (or five separate BNC connectors, which in turn plugged into a VGA output).

Indeed, the Matrox DualHead2Go family of products will accept up to 3840x1200 resolution over VGA, then split it out over two or three monitors. The refresh rate is slightly reduced from 60Hz to 57Hz - but it shows that VGA has a fairly high maximum bandwidth, more than single-link DVI. (The output from the Matrox product may be digital even when its input is analogue, so this can be a way to improve picture quality when using a VGA output from the computer. The analogue-to-digital converter in the *Head2Go may be a bit better than the one in your monitor.)