My computer doesn't recognize the monitor's resolution on some VGA cables, what's going on?

As @Wyzard says, computers talk to monitors via DDC and this uses 4 pins on the vga. See the pinout on wikipedia.

A working vga cable needs all the pins wired independently from each, and also from the cable shielding to the chassis, i.e the metal cover of the plugs and sockets. Only pins 4 and 11 can be missing or not cabled.

Inside a monitor there is a small eprom holding information on the monitor's resolutions (the EDID). This eprom is accessible even when the monitor is powered off, as it is independently powered from the computer by 5 volts on pin 9, with ground on pins 10 and 5. The eprom is read using i2c clock and data signals on pins 12 and 15. These days with DDC, the i2c bus can do other things too.

The analogue video signals on pins 1, 2, 3 have independent analogue ground on 6, 7, and 8.

Some bad vga cables will not have these analogue grounds, nor the i2c ground, and assume the cable shielding will do. This often does not work. The only way to check vga cables is with a continuity tester (multimeter) ensuring that all pins are wired through, and none are in common, including the shield.


Detecting a monitor's resolution depends on information provided by the monitor itself via DDC. In a VGA cable, this uses several pins and wires that are separate from the ones used for the actual video signal.

A cheap VGA cable might be missing some of these wires (to save on costs), or the wire might be present but broken (due to wear & tear).