What exactly is VGA, and what is the difference between it and a video card?

Operating system development tutorials pinpoint reaching screen data by writing directly to VGA or EGA or Super VGA, but what I do not get is what is the real difference between writing to a fixed address for display, and writing to a video card directly, either onboard or removable? I just want the basic clarification of my confusion on this on my issue

And since it's not such a simple case with variables in cards, connective-interfaces, buses, architectures, system on a chip, embedded systems, etc., I find it to be hard to find a way to understand the idea behind this 100%. Would the fixed addresses differ from a high-end GPU to a low-end onboard one? Why and why not?

It is one of my goals in programming to host a kernel and make an operating system, and a farfetched dream indeed. Failing to understand the terminology not only hinders me in some areas, but makes me seem foolish on the subjects of hardware.

EXTRA: Some of these current answers speak of using the processors maximum addressable memory in the specifics on 16-bits. The problem is some of these other arising issues:

1.What about the card's own memory? That would not need system RAM for screen data itself.

2.What about in higher-bit modes? And can't you not neglect BIOS in real mode(x86)and still address memory through AL?

3.How would the concept of writing to a fixed address remain unchanged on a GPU with multitudes of registers and performance at or above the actual microprocessor?


Solution 1:

Technically VGA stands for Video Graphics Array, a 640x480 video standard introduced in 1987. At the time that was a relative high resolution, especially for a colour display.

Before VGA was introduced we had a few other graphics standards, such as hercules which displayed either text (80 lines of 25 chars) or for relative high definition monochrome graphics (at 720x348 pixels).

Other standards at the time were CGA (Colour graphic adapter), which also allowed up to 16 colours at a resolution of up to 640x200 pixels. The result of that would look like this:

enter image description here

Finally, a noteworthy PC standard was the Enhanced graphics adapter (EGA), which allowed resolutions up to 640×350 with 64 colours.

(I am ignoring non-PC standards to keep this relative short. If I start to add Atari or Amiga standards -up to 4096 colours at the time!- then this will get quite long.)

Then in 1987 IBM introduced the PS2 computer. It had several noteworthy differences compared with its predecessors, which included new ports for mice and keyboards (Previously mice used 25 pins serial ports or 9 pins serial ports, if you had a mouse at all); standard 3½ inch drives and a new graphic adapter with both a high resolution and many colours.

This graphics standard was called Video Graphics Array. It used a 3 row, 15 pin connector to transfer analog signals to a monitor. This connector is lasted until a few years ago, when it got replaced by superior digital standards such as DVI and display port.

After VGA

Progress did not stop with the VGA standards. Shortly after the introduction of VGA new standards arose such as the 800x600 S uper VGA (SVGA), which used the same connector. (Hercules, CGA, EGA etc all had their own connectors. You could not connect a CGA monitor to a VGA card, not even if you tried to display a low enough resolution).

Since then we have moved on to much higher resolution displays, but the most often used name remains VGA. Even though the correct names would be SVGA, XVGA, UXGA etc etc.

enter image description here

(Graphic courtesy of Wikipedia)


Another thing which gets called 'VGA' is the DE15 connector used with the original VGA card. This usually blue connector is not the only way to transfer analog 'VGA signals' to a monitor, but it is the most common.

Left: DB5HD Right: Alternative VGA connectors, usually used for better quality) enter image description here


A third way 'VGA' is used is to describe a graphics card, even though that card might produce entirely different resolutions than VGA. The use is technically wrong, or should at least be 'VGA compatible card', but common speech does not make that difference.


That leaves writing to VGA

This comes from the way the memory on an IBM XT was devided. The CPU could access up to 1MiB (1024KiB) of memory. The bottom 512KiB was reserved for RAM, the upper 512 KiB for add-in cards, ROM etc.

This upper area is where the VGA cards memory was mapped to. You could directly write to it and the result would show up on the display.

This was not just used for VGA, but also for same generation alternatives.

  G = Graphics Mode Video RAM
  M = Monochrome Text Mode Video RAM
  C = Color Text Mode Video RAM
  V = Video ROM BIOS (would be "a" in PS/2)
  a = Adapter board ROM and special-purpose RAM (free UMA space)
  r = Additional PS/2 Motherboard ROM BIOS (free UMA in non-PS/2 systems)
  R = Motherboard ROM BIOS
  b = IBM Cassette BASIC ROM (would be "R" in IBM compatibles)
  h = High Memory Area (HMA), if HIMEM.SYS is loaded.

Conventional (Base) Memory:   
First 512KB (or 8 chunks of 64KiB). 

Upper Memory Area (UMA):

0A0000: GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG
0B0000: MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCC
0C0000: VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
0D0000: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
0E0000: rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
0F0000: RRRRRRRRRRRRRRRRRRRRRRRRbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbRRRRRRRR

(Source of the ASCII map).

Solution 2:

Writing to a "fixed address" was essentially writing to a video card directly. All those video ISA video cards (CGA, EGA, VGA) essentially had some RAM (and registers) mapped directly into the CPUs memory and I/O space.

So when you wrote a byte to a certain memory location, that character (in text mode) appeared on screen immediately, since you in fact wrote into a memory located on a video card, and video card just used that memory.

This all looks very confusing today, especially considering that today's video cards sometimes are called VGA (and they have bear resemblance to "true" VGA cards from 1990s). However even modern cards emulate some of the functionality of these older designs (you can boot DOS on most modern PCs and use DOS programs that write to video memory directly). Of course, nowdays it's all emulated in video card's firmware.