What is up with screen resolutions? [closed]
Solution 1:
In the "old days", TVs were analog devices - a CRT scans an electron beam across the front of the display, from left to right. In this sense, people have tried to argue that analog TV has "infinite" horizontal resolution, but they have an exact vertical resolution - the picture is formed of multiple horizontal lines.
Depending on where you are, this would have been either NTSC (525 lines) or PAL (625 lines).
Modern resolutions are still referred to by their "line count" - hence 1080 is the vertical resolution.
With such displays, the image was transmitted interlaced - i.e: the first field contains lines 1, 3, 5, etc... and the second field contains lines 2, 4, 6, etc...
With the advent of digital TV, the display technology changes - we now have discrete pixels instead of lines. A 2D image made from an array of pixels - the array having specific dimensions in the horizontal and vertical axes.
At this point, interlacing remains around only to reduce the bandwidth required for a video stream - in my opinion it's a horrible idea that is fundamentally incompatible with digital display systems.
As mentioned earlier, modern resolutions are still referred to by their "line count". But as shown above, we also have to identify whether we are referring to "interlaced" video (indicated by an i
), or "progressive" video (indicated by a p
).
The frame rate can also be specified, for example:
-
480i60
- 480 rows, interlaced, 60 Hz field rate (i.e: 30 Hz frame rate) -
1080p30
- 1080 rows, progressive, 30 Hz frame rate
"... okay, but where did 480 come from?"
The analog electronics involved in CRTs are imprecise, and a particular feature of early models was that as the set warmed up, or the capacitors and electronics aged, the image started to change shape. In addition to this, the electron beam has to be turned off and then redirected to the left of the screen for each line, and to the top for each new field/frame - this takes time, and is the reason for "blanking".
To account for this somewhat, only a handful of the scanned lines were intended / expected to be displayed. NTSC scans 525 lines, but only 483 if these are intended for display - each field shows ~241.5 lines to a viewer (with an additional 21 lines of "blanking"). ~240 lines per field (remember, this is interlaced) equates to 480 lines per frame (in the progressive world). Thus 480. Yay.
For the digital resolutions we follow a pattern... ish:
-
480 * 1.5 = 720
- "HD Ready" -
720 * 1.5 = 1080
- "Full HD" -
1080 * 2 = 2160
- "4k" or "Ultra HD"
So actually "4k" isn't following the * 1.5
pattern from before, and it's not really 4000 pixels in either dimension - it's 3840 × 2160.
"4k" is actually "Four times the number of pixels as Full HD". 1920 * 2 = 3840
and 1080 * 2 = 2160
. Or lay out four 1080p displays in a 2 × 2 grid - you get 3840 × 2160.
Additionally, if we're using 1080p
as a resolution description, then really "4k" should be called 2160p
(which it is in the technical world).
In summary, in the consumer / broadcast space:
-
480
is because that's approximately the number of visible lines that NTSC would display -
720
is because that's 1.5×480
-
1080
is because that's 1.5×720
-
2160
is because that's 2×1080
- "4k" is because it's a marketing thing - it isn't a technical specification, and I don't believe that there are any stipulations around the use of it...
Note: I've been taking about consumer / broadcast...
Within Cinema there is DCI 2K (capital K
, 2048 × 1080) and DCI 4K (capital K
, 4096 × 2160), where the 'K' presumably refers to kibi and the horizontal resolution. "DCI 4K" predates consumer "4k".
The aspect ratio of DCI 4K is not 16:9, but a slightly wider 256∶135... to bring the resolution back in line with 16:9, you can increase the vertical resolution, or decrease the horizontal... But I'm not entirely convinced that the cinema and broadcast standards evolved this closely.
Cinema evolved from whole-frame positives (aka film) directly to digital, whereas TV evolved from a scanning electron beam (line by line) to digital. This is evident in both broadcast, and the way that VHS operates too.
As a little extra, I've included the graphic below to expand on the "image changes shape" statement from above.
The graphic (from here) indicates the various "television safe" areas... note the rounded corners...
Importantly:
- 5 is the "television scanned area"
- 6 is the "television safe area for action"
- Faces and important plot information should not fall outside this area
- 7 is the "television safe area for titles"
- No text should be outside this area to ensure that subtitles / news tickers / etc... don't overhang the edge of an old display
Solution 2:
Attie has explained how vertical pixel counts were derived for traditional resolutions such as SD (480p), HD (720p) and Full HD (1080p). I would like to discuss K measurements which originate from the cinema industry. A K is about 1000 horizontal pixels and 2K and 4K were casual terms referring to resolutions of approximately 2000 and 4000 horizontal pixels.
Different K measurements were later standardised for digital cinema (2005) and consumer television (2007).
Digital Cinema Initiatives has specified the DCI 4K and 2K resoultion standards (4096 × 2160 and 2048 × 1080). These are the full frame resolutions for digital cinema projectors, with many films displayed in the a crop format such as flat crop (3996 × 2160 , 1.85∶1 aspect ratio) or CinemaScope crop (4096 × 1716, ≈2.39∶1 aspect ratio).
The 4K standard for consumer television (UHDTV1) can be more accurately labelled as Ultra HD or Quad HD. It is equivalent to 4 Full HD frames placed together. It is 3840 × 2160 with ≈1.78∶1 aspect ratio.
The Wikipedia article on 4K resolution provides a helpful visualisation which compares different frame sizes.
Solution 3:
Your assessment that resolution is pixel pitch is kind of correct. In fact, for everything except digital displays, that's the correct definition of resolution (well, more accurately, it's the smallest detail that can be resolved accurately, which functionally translates to the size of the dots making up an image, and from that pixel pitch). If you ever talk to somebody in the printing industry, this is usually what they mean when they say 'resolution'.
The reason that the term has a different meaning for computer displays is largely historical, but is actually pretty easily explained without getting into much history.
On a raster display1, you have what's known as a 'native' resolution. This is the highest number of pixels you can have on each side of the screen, and is usually what is used to describe the screen. For example, a Full HD display has a native resolution of 1920 pixels horizontally, and 1080 vertically, so a Full HD display with a roughly 18 inch diagonal has a resolution (in the classical sense) of about 120 dots per inch (DPI) (which is actually pretty low compared to print media).
However, you can run most displays at lower than their native resolution, which is functionally equivalent to having a larger pixel pitch on the same display. Taking that same 18 inch Full HD display and running it at 1280x720 gives you the equivalent of 80 DPI on the same screen.
Now, the OS and application software (usually) don't care about the exact physical dimensions of the screen, because that information is not really all that useful to displaying information unless you need to display something 'real-size'. Given this, the OS just assumes the size of the display never changes, which means that the different pixel counts are functionally equivalent to the resolution.
- A display that uses rows and columns of individual points that can be turned on and off to produce an image. Almost all modern computer displays are raster displays, because they're fare easier to make. Compare to a vector display, which just draws lines directly (with one of the most recognizable examples being the displays used in the original Asteroids arcade cabinets).