Why does 1366x768 resolution exist? [duplicate]

I know that there's a previous question about this but it doesn't have any real answers despite having been viewed 12,400 times, and the fact that it's been closed. With that in mind...

Why in the world is 1366x768 resolution a real thing? It has an aspect ratio of 683:384, which is the weirdest thing I've ever heard of while living in a 16:9 world.

All screens and resolutions I've been familiar with have been 16:9 aspect ratio. My screen, 1920x1080, is 16:9. The 720p that I'm familiar with is 1280x720, also 16:9. 4K that I'm familiar with, 3840x2160, is also 16:9. Yet 1366x768 is 683:384, a seemingly wild break from the standard.

I know there are plenty of other resolutions all over the place, but 1366x768 seems to dominate most of the mid priced laptop world and also seems unique to the laptop world. Why don't laptops use 1280x720 or something else as a standard?


Solution 1:

According to Wikipedia (emphasis mine):

The basis for this otherwise odd seeming resolution is similar to that of other "wide" standards – the line scan (refresh) rate of the well-established "XGA" standard (1024x768 pixels, 4:3 aspect) extended to give square pixels on the increasingly popular 16:9 widescreen display ratio without having to effect major signalling changes other than a faster pixel clock, or manufacturing changes other than extending panel width by 1/3rd. As 768 does not divide exactly into 9, the aspect ratio is not quite 16:9 – this would require a horizontal width of 1365.33 pixels. However, at only 0.05%, the resulting error is insignificant.

Citations are not provided, but it is a reasonable explanation: it's the closest to 16:9 they could get by keeping 768 vertical resolution from 1024x768, which had been widely used for manufacturing of early 4:3 LCD displays. Maybe that helped reduce costs.

Solution 2:

At the time the first computer wide screens became popular, the usual resolution on 4:3 panels was 1024x768 (XGA display standard). So, for simplicity and backward compatibility, the XGA resolution was kept as a basis to make the WXGA resolution, so that XGA graphics could be displayed easily on WXGA screens. Just extending the width and keeping the same height was also more simple technically because you would only have to tweak the horizontal refresh rate timing to achieve it. However, the standard aspect ratio for wide display was 16/9, which isn't possible with 768 pixels on width, so the nearest value was choosen, 1366x768.

WXGA can also refer to a 1360x768 resolution (and some others less common), which was made to reduce costs in integrated circuits. 1366x768 8bit pixels would take just above 1MiB to be stored (1024.5KiB) so that wouldn't fit in 8Mbit memory chip, you would have to take a 16Mbit one just to store a few pixels. That's why something a bit lower that 1366 was taken. Why 1360? Because you can divide it by 8 (or even 16) which is way more simple to handle when processing graphics and could bring to optimized algorithms.

Solution 3:

I had the same question in the 2007, because my computer doesn't supported my default tv resolution 1366x768 and I found this:

WHY does 1366 x 768 exist?

This has to do with a 1 megapixel processing boundary of easily available chipsets for VRAM ( video memory ) and video processing display drivers. Its a standard memory size of importance to chip makers. Makes for cost productive configurations where the Input / Output systems are built off of already available OEM devices, so basically the Manufacturer is more in the business of flatpanel Glass making and bezel/speaker situations on a large display. The functional basic math:

1 megapixel

1024 x 1024 = 1048576 pixels

1366 x 768 = 1049088 pixels 16 by 9 image

720p = 1280 x 720 = 921600 pixels. 16 by 9 HD standard .

720p is just under 1 megapixel of data per screen.

If they really wanted to make a 720p specific display, it would be 1280 x 720 pixels, but they decided to get every last bit they could into the viewable pixel space and that is what makes for 16 by 9 numbers to become 1366 across and 768 vertically. In fact 768 is a common vertical resolution memory boundary. Why get more pixels up into the glass and use 1366 x 768? ... because more pixels is better image resolution.

I recommend to read the full article here:

http://hd1080i.blogspot.com.ar/2006/12/1080i-on-1366x768-resolution-problems.html

Solution 4:

768 is the sum of two powers of 2 (512 + 256), and 1366 is 16/9 times 768, rounded up to the next integer. But only the person who selected that resolution can answer the "why". Some people just like powers of 2.

Also, 768 times 1366 is just over one Mebipixel (2^20), which is about 1.05 Megapixel (1.05 * 10^6). 768 times 1365 is just under that, so marketing probably came into play, too.