How does display scaling work on Mac

What happens is that the system essentially renders the image offscreen at a much higher resolution where you haven't got "a third of pixel" and similar. Then that image is downsampled to the actual resolution of your monitor and displayed.

This process is ofcourse not "lossless" and will give a slightly worse image than if you had a monitor of a higher resolution where this problem does not occur. However, due to the relatively high resolutions we're working with today, it doesn't look as bad as you might think.

It is always a tradeoff of whether you prefer the perfect quality of having 4k and 1920x1080 match each other perfectly, or you rather have more screen real estate by opting for 4k and 1440p.


When you choose 'looks like 2560x1440' and that is not an integer scale factor of 4K (3840x2160), macOS creates a virtual display twice the size (i.e. 5120x2880). All software (OS and applications) writes to that double size display.

The display driver then downsizes from the 5120x2880 virtual display to the 3840x2160 physical display. This ensures the consistent optimisation which applies across all applications.

As you know, this creates a slight fuzziness but does mean that text looks the intended 'right' size.

When choose 1920x1080 the scaling factor to 3840x2160 is exactly 2 so that the virtual display (twice 1920x1080) is the same size as the physical display. As a consequence no fuzziness, but everything looks far too large.

Macs always look best (text size and crispness) with screens that have about 220 pixels per inch (ppi). Older screens were all about 110 ppi. Anything else there is a compromise between best crispness and best text size. It is not chance that the new 24" (actually 23.5") iMac has a 4.5K screen - 4480x2520 and 218ppi.