Principle of resolution scaling via External monitor (GPU performance)

It is not clear to me how your ranking is supposed to function.

You need to make a distinction between whether or not you're using HiDPI-mode ("Retina-mode"). This is very important.

You can run a 4k monitor at 4k resolution (as seen by the monitor) in two modes - either HiDPI Retina or native 4k. The HiDPI mode is the default chosen by macOS. In this case it will be labelled as "Looks as 1080p".

When running in "native 4k" everything is rendered 1:1. This means that user interface elements would typically look very small on a normal sized monitor.

When running in "HiDPI" mode, you'll see that it will be labelled "Looks like 1080p". I.e. text and user interface elements are the same physical size on the monitor as had you chosen a native 1080p resolution. However, text, video, pictures, etc. are actually rendered at 4k (3840x2160) resolution, giving you crisper text and full resolution video.

Whether or not "native 4k" mode or "HiDPI mode" is more taxing on the GPU depends on what applications you're running. If you're looking at a blank desktop, the "native 4k" mode should use the fewest resources on the system as a whole.

If you choose a different resolution, like for example 1440p in HiDPI mode, the system will actually render at double the size - i.e. 5120x2880 - and then scale it down to the 4k output for the monitor. In this case the system has to perform more work than it did for the "Looks as 1080p" HiDPI mode.

However, if you run 1440p in native mode, i.e. the output signal for the monitor is 1440p - then that is usually less taxing for the system than 4k native.