Resolution scaling and performance
Solution 1:
Question 1:
In both cases the display is running at the same actual resolution (3840x2160). However, the graphics are rendered virtually at a resolution of 6400x3600 before being downsampled to the actual resolution.
This is done because 3200x1800 is not an integer multiple of the actual resolution, therefore there's going to be some compromises made regarding image quality when viewing it on screen. In order to make the right compromises, the display is rendered virtually to a much higher resolution that is then downsampled to the actual resolution. This enables the computer to use those extra bits of display data to choose a better pixel value for each position on the display, than if it just had a 3840x2160 virtual rendering to work with.
Question 2:
Yes, I think that is the case here.