What are the differences in scaling on the GPU vs on the Display?

I currently am using a ViewSonic XG-2700-4K on a GTX 1080. However, in some games that I would rather have High settings over Medium settings, I would have to sacrifice the resolution.

I use CRU (Custom Resolution Utility) to have a custom resolution of 3200x1800. However, some people recommend GPU scaling and others Display scaling to improve the image.

What are some of the differences in scaling on the GPU vs on the Display? Would one look better in-game over the other? Are there any other differences besides image quality?


It's typically something boiling down on individual hardware combinations I'd say. I think these are the most important points here:

  • No matter where you do the scaling, it will take some time to be calculated.

  • On the GPU this might reduce your frame rate of it's already running on it's limits.

  • On the screen this might cause delays, basically frames shown with a delay. This might be perceived as input lag (especially with high resolutions).

As such I'd say the question boils down to two possible picks:

  • If your GPU isn't running maxed out, let the GPU scale the screen.

  • If your GPU is maxed out and you prefer no input lag/delays over a maxed frame rate, let the GPU do the work.

  • If you prefer maxed frame rate no matter what and your GPU is maxed, let your screen do the scaling.

  • If we're talking about desktop work only, pick the screen for (usually?) lower power consumption.

Also keep in mind that your screen might not accept just any resolution since there might be memory limitations on place.