Why is windowed mode always slower in games?

When I run games in Windows or Linux in Windowed mode, my frame-rates always seem to start chunking.

Is this a general problem, the machines I'm using not being awesome enough, or or is it just the way modern game engines are designed?


I am going to assume you're running in the same windowed resolution as you would be fullscreen (i.e. if you run it in 800x600 windowed, you're comparing against 800x600 fullscreen), since that means the game has to render the same image in either case. If you run in a higher resolution for fullscreen, then it seems likely that there's something else going on.

When in windowed mode, your computer has to draw more than the game itself: your desktop and other windows are also visible, and time has to be spent on drawing that as well.

However, there is another, more subtle difference, which your games may or may not make use of - it was somewhat common in previous years, but I'm not sure if game companies are still using it.

You see, the image you see on the screen is nothing more than a contiguous collection of rows of pixels. That is, nothing is stored between each row of pixels: it's just one large chunk of memory.

In fullscreen mode, the game can just place its image in that chunk of memory; when one row ends, the next begins, so you can copy your entire rendering buffer in one go. In windowed mode, that doesn't work any more: you now have to copy each row individually, because you have stuff to the left and to the right of your window which you aren't allowed to overwrite.

If your hardware is sufficiently fast, the difference should not be noticable, because although it is more work, it is not really that much more work - not as long as you aren't keeping a ton of other things with complex rendering visible at the same time - and if you're capable of getting the full X FPS in fullscreen (where X is your monitor refresh rate), it's quite likely that you could also get a few more FPS, and the computing power that could have been used for those extra FPS, can instead go to all of the other stuff happening in windowed mode.


It is a general concern.
When in windowed mode, the video card is busy rendering the complex graphics of the game, while the "desktop" is trying to take priority to maintain its stability. When the two coexist, the frame rate drops, as the delay between a frame an another is increased.

As a result, some games enforce that, where they detect they are in windowed mode and purposely reduce the frame rate, to allow the desktop more processing power. Of course they offer that as an option that you can turn off -- World of Warcraft is an example of that.

If your processor and graphics processor are capable of achieving high frame rate, you may not notice that performance drop, as they are faster than the delay due to processes sharing the graphics card.

This is a general answer, unless someone else has more in-depth information.


I run World of Warcraft at 1080P resolution with every driver option and in game option set to maximum, exception shadows in game and ocular occlusion in the driver. The amount of processing is phenomenal, and it runs at 60FPS a lot of the time - the maximum the monitor will allow. Changing from full screen mode to full screen (windowed) mode displays the same thing except that I can then move my mouse pointer onto my other monitor and drag windows onto the top of my game.

Having just done a test in a city with some scenery in the background, the frame rate dropped from 43(full screen) to 28(windowed, full screen) FPS. Full windowed mode was also at 28 FPS. I cannot believe that it takes that much processing to put the rendered output from an off-screen buffer into a window. You can do this in hardware in a number of ways, hell, the graphics card will even do it with an overlay for no cost. Also, I don't buy priority explanation, since the desktop is generally not using any resources. To crunch the maths, a 1080P screen takes up 7.9MB. To read and write this 60 times a second takes up 950 MB/sec of bandwidth. The graphics card I run has 115GB/sec bandwidth. This is 0.8%, or in our 43 FPS scene, 0.35 of an FPS. Processor usage drops from 50% in full screen to about 33% in the windowed modes.

It takes a huge amount of optimisation to get games to run so fast, you have to cut everything you can out. I think that running in windowed mode likely calls in an unoptimised part of windows to do stuff that is beyond the reach of games developers. DirectX is called that for a reason, it bypasses the slow clunky high level graphics junk that made our old 386s go from lightning fast to take 2 minutes to draw a window.

I think trying to involve this totally different aspect of windows that really has nothing to do with real time games is what is causing the issues. It's another layer shoved in between the game and the graphics driver, and one that was written without optimisation in mind. Of course, this is just a hypothetical explanation. It's a real shame because it's nice to have a desktop to play with while playing games. But I guess you can't have everything.