Does Lowering The Monitor Refresh Rate Save Battery Life?

Solution 1:

LED: Refresh rate should have a minimal effect on energy efficiency. The LEDs only draw power when they are energized, and the overhead from the control circuitry should be more or less constant. (Note: this refers to 'true' LED displays, not LED-backlit LCD displays, which may or may not actually be commercially available at the time one is reading this.)

LCD: Refresh rate should have a minimal effect here, too. The backlight is by far the biggest power draw, which is always on whenever the screen is active. Since the liquid crystals only need to adjust their position/orientation when the color or brightness changes (which is a function of what is being displayed, not how quickly it's being refreshed), I would think the differences there are negligible.

CRT: Here I would think reducing the refresh rate might have some appreciable benefit to energy efficiency. The picture is constructed by firing electrons at phosphor spots on the front glass. If you reduce the refresh rate, you reduce the number of electrons fired per second and thus reduce the energy required to (a) generate the electrons and (b) slew the magnetic field generators that aim the electron beam. Of course, if the cathode ray tube is tuned to have optimal energy efficiency at a given refresh rate, its overall energy consumption might increase on net at lower refresh rates due to lower efficiency more than offsetting any power gains from a reduced rate of electron generation.

Some related questions on energy efficiency can be found here and here; and one on monitors and eyestrain can be found here.

Solution 2:

It is true in certain circumstances - specifically, those circumstances in which the display driver (GPU) is a significant part of the power consumption, and the display itself is not capable of self-refresh (which is still a new and non-widespread feature).

As other answers note, the power consumed by a display (possibly excluding CRTs) is largely unaffected by how often it refreshes, since the power consumption is dominated by producing a certain amount of light (and, as an unfortunate side-effect, heat) rather than the activity required to perform the refresh. This only considers the display itself though, and the display is only one part of the full pipeline that leads to an image being produced.

Looking at the bigger picture, a non-trivial amount of power is consumed by the GPU, and in fact even refreshing a static display can consume a meaningful amount of power - nowhere near the power needed to render a complex scene, to be sure, but still enough that it can be worth trying to save it. Panel self-refresh is a way to save this energy by allowing the GPU to skip frames which would be unchanged, and letting the display handle redrawing the static content where necessary. This is more likely to be a decent saving on a small and low-power display (such as on a phone) than on an enormous power-hungry monitor.

There is a quick instroduction to panel self-refresh at http://www.anandtech.com/show/7208/understanding-panel-self-refresh; as part of its rationale it covers the topic of this question.

Solution 3:

If your graphics card is drawing frames 40 times a second rather than 60 times a second, it will save power because the graphics card is less busy (33% power savings). This is probably the source of any power savings.

If your monitor refresh rate is set to 40Hz and a game you are playing is set to wait for vsync, then this certainly applies. Otherwise your graphics card is just drawing frames as fast as it can.

If you have Aero/desktop composition enabled, Windows is using the graphics card to render the desktop. I would imagine that Windows will draw its windows (assuming Aero/desktop composition is enabled) according to the refresh rate, but I'm not sure. You'll save power if it is. Turning off Aero/composition will offload rendering to the CPU and may actually increase power consumption.

As far as the signal from anything generating video to the LCD - the technique used is Low Voltage Differential Signaling and that works by comparing the voltages of two signals which are out of phase. You might want to ask on electronics.stackexchange.com to be sure, but I think this means there is always a constant amount of power going through the wire irrespective of what is transmitted, since the thing being modulated to represent data is not the amount of power, but the phase difference in two signals. So the amount of frames sent through the wire doesn't affect the power used.