Where does power consumption go in a computer?

Today we had a weird discussion over lunch: What exactly causes power consumption in a computer, particularly in the CPU? (ETA: For obvious reasons I don't need an explanation why a hard drive, display or fans consume power – the effect there is pretty obvious.)

Figures you usually see indicate that only a percentage (albeit a large one) of the power consumption ends up in heat. However, what exactly does happen with the rest? A CPU isn't (anymore) a device that mechanically moves parts, emits light or uses other ways of transforming energy. Conservation of energy dictates that all energy going in has to go out somewhere and for something like a CPU I seriously can't imagine that output being anything but heat.

Us being computer science instead of electrical engineering students certainly didn't help in accurately answering the question.


Solution 1:

Electrons are being pushed around, that requires work. And the electrons experience "friction" as they move around, needing more energy.

If you want to push electrons into a PNP junction in order to turn it on, that requires energy. The electrons don't want to move, and they don't want to move closer together; you have to overcome their mutual repulsion.

Take the simplest cpu, a single, lone, transistor:

alt text

Electrons lose energy as they bump around, generating heat. And overcoming the electric fields of attraction and repulsion requires energy.

Solution 2:

There's an interesting article on wikipedia about Landauer's principle wich states that (quote):

"any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment"

This means that (quote):

Specifically, each bit of lost information will lead to the release of an amount kT ln 2 of heat, where k is the Boltzmann constant and T is the absolute temperature of the circuit.

Still quoting:

For, if the number of possible logical states of a computation were to decrease as the computation proceeded forward (logical irreversibility), this would constitute a forbidden decrease of entropy, unless the number of possible physical states corresponding to each logical state were to simultaneously increase by at least a compensating amount, so that the total number of possible physical states was no smaller than originally (total entropy has not decreased).

So, as a consequence of the Second Law of Thermodynamics (and Landauer), some types of computations cannot be done without generating a minimum amount of heat, and this heat is not a consequence of internal CPU resistance.

Cheers!

Solution 3:

To add to the other excellent answers:

Figures you usually see indicate that only a percentage (albeit a large one) of the power consumption ends up in heat. However, what exactly does happen with the rest?

Actually, almost everything ends up in heat. By the law of Conservation of energy, all the energy (which is power multiplied by time) has to end up somewhere. Almost all processes inside a computer end up turning the energy into heat, directly or indirectly. For example, the fan will turn energy into moving air (=kinetic energy), however the moving air will be stopped by friction with the surrounding air, which will turn its kinetic energy into heat.

The same goes for things like radiation (light from the monitor, EM radiation from all electrical components) and sound (noises, sound from loudspeakers) a computer produces: They too will be absorbed and transformed into heat.

If you read of a "percentage" that ends up in heat, that may have referred to the power supply alone. The power supply should indeed turn a large percentage of its input into electrical power, not into heat (though it does produce some heat as well). This energy will then be turned into heat by the rest of the computer :-).