Why does the heat production increase as the clockrate of a CPU increases?

The whole multi-core debate got me thinking.

It's much easier to produce two cores (in one package) then speeding up one core by a factor of two. Why exactly is this? I googled a bit, but found mostly very imprecise answers from over clocking boards which do not explain the underlying Physics.

The voltage seems to have the most impact (quadratic), but do I need to run a CPU at higher voltage if I want a faster clock rate? Also I like to know why exactly (and how much) heat a semiconductor circuit produces when it runs at a certain clock speed.


Each time the clock ticks you're charging or discharging a bunch of capacitors. The energy for charging a capacitor is:

E = 1/2*C*V^2

Where C is the capacitance and V is the voltage to which it was charged.

If your frequency is f[Hz], then you have f cycles per second, and your power is:

P = f*E = 1/2*C*V^2*f

That is why the power goes up at linearly with frequency.

You can see that it goes up quadratically with voltage. Because of that, you always want to run at the lowest voltage possible. However, if you want to raise the frequency you also have to raise the voltage, because higher frequencies require higher operating voltages, so the voltage rises linearly with the frequency.

For this reason, the power rises like f^3 (or like V^3).

Now, when you increase the number of cores, you're basically increasing the capacitance C. This is independent of the voltage and of the frequency, so the power rises linearly with C. That is why it is more power efficient to increase the number of cores that it is to increase the frequency.

Why do you need to increase the voltage to increase the frequency? Well, the voltage of a capacitor changes according to:

dV/dt = I/C

where I is the current. So, the higher the current, the faster you can charge the transistor's gate capacitance to its "on" voltage (the "on" voltage doesn't depend on the operating voltage), and the faster you can switch the transistor on. The current rises linearly with the operating voltage. That's why you need to increase the voltage to increase the frequency.


Very basically :

  • A transistor switches faster when you apply more voltage to it.
  • modern IC consume most power when swithing from one state to the next (on the clock tick), but consume no power to stay in the same state (well, there is leakage, so not exactly no power) so the faster you switch, the more switch per seconds you have, the more power you consume.

A very good book on all the details of processor architecture :Computer organization and design by David A. Patterson, John L. Hennessy.


Every time a transistor switches state, current is spent. Higher frequency means faster switching, more current wasted. And the impedance of everything converts it to heat. P=I^2*R and all that. And P is V^2/R. In this case though, you'd really want the average V and I over time to be able to calculate, and it'd be quadratic to voltage and current both.


1) two cores vs. speeding up one core
To speed up one core you need new technology to speed up the transistors switching from one state to another. To add another core you just need more of the same transistors.

2) Heat
The power dissipation is in the form of heat. Power = Voltage * Current. Voltage = Resistace * Current. Power = Voltage^2 / Resistance. So the heat disipated is proportional to the voltage squared.