How much electricity is required to power 20 'average' computers on a LAN?

i trying to see how much electricity is required to power 'x' number of computers. I know it's a vague thing because some computers draw more than others (eg. diff chipsets, HDD, vid cards, PSU's, etc).

So, lets just assume is a mum-and-dad Dell computer with some average run of the mill stuff. Nothing fancy. 20" LCD's.

this is to help calculate the generator power required to keep around 'x' computers running in a LAN. The real figure is in the hundreds .. but i'm assuming i can just figure out the base cost for one machine and then multiple it by the number of seats.

I understand this doesn't include

  • Switches
  • Servers
  • cooling (fans), etc...

I did some stats on this a while ago FWIW, using the handy dandy kill-a-watt..

Typical Developer Dell PC
(2.13 GHz Core 2 Duo, 2 GB RAM, 10k RPM 74 GB main hard drive, 7200 RPM 500 GB data drive, Radeon X1550 video)

Sleep                            1 w
Idle                            80 w
One CPU core fully loaded      108 w
Both CPU cores fully loaded    122 w

Standard Developer Thinkpad T-60 Laptop
(core 2 duo 2.0 GHz, 100GB hdd, ATI X1400 video)

Sleep                            1 w
Idle                            66 w
One CPU core fully loaded       74 w
Both CPU cores fully loaded     82 w

LCDs

Old Dell 19"                    50 w
Ancient, giant 21" NEC          67 w
New Dell 19"                    28 w
New Samsung 19"                 28 w
Apple 23" LCD                   72 w
Samsung 24" LCD                 54 w

It turns out with the LCDs the default brightness level has a lot to do with how much power they draw. I almost immediately turn any LCD I own down to 50% bright, just because my eyes are overwhelmed if I don't...


While I don't have exact numbers, I have ran LAN parties with up to that many people in a conference room.

We had a power board, which had it's own breaker. We had about 18A breakers in total, for 6480W, which works out to 324 watts per machine. Not really a lot for gaming (we blew a breaker once, but I don't think we had 20 people, more like 17 or 18).

So if if it's just office type computers, 6000-6500 watts should be good.


There are two parts to this:

  • The monitor; and
  • The PC.

The monitor is basically constant. Work out hours per day x days per week x power rating and you have a figure (in kilowatt hours) per week.

The PC is a little harder because it uses a certain power level when it's idle and a higher power level when it's doing something. Certain peripherals like optical drives basically use no power when they're not being used and, say 10W or so when they are (figure is for argument's sake).

Generally speaking though, a PC (excluding monitor) shouldn't be drawing more than about 150W under load so use that as a baseline figure. Dedicated graphics cards and other factors can take this to 600W or more.

Generally assume it's under load at least 80% of the time it'll be used. Also take into account people who don't turn their machines off.


Although I haven't had real experience with figuring out how much power to keep multiple machines running, one thing to keep in mind is to have enough power for maximum load.

If tripping the circuit breaker or overloading the generator is unacceptable, then I would think it would be a good idea to figure out a conservative estimate for power consumption -- find out maximum power consumption of each component and round values up.

A rough guestimate that I would come up with for an "average" computer would be something along the line of 300 W for the machine and 100 W for the LCD, and definitely your mileage may vary.