How much will it cost me to run a Windows Home Server?

Anybody has any measurements of how much electricity does a Windows Home Server (say one of the HP models) use while being on 24/7?

I have an old PC running at home to store files that everyone can access from their laptops, but it is using too much power. I'm wondering if it would be worth to buy a WHS to replace it.

Kwh consumption per day or month is what i'm looking for, as rates vary from place to place.

Edit: Here are my conclusions, feel free to let me know if I'm not right.

Based on Stephen measurements (which validate Joel's estimates), I've come to this conclusions:

An HP WHS with a 2Ghz Celeron (rated for 65W TDP) with only 2 HDD and 1 GB of ram could be well under 2 Kwh a day (The 2.4Ghz Quad Cored max TDP is 105W and was measured at 2.5 kwh per day with 4 HDD and 8GB Ram).

That's a saving of almost 3kwh per day against the old box that averaged 4.6 kwh per day (over 1000 kwh per year). @25 cents/kwh we pay here, that's about $275 a year savings in energy, meaning ROI is less than two years at current (likely to go up) electricity rates.

Now, there is also a new HP WHS with a 1.6 GHz Atom processor. I'll have to check its performance since that would mean even greater savings.

Edit 2: The Atom powered WHS (I researched one from HP and one from Acer) claim that load power usage is 26 watts and that they can go to sleep @ 3W and turn back on upon accessed automatically. @ 26w that means 0.6 kwh per day (meaning even faster at less than 1 year ROI).

Edit 3 (Jul 28): Got my hands on a Fluke meter with data logging capabilities and a clamp and took some samples over time of the power usage of my current PC. The average was 3.45 Kwh per day.


Solution 1:

I recently upgraded a server I had running at home, it was a Windows 2k3 install which I believe is what is under the covers of Home Server.

Originally it was a 1.4GHz single Athlon with 1 Gb ram and 5 HDDs - headless system.

Power Consumption: 192W (4.6 Kwh a day)

Replaced with a 2.4Ghz Quad core Intel, 8 Gb ram and 4 HDDs again headless.

Power Consumption: 106W (2.5 Kwh a day).

I was very impressed at the power saving for a better machine! And as I run Hyper-V on it I was able to run the old server image, my build machine, a Home Server virtual machine and others all on the one box so it was well worth the upgrade.

I believe the dedicated home servers are designed to be low power but if you add lots of external HDDs with their own power supplied add about 15w (0.36 Kwh a day) for each one.

All the values were measured with a cheap plug in module so probably not especially accurate.

Solution 2:

You may be interested in a Kill-A-Watt device; it allows you to directly measure the electricity usage of any device.

Solution 3:

An old rule of thumb I used to use was $200 per year per PC. However, this was several years back. Things have changed significantly since then, both in terms of $ per watt (higher now) and watts per PC (higher, I think, than in 1999, but actually down I think in the last couple years).

Taking that as a starting point though, if you figure a PC lasts for 5 years that's $1000, or double the cost of a basic computer. If you can halve the power use of the PC you'll shave $500 off the TCO, or roughly $9/mo. But again, things have changed.

I know another big difference is that my old estimates included display power costs. Early LCDs used significantly less power than CRTs, to the point where they'd pay for themselves fairly quickly in many business scenarios where the display was on 8hrs or more per day (this is how LCDs achieved the economy of scale to get so cheap so quickly... businesses that were also watching power consumption would purchase them early, even at higher initial price points). Recent LCD displays are brighter and use more power again (update: the switch to LED negates this). However, the machine the OP asked about will likely be headless, and that will throw my estimate off even further.

As a final note, the reason the $200 per year number has stuck so long in my head is that it stayed true for a very long time. Given there are a number of factors affecting it in both directions, such as increased energy costs, increased overall requirements, followed by improved efficiency and removing the need to power a display, the number may not be all that far off after all. It's likely determined as much by the economics of what people are will to pay and can afford as it is by technical matters, and if that's true, the estimate will hold for some time to come.