How can I calculate a server's amperage?

Watts = Volts * Amps

None of that should matter though -- You will typically be charged for how many watts you use. You can think of this as the volts are the water pressure, the amps are the size of the hose, and the watts are how much water has actually gone through the hose. This sort of description fails in a bunch of ways, but as long as you don't build things based only on this abstraction you'll be okay

The only way to be sure about your system's utilization is to measure your server's utilization under your expected load. A PSU that can draw 500 watts, if you've got 220v power, can draw up to 2 amps, but it may draw only .5 amps under typical loads for your workloads.

Commonly you usually only pay attention to the amperage to figure out how big a wire you need to use to connect your box to the mains; if it is drawing more current it needs a larger wire (which, weirdly, is a smaller "gauge" wire, but don't worry it'll be thicker and more expensive).

Lastly, if you go beyond 15 or 20 amps (at least in the US, with 110v circuits), you'll get into a situation where you have weird connectors (twist lock connectors that vary in geometry depending on the current rating of the circuit; you can't plug a 20amp twist lock connector into a plug for 30 amps, for instance). But again, most of the time you don't need to worry about these details unless you're looking at big arn.


Just a thought, if you have your server already and want to test out the actual amps it is drawing, you could use Kill-A-Watt or something similar to measure the actual draw before placing it in the datacenter.


Other answers correctly refer to wattage as being the important measure of how much power you really use, however... many data centers and colo providers (like the two that I use, one in Canada, one in the US) will bill you a flat rate per circuit, measured in amps.

So it is useful to know the power draw in amps that your equipment will use. A very rough rule of thumb for ball-park estimation would be about 2A per "average" server. But if you need precision, then measure it precisely. Don't rely on ball-park estimates. :)

You can buy power bars that show you the amps used by whatever you plug into them. Good ones will let you poll that data by SNMP and you can graph it or whatever.


Amp is a current measure, not power. Although knowing the Voltage of you power source it is trivial to use current values as a measure of the power consumed. W=VA, Watt=Volt*Amper (disregarding the power factor, the phase between this two vectors).

Knowing this you can easily look at your HW power consumption details, it will give you the max Watt consumption, divide it by you electrical Voltage (110 or 200, depending of you location) and you will have how many Apms would consume, in the better case of course!