Source of PDU De-Rating?
Solution 1:
Promoting my comment to an answer...
Does anyone have anything that shows what agency this is (NEMA?) and these actual regulations?
This requirement comes from the National Electric Code, which is published by the National Fire Protection Association
National Electric Code requires that the continuous currentNote 1 drawn from a branch circuit not exceed 80% of the circuit's maximum rating... 24A is 80% of 30A. This 80% derating scheme is sometimes referred to as the Maximum Current Rating; it refers to the measurable current load through the device in question.
See Sizing a circuit breaker, which does a much better job than I could at describing the NEC rules.
Also, does anyone have any advice, based on actual experience, on how dangerous it is to push above these limits?
I have no experience here, but I personally wouldn't recommend it. Since these guidelines come from the NEC, violating these design guidelines could get you in hot sauce with your insurance provider / lawyer / fire marshal if there was ever a reason for a claim.
Note 1: Continuous current is any load sustained continuously for at least 3 hours
Solution 2:
The 80% rule is there for a reason. The best case overloading a circuit is going to yield circuit breakers popping, the worst a fire.
That said, there's a certain art to estimating how much capacity you're actually using (short of a clamp probe or similar). At first blush most folks add up the wattage of each piece of gear being plugged into a circuit and compare it to the 80% value... So 8 servers with 300W power supplies = 2400W = 20A @ 120V which, of course, is a no-go unless you've wired in a 30A circuit.
The thing is, though, that the plate ratings on equipment are generally the maximum rating. In practice most equipment never even approaches this number - so those 300W servers might pull, say, 200W momentarily at startup when drives are spinning up and such and then drop back to 80W when fans slow down, speedstep (or equivalent) kicks in, etc. For a modular network device the number may assume a full complement of ports all running optics with the highest power demand - which is also unrealistic.
Various vendors will publish so-called "typical" power draws, but these are often marketing numbers (...a quick way to appear more efficient). Other vendors may be a lot more conservative and publish a more realistic estimate. The only way to truly know what's going on is to hook up some kind of meter to get an actual empirical view of the power being drawn. Sources might include something as simple as a Kill-A-Watt to clamp on probes to smart PDU's providing graphed measurement on a per-socket basis.
So... if you want to be extremely conservative (not a terrible thing) add up the plate ratings until you get to 80%. Some electricians make recommendations around dropping 30% off of the plate rating of equipment, but I've always suspected they were working from experience with motors and such rather than electronic equipment. This is probably fine and could save you some pole capacity but ultimately you shouldn't take anyone's word for it. Measure the consumption of actual gear under something like actual conditions and then plan accordingly.
Remember: Being careful will cost you an extra circuit breaker, or two. Not being careful can cost a whole lot more.