Why are brownouts so harmful?

I was reading Is surge protection actually needed? and well I'd like to know why brownouts are so harmful. The explanation gave there is "the capacitors get above their rated voltage" But that makes no sense if the power coming into the PSU is less than the usual voltage. What happens to a PSU in a brownout to damage it?

Is there any protection built into modern PSUs to prevent such damage? Is there any way to protect the computer in brownout conditions other than using a UPS?


Solution 1:

A brownout is an undervoltage condition, when the AC supply drops below the nominal value by about 10% (Nominal meaning 110-120 or 220-240 in most places). So in the US a brownout might be defined as the AC voltage dropping below 99V. The Intel specification for ATX power supplies specifies that voltages between 90 and 135, and 180 and 265 should allow correct power supply operation (section 3.1), so the power supply will still run normally even when a noticeable brownout occurs.

Some people also include very brief power dropouts (under 30mS, or about 2 AC cycles) as brownouts, as incandescent bulbs will briefly, but visibly, dim during that time similar to a real undervoltage condition.

In either case, Intel defines them as undervoltage conditions, and discusses what requirements an ATX power supply has to follow under such conditions in section 3.1.3 of Intel's ATX12V Power Supply Design Guide

The power supply shall contain protection circuitry such that the application of an input voltage below the minimum specified in Section 3.1, Table 1, shall not cause damage to the power supply.

Typically power supplies have an input section composed of a bunch of interesting circuitry that, at the end of the day, provides about 308 VAC to a transformer, which then powers the regulation and conditioning circuitry. This circuitry actually forms the major basis of the regulation circuitry, and if you are using less than the full wattage of the power supply may be able to manage with significant undervoltage conditions without falling out of regulation on the output side.

When a brownout occurs, the powersupply will attempt to deliver the rated current for as long as it can (based on the incoming voltage and current) and if it cannot maintain regulation it'll deassert the Power Good signal going to the motherboard. The motherboard is responsible for deasserting the power on signal going to the supply, and if it does so in time, then the supply will drop all its output and turn off.

If the motherboard fails to do this, the powersupply should drop its rails when it falls too far out of regulation, but that is not guaranteed, and with low quality power supplies you may find your components and motherboard receiving undervoltage conditions as well.

What happens at that point depends on how robust those components are, but it's generally not a good thing as the components attempt to operate at the lower voltage. Keep in mind that the power supply always supplies an undervoltage on power down for a brief time (dropping the outputs to 0 is not instantaneous) so very brief undervoltage periods are fine. The problem only occurs if the power supply remains in an undervoltage state for a long period of time, which can only occur if both the power supply and motherboard fail to realize the problem, and continue to attempt to operate.

Keep in mind that the Intel specification is not much more than an industry guideline, and there are no certifying bodies. Even good power supplies are not bound by any agreement to follow its recommendations. My favorite section is 3.1.5. I've seen many power supplies, both expensive and cheap, fail to keep those recommendations!

The specific effects differ depending on the component being discussed, which is really a separate discussion.

Solution 2:

PIE. P = IE. Power = Current times Voltage. So if the voltage is lower in a brownout, a power supply has to pull more current from the mains to maintain the same power. So while the voltage stress is indeed lower during a brownout, the current stress to the power supply increases to compensate.

Here's the short answer: In a brownout, power supplies need to draw more current to compensate for the lower supply voltage, which is very stressful for transistors, wires, diodes, etc. They also become less efficient, which makes them draw even more current, aggravating the problem.

Here's the long answer: Most PCs (if not all) use switching power supplies. If all the elements of the supply (the transistors, transformers, capacitors, diodes, etc.) were completely ideal, a supply could take any input voltage and produce the desired power at the desired voltage (as long as there was enough current at the input to maintain P=IE).

But those elements are all far from ideal, so all real-world power supplies are designed to operate inside a certain range, say 80 to 240V. Even inside the range they are designed for, the efficiency (the percentage of power at the output of the supply compared to the power needed at the input) tends to fall off as the input voltage gets lower. Anandtech has a good example graph. The X-axis is the power at the output of the supply (the load) and the Y-axis is the efficiency. So this supply is most efficient at around 300W.

For a 120V input, it's about 85% efficient, so it draws about 300W/0.85 = 353W from the wall to get you 300W at the output. The "missing" 53W is dissipated in the power supply circuitry (that's why your PCs have fans - it's like your power supply has a 50W bulb in a little box and it needs to get the heat out). Since P=IE, we can calculate the current it needs from the wall plug to produce 300W output from 120V: I = P/E = 353W/120V = 2.9A. (I'm ignoring power factor to keep this explanation simple.)

For a 230V input, the efficiency is 87%, so it only pulls 344W from the wall, which is nice. Because the voltage is so much higher, the current draw is much lower: 344W/230V = 1.5A.

But in a 90V brownout condition, the efficiency is even worse than at 120V: 83.5%. So now the supply is pulling 300W/0.835 = 359W from the wall. And it's pulling even more current: 359W/90V = 4A!

Now that probably wouldn't stress this power supply much since it's rated at 650W. So let's have a quick look at what happens at 650W. For 120V, it's 82% efficient -> 793W and 6.6A from the wall. But the efficiency is even worse at high loads, so for 90V we see 78.5% efficiency, which means 828W and 9.2A! Even if the efficiency stayed at 78.5%, if the brownout went to 80V it would need to pull 10.3A. That's a lot of current; things start to melt if they aren't designed for that sort of current.

So that's why brownouts are bad for power supplies. They need to draw more current to compensate for the lower supply voltage, which is very stressful for transistors, wires, diodes, etc. They also become less efficient, which makes them draw even more current, aggravating the problem.

Bonus example: Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance. When a power transistor is switched "on", it has an "on resistance", let's say 0.05ohms. So when 3A of current flows through that transistor, it sees 3A * 0.05ohms = 0.15V across its leads. That 0.15V * 3A = 0.45W of power that is now being dissipated in that transistor. That's waste power - it's heat in the power supply, not power to the load. That's our 300W scenario, 120V scenario.

In the 90V brownout 300W scenario, the transistor has the same 0.05ohm on resistance, but now there's 4A of current going through it, so it drops 4A * 0.05ohms = 0.2V across its leads. That 0.2V * 4A = 0.8W of power that is now being dissipated in that transistor. So each device (and there are a lot of them) in the power supply that has an on resistance/voltage drop across it will generate more heat (wasted power) when the supply voltage drops. So in general and within reason, higher voltages give you higher efficiencies.