by erichayes » Sun Feb 04, 2007 8:45 pm
Hi All,
Anyone who's worked on antique radios knows that the input capacitor was usually 4 to 8 µF in the early ones, and 8 to 16 µF in the "later" ones (1930s). The reason was simple: anything electrically larger was physically impossible to fit on the chassis. The result was that these radios all had power supply hum that was clearly audible.
As electrolytic trchnology was advanced during WWII, values of 40 and 50 µF became common. At that point, the problem of rectifier arcing started showing up in the 80s, 5Y3s and other directly heated small rectifiers. That's when the spec sheets started including input capacitance values which were typically 10 to 40 µF. And the audible hum was still there
When solid state rectifiers became affordable in the '70s, the arcing problem obviously went away (there were other problems, but that's another story), and the values of input capacitances started creeping up. When the sintering method of electrolytic construction developed by, I believe, Sanyo and Matsushita went public, the capacitance density went through the roof.
Today, it's not uncommon to see input cap values of 200 or even 500 µF. The ideal power supply has an internal resistance of 0Ω. Lithium batteries come close, but 500 volts' worth of lithium batteries would be a wee bit expensive. The realistic alternative is to use solid state rectification and high values of capacitance to get the internal resistance as low as prectically possible.
This now makes the weak link of the supply the power transformer. If its secondary can't supply the rectifiers with enough AC for them to feed the raw DC the caps want, the internal resistance of the supply goes right back up until it equilibrates with the time constant created by the filter network.
Eric in the Jefferson State