It’s not a secret that PC components are constantly getting hungrier for power. Despite periodic remissions due to transitions to new semiconductor manufacturing processes (for example AMD’s processors began to need considerably less power on a transition from 130nm to 90nm) or to changes in the micro-architecture (compare the consumption of the Intel Core Duo with that of the Pentium D which has roughly the same performance), the overall tendency is clear: the tough market competition forces the manufacturers to seek for more speed before they develop a new tech process, not to mention a new micro-architecture. And in a developed micro-architecture, there are only extensive ways to improve performance: by increasing either the clock rate or the number of units working in parallel (pipelines, ALUs/FPUs, the amount of cache memory, etc). Obviously, either way leads to an increase in power consumption.
As a result, there has never been any long period of time throughout the entire history of personal computers that power consumption would have been lowering and I think this tendency isn’t going to change in near future. Perhaps a transition to molecular bio-computers will effect the change, but from today’s perspective it looks more like science fiction rather than anything to come anytime soon. A few years ago hardware reviewers were testing the then new AMD Athlon 1.4GHz with a TDP of 72W and were all wondering at its fantastically high heat dissipation. Today you can’t surprise anyone with such a number – the TDP of a modern CPU is long over 100 watts.
The same is true for the graphics card market, too. The 3dfx Voodoo2 used to get along without additional cooling whatsoever (only overclockers would put small heatsinks on its chips). Early GeForce 256 cards consumed 25-30W and were quite satisfied with small fans for cooling. Today’s graphics cards may draw over 100 watts of juice and their coolers have long transformed into monstrous contraptions, two slots in height.
The requirements to the power supply have also been growing up. A 145W model would be quite enough at the times of the Pentium M. Today, the declared wattage isn’t simply impressive but downright shocking – the maximum is 1.1kW whereas 400W models are the mainstream. Although many users have a somewhat vague notion of how powerful a PSU they need (they are confused by low-quality products whose real wattage is much lower than the real one as well as by the marketing departments of PSU manufacturers who use wattage as the most intuitively comprehensible characteristic to promote their products), you still can’t deny that the requirements to the computer power supply are ever increasing.
PC enthusiasts have long been ruminating the idea of using two relatively low-wattage power supplies. In the simplest implementation one PSU is responsible for the graphics card, processor and mainboard while the other for various peripherals (hard drives, optical drives, etc). This is simple to do because you don’t have to modify the PSUs much. It’s only necessary to connect their ground wires and PS_ON contacts (the main regulator is turned on at a signal on this contact). On the other hand, there’s not much sense in powering the peripherals from a separate PSU because their share in the overall consumption is too small compared with the combined consumption of the graphics card and processor.
Another approach is to separate the different supply voltages between the different PSUs. E.g., one PSU only provides +12V, and the other, the rest of the voltages. Both the PSUs should be modified so that the regulation in the former was bound to the +12V voltage and in the latter to the +5V (ATX units usually regulate both these voltages simultaneously, basing on some average value). The PSUs can be left as they are, but the modification is advisable – the output voltage may turn to be not very stable because only one of the PSU’s power rails is under load.