Contemporary graphics cards surpass the older products across a number of parameters like speed, functionality, complexity and, alas, power consumption and heat generation. For many users the words “fast” and “high-performing” are synonyms of “power-hungry”, “hot” and “needs a hell of a cooling system”.
That’s why many hardware enthusiasts and gamers who’re going to purchase a top-end graphics card consider the necessity of changing their PSU for a more powerful one or get ready to a serious test of the old unit. The well-known story about GeForce 6800 Ultra graphics cards, for which NVIDIA recommends a PSU with a wattage of 480W or at least 350 “high-quality” watts only adds pessimism into the calculation of the future expense.
So, power and heat of modern graphics cards are matters of great concern for each overclocker and gamer who’s preparing for an upgrade. In order for such people not to get overwhelmed with doubts I will lay out a method of measuring power consumption – and heat dissipation, too – of modern graphics cards and will also share with you the results of my tests, also with overclocking.
This is the first part of the investigation, and it concerns graphics cards on ATI’s chips only – those on NVIDIA’s GPUs will be discussed later. To escape the righteous anger of the overclocker who won’t see our traditional global comparison of a score of graphics cards, I will examine the power consumption of the cards not only at overclocking but also at extreme overclocking (with Vcore adjustments!). No one ever did this. I promise it is going to be very interesting :).
So, let’s get started.
One Thought about Heat Dissipation
Linking power consumption and heat dissipation of a graphics card I follow the law of conservation of energy. Evidently, the graphics card is not a power source for any other PC component, so all the energy it consumes is exuded in the form of heat. Thus, all the power consumption numbers that are listed below can be referred to as heat dissipation.
Powering up from the AGP and the Additional Power Connector
Before the arrival of graphics cards based on the RADEON 9700/9500 series chips – and earlier still, there had been Voodoo 5500/6000 GPUs from 3dfx) – all gaming graphics cards received their power from the AGP slot. Quite a few of the AGP slot pins are designed to supply power to the graphics card – 3.3v, 5v and 12v. The maximum consumption currents on these lines are 6A, 2A and 1A, respectively, according to the latest specification, AGP 3.0. If you know the voltages and the maximum currents, you can easily calculate the power consumption of the graphics card through the AGP slot – it is about 41.8W.
This is enough for modern mainstream graphics cards – they don’t have an additional power connector. Faster cards are not satiated by that. Even if the peak power consumption of a device doesn’t exceed 41.8W, but approaches this point, additional power is required – long exploitation under a strain never made any computer component live longer.
Today no one is surprised to see an extra power connector on top-end graphics cards. These connectors are analogous to the power connectors of hard disk drives and optical drives (RADEON 9700/9500-based cards use the power connector of the floppy drive) and they supply power by 5v and 12v rails. Some graphics cards are equipped with two such connectors even. According to this whitepaper from Molex, the maximum currents through these connectors are 6.5A or 10A, depending on the positioning of the connector on the PCB (“right angle” or “vertical”). Thus, the maximum power consumption of graphics cards with one additional connector, without counting the AGP slot in, may range from 110.5W to 170W, of graphics cards with two additional connectors – from 221W to 340W. You must agree that this is enough for any modern graphics card as well as for a few generations of them to come.
By the way, the fact that the GeForce 6800 Ultra is the only top-end graphics card today that has two connectors for supplying extra power (I don’t want to talk about the stillborn Volari from XGI here) shouldn’t be considered as the card’s willingness to eat and to radiate as heat up to 340 watts of electricity. The two power connectors on the GeForce 6800 Ultra is not a consequence of a crazy appetite, but a desire of the manufacturer to ensure the stability of power supply by dividing the currents into two connectors. Well, I’ve deviated too much from the topic – power consumption of graphics cards with NVIDIA’s chips is going to be the subject of a separate article.