Articles: Graphics

Bookmark and Share

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 ]

NVIDIA GeForce 6800 Ultra

This is the flagship product of the new GPU series from NVIDIA. GeForce 6800 series chips are manufactured with 0.13-micron tech process and have about 220 million transistors. The fastest chip, the GeForce 6800 Ultra, has all the 16 pipelines enabled and works at a frequency of 400MHz. I’ve got an Apogee AA6800 Ultra card from Chaintech for my tests:

The card follows the reference design with a standard cooler that covers the GPU as well as the graphics memory chips, with two additional power connectors and with a heatsink that cools the power elements of the GPU and memory voltage regulators. The Apogee AA6800 Ultra features the NVIDIA GeForce 6800 Ultra processor and has 256MB of GDDR3 memory on board. The regular frequencies of the GPU/memory are 400/1100MHz. They grew up but slightly at overclocking, to 440/1250MHz.

My power consumption measurements yielded the following results:

Here’s the first surprise to you! I was speaking back in my first report that the two power connectors of the GeForce 6800 Ultra was no real reason for panic, and that’s really so. The card does not eat a lot more power than the topmost competitor model, the ATI RADEON X800 XT Platinum Edition. In the Burn mode, the non-overclocked GeForce 6800 Ultra consumes 72.09W against the RADEON X800 XT PE’s consumption of 63.23W. This is only 14% more.

In the Idle mode, the power consumption of the GeForce 6800 Ultra is 29.28W against 17.62W of the RADEON X800 XT PE. The percentage difference is bigger here, though. This GeForce consumes about 66% more than that RADEON in this mode.

What’s curious, GeForce 6 series GPUs, like the RADEON X800, do not drop their operational frequency or voltage. The GeForce 6800 Ultra is always clocked at 400MHz, and its voltage, according to my measurements on the Chaintech card, is always 1.45v.

So, 66% looks terrible, but is actually equivalent to the same difference of 10 watts as in the Burn mode. A dozen watts this or that way is a trifle for modern power-supply units. But why, then, does the GeForce 6800 Ultra-based graphics card require two additional power connectors?

As shown, the GeForce 6800 Ultra doesn’t practically take any power through the AGP slot. The total consumption of the card on the 3.3v, 5v and 12v lines that go through the AGP is less than 5W. But the card consumes as much as 48.91W in the Burn mode through the 12v line of the additional power connector. The current in this line is 4.22 amperes, which is rather much even for the Molex 8981-04 connector.

So, it seems like the two power connectors on GeForce 6800 Ultra cards are implemented to ensure a higher stability of the system at large by reducing the noise and voltage slumps caused by the graphics card, which may negatively affect other devices attached to the same power rails. The reliability of the card itself is also increased as there’s a lesser probability of overheat, burning and other consequences of poor contact in the connectors.

My tests suggest that this graphics card consumes a little bit more power than the RADEON X800 XT PE in the Burn mode, so NVIDIA’s recommendation about 480W PSUs is an overstatement. Well, it is probable that first revisions of the cards did have power-related problems, but now we deal with an off-the-shelf product. Anyway, the card does load the +12v line much, so you have to have a high-quality PSU with a good current on the +12v line to assemble a system with this card and a powerful CPU, for example (CPU voltage regulators are steadily transitioning to powering from the 12v line, too).

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 ]


Comments currently: 186
Discussion started: 08/23/04 10:02:24 PM
Latest comment: 11/22/06 05:37:39 AM

View comments

Add your Comment