Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Power Consumption of Contemporary Graphics Accelerators. Part II: NVIDIA vs. ATI

Started by: 33 | Date 08/23/04 10:02:24 PM
Comments: 188 | Last Comment:  07/20/16 12:19:51 PM

Expand all threads | Collapse all threads


Cool article...From a power consumption point of view, seems like the plain GF 6800 has good performance to power consumption. No wonder some companies are starting to announce passively cooled versions! Thanks for confirming my suspicions! :)

Will we be able to see some old GF4 cards just compare?
0 0 [Posted by:  | Date: 08/23/04 10:02:24 PM]
- collapse thread

Since the GeForce4 is a bit outdated today, it does not make a lot of sense to measure their power consumption. Furthermore, no one has had power issues with such boards in my understanding, which probably means that its consumption is pretty low.
0 0 [Posted by:  | Date: 08/24/04 02:59:24 PM]

How is it that the GeForce 6800GT draws less power than the Ultra when overclocked past Ultra speeds? They are based on the same core, they have the same amount of pipelines, they have the same amount and type of RAM, and I'm almost sure the benchmarks of the OC'ed GT will surpass those of the stock Ultra! So how is the power consumption still lower? Am I missing something, perhaps in a cut-down core, or maybe some additional power-savings?
0 0 [Posted by:  | Date: 08/23/04 11:46:13 PM]
- collapse thread

Guys, please look at core voltages ;-)

Two factors contribute to the lower power consumption of the device: 1) the GeForce 6800 GT has lower GPU and memory frequencies than the 6800 Ultra and 2) the GPU voltage is lower here, too. According to my measurements on the Leadtek card, the GPU voltage was 1.35v against 1.45v on the 6800 Ultra card. The effect of the reduced voltage is especially noticeable at overclocking: although the GeForce 6800 GT overclocked above the standard frequencies of the GeForce 6800 Ultra, its power consumption remained at a lower level.
0 0 [Posted by:  | Date: 08/24/04 01:50:25 AM]
Exactly. The basic power equation is:

Power=Voltage X Current
0 0 [Posted by:  | Date: 08/24/04 01:32:47 PM]

Are you a retard or just ignorant:

"The card eats no more power than the topmost competitor model, the ATI RADEON X800 XT Platinum Edition."

So now in Tim Tscheblockov's world, 72.09 is no more than 63.23? What kind of statements are you trying to make, nVidia malaligned I'm sure.
0 0 [Posted by:  | Date: 08/24/04 11:08:17 AM]
- collapse thread

We have corrected our statement with the one that more clearly reflects the situation; though, you should probably read the whole issue before making claims like this, as the fact that we presented the number "as is" speaks about our balanced and independent position.
0 0 [Posted by:  | Date: 08/24/04 02:55:15 PM]


Back to the Article

Add your Comment