Information

Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!



Discussion

Discussion on Article:
ATI RADEON X1000: Brand-New Graphics Architecture from ATI Explored

Started by: Curious | Date 10/06/05 12:54:52 AM
Comments: 38 | Last Comment:  12/16/06 07:48:37 AM

Expand all threads | Collapse all threads

[1-4]

1. 

I have to say that I really enjoyed reading the article. However, there is one mistake in conclusions.

"The fastest product in the new family, the RADEON X1800 XT consumes a lot of power and sometimes still fails to outperform the competitor in complex pixel shaders 2.0, because GeForce 7800 GTX boasts more pixel processors onboard."

GeForce 7800 GTX is not faster tin PS2.0 due to having more pipelines, since 24*430 is more or less equal to 16*625.

GeForce GTX is simply more efficient executing PS2.0 in the same way as X1800XT is faster executing dynamic branching.

I would like to see how they both perform in HDR and PS3.0 without dynamic branching. This would allow rmore decisive conclusion about performance of 3.0 shaders.

Anyway, keep up a good work:).

Martin.
0 0 [Posted by:  | Date: 10/06/05 12:54:53 AM]
Reply

2. 
Is there a mistake on the Peak Power Consumption graphs page (Pg. 14)? The two graphs oppose each other on the matter of the X1800XL's power consumption, and also seem to clash totally with Anandtech's findings here: http://www.anandtech.com/video/showdoc.aspx?i=2552&p=7 .
0 0 [Posted by:  | Date: 10/06/05 05:22:12 AM]
Reply
- collapse thread

 
Thank you for pointing out. The mistake in the marking has been corrected. Of course the second graph refers to X800 XL.
0 0 [Posted by:  | Date: 10/06/05 07:13:49 AM]
Reply

3. 
Nice review, one of the longest I have read. I have two suggestions:

1. What about idle power usage?
2. There's quite a bit talk e.g. at 3dcenter.de about what are comparable image quality driver settings. Computerbase.de e.g. activated "high quality" in the Forceware vs "A.I. low" in the CCC. The optimations implemented by nVidia are more visible than the ATI ones. Have a look at this article: http://www.3dcenter.org/artikel/g70_flimmern/index_e.php.
The G70 videos "quality" videos have become obsolete with the availabilty of the "fixed" Forceware 78.03. (see http://www.hexus.net/content/item.php?item=1549&redirect=yes) but you can still judge von the 6800 "quality setting".
Perhaps you might rethink your driver settings in order to represent the same image quality.
0 0 [Posted by:  | Date: 10/06/05 03:20:51 PM]
Reply
- collapse thread

 
sorry, I confused "quality" with "high quality" with the fixed drivers. See http://www.nvnews.net/vbulletin/showthread.php?t=55812
0 0 [Posted by:  | Date: 10/07/05 02:04:24 AM]
Reply

4. 
Just a few things I noticed...

1. It doesn't seem like you use comparable image quality settings on the ATI and the NVidia cards. Doesn't the ATI card have it easier?

2. Maybe you should have used the newer beta NVidia driver with the beta ATI driver, or waited until ATI releases a WHQL driver before comparing it to the NVidia one.

3. Your conclusion is... ... NO CONCLUSION? 39 pages and you couldn't definitively say anything?

I'm looking forward to the 100 page "detailed gaming session" review soon! W00t!!!
0 0 [Posted by:  | Date: 10/07/05 12:32:50 AM]
Reply

[1-4]

Back to the Article

Add your Comment