Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Clash of the Titans: ATI RADEON X800 PRO and ATI RADEON X800 XT Against the NVIDIA GeForce 6800 Ultra

Started by: lanky | Date 05/04/04 06:16:02 PM
Comments: 45 | Last Comment:  08/25/06 12:53:49 PM

Expand all threads | Collapse all threads


I'd like to request Min and/or AVG FPS graphs to be included. I enjoy your reviews but some of us would like to make sure the card won't dip to low in some cases.

Also any overclocking info would be nice.
0 0 [Posted by:  | Date: 05/04/04 10:19:19 PM]

This is the most bias review ever. It's simply anti-nvidia bullshit.

Whenever Nvidia wins by around 10fps you claim that it's a tie between the 2 cards but when they come within 3fps of each other you claim that the "RADEON X800 XT manages to strongly outperform the rivaling NVIDIA GeForce 6800 Ultra. "

This is simply bullshit as the 2 cards are within 3fps of each other.
0 0 [Posted by:  | Date: 05/05/04 12:51:02 AM]

I'm going to kind of nod @ comment #4...

ATI is using 24 bit color downcoloring 32bit mode. Coming from the users who do digital photography on the side, that's not a limitation I'd think is acceptable anymore.

And the BIG thing is, back when you were touting the 9700/9800, you claimed that PS 2.0 was great because you were ready for the next big thing. Now PS 3.0 is in hardware, it makes programmers lives easier with even bigger effects and "ATI works just great with games you can play right now"?

This article is unusual for Xbit, and it is an article of convenience, catering to the ATI side of the fence while highlighting none of the sacrifices you have to accept in going to that side of the fence.

I like both, and I happily applaud good performance numbers, but at least highlight the serious compromises that ATI made to make these performance figures happen. I think your readers ought to be aware that ATI is banking on selling you a $500 card now and then another $500 card in a year to make up for all the things the NV card can do right now.

Its a poor financial choice and you are hiding that fact behind performance numbers, imo. That is NOT a service to your readers.
0 0 [Posted by:  | Date: 05/05/04 02:43:19 AM]

I find it odd that a free game like America's Army is not in the benchmarks...I am curious how it performs with ATI's latest products.
0 0 [Posted by:  | Date: 05/05/04 03:53:06 AM]

if you know about ATI's questionable "optimisation" regarding anisotropic filtering (where only the first texture stage gets trillinear filtering, and the rest get bilinear filtering), how come you don't know about rtool (, which can be used to enable correct trilinear filtering for all texture stages on ATI chips ? even worse, you claim ATI's optimization cannot be disabled, and you perform all tests using this low-quality bilinear anisotropic filtering method. Sure, other reviewers did exactly the same, but they didn't seem to have a clue about ATI's texture stage bilinear filtering optimizations, while you knew and did nothing to change that.
I'm dissapointed because I expected a review which would show the performance of these high-performance cards in a way which suits them best: and that is using the highest quality filtering provided by the chips, disabling all driver optimizations for the chips (just like you did with the Geforce 6800). Now I'll never know what the X800XT's performance using the correct trilinear anisotropic filtering method would be like...
0 0 [Posted by:  | Date: 05/05/04 04:51:24 PM]

"The top-of-the-line $499 RADEON X800 XT appeared to be faster compared to its main competitor – the GeForce 6800 Ultra – in plethora of applications where it was pretty natual to expect – the games that broadly use complex geometry and loads of math-intensive pixel shaders"

Excuse me, but this is totally untrue.
As it shown from the Benchmarks of F1 Challenge, ATI's cards are using much more CPU power than Nvidia's cards. So, when a game needs a lot of CPU power, e.g. for AI, ATI's cards are not good. You may check setups with lower CPU speed to verify this. Also, I think it is very bad for ATI not to include Shader Models 3, no matter the present situation, and if they want to stay alive in the mainstream market, they shall include them in the very near future.
0 0 [Posted by:  | Date: 05/05/04 05:15:27 PM]

To 9. is entirely false...

In fact, selection of of AA has no "rules" for tex filtering at all... There's no reason point sampling can't be used... ;) Additionally, stage 1 tri 2-7 bi filtering only occurs when forced through control panel. If the app/game requests AA/tri it is applied correctly to all 1-7, etc.

This contrasts nicely with Nvidia, for which there is no way you can set tri on subsequent tex stages with AA enabled... Everything is now "brilinear" on every stage regardless of control panel/app/game requests for tri... So the comparisons were entirely correct. Also be aware that with Nvidia's latest ForceWare drivers, there's a "bug" where if you disable trilinear optimizations in the control panel, they are still active... Convenient bug, no...?

Nvidia have no choice but to copy ATI's AA method because that's where they perceived they lost out to ATI with R3x0 series... (I do not like their selection of 25 deg angle dependence, though. 22.5 is more "optimal"). Unfortunately 6800 can no longer perform "correct" trilinear as per 5900 series.

BTW. RTool only does what normal games/apps do when requesting AA/tri.
0 0 [Posted by:  | Date: 05/07/04 10:09:02 AM]
- collapse thread

Sorry, that was mistake. Should read "Unfortunately 6800 can no longer perform "correct" AA as per 5900 series."

NV40 is an excellent chip with no need for crappy IQ reducing. In fact I am very impressed by its pixel pipeline ALU power...
0 0 [Posted by:  | Date: 05/07/04 10:14:39 AM]


Back to the Article

Add your Comment