Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
The Fastest Graphics Cards of 2003

Started by: guru | Date 12/12/03 02:10:15 AM
Comments: 47 | Last Comment:  08/05/16 11:30:31 AM

Expand all threads | Collapse all threads


I think your final summary is quite unprofessional.

- Those who are not interested in the next generation games;
- NVIDIA products fans :)

You tested one game which is in beta state and won't come out until mid next year! That alone will not predict how DX9 games in the future will run on the current nv3x cards.

You do now that AF on ATI has lesser quality than NV's right?

0 0 [Posted by:  | Date: 12/12/03 02:10:15 AM]
- collapse thread


NVIDIA GeForce FX is behind ATI RADEON 9500-9800 in every single game or benchmark when pixel shaders are used. Moreover, NVIDIA GeForce FX somehow manages to be slower in games like Splinter Cell.

Our general expectation based on performance in Highly-Anticipated DirectX 9 Game, Tomb Raider: Angel of Darkness as well as some other games and numerous synthetic benchmarks, such as 3DMark03, is that NVIDIA GeForce FX series will be behind comparable models from ATI RADEON 9500-9800 series. We maintain this expectation at this time.
0 0 [Posted by:  | Date: 12/12/03 04:26:33 AM]
Well I don't disagree with FX being slower than Radeons in most cases, that's pretty much known facts. What I disliked was the indirect statement that FX cards would be total useless for next gen DX9 games which is not the case.

It might not run as fast in pure benchmark tests as the radeons but DX9 games will of course be playable with good fps.

It's the statements below that I think is unprofessional. How about OGL games, heard of DOOM 3?

- Those who are not interested in the next generation games;
- NVIDIA products fans :)
0 0 [Posted by:  | Date: 12/12/03 05:26:11 AM]

The publication of the benchmarking results for the GeForce FX 5800 Ultra (NV38) is bad news for NVIDIA

Fx 5800 Ultra = Nv30, not Nv38 ;)
0 0 [Posted by:  | Date: 12/12/03 06:14:25 AM]

[quote]Moreover, it is quite possible that NVIDIA’s software developers will fail to eliminate all bugs from the NV3x architecture with the help of the shader code compiler built into the ForceWare driver. As a result, the performance of GeForce FX based solutions in the next generation games such as Doom III, Half-Life 2 and some others will turn out absolutely unacceptable.[/quote]

I'm an Ati fan... hell, I've got a Radeon 9500 PRO (love it!) but I think this statement is a little silly insofar as it suggests that the R3xx series will surpass Nv3x in Doom3.

Doom3 is an OpenGL title, with a highly optimized Nv30 codepath that the GeforceFX's eat up. The relative Doom3 performance on release will probably look relatively comperable to Serious Sam, between card manufacturers.
0 0 [Posted by:  | Date: 12/12/03 07:34:05 AM]

I have no clue what is up with everyone who benchmarks now. They dwell on the nVidia incident and it seems like that is all everyone thinks about during benchmarking. There are NO references to past incidents of the other major companies and all you guys seem to do is slam nVidia. What sickens me is on the actual game benchmarking, if nVidia seems to be pulling ahead of ATI, you say things like "We should probably really think about the correctness of the results obtained in this gaming benchmark." but if ATI is completely pulling ahead of an nVidia card "RADEON 9800 XT remains the winner in the high-end competition" like we aren't supposed to think that you aren't bias'ed? Give me a break.

Ok, so let me get this straight one more time. If nVidia seems to be "winning" against an ATI card, there is OBVIOUSLY something wrong with the image quality. I mean, the cheated once before, THEY MUST BE STILL DOING IT!!! So we are going to slam them constantly when ever they simply perform better. Now take ATI, if they are ahead, it is because they are simply better. "To be expected" is how one takes your bias remarks.

Excuse me if I sound rude, but you benchmarkers disgust me.
0 0 [Posted by:  | Date: 12/12/03 12:31:46 PM]
- collapse thread

I agree completely ... this is not a professional review and the results should be taken with a serious dose of salts. It's obvious that the guy who wrote this is yet another Ati fanATIc.

Look, if the ATI slacks off then the results are surely misleading and need looking at ... yeah yeah ... It's fairer to say that ATI are better at some things and nVidia are better at other things.

I notice that there are no comments like this when the 5700 scores far higher then the 5950 ... there was even one benchmark when the 5600 owned the 5950 ... ffs , is this a serious comparison ??? I think not.

To absolutely rule out the FX range of cards for all DX9 applications is wrong, stupid and blind.

It is a fact that nVidia have the best driver developers in the world. For pure terms of stability you simply cannot compare ATI to nVidia, thats like comparing a 386 to an AthlonFX 51 ... nVidia have proven that they are improving their drivers in DX9, just compare the 52's and the 53's ... and even a site as reputable as tomshardware indicate that therf is something 'dubious' going on in ATIs cataclysm driver.

How can you mention Doom 3 results when you HAVENT EVEN BLOODY played the game ??? I mean how skewed is that .... do not believe everything you read on a website.

How legitimate is your comparison of your so called Secret DX9 game ????

Ok firstly you make no claims to legitimate benchmarks, so how do we the discerning public know that this isn't the version you downloaded from the newsgroups when it was hacked from valve ... and why have valve seemed to release such an exclusive benchmark to your website, when they haven't released it to the major sites such as toms or anand ... or are you breaching NDA ?

Oh and gee whilst I'm at it ... lets look at the real benchmark issues with HL2. ATI has 24b precision ... nVidia has 32b precision ... Higher is better.
When the drivers are ready, and lets face it they will be (I know you dont agree but dont tar nVidia's programmers with the same brush that you tar ATI's) , this alone means that HL2 will look better on nVidia than ATI ... I really dont care if a 9800 can do an average of 9fps better if it looks worse ... at 60fps who can tell the difference ?

When the HL2 code path for the NV is complete I wouldn't be suprised if the high end nvidias matched or beat performance with the 9800's as they certainly already beat visual quality....
Just a thought here , dont forget that more and more people are moving to TFT screens , and in the majority of cases the Max resolution you can acheive is 1280x1024 ... ergo wtf do a lot of ppl care how the performace is at 1600x1200 ... you have to level the playing field to the level that people will use it ...

I know its hard to do any serious benchmarking , I understand that at the detail you guys went to , it can't be easy ... I for one appreciate greatly how much time this must have taken ... but the answers and your comments show for sure that you are biased towards ATI and that you haven't given 'The Fastest Graphics Cards of 2003' a fair assesment.

That reminds me, my main Graphics card is a Radeon 9800XT.
0 0 [Posted by:  | Date: 12/12/03 02:33:14 PM]
Guys, please be more constructive.

The only constructive thing I figure out from your comment is about precision. Permit me to explain.

MS and ARB allow full precision to be 24-bits to 32-bits. There are no difference in image quality between them. NVIDIA wants everyone to use 16-bit precision for Pixel Shaders to get higher speed on its hardware. 16-bit - they call it part precision - is not supported by MS or ARB. All mixed modes for NVIDIA I heard about either used DX8.1, degraded precision to 16-bit, or degraded something else.

So, you do not seem to be right about precision.
0 0 [Posted by:  | Date: 12/12/03 02:52:09 PM]
I apologise, that was not meant to be totally constructive :), it was meant to prove that (according to you) in one benchmark where the results are TOO similar, then its not a conclusion, but you pass this off as being inconclusive, when you fail to point at that on average there is no noticeable difference between the highend products. As I mentioned, how can you tell the difference betweend 45 and 50 fps when you are playing and CANNOT see the FPS counter... If you tell someone somethings faster they will believe you ... if you dont tell them and ask them to judge themselves they cannot tell thedifference , this is psychology 101.
0 0 [Posted by:  | Date: 12/12/03 03:05:21 PM]

Can you point out more cheating incidents with a major graphics company happened recently in addition to NVIDIA's ultimate cheating episodes?

In case you read our article carefully, you should probably noticed that the obvious problem with the F1 game is that it locks the performance at two points basically.

I don't quite understand your next two claims. Sorry.
0 0 [Posted by:  | Date: 12/12/03 02:34:00 PM]
Ok , forgetting the Quake - Quak incident with ATI, lets look at Aquamark ... It is shown here at toms hardware able_optimizations_in_atis_drivers

Understandably ... Toms refuse to say for certain that ATI are optimizing in a method unbecomming a major graphics retailer ... however a simple look at the screen grabs show an Obvious difference ... In fact if you read the entire review on toms from here

and another comparison here ...

You might get a possibly fairer assesment.

Again kudos to alexy stepin for a very hard job mostly well done.
0 0 [Posted by:  | Date: 12/12/03 02:50:51 PM]
ATI and Quake issue happened only once and more than 2 years ago. Nobody is interested in it.

We do not comment on findings of other web-sites, especially in cases when a company explains its positions. In your cases ATI did that.
0 0 [Posted by:  | Date: 12/12/03 02:54:29 PM]
Ohh . but you comment on the findings of Doom 3 ............ Without having played it ...
0 0 [Posted by:  | Date: 12/12/03 02:56:43 PM]
We retell you John Carmack's comments, Troj.
0 0 [Posted by:  | Date: 12/12/03 02:59:49 PM]
Amd thas it so much more valid then reading another websites comments ?

If you read the entire draft then you are only paraphrasing Carmack ... Granted he said the R300 path is faster using defaults, but if you optimize for nVidia, then (again paraphrasing) its a damn site faster again ... I believe he used the word 'noticeable'.
0 0 [Posted by:  | Date: 12/12/03 03:12:06 PM]
Nevermind the fact that at the time that Jon Carmack said that the NV was slower, the Forceware drivers had NOT been invented at all, and the 5950 simply didn't exist.
0 0 [Posted by:  | Date: 12/12/03 03:34:41 PM]
That is my point exactly though, it was years ago, just as this one with eventually fall out of major news. Yet the point is ATI fixed it and nVidia have fixed most of their problems, my main concern is why the people who won't let it drop. It is in the past, don't hold it against a company who was there in the past and most certainly has a future.
0 0 [Posted by:  | Date: 12/12/03 04:54:39 PM]

did you even take into account that ATI can not even impliment trilinear filtering while running anisotropic ... ... WTF bash nvidia and not even have the specs correct ..................
0 0 [Posted by:  | Date: 12/21/03 01:10:49 AM]
- collapse thread

Have you ever tried to READ something??? I mean you "quantity of BS" is quite staggering. Where the hell you’ve been for the last year and a half? Please be so kind to read some articles about R200, and then some reviews of R300. Maybe then you will bring a comment worth anything except nothing!
0 0 [Posted by:  | Date: 12/21/03 02:51:46 PM]


Back to the Article

Add your Comment