Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!
Discussion on Article:
The Fastest Graphics Cards of Summer 2004
If you are unaware, "The Black Death" track is the most widely used benchmarking utility in the FB world because of it's incredible load on both the CPU and the graphics card.
I can't understand your arrogance? Everyone else has posted completely different numbers whether they be other online reviewers or other gamers using FB and yet you refuse to even entertain the idea that something is wrong with how you tested. Incredible! I'll leave you to your ATI-worship - at least I now know where you stand. B'Bye now...
However, there is a flaw. Why must Xbitlabs ALWAYS neglect to mention that Nvidia's driver optimizations are turned OFF?
You cannot turn off ATI's optimizations, and so ATI gets the edge once you turn on AA+AF.
If you were to re-run the benches with Nvidia's optimizations ON, then Nvidia would have looked alot better with AA+AF..
Atleast edit your review and put the important information of whether Nvidia's optimizations are on or off in the test bed section..
Unfortunately though, this article was far too biased to be taken seriously.
Whenever NVidia wins, we get a comment like "NVIDIA’s solutions are evidently better in this game, but it’s hard to account for this fact." and "At least we can’t lay blame on the low geometry processing speed anymore – ATI’s new graphics processor is NVIDIA’s better in this respect as you know."
What we *know* is that NVidia won that benchmark. Trying to pass that off as some anomaly, while stating (without any proof) that ATI's cards are better in this respect, just doesn't sound very professional. If they are better, show us. If you can't show us, don't pretend that it's a fact.
Other comments are "With FSAA and AF enabled, NVIDIA’s graphics cards look unsure, as we’ve seen a number of times, since their algorithms of working with the memory are less efficient." about a benchmark where the 6800 U beats the X800XT with 10+ fps. If that is unsure, then what is ATI's cards? Paralyzed?
But when ATI cards are ahead, there's no need to look for an explanation.
In another, we're told that the 3-5 fps difference (in NVidia's favor) is a "small difference", but a few pages earlier, ATI is said to perform better because it has about the same number of fps over the NV cards.
"The RADEON X800 XT equals the GeForce 6800 Ultra in this game" about a benchmark where the 6800 Ultra wins with 8/1/4 fps (for each resolution)
We're talking about something like a 10% lead to NVidia here.
"The top-end members of the new GPU series from ATI and NVIDIA again give out the same number of frames per second" in Flight Simulator 2004, where the 6800 U wins by 3-4 fps in all resolutions and in pure as well as eye candy, but on the next page (X2), "The RADEON X800 XT seems to be winning in this test", despite only having a lead of 0.6, 1.5 and 0.6 fps in the different resolutions.
In Eye Candy, the 6800 U pulls ahead by 7, 3 and 1 fps, but we're told that "The situation hardly changes in the “eye candy” mode"
I'm not questioning the benchmarks, and I'm not saying ATI's cards sucks, I'm asking for a fair description of the benchmarks. I'm still trying to decide which card to buy, and reading biased articles like this doesn't help. I want the best card, not the one from the reviewers favorite company.
The reason for that is because Nvidia's AF optimizations were disabled, which isn't fair.
However, I find that weird because the NV40 always beats ATI in CoD.. If you look at some of your past reviews, you can see this clearly...
Add your Comment
Enter your username and e-mail address. Password will be sent to you.