Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Nvidia GeForce GTX 275: Worthy Rival to Radeon HD 4890

Started by: Vlad | Date 05/14/09 10:58:23 PM
Comments: 18 | Last Comment:  06/15/16 09:56:07 AM

Expand all threads | Collapse all threads


This is a well done review.Probably the most accurate and the cleanest of all the reviews og the gtx275 i've seen so far.Still, the numbers in Hawx and Far Cry 2 are unjustified.I mean it's well known that all Ubisoft games aupport ATI's adaptive AA.And it's well known that ATI's adaptice AA, especially on QUALITY mode, is not only superior to Nvidia's transparent AA, but also much more demanding.

I mean 27fps in HAXW...!!!
That's the game where ATI cards get significantly better results than Nvidia's.

I think that testing both Nvidia and ATI cards with MS-AA is more adequate for better performance comparison.
0 0 [Posted by: Vlad  | Date: 05/14/09 10:58:23 PM]

This may have been answered a long time ago, but why is there a different system (a P4 system, at that) for power consumption tests? What's special about it? Then the benchmarks are ran on a i7 system.
0 0 [Posted by: lh3nry  | Date: 05/15/09 08:34:43 AM]
- collapse thread

Because this system has been modded to allow power consumption measurements on internal +12V and +3.3V rails of PCIe x16 slot. Soon we'll replace this system with more advanced one.
0 0 [Posted by: Vader@Xbit  | Date: 05/15/09 10:48:15 AM]
That's what I like about the Xbit reviews, your measurements of the card consumption alone. It's at least to me a much more useful result than just measuring total power at the wall socket.
0 0 [Posted by: Joe Public  | Date: 05/19/09 04:49:28 AM]

Excellent.. excellent.. seems that you're really punishing ATI for not improving their AF angle-dependent algorithm, by using QUALITY Adaptive AA instead of Performance Adaptive AA versus Nvidia's TR-MSAA. Pretty much all of the enthusiasts in the world knows that ATI's Quality AAA is roughly equal to Nvidia's TR-SSAA, not TR-MSAA.

Well, it would probably take a few percentage points away from ATI's scores if ATI used the same AF angle-INdependent algorithm as Nvidia's. Although it might not be as bad as using QUALITY Adaptive AA (since the performance hit can be more than just a few percentage points), I guess some of us are willing to pay more for better AF quality.
0 0 [Posted by: Bo_Fox  | Date: 05/15/09 02:07:45 PM]
- collapse thread

Not improving their AF?

At xbitlabs, they always test GPUs with texture filtering on hgih quality for both ATI and nvidia cards.

Nvidia's default texturing is the Quality mode with enabled optimizations.
It's a combination of bi-linear and anisotropic filtering, instead of the classic mix of tri-linear and anisotropic filtering on ATI's default QUALITY Texure filtering. This simplified combination on nvidia's Default Texture filtering leads to no smoothness in the transitions between the mip-levels, and casues the characteristic texture flickering in games and “volatile” borders of levels that you can see in motion....
So yes ATI's default Texture fitlering is better than nvidia's.

Enebling the High Quality mode, automatically disables the optimizations on nvidia's cards, and makes use of tri-linear texturing with a higher quality artificial anisotropic filtering delivered samples, so you get a noticeable increase in quality.
But even with that, ATI's High quality texture filtering is more effective, and provides a higher sharpness of the textures and decreases the fogginess of the of the objects positioned in the scene at long distances more effectively.

Now talking about the AA used in xbitlabs test, the transparency Multi-sampling AA "used for nvidia's cards in the test hardly does any job compared to the default MS-AA to the smoothness of the edges...but it does allow Full HDR rendering....Although in most current games there's no need to enable the TR-AA for nvidia cards or AD-AA for ATT cards to allow HDR rendering since AA and HDR use independent rendering paths in most of those games.....

Comapred to n-vidia's TR-MS-AA, ATI's AD-AA on "PERFORMANCE" is far better.Put aside the "QUALITY" AD-AA that was used in xbitlabs test.

Nvidia's Super sampling AA, is "slightly" more accurate than ATI's "QAULTIY" AD-AA but more blurry...but compared to ATI's Super AD-AA, It's no-where near...

Here are the FS-AA modes used by both manufacturers positioned from the worst to the best:

CS-AA - used by Nvidia

MS-AA - Used by both

TR-MS-AA - Used by nvidia

AD-AA on "PERFORMANCE" - used by ATI

AD-AA on "QUALiTY" - used by ATI

Super-Sampling-AA - used by nvidia

Super-Adaptive-AA used by ATI.

As for the coverage scene AA that's supported since nvidia's 8000 series cards, this CS-AA is nothing more than a slighlt "downgrade" form the default MS-AA.
Nvifia's CSAA is a response to MS-AA being too costly at beyond 4X-AA towards 8x and 16x-AA. Even with the increase of gpu performance, 16x MS-AA is still far too demanding. the CS-AA does the same job as the MS-AA until you enable 8 or 16 samples/ texture, and that's when CS-AA falls behind MS-AA...

With 16X CS-AA, you have the same amount of colour and Z samples taken as with 4x MS-AA....

The benefit, [SIZE="5"]claims NVIDIA[/SIZE], is that CSAA offers similar image quality to MSAA but with much less of a performance penalty......

So overall, ATI offers better texturing in both quality and High quality modes than Nvidia....And offers better Mid-Class and Super-Class AA than nvidia....

0 0 [Posted by: Vlad  | Date: 05/15/09 09:21:57 PM]
Are you aware that Nvidia's AF algorithm is fully angle-independent? At 45-degree angles, ATI's AF algorithm is only as good as Nvidia's 2x AF at best, even if you turn it all the way up to 16x on ATI cards. Remember the horrible AF algorithm on G70 cards and older ATI cards?

Nvidia's pure SSAA modes offer the absolute best in image quality. However, Nvidia does not officially support those modes. I use either Nhancer or Rivatuner to force those modes. My favorite mode is a combined 2x2 SSAA + 2x RGMSAA mode which is called 12xS. Remember when nothing rivalled Nvidia's old 8xS mode back in the FX5900/R300 days? The performance hit is acceptable for older games, like UT99, etc.. Another bonus is that it effectively doubles your AF to 32x. The only problem is that the smaller text on the GUI might be blurred a bit, since everything is super-sampled. I wish ATI would offer those modes also, if not support them. Not all of us have 2560x1600 monitors for the best resolution, and we still want the best out of older games.

Yes, Nvidia's TR-SSAA does make alpha texture edges more blurry than ATI's Q-AAA. Sometime around the release of G80 cards, Nvidia improved its TR-MSAA to include a little of TR-SSAA, just like ATI's P-AAA, so that it could compare better against ATI. The performance hit was a bit greater than Nvidia's old TR-MSAA which hardly offered any benefits. The old TR-MSAA only make the textures shimmer a little less when it came to alpha textures that were being slightly AA'ed, but often introduced errors associated with properly displaying the edges of alpha textures, distorting them past what was intended by the game designers. Usually, there was not any performance hit at all with the older "questionable" TR-MSAA method.

I do prefer ATI's Q-AAA over Nvidia's TR-SSAA, in the end. ATI does almost just as good a job at smoothing out the edges of transparent textures, WITHOUT distorting the areas near the edges (making them BLURRY like Nvidia does).

About CSAA modes, I cannot accept those modes. It makes the wires of power lines and thin objects way too thin, sometimes literally making them invisible. I call it "flawed" AA. When playing at high resolutions, it makes it more difficult to see distant characters, in UT2004, for example, when you're trying to spot enemies at a distance. Pure SSAA modes actually make the wires thicker, appealing to my own preference of a fuller, more robust image. See [url= AASamples.htm[/url[/url]]

What you said about ATI having better AF method could be true for a few games. Yes, a few objects do seem a little less foggy in some particular games, but I cannot remember where the articles on this were. Could you please provide a link on this matter? Vlad, did you work for Xbitlabs--I seem to recall an editor that had the same name???
0 0 [Posted by: Bo_Fox  | Date: 05/16/09 11:34:34 AM]
Here's a quality comparison article from xbitlabs:

According to this article, ATI's high quality AF is better than Nvidia's by miles.
0 0 [Posted by: Vlad  | Date: 05/16/09 03:35:18 PM]
Vlad, Vlad, Vlad..

You're only looking at an ancient AF method that is no longer used on G80 cards and newer. It only applies for 7950 and older cards. Yes, ATI's HQ-AF was better by miles, but now Nvidia's Quality AF is the best.

Here's a link to keep you updated as of late 2006:

Here's one that shows all of the comparison, including screenshots:

0 0 [Posted by: Bo_Fox  | Date: 05/17/09 11:57:26 PM]
Beyond3D, is a well know n-vidia sponsored site.Im not going to believe them over xbitlabs...
As for the AF and AA,both Nvidia and ATI still use the same AF and AA modes "except for the CS-AA, which is a downgraded MS-AA at 8x and 16x levels", they were using in their 7000 and 1000 series cards.

0 0 [Posted by: Vlad  | Date: 05/18/09 03:20:40 AM]
Look, Vlad:

Just scroll down to the bottom and see how it's different from the old link you provided above. It's NOT the same AF method as the 7950 card. The program D3DAFtester does not lie, and if you have the card, please test it yourself.
0 0 [Posted by: Bo_Fox  | Date: 05/18/09 12:07:46 PM]
Thank you Bo Fox, the red rooster FUD is always thick wherever I go. It's just amazing what the red rooster rooters come up with. lol
They are more skilled than our smarmy politicians.
On this whole AA AF thing I saw about 6 months ago the two modern rivals put to a blind test that had identical computers except for the card side by side and the polled screen viewers couldn't pick the winner.
Yeah they were noobed as far as I recall.
I'm going to look at hardocp he has some image stuff there.
0 0 [Posted by: SiliconDoc  | Date: 06/22/09 10:55:02 PM]

I am really disappointed because you still use the old driver for Nvidia cards : 182.06.

The new driver 185.85 provides noticeable improvements in games like Crysis Warhead , Far Cry 2 and Fallout 3 , you should have used it , considering that you are already late in reviewng GTX275 , the new driver also improves the efficiency of Geforce cards under high resolution.

at least you could have used the beta driver , like most other sites , HardOCP , anandtech , tomshardware , guru3d and many more , all uses the new driver or have used the beta one.

0 0 [Posted by: MuhammedAbdo  | Date: 05/15/09 02:35:46 PM]

This is an excellent review.
0 0 [Posted by: konstantine  | Date: 05/17/09 06:15:55 AM]

A quick look at Nvidia's TR-MSAA compared to ATI's Quality AAA makes one wonder why Xbitlabs is treating those modes as equal:

In the middle row for the 8800GTX column, it shows TR-MSAA, and the bottom row shows the maximum quality, which is TR-SSAA and Quality AAA for the HD 2900XT, respectively. Take a close look at the tree leaves/branches, and the big letters on the sign.
0 0 [Posted by: Bo_Fox  | Date: 05/18/09 12:22:22 AM]

Ahha ! Evidence the 4850 is not as good as the 8800 even !
And one guy points out the NV control panel allows specialized forcing in each specific game, which no matter what ati fanbois say, is AWESOME !
Yes, the specialized come with the driver nvidia game specific default CP settings and the ability to megatweak in there is really great. When new games are installe they appear or alternately you can just browse to their executable to add them.
That is just one more NVidia materpiece butt kickage to add to the long list of nvidia advantages (cuda, physx, now ambient occlusion shadows for games with the new driver), and it is never crowed about enough.
Sorry, I just really hate the ati bullhockey I've been subjected to for over a year now. I'm not anti-ati card, I have some of them, it's just the bs so many try to shove down everyone's throat.
If it's a few framerates at high rezz and aa af I'll take the nvidia card like LIGHTNING with all the other advantages, so numerous - from the light drivers CP, to stability, to driver upgrades that add features (that's been going on a long time like with the Ti4 series) - I just hate it when ati fans say absolutely none of it matters because they have a huge hate chip on their shoulder for nviidia.
Sorry, 3,4,5 or more added advantages and features and it "doesn't matter" ? NO WAY.
I even saw the red fans shrieking over at AT that ambient occlusion sucks and they don't like shadows in games. LOL
They suddenly, hate shadows in a game. Then they clucked if it wasn't for nvidia, ati's tesselation would be a standard, and anything that doesn't run on ati is garbage...
Now that takes some real fanboy illogic to spew that kind of stuff out and they're dead serious. It's even nvidia's fault that PhysX doesn't run on ati cards and it should be destroyed because it's proprietary they say. LOL Right after tessalation is good if it just runs on ati only.
Whatever - it's just amazing. Ati is losing billions so their smaller core saved em nothing - and nvidia has been posting profits all those 9 quarters ati lost $$$$$$$$$$$$$$$.$$
Thanks to those who play fair and stop the bs'ing.

0 0 [Posted by: SiliconDoc  | Date: 06/22/09 11:11:39 PM]


Back to the Article

Add your Comment