by Alexey Stepin , Tim Tscheblockov, Anton Shilov
09/07/2004 | 05:39 AM
This review is a continuation of our all-around tests of the GeForce 6600, the new mainstream graphics processor from NVIDIA Corporation. Earlier we examined its technical characteristics and capabilities, described the reference graphics card based on the GPU, and carried out theoretical tests. If you’re interested in that review, follow this link. In brief, we found the new GPU from NVIDIA an excellent solution on that first approach, as it left no chance to mainstream GPUs of the previous generation and sometimes challenged its top-end representatives.
The second part of the review is going to be concerned with testing NVIDIA’s new offer in a selection of modern gaming applications. It’s no secret that games are the main area of application of consumer graphics cards, so we think that games must be also used to benchmark them. We are using many games of various genres for our tests to be the most representative of a graphics card’s performance. And as we’re benchmarking modern graphics processors, we must certainly include the most visually advanced games that would employ all of a GPU’s features and capabilities.
In this testing session we tried our best to stick strictly to the aforesaid rules. Various game genres are all represented, with a handful of new popular titles, among which id Software’s new creation, Doom 3, stands prominent – we even dedicated a separate review to this game. We used a new version of the popular multiplayer shooter Counter-Strike called CS: Source. It has a status of beta version so far, though. And we also employed a new synthetic test based on the engine of the upcoming Half-Life 2 that mostly boasts various pixel shader based effects.
We decided to pit the GeForce 6600 GT against the last-season stars, the GeForce FX 5950 Ultra and the RADEON 9800 XT, since they have eight pixel pipelines like the new NVIDIA GPU does, and work at comparable frequencies and, what’s most important, sell for $200-300 today, thus belonging to the same price category. So users ready to spend that sum of money for a graphics card will surely compare these offers among themselves.
With the arrival of new solutions (RADEON X800 XT/PRO and GeForce 6800/Ultra/GT), older 8-pipeline GPUs, once performance leaders, have moved a class down, to the sector of mainstream graphics cards. Regrettably, this is not a very quick process, and such cards still cost quite a sum, since the new -generation products are rare guests in shops, yet. Well, the price will fall certainly, and these ex-leaders will have to face competition from the new breed of mainstream GPUs, the GeForce 6600 and RADEON X700. So we want to see how competitive the GeForce 6600 is against the well-known combatants like the RADEON 9800 XT and the GeForce FX 5950 Ultra. The latter two GPUs have an added bonus – their 256-bit memory bus. Theoretically, it should give them an advantage in high resolutions and/or with full-screen antialiasing. We will see shortly how big this advantage really is.
Besides the above-mentioned cards we also included a GeForce 6800 GT with the PCI Express interface and a RADEON X600 XT. The latter is in fact promoted by the ATI camp as a direct competitor to the GeForce 6600 which works at lower frequencies than the 6600 GT.
Unfortunately, we hadn’t a mainboard with both AGP and PCI Express slots, so we had to run the tests of the GeForce FX 5950 Ultra and RADEON 9800 XT on our AMD64 platform. The results thus can only be compared indirectly. Our test platforms were configured as follows:
The Intel Prescott platform:
The AMD Athlon 64 platform:
We used the following benchmarks and games:
First Person 3D Shooters:
Third Person 3D Shooters:
As usual, we selected the settings in each game that would produce the best-looking picture on the screen. The settings were identical for all the tested cards. We had disabled the anisotropic and tri-linear filtering optimizations for ForceWare 61.77 before running the tests. The other version of the driver from NVIDIA, ForceWare 65.76, needs one comment.
The developer renamed the Anisotropic optimization option into Anisotropic mip filter optimization and added the Anisotropic sample optimization option. The first setting has really just changed the name; its key point remained the same as in the previous versions of ForceWare – if enabled, it replaced tri-linear with bi-linear filtering on all the texture stages, save for the first one. This leads to a certain performance gain at a tradeoff of a practically unnoticeable image quality loss. The new option, Anisotropic sample optimizations, performs a series of optimizations of texture samples, probably reducing the number of texture lookups when it can be done without compromising the image quality much, but again this option doesn’t tamper with the first texture stage.
We are glad to see that the control over all the optimizations is in the hands of the user. NVIDIA plays fair and trusts the user – we think the user community is going to value this.
Optimizations are always on in ATI’s Catalyst driver – you can only disable them by performing some complex manipulations over the system registry. So we left the ForceWare optimizations on to give the participants equal chances. According to NVIDIA’s recommendations, its products are the closest to the competitors in terms of optimizations if you enable Trilinear optimization and Anisotropic sample optimization, but turn off the Anisotropic mip filter optimization.
We followed this recommendation, but made a few screenshots before the tests to check out how the image quality depended on the enabled Anisotropic sample optimization. We took the screens in Far Cry, Doom 3, Painkiller, Max Payne 2 and Counter-Strike: Source Beta.
So, the screenshots made on the GeForce 6600 GT with disabled optimizations are on the left; the visual effect of the Trilinear optimization and Anisotropic sample optimization is shown in the middle; and the result of enabling all of the texture optimizations is displayed on the right:
Well, it’s really hard to see any difference in the static screenshots with or without the optimizations. Sometimes the image quality degenerates quite visibly, though. The images from Max Payne suffered the most: the textures became sharper but with more noise after we had enabled the first two optimizations, and the Anisotropic mip filter also added noticeable lines between MIP-levels.
Now we can run our tests to evaluate the strength of NVIDIA’s new GPU in real-life gaming applications. Traditionally, 3D shooters with a view from your own first person open up the testing session. Well, this is really the most popular and widespread game category.
The new GPU took an excellent start, leaving all the competitors behind. The traditionally well-written driver and the high clock rate of the graphics processor are the GeForce 6600 GT’s winning factors. The GeForce 6600 looks somewhat less impressive, but anyway matches the performance of its slightly more expensive market competitor, the ATI RADEON X600 XT.
Antialiasing with anisotropic filtering add to the load on the memory subsystem, so the GeForce 6600 GT is less brilliant here. Anyway, it has worthy results up to 1280x1024 – that’s very good for a graphics card of its class.
Although the new GPU yields a lower fps rate than the RADEON 9800 XT or the GeForce FX 5950 Ultra, we can’t call its result disappointing, considering its market positioning and a huge potential for a price reduction.
This title was around for half a decade and the development process came to an end only recently. The wait seems to have been worth it – the game is a real blockbuster. This is a classic horror shooter, not encumbered with brain work, like Half-Life 2 for example. The fate of the player depends mostly on your reaction and steady hand, rather than on your skills at puzzle-solving. Unlike the majority of other 3D games, Doom III uses id Software’s favorite OpenGL API and displays an extreme voraciousness with regard to the system resources at large and to those of the graphics card in particular. Well, this exactly makes it an excellent benchmark. The game allows recording demos and playing them back in the timedemo mode. So let’s see what the GeForce 6600 GT has to show to us in this glorious shooter:
The GeForce 6600 GT puts on an excellent performance in Doom III – it is much faster than the mainstream graphics cards, faster than the RADEON 9800 XT and just a little slower than the GeForce 6800 GT. Well, this result is partially thanks to id Software who seem to have sharpened their new game (with its ubiquitous shadows) specifically for NVIDIA GeForce 6600/6800 series cards (with their UltraShadow technology).
There’s only one difference in the “eye candy” mode – the gap between the GeForce 6600 GT and the 6800 GT is wider due to the latter’s having a 256-bit memory bus.
All the above-said things are true for the d3dm4 deathmatch level. The fps rates are overall higher here, since there no monsters on deathmatch maps.
Again, the GeForce 6600 GT is only inferior to its elder brother, the GeForce 6800 GT.
It’s more difficult to compare the GeForce 6600 GT with the “oldies” in this game since Unreal Tournament 2004 has a liking towards the AMD platform we tested the GeForce 5950 Ultra and RADEON 9800 XT on. All the cards hit the ceiling as set by the central processor of the system, but this ceiling is somewhat higher with the AMD Athlon 64 3400+ CPU than with the 3.6GHz
In the eye candy mode the above-mentioned ceiling only presses down on the cards in 1024x768 resolution. In other display modes the GeForce 6600 GT is somewhat better than the old-generation cards, despite its 128-bit memory bus.
It’s different on the Metallurgy level: the level itself is simpler, and the performance ceiling is higher, so the GeForce 6800 GT is the only graphics card to reach it (in 1024x768 resolution – along with the RADEON 9800 XT and GeForce FX 5950 Ultra). The GeForce 6600 GT has good fps rates since 1280x1024 resolution.
The new GPU feels all right in the eye candy mode, too, outperforming all its competitors despite its narrow 128-bit memory bus.
Having a smaller memory bandwidth, but a higher fill rate, the GeForce 6600 GT gains the upper hand over the RADEON 9800 XT and GeForce FX 5950 Ultra in higher resolutions. It provides playability in 1280x1024, which is a kind of basic requirement for a modern graphics card. The GeForce FX 5950 Ultra looks bad in this visually advanced game and falls behind the GeForce 6600 even.
The performance of the GeForce 6600 GT is comparable to that of the RADEON 9800 XT; that’s an achievement for a graphics card with a 128-bit memory bus, especially considering the high pixel shader performance of the ATI chip.
We should explain that we always choose the most efficient rendering method for each graphics processor, so we select the Shader Model 3.0 path for all cards of the GeForce 6 architecture. It certainly gives them a small speed bonus.
The GeForce 6600 GT is slightly faster than the RADEON 9800 XT here. This is again an excellent result, considering the higher load (full-screen antialiasing and anisotropic filtering) on the memory bus in this mode.
The GeForce 6600 GT loses the lowest resolution to the RADEON 9800 XT, but then the two are racing neck and neck. The GeForce FX 5950 Ultra suffers a defeat, “thanks” to its inefficient architecture.
The eye candy mode on the Research level proved to be a hard nut for the GeForce 6600 GT to crack. The RADEON 9800 XT makes a good use of its 256-bit memory bus and efficient memory bandwidth utilization techniques to achieve a better result.
Painkiller feels better on the AMD platform, which sets a lower ceiling for the graphics card, as the RADEON 9800 XT confirms in 1024x768. But if we compare the numbers for higher resolutions, the GeForce 6600 GT seems preferable. Well, even the GeForce 6600 easily achieves a playable fps rate.
Brisk in the lowest resolution, the GeForce 6600 GT soon falls behind the GeForce FX 5950 Ultra and the RADEON 9800 XT. We blame the same 128-bit memory bus again – it is the real bottleneck that prevents the graphics card from performing better. Well anyway, the speed of the GeForce 6600 GT is enough for comfortable play.
Counter-Strike is probably the most popular multiplayer shooter in this world. Until recently, this game was based on the rather obsolete Half-Life engines which couldn’t use the capabilities of modern graphics cards, but now Valve has released the beta version of Counter-Strike: Source with a next-generation engine from the Half-Life 2 project, the sequel to the adventures of the scientist Gordon Freeman.
The new engine uses the newest advances in the graphics field to bring us to a new level of realism. Like Doom III, Counter-Strike: Source Beta allows recording and playing demos and offers the timedemo mode for benchmarking a graphics card. Here’re the results we got with it.
Note: we used slightly different demo records with PCI Express graphics cards since the game engine had already updated itself through the Valve Steam system, making the older versions of the records unplayable.
The results are only indicative of the strength of the graphics cards since 1280x1024 resolution as the CPU seems to be a speed limiter in 1024x768. The GeForce 6600 GT did well in this test, similar to the RADEON 9800 XT. The cheaper GeForce 6600 is ahead of the RADEON X600 XT.
The GeForce 6600 GT is excellent in the eye candy mode, slightly losing to the RADEON 9800 XT in 1600x1200. Somewhat to our surprise, the GeForce FX 5950 Ultra takes the lead, but we shouldn’t forget that this game uses DirectX 8.1 with this graphics card, so its results cannot be compared directly to those of the cards that use all the capabilities of the new engine.
NVIDIA’s new GPU is much alike to the RADEON 9800 XT in the second demo record.
We enable full-screen antialiasing and anisotropic filtering to see the GeForce 6600 far ahead of the RADEON 9800 XT. The gap diminishes in higher resolutions, though, as the GeForce is once again impeded by its slow memory subsystem.
The GeForce 6600 GT shows itself a highly competitive product in this upcoming game, notching a result close to the RADEON 9800 XT. The numbers in 1024x768 resolution shouldn’t be considered really adequate as we used different testbed configurations.
The GeForce 6600 GT looks most convincing on the second level of the game.
The new GPU from NVIDIA again pleases us with a highest speed in the second upcoming game we use as a test. Sometimes it even leaves behind the GeForce 6800 GT, probably due to the higher core frequency.
The triumph doesn’t last long, though. With full-screen antialiasing and anisotropic filtering enabled, the newcomer finds itself behind the RADEON 9800 XT. The gap only grows wider in high resolutions, and that’s quite natural.
It’s all the same on the second level of this game as on the first one, only the GeForce 6600 GT cannot surpass the GeForce 6800 GT in any of the display modes.
The load becoming heavier, the GeForce 6600 GT again loses to the RADEON 9800 XT, and again due to the lack of the memory bandwidth.
Overall, the GeForce 6600 GT looks fine in Splinter Cell: Pandora Tomorrow. Its high core frequency and efficient NV40 architecture would do more if it were not for the narrow memory bus.
Overclocking is rewarding here as the game engine is most sensitive to the frequencies of the graphics card.
There’s nothing to comment on, really. The new GPU architecture is superior. In 1600x1200 the gap between the RADEON 9800 XT and the GeForce 6600 GT is small, though.
It’s different in this game to what we have seen earlier. The RADEON 9800 XT is better than the GeForce 6600 GT here, but you should keep in mind the fact that we test these cards on two different platforms. Moreover, Hitman 3: Contracts is a heavy burden even for the GeForce 6800 GT. Overclocking the GeForce 6600 GT brings in a truly gargantuan performance gain!
This game used to be a catastrophe for the GeForce FX family of chips. Now it puts the GeForce 6 architecture on top: the GeForce 6600 GT with its narrower memory bus leaves behind the RADEON 9800 XT as well as the GeForce 5950 Ultra. What’s natural, the RADEON X600 XT can hardly be considered a worthy rival to the GeForce 6600, not mentioning the GeForce 6600 GT.
Incredible things happen in the eye candy mode: the GeForce 6600 GT is far ahead of the RADEON 9800 XT with its twice as wide memory bus! The GeForce 6600 defeats the RADEON X600 XT in their local fight.
Like Splinter Cell, this game scales up well in accordance to the performance of the graphics card. However, the GeForce 6600 GT cannot reach the level of the RADEON 9800 XT, and the GeForce 6600 somewhat unexpectedly loses to the RADEON X600 XT.
The graphics cards of the older generation run this game faster than the GeForce 6600 GT does in low resolutions. Then they change their positions. Whoever the leader, all the tested cards did well in Max Payne 2.
The RADEON 9800 XT even beats the GeForce 6800 GT after we enable full-screen antialiasing and anisotropic filtering.
This flight sim becomes a most demanding game at its maximum graphics quality settings. It favors modern graphics cards with a high fill rate and capable of fast processing of pixel shaders. We’ve got two such cards in here, the GeForce 6600 GT and RADEON 9800 XT, and they race along together, the GeForce being somewhat faster.
Its narrow memory bus becomes a limiting factor in 1600x1200, though. The RADEON 9800 XT wins this resolution, although with a tiny breakaway of 1.7 frames per second. Dragged down by the slow memory, the junior GeForce 6600 model is trying to save face – it is a worthy competitor to the RADEON X600 XT.
We enable FSAA and AF to see what we have expected to see: the GeForce 6600 GT lacks the memory bandwidth and hands over its first place to the RADEON 9800 XT in 1280x1024. The GeForce 6600 looks surprisingly well with its eight pixel pipelines; its results are close to those of the GeForce FX 5950 Ultra. Still, we should note that even the GeForce 6800 GT can hardly provide a playable fps rate in IL-2 in the eye candy mode.
Lock On seems to be less advanced graphically than the latest version of IL-2 Sturmovik, but it is anyway a most demanding gaming application.
The GeForce 6600 GT is capable of keeping up the pace of the old-generation graphics cards as well as of the GeForce 6800 GT itself! The GeForce 6600 continues its local racing with the RADEON X600 XT – it would certainly enjoy some extra megahertz of the GPU or memory frequency.
Although with a narrower memory bus, the GeForce 6600 GT goes abreast with the RADEON 9800 XT in the eye candy mode. The results of the GeForce 6600 and the RADEON X600 XT are practically the same.
The GeForce 6600 GT proves that it can output a higher fps rate than the ex-leaders like the RADEON 9800 XT.
It’s the same in the eye candy mode, save for 1600x1200 resolution where the new GPU lacks the memory bandwidth. The GeForce 6600 makes short work of the RADEON X600 XT, notwithstanding the big difference in their frequencies.
It is the difference between the two test platforms that we see in the first two resolutions. In 1600x1200 the leader RADEON 9800 XT is closely followed by the GeForce 6x00 family.
The use of full-screen antialiasing and anisotropic filtering helps differentiate between the cards. The GeForce 6600 GT and the RADEON 9800 XT match each other, although with a certain advantage on the side of the former.
The 3D real-time strategy game Perimeter is probably the most demanding game in our tests. Even top-end graphics cards often falter here.
As you see, it only makes sense to comment on 1024x768 display mode. The RADEON 9800 XT wins in the mainstream sector, and is followed by the GeForce 6600 GT (overclocking even makes it the leader). The game seems to have a fancy to the RADEON architecture – the RADEON X600 XT surpasses the GeForce 6600.
It’s all the same here as in the pure speed mode, but none of the graphics cards provides the bare minimum of playability.
Free from complex pixel shaders, Aquamark 3 traditionally likes NVIDIA’s GPUs which are good at processing geometrical data and at rendering scenes with a high overdraw count.
The GeForce 6600 GT is the best here, leaving the RADEON 9800 XT behind. The GeForce 6600 is slightly better than the RADEON X600 XT.
With all the eye candies enabled, the GeForce 6600 GT allows to be overtaken by the RADEON 9800 XT in 1600x1200 only.
This is a synthetic test based on the Half-Life 2 engine. It suits well for benchmarking pixel shader performance as we are being shown the engine’s capabilities at drawing shader-based effects throughout the entire test. Although short, the test loads to the full a most advanced graphics card, that’s why we have added it into our list of benchmarks.
Again, there are many complex pixel shaders in this test, and the GeForce 6600 GT, being a new-generation GPU, handles them well, outperforming the RADEON 9800 XT which was once considered the best executor of shaders. The GeForce 6600 wins the race in its class, leaving the RADEON X600 XT behind.
The new generation wins the test under the higher load, too, although the lack of the memory bandwidth is felt in 1600x1200.
This test focuses on pixel shader performance, so its results suggest that execution of such shaders is among the strong points of the GeForce 6600 GT, which is going to give it an advantage over the high-end products of the last year in today and tomorrow games.
As you see, NVIDIA’s claims about the GeForce 6600 GT’s scoring 8 thousand points turn to be true: our card is just a little below that result, which may become better as the i925X platform itself gets mature. Overclocked to 580/1100MHz, the card jumped over a barrier of 8,500 points – a very impressive acrobatics for a mainstream solution. The RADEON 9800 XT is left far behind. As for the cheaper version, the GeForce 6600 card, it is at a lower level, although ahead of the RADEON X600 XT.
Let’s see what the triumph of the GeForce 6600 GT is made out of:
The GeForce 6600 GT gets a good score in the first game test – roughly the same as the GeForce FX 5950 Ultra which has always felt at its ease in this test, free of pixel shaders.
The newcomer feels all right with FSAA and AF, but the old-generation cards are helped by their wider memory bus in 1600x1200.
UltraShadow II technology makes the new GPU the unrivalled winner of this test.
The higher load cannot shatter the positions of the GeForce 6600 GT.
The second and third tests are technically much alike, so we have the same situation here.
The same is true for the eye candy mode – the GeForce 6600 GT wins it.
The fourth game test abounds in pixel shaders and has always been favorable towards ATI’s GPUs. This time, however, the GeForce 6600 GT confirms its superiority. The new architecture shows its best here – the graphics card with a slower memory subsystem beats a most serious enemy.
After we had enabled full-screen antialiasing and anisotropic filtering, the new graphics card had a smaller advantage, but it is never inferior to the RADEON 9800 XT. NVIDIA’s new GPU was on the whole successful in 3DMark03 – notwithstanding its 128-bit memory bus, it scored well in almost all the subtests to get a nice overall sum.
As you have seen, the new family of mainstream graphics processors from NVIDIA has done well in real-life applications. Cutting it short, the GeForce 6600 GT definitely delivers more performance than the RADEON 9800 XT, and also provides more functionality.
Meanwhile, in the lower class, the RADEON X600 XT can still compete with the GeForce 6600 in terms of speed, but the latter seems to have a higher flexibility in terms of price.
We put the results of the gaming benchmarks all together into diagrams to get a notion of the performance level of the tested cards at a glance:
So, the GeForce 6600 GT wins 22 tests out of 32 in 1280x1024 resolution and 25 tests out of 32 in 1600x1200 resolution in its competition with the RADEON 9800 XT graphics card. This is just excellent for a simple graphics card with a twice-narrower memory bus. Its advanced architecture and higher clock rates proved to be of more weight, though, as the GeForce 6600 GT defeated the ex-leader RADEON 9800 XT in all the modern games like Doom III, Unreal Tournament 2004, Far Cry, Counter-Strike: Source and others.
The main accomplishment of the new GPU is its extremely high speed in games rich in complex geometry and shader-based effects. That is, the GeForce 6600 GT is perfectly suited for the kind of work it is supposed to perform. There will be ever more such “heavy” games in the future, so NVIDIA’s new mainstream GPU seems a better buy than the ex-top-end RADEON 9800 XT.
As for the junior, the GeForce 6600 model, it has pocketed a few victories, too. We can claim it is faster than the RADEON X600 XT, but don’t forget that it will have to face a much more dangerous enemy, the RADEON X700.
The results of the new products in the eye candy mode (4x full-screen antialiasing and the maximum possible level of anisotropic filtering) do not disappoint, either. They have 17 victories out of 25 possible in 1280x1024 resolution. However, the negative effect of the 128-bit memory bus is more perceptible here than in the pure speed mode. In the highest resolution, 1600x1200, the GeForce 6600 GT doesn’t boast any advantage over its closest competitor, which has a wider memory bus and uses efficient memory bandwidth utilization algorithms.
The new family of mainstream graphics processors from NVIDIA deserves our praises: with all their simplicity they found themselves capable of competing with time-tested combatants like the RADEON 9800 XT and the GeForce FX 5950 Ultra. In fact, the GeForce 6600 GT is an ideal GPU of today, since it gives you all the things you need:
Among the potential disadvantages we should probably note only 128MB of memory on GeForce 6600 GT model and 128-but memory bus. However, since these two factors currently hardly really hold the GeForce 6600 GT back, let us wait and see whether they will really negatively influence this terrific graphics processor in future.
Unfortunately, the GeForce 6600 GT and the GeForce 6600 exist now in PCI Express implementations only. AGP versions are not impossible as the AGP-PCI bridge NVIDIA uses in its other products can work in either direction. So nothing prevents them from installing such a bridge on the card with native PCI Express support for the result to work in the AGP slot. Demand for AGP versions of the GeForce 6600 GT is forecast to be high, so the probability of such versions is high, too. Well, everything is in the hand of NVIDIA and its partners, as the GPU maker must first provide the reference design of such a card at least.
The problem of market availability has been rather urgent with the GeForce 6800 family, but shouldn’t become such with the new GPUs. The NV43 (GeForce 6600) is a simpler chip than the NV40 and is produced at TSMC’s facilities rather than at IBM’s, like the elder model. So we think no deficit should occur.
We want to mark specifically the pioneering character of NVIDIA’s release. The GeForce 6600 family chips are in fact the first mainstream graphics processors intended for the PCI Express platform. Until now, this market niche was practically empty: versions of the GeForce FX with the HIS bridge couldn’t boast a high level of performance, while the RADEON X300 and X600 were nothing more but a reincarnation of the RADEON 9600, which was no speedy racer, either. Thus, NVIDIA has pioneered the development of mainstream PCI Express graphics adapters priced at $200 and with a performance of the RADEON 9800 XT.
NVIDIA’s breakthrough shouldn’t go unnoticed by its main rival, ATI Technologies. The RADEON X700 family is ready to engage into the fight, so it’s rather early yet to summarize the results of this round of the fight.