ASUS RADEON 9800 XT and LeadTek WinFast A380 TDH Ultra: The Battle for AquaMark3

There are just a few graphics benchmarks industry recognizes very-well. Earlier this year we tested the 3DMark03 to find out the champ to receive the 3DMark03 cup. Today we have new titans to fight against each other in the Aquamark3 benchmark - a yet another tool for measuring 3D performance. The GeForce FX and the new RADEON VPUs participate in the ultimate clash.

by Alexey Stepin
11/30/2003 | 08:30 PM

3DMark 2001SE and 3DMark03 used to be basic benchmarking tools of any hardware tester. Once sharp and trustworthy, these two tools have somewhat lost their edge nowadays. The former version is rather obsolete and suits but little for testing modern graphics cards. The current version, although uses DirectX 9.0 features, produced too much controversy and critical remarks to become a universal and unanimously-adopted truth. Well, no benchmark is. But some aspire to be. The last September Massive Development introduced their new benchmarking suite to the public. It is called AquaMark3.

 

The new version of the well-known product is based on the krass engine. This engine powers up games like AquaNox and AquaNox 2: Revelation and is going to power up Spellforce, which is still under development. Thus, the first advantage of AquaMark3 is obvious: it uses a real engine real games are based on. Moreover, the scenes involved in the benchmark are in fact gaming scenes from AquaNox 2, but more sophisticated.

The krass engine supports DirectX 9.0, particularly ver.2.0 pixel shaders, and thus suits well for testing the performance of modern gaming graphics cards, most of which support DirectX 9.0, too. The game scenes you see in AquaMark3 are simply beautiful: they are all underwater, with the following effects and features displayed and tested:

Overall, there are 9 scenes that put every aspect of the graphics card under close scrutiny. Each of the scenes, unlike the relatively featureless tests from 3DMark, create a definite type of workload, thus we can examine the efficiency of each given graphics card in more detail. Every graphics card we will test today is going to produce 10 numbers. Nine of them will be specific and one – showing the overall performance of the product.

It is unfortunate the AquaMark03 benchmark developer, unlike Futuremark Corporation, doesn’t reveal the physical properties of the scenes, so it is rather hard to comment much on some selected scenes.

AquaMark3 in Detail

The suite offers two curious modes: AquaMark3 OVIST (Overdraw Visualization Technique) and AquaMark3 SVIST (Shader Visualization Technique). The first of them allows you to estimate visually the workload on the graphics card and the complexity of the scene.

This visualization looks funny enough, resembling the thermal vision of the Predator from the namesake movie: parts of the image with a high overdraw coefficient are rendered in “warmer” colors. The brightest object in the AquaMark3 OVIST mode is the refuse trickling out of the drain pipe in the High Particle Count scene.

AquaMark3 SVIST has the same purpose and the “S” stands for “shader”. Version 2.0 shaders are rendered in red, while ver.1.1 and 1.4 shaders in yellow. Ordinary, “shaderless” scenes are rendered in blue.

Besides that, AquaMark3 can automatically take screenshots in user-defined time intervals, for example once every 10, 20, 100 or 1000 frames.

The AquaMark3 PIXPM mode (Pixel Performance Measurement) is for an in-depth examination of the pixel shader performance. It may come in handy for a detailed study of a specific VPU.

By the way, the developer of the benchmark declares the availability of about 30 pixel and 200 vertex shaders, each different. It is a good workload for any modern graphics card. AquaMark3 is designed for DirectX 9-compatible hardware, but can run less complex shaders on previous generation graphics cards, too. It means this benchmarking package can hardly serve to compare different graphics card families.

AquaMark3 carefully piles the results up in the My Documents/AquaMark3 folder, with a separate subfolder for each test cycle. You can also find the settings and initialization files there as well as the EULA file, which is clear evidence that the developer tried to take into account every trifle. You don’t realize how useful and time-saving those trifles are until you have to benchmark a dozen of different graphics cards. Unfortunately, there is a minor defect in AquaMark3. If you check the “Use all” checkbox in the Resolutions panel, the program starts a test cycle from 640x480 and 800x600 modes. These resolutions are used very rarely nowadays and the two extra passes don’t save your time at all. I think it would be better if the user could decide for himself which resolutions to use. This goes for other settings like full-screen anti-aliasing and anisotropic filtering, too. I could point out 3DMark 2001 as a well-organized benchmarking set.

Some users may also grumble about the graphics-heavy menu system of AquaMark3. All those animated buttons and other “bells and whistles” may be annoying, but after all it is a matter of taste. :)

Testbed, Methods and Settings

We gathered together eight graphics cards of the mainstream and high-end classes to have them benchmarked in AquaMark3. Who’s going to win the battle? The list of the testing participants includes:

And that’s the battlefield:

We used the new drivers from NVIDIA to show you that the GPUs from the company highly depend on the correct work of the software shader translator. The software solution employed in the ForceWare driver allows working around the numerous bottlenecks in the NV3x architecture (we covered this matter in our eVGA e-GeForce FX 5700 Ultra Review called ATI RADEON 9600 XT vs. NVIDIA GeForce FX5700 Ultra and NVIDIA GeForce FX 5950 Ultra Review called NVIDIA GeForce FX 5950 Ultra against ATI RADEON 9800 XT: Shader Wars).

We will get to the benchmarks a little later. For now, let me introduce to you two fresh products that have recently entered the league of the high-end graphics cards. The late kings have left their throne to hail the ASUS RADEON 9800 XT and the LeadTek A380 TDH Ultra.

ASUS RADEON 9800 XT: Masterpiece or a Graphics Card?

There was a time when ASUS Computer used to produce a wide range of graphics cards on different GPUs, including ones from 3dfx and ATI Technologies. In fact, first graphics cards from ASUS were based on controllers from ATI. But by the end of the 1990-s, the two graphics companies focused on their own graphics card businesses and also lost their leadership in the 3D GPU field to NVIDIA. Times have changed.  3dfx is no longer there, while ATI Technologies is questioning the supremacy of NVIDIA and quite successfully! Every member of the “big four” (ASUS, ECS, MSI and Gigabyte) as well as a number of minor companies released their products on ATI chips throughout the last year.

Products from ASUS are known for the highest production quality that comes at its proper price. That exactly applies to the graphics card we are going to review in detail today.

The package of the ASUS RADEON 9800 XT/TVD surprises with its huge size. You usually see boxes like that at computer exhibitions or in the storefronts of big PC shops, but this time it is a standard package of the new card from ASUS. It is about half a meter wide and 32cm high. Just take a look:

That’s impressive, isn’t it? The design of the box can be called classic: fantasy-style characters are fashionable nowadays. The face side of the box depicts a buxom girl all smothered in jewelry. There is also a sticker giving you some info about the product, particularly, the amount of the graphics memory. The cover with the girl is folded back, revealing a transparent window where you can see the card. Opening the box, we found the following items provided by ASUS:

The CD box deserves mentioning separately. It contained the following:

The video-editing software is most appropriate since this graphics card offers the VIVO functionality. For a description of the exclusive utilities from ASUS you can refer to our ASUS V9950 Graphics Card Review. I wll only describe Smart Doctor II, as the most important component of the hardware monitoring system.

The would-be video directors don’t have to creep under their desk to get to the back panel of the PC to connect a camcorder or a VCR. The VIVO unit included into the package can be attached anywhere with the help of a special sticker. I can only say that unfortunately very few manufacturers show the same tender loving care about the customer.

ASUS RADEON 9800 XT: Closer Look

So, I extracted the thing out of the box and gasped: it is more like a jewel rather than a trivial graphics card. The card looks most extraordinary with its silver figured plate mounted on the cooler and gold heatsinks that match the PCB color.

 

The cooling solution is called ASUS Smart Cooling. This well-designed system traces its origin to the cooler installed on the ASUS V9950, but has made a step farther. A flat heat-pipe takes the heat from the copper plate that contacts the VPU and memory chips surface to the heatsinks. The heatsinks are not part of the plate; they are screwed up to it. Two fans are blowing at these heatsinks. One of the fans is equipped with a rotation sensor Smart Doctor II uses. The engineers never forgot about the memory chips located at the back side of the PCB: a nicely shaped copper plate serves as a heatsink for them. The whole cooling system is reliably fixed in place. The front and back parts of the cooler are fitted together by spring screws. This prevents any damage to the fragile VPU die. A white thermal paste with good thermal properties serves as the thermal interface. ASUS Smart Cooling is very efficient and deserves all our praise and credit.

The manufacturer didn’t just use the reference design from ATI, but changed it a little. The difference turns clear when you look at the back side of the PCB. Instead of the square-shaped Rage Theater 200 chip, ASUS installed an older rectangular Rage Theater chip, which required some changes in the PCB wiring layout.

The memory installed on the card comes from Hynix. The chips have 2.5ns access time and work at 365MHz (730MHz DDR), although their nominal frequency equals 400MHz (800MHz). The total amount of graphics memory is 256MB. The VPU works at 412MHz. There is one peculiarity, though. The dynamic overclocking function, OVERDRIVE, doesn’t work with this card, just like ASUS Smart Doctor II refuses to run on graphics cards that follow ATI Technologies’ reference design. Choosing between the two, I would say the owner of the ASUS card shouldn’t be disappointed. OVERDRIVE is of little use, as it provides a small performance gain, and the VPU frequency goes down after the critical temperature is reached. In my opinion, the hardware monitoring system, ASUS Smart Doctor II, is much more useful.

ASUS Smart Doctor II – Keeping ASUS RADEON 9800 XT/TVD in Good Health

The new version of the exclusive hardware monitoring system from ASUS can do a lot. It can keep track of the fan rotation speed, control the voltages as well as the temperatures of the GPU and memory. This is what the interface of Smart Doctor II looks like:

The settings are flexible enough. You can enable the overheat protection system and set up the threshold values when the alarm signal should go off. You can set up the rotational speeds of the fans depending on the graphics chip temperature. I think this is a better and more functional monitoring and protection system than the one offered by ATI Technologies. Besides that, Smart Doctor can overclock the card, although in a narrow frequency range: up to 440MHz GPU and 770MHz memory. Interestingly, the model of the graphics card is shown as “A9800XT”. Such short names are typical of ASUS’ products.

ASUS RADEON 9800 XT: 2D Quality and Noise

Such a parameter as 2D image quality is losing its importance, since there are ever more LCD monitors with the digital (DVI) interface. Anyway, the ASUS RADEON 9800 XT/TVD yielded a crystal-sharp picture in all resolutions up to 1800x1440@75Hz.

As for the noise produced by the cooling system of the card, you can hear it, but in ideal testing conditions. In real situations, its 5,200rpm fans are completely lost against the roar of other system components. Moreover, you can adjust the fans speed in the Smart Cooling system or enable the mode where they change their speeds depending on the temperature. ASUS Smart Cooling is probably the best cooling and monitoring system of today. Perhaps the systems from Tyan offer similar functionality, but they are a bit noisier.

ASUS RADEON 9800 XT: Overclocking

Overclockability is a major attraction for the enthusiastic user. So, we decided to check the ASUS RADEON 9800 XT/TVD at high frequencies, having first installed an additional 120mm fan as a preventive measure. The memory worked stably at 385MHz (770MHz DDR), without even making it to its own specs (400MHz or 800MHz DDR). The simple truth was confirmed once again: for memory to work at high frequencies the access time should be low, but also proper layout is required. I think it was the too simple wiring that slightly spoilt our overclocking experience. The VPU overclocking was even worse. At 430MHz VPU frequency, the water surface in 3DMark03: Mother Nature showed the funny tessellation much similar to what the unlucky owners of the RADEON 9500 saw after trying to remake it into the eight-pipelined RADEON 9700. This tessellation only vanished after the core frequency was reduced to 415MHz, only 3MHz above the nominal. Thus, I should confess this graphics card is not suitable for overclocking.

ASUS RADEON 9800 XT: Conclusion

The ASUS RADEON 9800 XT/TVD is a high-quality product with an exclusive exterior and excellent cooling and hardware monitoring systems. To cap this all, it offers you the VIVO functions. If you want to be at the leading edge of the technology, that’s your card. There is only one factor that could make you reconsider: according to X-bit labs DealTime, you have to count out from $492 to $515 (at the time when we worked on this review) for this product. Anyway, the excellent quality and nice accessories set are awarded with our quality mark, X-bit labs Editor’s Choice. It should be noted, though, that several other graphics cards from ASUS could aspire for this award, too. For example, the V9950 we tested earlier this year, although it is a little slower (ASUS V9950 Graphics Card Review).

LeadTek WinFast A380 TDH Ultra: Tradition and Monumentalism

We already reviewed a graphics card from LeadTek that featured an original and powerful cooling solution. Now, we have got another product from this company, the LeadTek WinFast A380 TDH Ultra based on the NVIDIA GeForce FX 5950 Ultra (NV38) GPU. We will take a closer look at its cooling system shortly. Right now, let’s open its package and check what accessories are in there.

The package is smooth to the eye with its soft tones. Unlike the package of the A350 TDH, the box of the A380 TDH Ultra follows the “fantasy” style by depicting some king or a distant relative of Gandalf the Gray. The rear side of the box lists the product features, system requirements and the items of the bundled software. So, what do we have inside?

You can’t call this a poor set. Moreover, it’s the first time I see direct support of the component video output. ATI’s Rage Theater supports this function, too, but through a special adapter, which is not included into the accessories set. LeadTek proved that they care about the users by complementing its product with everything necessary. Unfortunately, we couldn’t test this function because we didn’t have an HDTV-set at our disposal.

Just like in case of A350 TDH, the driver CD contains an overclocking and monitoring utility, WinFox 2.0, and a video-capture program, WinFast PVR, which supports Time Shifting and DirectBurn functions and is capable of working in the “picture in picture” mode. There are also a software player (WinFast DVD), Cult3D for publishing 3D content on the Web and software for color correction (3Deep, Coloreal and Colorific).

LeadTek WinFast A380 TDH Ultra: Closer Look

Yes, the engineers from LeadTek evidently tend to be monumental. The cooling solution of the A380 TDH Ultra looks not any weaker than the one installed onto the A350 TDH.

 

The GPU die and the memory chips are hidden under a big copper plate. Three folded copper stripes are soldered up to it. They serve as cooling ribs. The 60mm fan we see here is usually found in CPU coolers rather than graphics cards. This is all covered with a bulletproof casing carrying a protective grid. It is designed so that a part of the air stream from the fan went to the other side of the PCB and blew at the rear heatsink. This part of the cooling solution, responsible for the memory chips on the backside of the PCB, is no less massive. It is a huge aluminum heatsink with high ribs. Two plastic and two metal clips hold the entire construction tight in its place. This cooling system makes you forget about any overheating problems. A thick thermal paste with good thermal qualities serves as a thermal interface between the cooler’s sole and the GPU die surface.

Bravo, LeadTek! We have only to check out this cooling solution in practice to see (or hear) if it’s of any danger to our ears.

The design of the PCB resembles the GeForce FX 5900 (Ultra) reference card with some distinctions. They show most in the right part of the card where voltage regulators and other “power” electronics reside. The new card works at higher frequencies and consumes more power. The left part of the PCB features a few changes, too, but they are minor. A TMDS-transmitter from Silicon Image (Sil164CT64) and a VIVO chip from Philips (SAA7108AE) are located there. The 2.0ns memory chips come from Hynix. 16 chips like that make the total of the graphics memory: 256MB. The memory works at 475MHz (950MHz DDR). The GPU frequency is 475MHz, too, but only in 3D. When processing 2D graphics, it works at 300MHz and slows the cooler down to reduce the noise.

LeadTek WinFox 2.0 – Another Emergency System

We already described the WinFox 2.0 set in our LeadTek WinFast A350 TDH Graphics Card Review. I think there is no need to repeat the same thing once again. Please check this article for more details on it.

LeadTek WinFast A380 TDH Ultra: 2D Quality and Noise

2D image quality was excellent, with the picture remaining crisp in all resolutions up to 1600x1200@85Hz. Noise was quite another thing. When the card is in the 2D mode, it is practically noiseless, as the fan rotates slowly. But as soon as you launch any 3D application, the fan speeds up to its maximum. It would be all right, if it were not for the quite indecent noise. Of course, this cooler cannot beat the notorious NVIDIA FlowFX, but the hiss of the air coming through the dense grid is not a pleasant experience to the ear. So, I have to say that the efficiently-looking cooling solution of the LeadTek WinFast A380 TDH Ultra card is too noisy to be considered comfortable.

LeadTek WinFast A380 TDH Ultra: Overclocking

We installed an additional 120mm fan onto this card for overclocking, too. The GPU worked stable at 515MHz, memory – at 505MHz (1010MHz DDR). I couldn’t go above these numbers: when using the CoolBits function, I had the frequency resetting to the nominal value, while RivaTuner made the system hang up. Anyway, the extra 40MHz GPU and 30MHz memory should be considered a good achievement, since topmost graphics cards are usually badly fitted for overclocking.

LeadTek WinFast A380 TDH Ultra: Conclusion

The LeadTek WinFast A380 TDH Ultra graphics card left me doubtful. On the one hand, it is a quality product with a nice set of accessories. Its advantages include a very efficient cooling solution, excellent 2D image quality, VIVO functionality, component video output and hardware monitoring system, WinFox 2.0. On the other hand, this very cooling solution roars like hell in 3D applications. It’s too annoying to be waved aside.

Those who are interested in the capabilities of the product and are undaunted by its noise can purchase it for as much as $435-446, according to PriceWatch.com. I think that’s an interesting offer. 

NVIDIA vs. ATI Technologies: Anisotropic Filtering Quality

Besides the ordinary tests, we compared the quality of anisotropic filtering as provided by graphics cards on chips from ATI and NVIDIA. I guess it’s better to show you the results rather than talk about them. Just take a look at the screenshots:

Click a thumbnail for the high-resolution version
    

ATI, Catalyst 3.8

NVIDIA, Detonator 45.23

As you see, NVIDIA’s chips are better at anisotropic filtering, although have a lower maximum level of anisotropy – 8x against 16x by ATI’s VPU. You can see the difference by the textures at a distance. ATI’s chips crippled the slope of the hill, probably because of its “inconvenient” angle. As you know, the AF algorithm implemented by ATI Technologies works best on definite angles, but degrades in quality in other cases. This approach to AF implementation has certain advantages, the main of which is high performance.

NVIDIA vs. ATI Technologies: Where Can Driver “Optimization” Get You?

NVIDIA has been caught red-handed numerous times with its incorrect driver optimizations (by the way, the quality of the drivers themselves has degenerated). We discovered some visual artifacts in AquaMark3, too. First, we used the scandalous Detonator 45.23 and met some lighting problems. You can see them in the following screenshots.

Click a thumbnail for the high-resolution version

 

 


ATI, Catalyst 3.8

 
NVIDIA, Detonator 45.23

In the first screenshot we see the explosion of a submarine. The explosion illuminates the seabed. Unluckily, the reflections of the explosion that used to make the seabed so bright have been lost somehow with Detonator 45.23.

The second scene shows a play of sunlight beams on the seabed. It looks different on the graphics chips from ATI and NVIDIA, too. NVIDIA’s chip doesn’t eliminate the flares altogether, but makes them less bright and spectacular compared to what ATI’s chip renders.

Then, we noticed that game objects shed their shadows when Detonator 45.23 was used. Here is the evidence, in the screenshots:

Click a thumbnail for the high-resolution version

  

ATI, Catalyst 3.8

NVIDIA, Detonator 45.23

As you see, the submarine in the left picture casts a shadow, which conforms to the laws of optics. And there is no shadow in the right picture. Once again I have to admit new problems with an official version of NVIDIA’s Detonator and their inappropriate use with modern 3D games and applications.

The only Detonator to produce a correct image was the old official 44.03 version. Alas, the performance of the GeForce FX 5900, not mentioning the GeForce FX 5600 Ultra, dropped in half with this driver, to the level of the ATI RADEON 9600 PRO and lower. As for the new driver, ForceWare 52.16, we discovered no problems with lighting or shadows with it. The image quality with this driver was in fact identical to what Detonator 44.03 produced, so we don’t offer you the screenshots – they are the same. NVIDIA seems to have abandoned the bad habit of introducing “optimizations” for particular applications, but took to a sincere optimization of the shader code at large. Well, they have no other way. In order to ship competitive products, they have to re-design completely the architecture of their GPUs, but NVIDIA has no time for this – ATI Technologies has strong products here and now!

So, I recommend the owners of a GeForce FX based graphics card to install ForceWare 52.16 rather than Detonator 45.23, since the latter cannot provide appropriate image quality.

AquaMark3: Benchmarking Results

We tested each of the graphics cards two times: in the “pure speed” mode to check their maximum possible performance and in the “eye pleasing” mode with full-screen anti-aliasing (FSAA) and anisotropic filtering (AF) enabled. The level of AF was set to 8, as this is the maximum mode supported by NVIDIA GPUs. As for ATI’s chips, we preferred 8x over 16x, since there is a very slight difference between them in speed as well as in quality. So, it’s time we checked the actual test results.

We have got a new leader, GeForce FX 5950 Ultra. Ultra frequencies, super-fast memory, and much-improved drivers all contribute to the result: the NV38 is everywhere faster than the RADEON 9800 XT by 2-3 frames per second. In the mainstream graphics cards class, NVIDIA is also on the winning side with its GeForce FX 5700 Ultra, and this time NVIDIA’s creation is substantially faster than ATI’s.

After turning on FSAA and AF, we witness a well-known effect: the cards on ATI VPUs easily break away from the competitors, leaving them far behind. The architecture of ATI’s chips coupled with efficient AF and FSAA algorithms helps handling huge workloads.

Now, let’s get to the specific tests from the AquaMark3 package.

Dynamic Occlusion Culling

The GeForce FX 5950 Ultra wins the test in which invisible surfaces are discarded, while the GeForce FX 5700 Ultra goes close to the RADEON 9600 XT. The performance of the two top-end cards seems to be limited by the CPU in this test. The GeForce FX 5700 Ultra is once again faster than the RADEON 9600 XT.

It should be noted that NVIDIA has been implementing more efficient culling algorithms since the previous generations of its GPUs. That’s why it’s not much of a surprise to see the GeForce based cards winning this test. On the other hand, we shouldn’t forget that NVIDIA GPUs work at higher frequencies than ATI’s ones.

Under higher workloads, the GeForce FX loses its ground. High frequencies and software optimizations cannot make up for all the shortcomings of the architecture. As a result, the more efficient culling algorithms can’t help the NV36 and NV38 out in this test.

High Particle Count

You see a fabricating facility with two drain pipes that spew something like a dark cloud of separate particles into the water.

NVIDIA GPUs have always been good at processing systems of particles. So, they win in this test. By the way, in our preliminary tests, with Detonator, we saw a similar situation.

The diagram doesn’t need much commenting upon. I guess you realize now that RADEON based cards are unrivalled in the “eye pleasing” mode.

Masked Environment Mapping

The GeForce FX 5950 Ultra is a favorite in the environment mapping test. It is always faster than the competitor, although just a little bit. The GeForce FX 5700 Ultra dominates in the mainstream class, although is closely followed by the RADEON 9600 XT. The latter only slows down in 1600x1200, “thanks” to its relatively slow memory.

FSAA plus AF change the situation, but the RADEON is not too far ahead. Both new graphics chips go neck and neck in 1600x1200. The RADEON 9600 XT looks good enough, but loses to the competitor in the highest resolution.

Large Scale Vegetation Rendering

The scene represents an underwater landscape covered with a fantastic carpet of flowering seaweed. The picture is beautiful and loads any graphics card to the full; you can rarely see the fps-meter notching 25 or higher. ATI Technologies shows its best here. Notwithstanding the software tricks, the NVIDIA GPUs cannot even come close to the rivals.

The same is true for the same test in the “eye pleasing” mode.

Large Scale Terrain Rendering

We have got a marvelous landscape here – an open surface with a natural or artificial construction of gigantic dimensions and a whole underwater city! Among the cards on NVIDIA’s GPUs, only the one with the GeForce FX 5950 Ultra can handle this scene fast enough due to its sky-high frequencies.

The GeForce FX 5950 Ultra slows down under higher workloads, while the RADEON 9800 XT becomes the winner. We have parity in the mid-range sector: the RADEON 9600 XT and the GeForce FX 5700 Ultra show similar results. The latter looks somewhat better, though, especially in the highest resolution.

Vertex and Pixel Lighting

ATI chips used to do vertex and pixel lighting faster than NVIDIA, but the new driver from NVIDIA changes the rankings dramatically. The GeForce FX 5950 Ultra is a little better than the RADEON 9800 XT in the low resolution and competes with it in the high one. The mainstream GPUs perform at the same level; the difference between their results is sometimes less than 1fps. The ForceWare driver does its job well – boosts the performance of the GeForce FX architecture in shader-heavy scenes. At least, it does well in the AquaMark3 tests.

When FSAA and AF are enabled, the high frequencies and the shader code optimizer in ForceWare 52.16 can do nothing to help the GeForce based cards to catch up with the GPUs from ATI Technologies.

3D Volumetric Fog

The GeForce FX GPUs, including the mass models, are better at rendering volume fog. The programmers who wrote the ForceWare driver solved the problem: the latest chips from NVIDIA perform much faster with the new driver.

Again, the VPUs from ATI are faster under high workloads, although the GeForce FX 5950 Ultra is on the level of the RADEON 9800 XT in 1024x768.

Complex Multimaterial Shader

That’s another example of the advantages of the ForceWare optimizer. It now helps the GeForce FX GPUs win the test where RADEONs used to dominate: I am talking about the execution of complex shaders.

But activation of FSAA and AF puts the GeForce based cards down again.

Massive Overdraw

The final scene of the AquaMark3 test set is hard to handle. The frame comprises a lot of objects like submarines, smoke, fog, cannon blasts, torpedo traces, splinters and explosions. This all accounts for the high overdraw coefficient. The test concludes with a huge explosion of the underwater ship. The NVIDIA GeForce FX 5950 is on top…

…until we turn on FSAA and AF. The VPUs from ATI Technologies win the “eye pleasing” mode.

Conclusion

I have already said this many times that modern graphics cards based on ATI Technologies chips have higher potential than those on GPUs from NVIDIA Corporation. In other words, if you’ve got a card with a RADEON in it, it is sure to be fast in more DirectX 9.0 games. Our testing in the AquaMark3 set confirms this statement. Both: expensive and mass graphics cards on RADEONs ensure good performance combined with excellent image quality.

As for NVIDIA, the company has corrected its mistakes by releasing the new driver, ForceWare. In fact, the release of a special code compiler was the only choice, since NVIDIA has no time to redesign the bulky NV3x architecture and is unlikely to have the time in the future. According to the test results, the ForceWare project is a success. The performance of NVIDIA’s GPUs has increased considerably. The performance gain is so high that NVIDIA regained the leadership in a number of tests. On the other hand, GPUs from ATI Technologies go unrivaled in the tests that use full-screen anti-aliasing and anisotropic filtering.

It means that if you want more fps, you may want to choose a GeForce FX 5950 Ultra card. Otherwise, if you go for a higher image quality, the RADEON 9800 XT and PRO may be your choice. If you cannot afford the top models, consider a GeForce FX 5700 Ultra or RADEON 9600 XT – these two solutions show similar level of performance. Besides that, other factors also should be considered. I am talking about physical dimensions and heat dissipation. From this point of view the RADEON 9600 XT looks advantageous over the competitor that carries hot chips of DDR-II memory and a hot GPU on its massive PCB.

As for the image quality, we noticed no visual artifacts during our AquaMark3 tests. This doesn’t mean the ForceWare driver needs no further improvement. In some modern games, like Splinter Cell, you lose or distort shadows with this driver. Anyway, the new driver from NVIDIA does provide a performance growth in every application and game rather than in a selected few. This approach should be considered appropriate, but needs further working upon.

The situation with the GeForce FX reminds me of the one with the Intel Itanium processor. This high-performance 64-bit processor, featuring the EPIC architecture, has to use a translator to execute 32-bit code. The result is obvious: the Itanium is very slow at running x86 programs, notwithstanding all the advantages of the EPIC architecture. Intel is constantly polishing off the translator and that’s what NVIDIA’s going to do with its ForceWare. Nevertheless, it is quite possible that the games of the new generation, like Doom III, S.T.A.L.K.E.R.: Oblivion Lost and Half-Life 2, will have too complex engines for the software optimizer to digest. We are going to see soon, if it is really the case.

The AquaMark3 benchmarking set itself proved to be a handy and precise tool for measuring the performance of graphics cards. Based on a real gaming engine, AquaMark3 offers a number of extras that make your work easier. I can recommend it to any professional tester. Of course, the suite has minor drawbacks, but is often more convenient than Futuremark 3DMark03.