Evolution of Nvidia GeForce Driver: GeForce GTX 260 (216SP) under Investigation

In this review we will be benchmarking the GeForce GTX 260 with different versions of the GeForce driver and also cover the image quality aspect. Our today's hero will be Zotac GeForce GTX 260 AMP2! Edition 896MB graphics card.

by Sergey Lepilov
12/11/2008 | 11:50 AM

Trying to be ahead of the competitor, Nvidia and ATI often save time on optimizing the driver for their newest graphics processors and cards but it is no secret that Nvidia’s GeForce and ATI’s Catalyst driver are an important factor of the success of new graphics solutions. As an example I can recall that there was no official driver available at the time of the launch of the G200 and RV770 processors and appropriate cards, and hardware reviewers had to use imperfect and unstable beta versions. The test results could not be called truly objective and there could be a large dispersion in the numbers from different sources even within one game or benchmark.

 

An official certified driver does not always make the graphics card faster. More often it aims at correcting errors and supporting new games. However, both Nvidia and ATI keep on claiming that driver updates bring about higher performance in certain games or benchmarks. These claims are not proved in practice as often as we might want, though. To check out the performance benefits provided by the different versions of the Nvidia GeForce and ATI Catalyst driver I want to offer you two reviews comparing the speed of the same card with a few versions of the driver. The next review will be published after the release of the promising Catalyst 8.12 and will cover the performance of the Radeon HD 4870 with different versions of Catalyst. And in today’s review you will see how Nvidia’s driver for the GeForce GTX 260 (216SP) card has been evolving. Why did I choose this graphics card model? Because Zotac provided me with their new GeForce GTX 260 AMP2! Edition. Let’s check it out first.

Zotac GeForce GTX 260 AMP2! Edition 896MB

Package and Accessories

Zotac International Limited is quite a newcomer on the graphics card market. The company entered it in the last year only. The Zotac GeForce GTX 260 AMP2! Edition 896MB card comes in a large upright-standing cardboard box showing a picture of a ferocious dragon with outstretched wings.

 

You can also learn about the card’s characteristics and warranty terms here. A sticker on the box informs you that a copy of Race Driver GRID is included into the box. The back side of the box describes key features of Nvidia’s GPU but mentions 192 unified shader processors whereas this card actually has 216. This must be the box from the older version of Zotac’s AMP! series cards, but they put an “AMP2!” sticker on the front of the box anyway.

The graphics card is securely fixed in a plastic case within the central compartment. Under and around it the following accessories can be found:

From left to right and from top to bottom:

It is good that there are two power cables (they usually include only one) and a modern and visually advanced game in the kit.

PCB Design

The only visible difference of the new card from the reference GeForce 260 is in the sticker on the cooler’s plastic casing that shows the same dragon as on the box.

The dual-slot cooler envelops the card from three sides:

The card measures the standard 270 x 100 x 32 millimeters. It is equipped with a PCI Express x16 2.0 interface, two dual-link DVI-I ports and an S-Video output. The latter three reside on the card’s mounting bracket which has slits for exhausting the hot air from the components.

At the top part of the PCB there are two 6-pin power connectors, a connector for S/PDIF audio, and two MIO connectors for building SLI and 3-way SLI subsystems out of two or three identical cards based on Nvidia’s GPUs.

 

To remind you, the GeForce GTX 260 (192SP) has a specified peak power draw of 182 watts. A 500W power supply is recommended for a computer with one such card. For SLI configurations a 600W and higher power supply should be used. The GTX 260 with 216 stream processors has the same requirements because its power draw is not much higher.

Here is the card with its cooler removed:

And here is its power section:

The GPU is marked as G200-103-A2. It was manufactured in Taiwan on the 27th week of 2008.

So, the chip is revision two (A2). It has 216 unified shader processors, 72 texture modules, and 28 raster back-ends. The GPU frequencies are increased from the GeForce GTX 260’s default 575/1242MHz to 648/1404MHz. This 12.7%/13.1% addition is quite nice even though not record-breaking as pre-overclocked cards go. The GPU voltage is 1.06V. The GPU frequencies are dropped to 300/600MHz in 2D mode to save power and reduce heat dissipation.

The card is equipped with 896 megabytes of GDDR3 memory in 14 chips placed on both sides of the PCB. The memory bus is 448 bits wide. Like every other GeForce GTX 260 I have ever seen, the Zotac card has 1.0ns chips manufactured by Hynix.

The H5RS5223CFR NOC chips have a rated frequency of 2000MHz. The memory frequency of the reference GeForce GTX 260 is 1998MHz but the Zotac card has a memory frequency of 2106MHz, i.e. 5.4% higher. That’s not much of an addition because there are a lot of off-the-shelf versions of GeForce GTX 260 with higher memory frequencies.

Thus, the new card has the following specs:

The table provides even more details:

Cooling System Design and Performance

The cooling system is the same as on the reference GTX 280 and 260 cards.

The thick layer of thermal grease has always surprised me. It seems as if the Chinese workers are paid for a specific amount of thermal grease used up and they spend the latter with utmost generosity.

The card’s temperature was measured in a simple test. I loaded it by running the Firefly Forest test from 3DMark06 at 1920x1200 with 16x anisotropic filtering for 10 times. I didn’t enable FSAA as the GPU load and temperature would have been lower then. The test was performed in a closed ASUS Ascot 6AR2-B system case (its fan configuration is described below in the Testbed and Methods section). The ambient temperature was 23.5°C. The card’s frequencies and temperature were monitored with RivaTuner 2.20. As I had dismantled the card before testing it, I replaced the thermal interface of the GPU with a thin layer of high-efficiency Gelid GC1 thermal grease.

So, here are the results of the test with the card working with automatic fan speed management.

The GPU was never hotter than 80°C despite the pre-overclocked frequencies and the low level of noise of the cooler’s fan. The latter was rotating at only 1860rpm during this test (the maximum speed of this fan is 3200rpm).

And here is how the reference cooler can cool the card at the full speed:

The temperatures are atypically low for a top-end graphics card. Interestingly, the current on the voltage regulator at peak load was 5A lower when the card’s temperatures were low than when they were high.

Considering the high efficiency of the reference cooler, I checked the card’s overclockability out without replacing its cooler but set its speed manually at 2050rpm which seemed subjectively comfortable to me. The card proved to be stable at a memory frequency of 2448MHz (+22.5%) but I could not increase its GPU frequency at all.

Alas, the core of this sample of the Zotac card already works at its limit. Moreover, the GPU’s overclockability didn’t grow even when I increased its core voltage from 1.06V to 1.15V in the card’s BIOS. Well, it just means that the specific sample of the card has poor overclockability.

Working with the overclocked memory, the card had the following temperature:

The Zotac GeForce GTX 260 AMP2! Edition 896MB card cost about $320-350 at the time of my writing this, but a price cut is expected for the entire new series of cards from Nvidia. It will surely affect Zotac’s products, too.

Evolution of GeForce (ForceWare) Driver

First of all, I want to note the fact that Nvidia has changed the name of the driver from ForceWare into GeForce. I guess this only adds more confusion into the names of the cards and drivers but perhaps I just have to get used to that. When I began my tests in late November, Nvidia had released a number of beta versions and five official versions of drivers for the GeForce GTX 260 and 280 series. Four of the official versions are going to be covered in this review. I’ll describe each of them now.

GeForce 177.41 (June 26, 2008) is the second WHQL-certified release for the GTX 280/260 series. Why didn’t I take the first official version (numbered 177.35)? Because version 177.41 was released just nine days after version 177.35 and did not bring any significant changes. Here is the official change list:

As you can see, the new (at that moment) driver was not declared to change anything in terms of performance. Besides the change list above, this drive corrected a few issues under Windows Vista x64, and that’s all.

GeForce 178.13 (September 25, 2008) is the third WHQL-certified driver for GeForce GTX 280/260 released after a long period of time (almost three months since the previous version, which is unusual for Nvidia). The following interesting changes could be noted:

The release notes also said that some bugs in games had been corrected for single GPUs and SLI configurations. According to user reports, this version of the GeForce driver is the most stable and problem-free.

GeForce 178.24 (October 15, 2008) was released in only 20 days since version 178.13 and had WHQL certification, too. There are a lot of changes in it despite the short period of time between the two releases. Here are the key games-related features:

You can note that the optimizations are identical to those in version 178.13. That’s not a typo because the official website says that the performance growth is calculated relative to beta version 178.19 which was released later than version 178.13. That’s an interesting thing. In fact, Nvidia increased the performance of its graphics cards in the mentioned games by the same value in two consequent driver releases. I mean, they claim to have increased it.

Besides the optimizations, the new driver corrected the operation of DVI-HDMI devices and Hybrid SLI mode and solved the bug with the menu of World in Conflict when using full-screen antialiasing on the GeForce 6600. Like with version 178.13, PhysX libraries version 8.09.04 were packed with this release.

GeForce 180.48 (November 19, 2008) is one of the latest certified GeForce drivers for Nvidia’s GPUs. Long anticipated and loudly touted by Nvidia, the new 180 series driver brings about ambitious performance improvements besides the traditional correction of errors. Here are the new capabilities:

The 80% performance growth in Lost Planet: Colonies is especially impressive as if driver 180.48 accelerated the game from 100 to 180fps. There is a kind of disclaimer at the official website: the results may be different on specific configurations. So, you are not guaranteed to have the specified performance growth on your particular configuration. Anyway, the new driver claims to have very interesting features. You will see in the test section if they are really such. Right now let’s take a look at the testbed and testing methodology.

Testbed and Methods

The graphics card was benchmarked in a system case with the following configuration:

To minimize the CPU’s influence on the graphics card’s performance I overclocked the CPU to 4.00GHz at 1.575V voltage before the tests.

The system memory worked at a frequency of 1000MHz with 5-4-4-12 timings (Performance Level = 6) and 2.175V voltage.

The tests were run under Windows Vista Ultimate Edition x86 SP1 (with all the critical updates available for November 10, 2008). I used the latest drivers available at the moment of my tests:

For each GeForce/ForceWare driver I installed the PhysX pack available at the moment of the driver’s release or included into the driver bundle. The drivers were tested in the order of their release. Each driver/PhysX pair was installed only after the previous pair had been uninstalled and the system had been cleaned with Driver Sweeper 1.5.5.

The drivers were set at High Quality and the Transparency Antialiasing (Multisampling) option was turned on. Vertical synchronization was forced off. Other settings were left at their defaults. I turned full-screen antialiasing and anisotropic filtering on from the menu of each game. If the game didn’t provide such options, I enabled FSAA and AF from the control panel of the GeForce driver.

The graphics cards were tested at two resolutions, 1280x1024 (or 1024x960) and widescreen 1920x1200. We used the following games and applications, including two synthetic benchmarks, one techno-demo and eleven games of various genres:

The last game is rather old, yet I added it into the list because the GeForce 180.48 driver claimed to improve the speed of this game by 80%! I just wanted to check that claim out.

I tested the cards twice in each application (do not confuse this with a double run of the demos). The final result is the best fps/score value out of the two cycles. It is shown in the diagrams.

Performance

The drivers are sorted by their release date in the diagrams. The earliest version (177.41) is colored yellow; the two drivers of the 178.xx series are colored teal; and the new 180.48 driver is blue. That’s what it looks like in the diagrams:

Before analyzing the results I want to note one important thing. The version 180.48 driver surprised me unpleasantly as it did not offer the very popular resolution of 1280x1024 pixels, suggesting to select 1280x960 instead. I could not add that resolution into the Custom resolution section of the driver’s Control Panel and each attempt to test the card at 1280x1024 was unsuccessful. I did not have time to wait for Nvidia’s tech support to answer my request. What was the most annoying fact, the 180.48 driver was tested the last of all, so I could not just select the resolution of 1280x960 in all the other drivers as it would have made it necessary to perform all the tests once again.

Well, the resolution of 1280x960 is less than 6.3% smaller than 1280x1024, and graphics cards do not show a linear dependence of performance on the display resolution. Besides, we have the same and less CPU-dependent resolution of 1920x1200 which I will focus on in my analysis.

Two synthetic benchmarks come first.

3DMark 2006

There is no performance growth on transitioning to newer drivers in 3DMark06. The 178 series drivers are a little better than version 177.41 whereas version 180.48 is better at 1280x1024 due to the lower resolution and equals the other versions at 1920x1200.

3DMark Vantage

Version 180.48 is ahead of the other three in 3DMark Vantage at both resolutions. This benchmark doesn’t calculate the overall score for the resolution of 1280x960, so there is only the GPU Score in the diagram for that driver.

Unigine Tropics Demo

I used version 1.1 of this beautiful demo/benchmark. Its settings look like follows (I changed the resolution, AF and AA):

And here are the results:

The 180.48 driver makes the card a little faster in the new Unigine Tropics demo, too.

Thus, the new driver provides a small performance gain in two out of the three synthetic benchmarks. And what about games?

World in Conflict

Every driver released since version 177.41 improves performance in this game if FSAA and AF are turned off. The GeForce 180.48 driver also provides a performance growth in the FSAA+AF mode which is quite close to the promised +18%.

Enemy Territory: Quake Wars

Call of Duty 4: Modern Warfare MP

Unreal Tournament 3

The performance of the GeForce GTX 260 graphics card does not change in the previous three games (Enemy Territory: Quake Wars, Call of Duty 4: Modern Warfare MP and Unreal Tournament 3) with the different drivers (to remind you, version 180.48 has a small start advantage as it uses the resolution of 1280x960 instead of 1280x1024). Nvidia promises performance benefits in Enemy Territory: Quake Wars and Call of Duty 4: Modern Warfare MP but you must be able to see them on some “specific configurations,” only.

Devil May Cry 4

This game runs faster on the 180.48 driver in both display modes. So, you can have a higher average frame rate with this driver than with the previous versions.

S.T.A.L.K.E.R.: Clear Sky

Performance doesn’t change much depending on the driver version here. This can hardly be taken seriously.

Crysis Warhead

The game settings look like follows:

  

Here are the results:

The GeForce 180.48 driver is 1-2fps ahead in Crysis Warhead, but you can’t notice that with a naked eye.

Far Cry 2

GeForce 180.48 is suddenly ahead of the other versions of the driver when I enable FSAA and AF. The performance growth is high, even though not as high as the promised 38%. I didn’t find any difference in image quality provided by the different versions of the driver.

X3: Terran Conflict

The new game and benchmark were run with the following settings (I changed the resolution, AF and AA settings):

 

The results are not surprising:

Left 4 Dead

Since it is the first time we are using this new game for testing purposes, let me show the detailed settings we used:

The screen resolution, anisotropic filtering and FSAA were changed depending on the test mode. Since the gaming engine is not too heavy for contemporary graphics accelerators, we used MSAA 8x instead of the frequently used FSAA 4x.

We performed the test in the first Scene “The Seven” of the “No Mercy” level with a lot of fast dying zombies, explosions and other effects:

Let’s check out the results:

As you remember, there is no mention of the performance increase in Left 4 Dead game in the release notes to the drivers. Moreover, the game was not even in the market yet when driver versions up to 178.24 were released. Nevertheless, the performance improvement is quite noticeable and increases in a step by step manner from 177.xx drivers series to 178.xx and then to 180.xx. The image quality didn’t change not only when observed in dynamics, but also during careful study of the screenshots.

Lost Planet: Colonies

I tested this game after the other games and could use the resolution of 1280x960 for all the driver versions. Here are the results:

Well, the performance grows up with every newer driver in both test scenes, but the growth doesn’t amount to the promised 80% with the 180.48 driver.

Image Quality and Speed in 3DMark 2006

First of all I should tell you that this section is just a supplement to the main article and does not claim to be a thorough investigation. Here I will measure the performance hit of the GeForce GTX 260 (216SP) in 3DMark06 depending on image quality mode and will try to evaluate that quality. I ran 3DMark06 a few times, changing the image quality setting from High Performance to High Quality in the GeForce driver’s Control Panel. Then, I enabled anisotropic filtering and three levels of full-screen antialiasing (2x, 4x, 8x) using 3DMark06’s settings. Te remind you, 3DMark06 is indifferent to the FSAA setting of the GeForce driver’s Control Panel, so I didn’t test the other types of multisampling.

By the way, I chose 3DMark06 because of its virtually unique ability to capture a screenshot of a specific frame just as I needed.

These tests were performed with the GeForce 180.48 driver. The next diagram shows how the performance changes depending on image quality mode:

You can see that the quality mode you can select in the driver does not affect the speed at all because 3DMark produces about the same score. But when I turned on anisotropic filtering and FSAA, the speed dropped heavily.

Next, let’s check out the effective image quality depending on what quality mode you select in the GeForce driver. I selected Frame 1350 from the Canyon Flight scene at a resolution of 1920x1200 pixels.

High Performance

Performance

Quality

High Quality

To examine the details, it is easier to download all the screenshots and switch between them in some image viewer like ACDSee). The first two screenshots, captured in High Performance and Performance modes, do not differ at all. At least, I can’t find any difference. The screenshots captured in Quality and High Quality modes do not differ between each other but differ from the screenshots of the previous pair. The picture is darker in Quality and High Quality modes and the background doesn’t look as whitish as on the first pair of screenshots.

Now let’s see if anisotropic filtering and full-screen antialiasing can affect the image quality much.

HQ + AF16x

HQ + AF16x + AA2x

HQ + AF16x + AA4x

HQ + AF16x + AA8x

Well, 16x anisotropic filtering has a tremendous effect on the image! Textures appear on the body of the sea monster and on the cliffs. You can now see the planks of the rudder and the vessel’s keel. The surface of the balloon and its empennage look more realistic and natural. So, you should never disable anisotropic filtering unless you don’t care at all about image quality.

In the other three screenshots full-screen antialiasing increases image quality to maximum. The smoothed-out ropes, edges of the vessel, propeller screw, and wings of the balloon are all the result of multisampling. There is a very visible difference between 4x and 2x FSAA, but the transition from 4x to 8x FSAA doesn’t have such a huge effect.

I had wanted to perform a similar test in Crysis Warhead, but failed. The HardwareOC Crysis Warhead Bench tool can capture a screenshot of a specified frame, but the screenshots were fuzzy at the edges and could hardly be compared in terms of image quality. I tried every one of the 13 integrated demos but had fuzzy screenshots in each of them. The screenshots were not sharp, either, when I tried to make them after loading a save. That’s why I had to limit myself to checking the performance of the graphic card depending on the image quality mode.

In Crysis WARHEAD anisotropic filtering doesn’t affect the frame rate of the game on the GeForce GTX 260 card (a rather low frame rate, by the way). And you have almost the same performance irrespective of the FSAA level.

Conclusion

I did not see the performance growth promised by Nvidia when transitioning to newer versions of the GeForce driver on my computer configuration, yet the frame rate did grow up in such games as World in Conflict, Devil May Cry 4, Crysis Warhead (a very small performance growth here), Far Cry 2, Left 4 Dead, Lost Planet: Colonies as well as in the benchmarks 3DMark Vantage and Unigine Tropics Demo. This is already a good reason for updating your graphics card driver regularly. Moreover, I didn’t test some older games mentioned in the release notes while some newer games just don’t offer accurate tools for measuring the frame rate. The performance benefits may vary on other configurations and under other operating systems. You should also keep it in mind that newer drivers not only improve performance but also correct some errors occurring with new games.

In my next review I am going to check out the evolution of ATI’s Catalyst driver together with a Radeon HD 4870 1024MB graphics card.

Finally, I’d like to say a few words about the graphics card I used for this test session. The Zotac GeForce GTX 260 AMP2! Edition is an interesting product in attractive packaging. It comes with all the necessary accessories, pre-overclocked frequencies, and a very effective and quiet cooler. The card’s GPU refused to overclock, but the memory chips could be overclocked quite well. The price of this card is competitive, too. There is currently a wide choice of GeForce GTX 260 cards on the market, so you can easily choose the variant that suits you best.