by Alexey Stepin , Yaroslav Lyssenko
03/07/2008 | 10:33 AM
The ATI Radeon HD 3000 series based on the new 55nm RV670 core has shattered Nvidia’s position in the below-$200 sector. Nvidia had very good G92-based models, yet all of them, except for the GeForce 8800 GT 256MB, cost far more than $200, while this exception, having a recommended price of $199, could not be actually obtained in shops for that money. Moreover, the limited amount of graphics memory together with inefficient memory management had a negative effect on the gaming performance of the GeForce 8800 GT 256MB. The Radeon HD 3850 was far cheaper and, as our tests showed, far more efficient at managing its 256MB of local memory whereas the high retail price of the GeForce 8800 GT 256MB made it an opponent to the ATI Radeon HD 3870 which left no chance to the Nvidia card.
The GeForce 8600 GTS could not be viewed seriously as a gaming graphics card due to its very modest performance in modern 3D applications. The gap between the GeForce 8600 and GeForce 8800 series shrunk somewhat with the release of the G92 core but was still too wide. A new graphics core was needed that would deliver higher performance than the G82 and would be simpler and cheaper than the G92. GPU developers usually create mainstream chips by cutting down the existing top-performance cores. The main thing is to hit the balance between the manufacturing cost and the performance of the resulting chip.
For example, the G84 suffered from Nvidia’s desire to simplify and cheapen: with only 32 unified shader processors, 16 texture modules that were actually equivalent to eight TMUs only, and 8 ROPs, the chip just could not be fast. As a consequence, the flagship of the new series was slower than the ATI Radeon X1950 Pro in real applications. With such performance, the new chip’s DirectX 10 support sounded like a joke even then, and we know today how hungry DirectX 10 applications are for hardware resources. In fact, the GeForce 8600 GTS replayed the story of the GeForce FX 5600 which had been replaced with the more successful GeForce FX 5700. And now the GeForce 8600 GTS is substituted with the GeForce 9600 GT, the first member of Nvidia’s new GeForce 9 series.
The new card features a specially developed mainstream core codenamed G94. The GeForce 9600 GT was officially announced on February 21, 2008. The same day we offered you preliminary data about the performance of this new product targeted at the $169-189 price sector. Today we’ll discuss it in detail to see if it is competitive against the ATI Radeon HD 3870 and 3850 whose prices have been recently lowered to $189 and $169, respectively (see this news story for details). The battle for the mainstream segment is going to be fierce.
The new G94 graphics core and the GeForce 9600 GT graphics card should be viewed as a correction of errors and flaws the developer committed when creating the G84 and the GeForce 8600 card.
Click to enlarge
Click to enlarge
The positive trend is obvious: Nvidia has corrected all the flaws of the previous mainstream solution in the new chip. The number of execution subunits has been increased twofold, which is more than enough even in comparison with the ATI Radeon HD 3870/3850. If there are no application-specific optimizations for the superscalar Radeon HD architecture, the efficiency of the RV670’s 320 ALUs grouped in blocks of 5 ALUs may prove to be just as high as that of the 64 scalar processors of the Nvidia G94 chip. AMD’s chip supports DirectX 10.1 and Nvidia’s does not because such support is not yet called for.
There are 32 TMUs, but the texture processors are architecturally identical to those of the G84 and G92. That is, there are two filter units per each two texture address units. Theoretically, this ensures a high speed at texture sampling and filtering but only if anisotropic filtering is turned off. But today anisotropic filtering is enabled by every gamer who cares about image quality. Under real-life conditions such TMU architecture is only half as fast as the G80’s architecture with two filter units per one sampling unit. Thus, the G94 can be viewed as having 16 TMUs only, yet this is a step forward in comparison with the G84 and quite comparable to the texture-mapping performance of the ATI RV670 chip. The higher frequencies of the latter chip make up for its small amount of texture filter units: one such unit per each two address units. So, the two GPUs are roughly equal in this respect.
The raster operators have not changed since the G92 and there are as many of them as before. Thanks to improved compression techniques, Nvidia says their performance is 15% higher than the performance of the G80’s ROPs. From our tests of Nvidia’s GeForce 8800 GT/GTS 512MB we know that it is really so. The G92’s 16 ROPs are more than successful competing with the G80’s 24 ROPs. There are four ROP sections in the chip, which means four 64-bit memory access channels combining into a 256-bit external memory bus. This is one of the most significant improvements over the G82 whose 128-bit bus often proved to be a bottleneck. At a specified frequency of 900 (1800) MHz, the memory bandwidth is 57.6GB/s, i.e. the same as that of the GeForce 8800 GT 512MB. In other words, the GeForce 9600 GT is very unlikely to feel a lack of memory bandwidth.
New in the G94 chip is the controller of the DisplayPort interface integrated into the graphics core. Promoted by VESA, this interface is an alternative to HDMI and, as opposed to the latter, does not require licensing. DisplayPort supports the transfer of signal with a resolution of 2560x1600 pixels over a cable up to 3 meters long and the transfer of video signal with a resolution of 1080p over a cable up to 15 meters long. It can optionally transfer 8-channel 24bit audio stream with a sampling rate of 192kHz. Starting with version 1.1, the interface also features DisplayPort Content Protection developed by AMD. Conceived as a unified, easily expandable, royalty-free standard, DisplayPort may acquire high popularity in near future especially as it is supported by the leading hardware makers such as Intel, Dell, and Samsung.
The GeForce 8800 GT/GTS 512MB supported DisplayPort by means of an additional onboard controller, but the GeForce 9600 GT doesn’t need one, which simplifies the PCB design and makes the end-product cheaper. The integration of the DisplayPort and NVIO controller into the chip prevented the developer from reducing the amount of transistors greatly, although the new chip is less complex than the ATI RV670. Despite its 505 million transistors, the G94 doesn’t contain an integrated audio core as the RV670. The audio-over-HDMI feature is implemented in the GeForce 9600 GT in the same way as in some models of GeForce 8800 GT, i.e. by means of an S/PDIF adapter that connects the audio card’s output with the graphics card’s input.
The developer estimates the peak power draw of the new card at 95W, but we think it is going to considerably lower in real applications because the peak power draw of the Nvidia GeForce 8800 GT 512MB was measured by us to be only 78W. Overall, the Nvidia GeForce 9600 GT seems to be a well-balanced product and has a chance to become a new leader of the below-$200 segment – we’ll check it out in the gaming tests, of course.
Summing up, we have to say that although it seems to belong to the ninth GeForce generation, the 9600 model is no different from the eighth generation in terms of capabilities and architecture. In fact, the only advantage of the GeForce 9600 GT over the GeForce 8600 GT/GTS is performance. This naming system may sound confusing, but ATI has been guilty of it, too, with its Radeon X300, X600, X700 X800 series as well as with the Radeon HD 3000: it’s when a series would get a new number after but minor improvements.
Contrary to a widespread opinion, the PureVideo HD has not changed on the hardware level. It is architecturally identical to the unit that was introduced in the Nvidia G84 core and then incorporated into the G92. Nvidia’s video-processor provides full hardware support for decoding H.264 and partial hardware support for decoding VC-1. For the latter format, entropic decoding is performed on the CPU. Nvidia’s PureVideo HD is inferior to ATI’s UVD in this respect but that’s not a big problem. Our tests of these video-processors showed that the peak CPU load is not higher than 33-35% when processing VC-1 1080p content on either of them.
The improvements are available in the software, so all the new features in decoding and processing of video can also be used on Nvidia’s G92-, G84- and G86-based cards after you install ForceWare 174.x. First of all, the driver now supports simultaneous decoding of two streams to enable Picture-in-Picture mode.
Two more innovations are questionable. We mean the dynamic enhancement of contrast and saturation. The first feature corrects the contrast of the scene in a way similar to the Auto Contrast filter in Adobe Photoshop and ACDSee, but in real-time mode. The second feature adjusts the color saturation in skin, green, and blue tones. Are these features really valuable? In fact, they have something in common with the Creative 24-bit Crystalizer technology as well as with the Digital Vibrance Control option long available in Nvidia’s ForceWare driver. Processed with these technologies, the content does not resemble the original. If you enable these features, you won’t see what the movie director wanted you to see, for example. Fortunately, each feature can be turned off in the ForceWare control panel. Perhaps someone may like the more saturated colors, but most admirers of HD video will prefer the original colors, we guess.
The last change is the correction of a bug that would disable Windows Vista’s Aero interface when playing video in H.264 or VC-1 formats. We haven’t spotted this bug even before, though.
Nvidia’s new graphics card is represented by a Gainward Bliss 9600 GT 512MB GS in this review. It differs from the reference card, though. The product comes in a classic glossy cardboard box. The picture on the front of the box is traditional but the overall design has changed greatly since earlier products from Gainward.
The text on the box tells all the technical information necessary for the buyer: the amount of graphics memory, the version of the PCI Express interface, the type and number of connectors, and the note that the card belongs to the Golden Sample series and has pre-overclocked GPU and memory frequencies. The included game Tomb Raider: Anniversary is mentioned as well.
The packaging quality is high. The graphics card is packed into a bubble wrap and lies in an individual box firmly fixed within the main one. It is protected well enough against possible damage during transportation or storage. The two compartments of the main box contain the following accessories:
There is everything here you can expect to find included with a modern graphics card except for a DVI-I → HDMI adapter, which is not necessary because the card offers an appropriate connector. As opposed to the Gainward Bliss 8800 series, the Bliss 9600 GT uses an internal SPDIF connection because the card doesn’t offer a mini-DIN port. It’s the first time we see an optical S/PDIF cable included as a graphics card’s accessories.
The CyberLink DVD Solution disc might be helpful, especially as the Gainward Bliss 9600 GT 512MB GS would be a perfect choice for a multimedia center, if it contained up-to-date versions of the software. Moreover, this disc does not contain the main and most demanded component, the PowerDVD player. The included game, Tomb Raider: Anniversary, is all right, even though not very new, either.
So, the packaging and accessories of the Gainward Bliss 9600 512MB GS are overall good and might be even better if the company supplied more modern software.
Using a unique PCB design, the card has very little in common with the reference GeForce 9600 GT.
This PCB carries a robust power circuit with three GPU power phases as opposed to two on Nvidia’s reference card. The fourth phase is responsible for the memory chips. The power circuit is based on a multi-phase Richtek RT8802A controller you can find on other Gainward products with unique PCB designs.
Each of the three GPU power phases contains two MOSFETs, but there are empty seats on the reverse side of the PCB to increase their amount to three MOSFETs per phase. External power is attached by means of a standard 6-pin PCI Express 1.0 connector. Theoretically, the card could get along without it as the PCI Express 2.0 specification implies the provision of up to 150W through an x16 slot, but there are too few such mainboards on the market yet. Most mainboards in use support PCI Express 1.0a/1.1whose load capacity is limited to 75W, so the installation of the additional connector is a must.
The PCB design of the Gainward card is overall somewhat more complex than the reference GeForce 9600 GT design from Nvidia, yet simpler than the GeForce 8800 GT. The DisplayPort controller being integrated into the core, there is no seat for the installation of the external chip.
The card carries eight Samsung K4J52324QE-BJ1A chips of GDDR3 memory. Each chip has a capacity of 512Mb (16Mbx32) and a rated frequency of 1000 (2000) MHz with a voltage of 1.9V. Belonging to the Golden Sample series, the card has a higher memory frequency than that of the reference card: 1000 (2000) MHz as opposed to 900 (1800) MHz. The card has a total of 512 megabytes of memory accessed across a 256-bit memory bus.
The graphics core looks somewhat odd as the GPU die is turned by 45 degrees relative to the packaging. There have been precedents, though. The ATI R600 and Nvidia NV30 looked the same way with the heat-spreader cap removed. The core is marked as “G94-300-A1” and it’s somewhat surprising as it is revision A2 that usually goes to mass production. It means the developer had no problems working on the G94. Our sample of the GPU is dated the first week of the current year. The GPU package is equipped with a plastic protective frame to prevent the cooler from misaligning and damaging the die. Gainward took this precaution although the reference card doesn’t have such a frame. The GPU clock rates are set higher than on the reference card: the main domain is clocked at 700MHz rather than at 650MHz, while the shader domain frequency is 1750MHz rather than 1625MHz. So we can expect a 5-7% performance growth from the Bliss 9600 GT 512MB GS in games. The core configuration is standard with 64 unified scalar ALUs, 16 (32) TMUs and 16 ROPs grouped into 4 sections.
The card supports all the modern digital interfaces: dual-link DVI, HDMI, DisplayPort, and S/PDIF TOSLINK. All these connectors could not be placed at the same level due to the PCB height limitations, so the second DVI-I port is above the first one and the card has a dual-slot mounting bracket. This bracket would be required anyway due to the dual-slot form-factor of the cooler. Besides the mentioned ports, the card has an onboard MIO connector and a dual-pin header for connecting to the audio card’s onboard SPDIF output.
Nvidia’s reference card comes with a compact single-slot cooling system similar to the second version of the GeForce 8800 GT cooler, but Gainward installed an original dual-slot cooling device on its card.
Well, this design seems original only at first sight. In fact, it represents a simplified and smaller version of the reference cooler of the GeForce 7900 GTX. Two heat pipes connect the copper base with the heatsink consisting of thin aluminum plates. There is a depression in the center of the heatsink in which an ordinary axial fan is installed.
The airflow from the fan goes to both sections of the heatsink. Only part of the hot air is thus exhausted out of the system case. With the Bliss 9600 GT 512MB GS, this is aggravated by the fact that the second DVI connector occupies some space of the second storey of the mounting bracket, so there are fewer slits for exhausting the hot air. The fan uses a 4-pin connection with a PWM-based regulation of speed, which has become a de facto standard already.
The aluminum heatsink is not very large and not heavy. It is fastened to the PCB with four screws only. The plastic casing with a Gainward logo is fastened individually. There is light-gray thermal grease between the GPU die and the cooler’s copper sole. The heat from the memory chips is taken off by means of an aluminum plate with elastic thermal pads.
This cooler doesn’t seem to be as efficiency as the one Gainward installed on some models of Bliss 8800, but it must be capable of cooling the simpler G94 chip without much fuss. We’ll check this out in the next section.
We also emulated the reference Nvidia GeForce 9600 GT by lowering the frequencies of the Gainward card to the official values: 650/1625MHz for the core and 900 (1800) MHz for the memory. In the 3D mode the card was loaded with 3DMark06’s first SM3.0/HDR test running in a loop at 1600x1200 with 4x FSAA and 16x anisotropic filtering. The Peak 2D mode was emulated by means of the 2D Transparent Windows test from PCMark05.
This version of Nvidia GeForce 9600 GT has an original PCB design and its distribution of loads among the power channels may be different than that of the reference card.
Click to enlarge
The new core having a four-channel memory controller and 16 raster operators, it is not just a half of the G92. The reduction in power consumption is consequently lower than with the G84. Anyway, the GeForce 9600 GT consumes 30% less than the GeForce 8800 GT 512MB in 3D mode. These 60 watts we’ve measured in our test are much lower than declared by Nvidia, so there is no need in purchasing a high-wattage PSU just as there’s no need to buy one for RV670-based solutions.
We measured the level of noise produced by the non-standard cooler of the Gainward Bliss 9600 GT 512MB GS card with a digital sound-level meter Velleman DVM1326 using A-curve weighing. The level of ambient noise in our lab was 36dBA and the level of noise at a distance of 1 meter from the working testbed with a passively cooled graphics card inside was 43dBA. Here are the results:
The large heatsink allows the fan to rotate at a rather low speed, making the card almost silent even in 3D mode. The fan itself is quite noisy, though, as you can hear before the OS loads the ForceWare driver. The reference cooler is no different from the second, improved, version of the GeForce 8800 GT cooler and has identical noise characteristics. The GeForce 9600 GT generates less heat, this cooler should cope with its job well. There is no need to install more advanced cooling systems on G94-based devices.
The Bliss 9600 GT 512MB GS being pre-overclocked by the manufacturer, we couldn’t overclock it much further. The maximum frequencies the card was stable at were 740/1850MHz for the GPU and 1080 (2160) MHz for the memory. This overclocking cannot provide serious performance benefits and we won’t benchmark the card in the overclocked mode. We’ll only test it at its default frequencies and at the frequencies of the reference card from Nvidia. The GPU temperature was 42°C in idle mode and 58-60°C under load thanks to the advanced cooler installed by Gainward on this card.
The Bliss 9600 GT 512MB GS had no compatibility problems when working on mainboards supporting PCI Express 1.0a and 1.1.
To test the performance of Nvidia GeForce 9600 GT in games we assembled the following standard test platform:
According to our testing methodology, the drivers were set up to provide the highest possible quality of texture filtering and to minimize the effect of software optimizations used by default by both: AMD/ATI and Nvidia. Also, to ensure maximum image quality, we enabled transparent texture filtering - Adaptive Anti-Aliasing/Multi-sampling for ATI Catalyst and Antialiasing – Transparency: Multisampling for Nvidia ForceWare. As a result, our ATI and Nvidia driver settings looked as follows:
For our tests we used the following games and benchmarks:
First-Person 3D Shooters
Third-Person 3D Shooters
We selected the highest possible level of detail in each game using standard tools provided by the game itself from the gaming menu. The games configuration files weren’t modified in any way. The only exception was Enemy Territory: Quake Wars game where we disabled the built-in fps rate limitation locked at 30fps.
Games supporting DirectX 10 were tested in this particular mode. With a few exceptions, the tests were performed in the following most widely spread resolutions: 1280x1024/960, 1600x1200 and 1920x1200. If the game didn’t support 16:10 display format, we set the last resolution to 1920x1440. We used “eye candy” mode everywhere, where it was possible without disabling the HDR/Shader Model 3.0/Shader Model 4.0. Namely, we ran the tests with enabled anisotropic filtering 16x as well as MSAA 4x antialiasing. We enabled them from the game’s menu. If this was not possible, we forced them using the appropriate driver settings of ATI Catalyst and Nvidia ForceWare drivers
Performance was measured with the games’ own tools and the original demos were recorded if possible. Otherwise, the performance was measured manually with Fraps utility version 2.9.1. We measured not only the average speed, but also the minimum speed of the cards where possible.
We have also included the results for the following graphics accelerators:
This game doesn’t support display resolutions of 16:10 format, so we use a resolution of 1920x1440 pixels (4:3 format) instead of 1920x1200 for it.
That’s an impressive start: the Nvidia GeForce 9600 GT is not only ahead of the ATI Radeon HD 3870 but also very close to the GeForce 8800 GT 512MB despite having 48 shader processors and 12 (24) TMUs less! Let’s see what we have in more complex and demanding tests, though.
BioShock doesn’t support FSAA when running in Windows Vista’s DirectX 10 environment. That’s why we benchmarked the cards without FSAA.
The new card cannot repeat the triumph in BioShock as this game uses an advanced engine and DirectX 10 based special effects that call for serious computing capacities. Anyway, the Nvidia GeForce 9600 GT is no worse than the ATI Radeon HD 3870 at 1280x1024 and much better than it at 1600x1200/1680x1050. It is only at 1920x1200 that the lower memory bandwidth shows up, yet the 9600 GT is a mere 6-7% behind the AMD card in that case, too. The performance of the new card from Nvidia is comfortable enough except for the highest tested resolution.
The new GeForce 9600 GT keeps up with the ATI Radeon HD 3870 even where the latter is traditionally strong. Moreover, Nvidia’s card delivers a better minimum speed. At high resolutions Nvidia’s old problem, the inefficient memory management due to driver flaws or the specifics of the memory controller, shows up and the ATI Radeon HD 3870 easily goes ahead.
The overall performance level is very low. The results have no real practical value.
The Nvidia GeForce 9600 GT is as fast as the more expensive GeForce 8800 GT 512MB, showing a slightly lower minimum of speed. It delivers comfortable performance at resolutions up to 1600x1200/1680x1050 inclusive. The ATI Radeon HD 3870 cannot do the same as its minimum speed is below 25fps even at 1280x1024.
The game being too hard at its Very High level of detail, we benchmarked the cards without FSAA to get a more playable speed.
A new-generation game, Crysis puts a high load on the graphics card’s computing units. The Nvidia GeForce 9600 GT, with only 64 such units, cannot compete with the G92- and RV670-based products, including the ATI Radeon HD 3850. Anyway, a below-$200 graphics card is not actually meant for running Crysis at the highest graphics quality settings because top-end cards cannot cope with that job even.
The frame rate is fixed at 30fps in this game as this is the rate at which the physical model is being updated at the server. Thus, this 30fps speed is the required minimum for playing the game.
The new card has a good start, being only 14-15% inferior to the Nvidia GeForce 8800 GT 512MB and delivering comfortable performance, but at 1600x1200 it slows down to the level of the ATI Radeon HD 3850 although ATI’s solutions have never been fast in OpenGL applications. It’s hard to explain this behavior, but it must be due to a lack of execution units because the GeForce 9600 GT has enough of graphics memory and enough of memory bandwidth. Its texture-mapping performance shouldn’t be far inferior to that of the ATI Radeon HD 3850/3870, either.
The GeForce 9600 GT outperforms the GeForce 8800 GT 512MB thanks to the higher core frequency. Starting from the 1600x1200 mode these two cards match each other’s performance. It means that Episode Two doesn’t fully employ all the computing capacity of modern top-performance GPUs.
The game doesn’t support FSAA when you enable the dynamic lighting model, but loses much of its visual appeal with the static model. This is the reason why we benchmarked the cards in S.T.A.L.K.E.R. using anisotropic filtering only.
The new card is slower than the Nvidia GeForce 8800 GT 512MB but allows playing at the same resolutions, up to 1600x1200/1680x1050 inclusive. So this result is quite good especially in comparison with the ATI Radeon HD series. The latter is traditionally weak in this test, except the expensive dual-chip ATI Radeon HD 3870 X2.
Forcing FSAA from the graphics card’s driver doesn’t produce any effect as yet. That’s why the game is tested with anisotropic filtering only.
Notwithstanding the perfectly optimized game engine, the Nvidia GeForce 9600 GT doesn’t look brilliant even if compared with the ATI Radeon HD 3850 and Nvidia GeForce 8800 GT 256MB. Epic Games’ project seems to put all of the GPU’s computing capacities to use, and it is in this parameter that the GeForce 9600 GT is greatly inferior to every other graphics card in this test session.
The Nvidia GeForce 9600 GT is only ahead of the ATI Radeon HD 3870 at 1280x1024. At the higher resolutions they deliver the same performance, which is, however, insufficient for comfortable play.
The only resolution at which the GeForce 9600 GT is unable to keep the frame rate at a comfortable level is 1920x1200 pixels, but this card is not actually meant for playing at such high display modes. The ATI Radeon HD 3870 is far inferior to the new card in terms of average frame rate but it doesn’t affect the gamer’s experience: the average speed of the AMD card is high enough for this game genre while the minimum speed is about as high as that of the Nvidia card.
The GeForce 9600 GT has fewer ALUs and TMUs in comparison with the GeForce 8800 GT 512MB, but this only shows up in its lower minimum speed. The cards provide similar average frame rates, with a minor advantage on the latter’s part at high resolutions. The ATI Radeon HD 3870 behaves in a similar way, too. None of these cards provide a comfortable speed with enabled FSAA. The Nvidia GeForce 8800 GTS 512MB is the only device capable of that at the highest graphics quality settings.
The current version of the game doesn’t support FSAA, so we performed the test with anisotropic filtering only.
There is almost no difference between the Nvidia GeForce 9600 GT and the GeForce 8800 GT 512MB due to the specifics of the game engine. These cards are also very close behind the GeForce 8800 GTS 512MB. All of them allow playing comfortably at 1920x1200 whereas the ATI Radeon HD 3870 is limited to 1600x1200. That’s quite an ordinary sight for a game developed under Nvidia’s The Way It’s Meant to Be Played program.
The game loses much of its visual appeal without HDR. Although some gamers argue that point, we think TES IV looks best with enabled FP HDR and test it in this mode.
The Nvidia GeForce 9600 GT is somewhat worse than the ATI Radeon HD 3870 as its speed may bottom out below 25fps in open scene at a resolution of 1920x1200. This is hardly noticeable, though, so the two cards are equals as concerns playing this game.
The new add-on to Company of Heroes is tested in DirectX 10 mode only since it provides the highest quality of the visuals.
Games that make use of the advanced DirectX 10 capabilities but do not have optimizations for the superscalar Radeon HD architecture illustrate our point about the identical computing power of the Nvidia GeForce 9600 GT and ATI Radeon HD 3870. We see the same in this add-on to Company of Heroes. Moreover, the AMD card has a lower minimum of speed, probably due to its texture processor architecture with fewer texture filter units.
The game having a frame rate limiter, you should consider the minimum speed of the cards in the first place.
The Nvidia GeForce 9600 GT looks preferable to the GeForce 8800 GT 256MB, like in most other games. It means that as soon as the G94-based card hit the market in mass quantities, the G92-based solution will become obsolete. This is also the result of the inefficient memory management of Nvidia’s cards.
As opposed to its performance in other DirectX 10 games, the ATI Radeon HD 3870 fails this test completely, which makes the Nvidia GeForce 9600 GT the leader in its class. The new card delivers the same performance as the GeForce 8800 GT 512MB at resolutions up to 1600x1200/1680x1050 pixels, being a very appealing buy. You’ll have to disable FSAA and, probably, lower the level of detail to play the game comfortably, though.
The Nvidia GeForce 9600 GT has a lower overall score than both single-core models of Radeon HD 3000 and can only outperform the Nvidia GeForce 8800 GT 256MB.
It’s different in the individual tests: the ATI Radeon HD 3870 is only as fast as the Nvidia GeForce 9600 GT in the second test and far slower in the other two tests. This is normal as 3DMark05 doesn’t utilize all the capabilities of modern unified-architecture GPUs and suits but poorly for benchmarking their performance even at high resolutions with enabled FSAA.
3DMark06 produces a more realistic measurement. The Nvidia GeForce 9600 GT is only ahead of the ATI Radeon HD 3850 here.
Nvidia’s card is somewhat faster than the ATI Radeon HD 3870 in the SM2.0 tests that do not need a serious computing power, but can only compete with the Radeon HD 3850 in the SM3.0/HDR tests that abound in complex, math1ematics-heavy, shaders. ATI’s superscalar architecture doesn’t show its best here, though. 64 ALUs of the GeForce 9600 GT are about as effective as 320 ALUs of the ATI Radeon HD 3850. So, the ATI Radeon HD 3870 owes its victory only to the higher frequency of the core.
In the first test the Nvidia GeForce 9600 GT has the biggest lead over the ATI Radeon HD 3870 and 3850: over 50%. In the second test, which is not so demanding about the texture processor performance, the gap is only 16-17% in favor of Nvidia’s solution.
Both tests of the SM3.0/HDR group suggest a minor advantage of the ATI Radeon HD 3870 over the Nvidia GeForce 9600 GT. The difference with the overall score is due to the use of FSAA. The ATI Radeon HD performs FSAA by means of shader processors whereas the Nvidia GeForce 8/9 traditionally relies on the hardware capabilities of the ROPs. The former approach is more flexible, but results in a bigger performance hit, while the classic method is faster, especially as the most frequently used mode is the classic 4x multisampling.
Without a doubt the new G94 graphics processor is a good solution. It is a real breakthrough in the belo-$200 segment if compared with the obviously weak G84. The new GPU is similar to the G92 which has become a sensation in the below-$349 category.
Formally belonging to the ninth generation, the GeForce 9600 GT doesn’t have any new capabilities and is not based on a new architecture as you might have guessed. It is still a DirectX 10.0-supporting chip that offers the same functionality as the GeForce 8800/G80 released a year and a half ago. We don’t say this functionality is not enough, yet users may ask for something fundamentally different from the GeForce 9 series. The main new feature is the new level of performance in every price segment as is demonstrated by the GeForce 9600 GT.
As we have seen in the gaming tests, 64 execution units are enough for most of modern games. We’ve only seen a serious performance hit in comparison with the Nvidia GeForce 8800 GT 512MB in such games as Bioshock, Crysis, Unreal Tournament 3 and Company of Heroes: Opposing Fronts. In a few tests the GeForce 9600 GT was even faster than the more expensive and advanced GeForce 8800 GT 512MB due to the higher frequency of the core. Thus, the new card is indeed a well-balanced solution without bottlenecks, unlike the GeForce 8600 series that was greatly limited by the small number of execution units. Nvidia has corrected the earlier mistakes. There is still a huge gap between the GeForce 9600 GT and GeForce 8600 GTS, so we are likely to see cheaper versions of G94-based cards as they can use rather simple PCBs. Besides everything else, the GeForce 9600 GT wiped out the GeForce 8800 GT 256MB, which suffered from the well-known problem of Nvidia’s cards – inefficient memory management.
The RV670 chip from the former ATI Technologies was unrivalled in the below-$200 sector for a while, but Nvidia stuck back a heavy blow. You can take a look at the table below that shows the current situation in the mentioned price category:
Well, notwithstanding the worse support on the game developers’ part, the ATI Radeon HD 3870 feels quite confident, especially at high resolutions. But at 1280x1024, which is the main resolution for most gamers who are interested in below-$200 graphics cards, this card is only superior in four tests, including both versions of 3DMark and the highly demanding Crysis. In six more tests the Radeon HD 3870 delivers the same performance as its opponent. Among these games are Call of Juarez DX10 Enhancement Pack, Hellgate: London and World in Conflict, which are too demanding to run normally on mainstream cards. So, Nvidia’s card obviously suits better for users whose monitors support resolutions up to 1280x1024. If your monitor has a native resolution of 1600x1200 or 1680x1050, you may want to prefer the ATI Radeon HD 3870.
As for the ATI Radeon HD 3850, it is inferior to the Nvidia GeForce 9600 GT in most tests excepting Crysis and Unreal Tournament 3, but equal to it in Enemy Territory: Quake Wars, TES IV: Oblivion and Command & Conquer 3. Considering that the recommended price of the 3850 model is $169 while the Nvidia GeForce 9600 GT is going to cost about $190 or even more at first, we can’t say that the new card totally beats the Radeon HD 3850 on the market. As the GeForce 9600 GT will be getting cheaper, the Radeon HD 3850 will feel the pressure unless AMD cuts the price, too. Anyway, it is the buyer who wins because his shopping choice has become broader.
The Gainward Bliss 9600 GT 512MB GS doesn’t ensure a considerable performance gain in comparison with the reference card because its performance growth varies from 3% to 8% only but it offers more important benefits instead. First, it has a quiet and effective cooler. Second, it’s got a wide selection of interfaces, including a DisplayPort and S/PDIF TOSLINK. So this card is going to be a good choice for a modern multimedia center that can also be used for playing games. The only prerequisite is that you need a roomy system case that allows installing dual-slot graphics cards. This niche product cannot be cheap, so you are unlikely to find a Gainward Bliss 9600 GT 512MB GS selling at its recommended price. It is going to cost more than $200 we guess. Anyway, this card will surely find its buyer due to its unique characteristics.