by Alexey Stepin , Yaroslav Lyssenko, Anton Shilov
04/17/2007 | 11:50 AM
November 2006 Nvidia announced its landmark graphics processor G80 that featured a unified architecture and support of Shader Model 4.0 and next-generation DirectX. The new chip turned out to be very complex, incorporating an unprecedented 681 million transistors. The analog section and TMDS transmitters were moved into a separate chip. The G80 processor became the foundation of the GeForce 8800 graphics card series led by the flagship GeForce 8800 GTX.
The G80 features a 384-bit memory bus and has a high power draw, so the new best product from Nvidia is a large and heavy device, but our tests proved that these drawbacks were made up for by unprecedented performance in games (for details see our article called 25 Signs of Perfection: Nvidia GeForce 8800 GTX in 25 Benchmarks).
In 14 tests out of a total of 23 we used in our review the GeForce 8800 GTX was faster than dual-GPU SLI and CrossFire subsystems based on the fastest graphics cards from the older generation and even faster than a GeForce 7950 Quad SLI system Nvidia had expected so much from. The new card was actually as fast as to make it possible to use new high-quality full-screen antialiasing methods 8x MSAA and 16xQ CSAA. With traditional 4x MSAA and at common display resolutions up to 1600x1200 the GeForce 8800 GTX would often be limited by the CPU. Coupled with anisotropic filtering that was better quality even than ATI cards’ HQ mode, this made the GeForce 8800 GTX an unrivalled king of gaming 3D graphics hardware. We also mentioned a few drawbacks then, faulty drivers being the most annoying one.
Anyway, the GeForce 8800 GTX remains the fastest gaming graphics card available and is being offered by most of major vendors. We must note, however, that this graphics card is made by contract manufacturers and is then supplied to Nvidia’s partners that have little room to personalize their GeForce 8800 GTX to make it unique and more attractive in the potential customer’s eyes. The deepest modification they can do is to replace the cooler, but the stock cooler from Nvidia is already both efficient and quiet. Graphics card suppliers have to look for other ways to distinguish their products, but usually limit themselves to putting their own stickers on the cooler casing.
Are there any exceptions from this rule? In this review we will discuss four GeForce 8800 GTX models offered by ASUS, Foxconn, OCZ Technology and XFX:
We’ll see how original each of these products is and what you should specifically look for when shopping for a GeForce 8800 GTX.
All GeForce 8800 cards are manufactured at Foxconn’s and Flextronics’ facilities and shipped ready-made to Nvidia’s partners. That is why it doesn't make much sense to describe the PCB and the cooler – we have already done this in our earlier review. Here, we will only dwell upon the specific features of the four graphics cards – they do have some peculiarities despite the unified design. GeForce 8800 GTX may also come with different memory, from Samsung or Hynix, that may have an access time of 1 or 1.1 nanoseconds.
Traditionally for this company’s products, the graphics card is packed into a huge-sized box with a handle. The package design is the same as we described in our review of ASUS EN7950GT and EN7900GS Top. The only difference is the color of the company logo. It is silver here instead of white. Black is still the main color and the picture on the face side of the box still shows a trooper from Tom Clancy’s Ghost Recon Advanced Warfighter. The game is included with the card, of course.
As usual, the contents of the box are neatly laid out into separate compartments and the graphics card itself is wrapped into an antistatic pack and lies on a polyurethane-foam tray. The box is so large that even a GeForce 8800 GTX, with all its length, fits into it crosswise. The ASUS card comes with the following accessories:
There is only one DVI adapter in the package, but we don’t think it is a problem any more because the user of such an expensive and powerful graphics card surely has an LCD monitor with a digital input. The same goes for power supplies. Most modern PSUs offer at least one 6-pin PCI Express power connector whereas advanced models for enthusiasts are equipped with two to four such connectors. The GeForce 8800 GTX uses two external power connectors – you’ll probably need a splitter. You’ll have a problem only if your PSU does not have a single 6-pin power connector.
The accessories include a handy pouch for your discs and a software bundle that includes the popular benchmark 3DMark06 and two games. We use Tom Clancy’s Ghost Recon Advanced Warfighter and 3DMark06 in our own tests, while GTI Racing is a top-notch car simulator with modern enough visuals.
We don’t have any gripes about the accompanying documentation. The SpeedSetup guide will help an inexperienced user to install the card. For more details you can refer to the full version of the manual you’ll find on an enclosed CD. ASUS’ exclusive technologies need a special driver to work, but we wouldn’t advise you to install it just because that driver is updated less frequently than the official ForceWare. This is especially important as the driver for the GeForce 8800 series is still not free from errors and bugs. Using an older version of ForceWare increases the risk of your having troubles with certain games (some may not even launch on an out-dated driver).
All in all, the package and accessories of the ASUS EN8800GTX deserves a good word from us. The card comes with two good gaming titles and a Pro version of 3DMark06, and a pouch for carrying optical discs.
This graphics card uses the reference PCB with a black solder mask and a reference cooler.
This is an absolutely standard GeForce 8800 GTX that only differs from the reference model in having a picture of a trooper from Ghost Recon Advanced Warfighter on the cooler casing and a holographic ASUS logo on the fan. There is an Nvidia logo in the bottom right of the PCB – it is present on all GeForce 8800 GTX cards irrespective of the color of the PCB.
The EN8800GTX comes with Samsung K4J52324QC-BJ11 memory. Its twelve 512Mb GDDR3 chips have a voltage of 2.0V and a rating frequency of 900 (1800) MHz. This is actually the frequency the memory is clocked at by the card, in full compliance with Nvidia’s official specification. The total amount of memory is 768MB with a 384-bit memory bus. The GPU frequencies are 576MHz and 1350MHz for the main and shader domains, respectively. The GPU is configured just like it must be on a GeForce 8800 GTX: 32 TMUs, 24 ROPs, and 128 unified shader processors.
Just like any other GeForce 8800 GTX, the ASUS EN8800GTX requires that you attach two power plugs to its external power connectors. If one connector is not powered up, the card either emits a warning sound or starts up at greatly reduced frequencies, depending on what particular connector is unattached. The power requirements are high: your PSU must have a wattage of 450W and higher and provide a combined current of 30A and higher on the +12V power rail.
The card is equipped with two universal DVI-I connectors and a 7-pin mini-DIN connector that allows to plug an S-Video cable in directly.
The box with the Foxconn product is large, too, yet it is smaller than the ASUS box and lacks a handle. It is designed in mild colors, mostly dark violet and blue, which make up an abstract picture with a bent towards astronomy. There is a sticker in the top left corner that informs the user about the advantages he is going to enjoy upon purchasing the card.
Foxconn protected its product against damage during transportation, too. ASUS used a polyurethane-foam tray, but Foxconn additionally put the card into a blister wrap, besides an antistatic pack. The card is firmly fixed in its compartment and is accompanied with the following accessories:
The Foxconn FV-N79GM2D2-HPOC we reviewed before had almost the same accessories, by the way.
A good thing about the accessories is that they include a Foxconn 3D Game Pad.
This gamepad resembles the famous Sony DualShock, which is considered one of the most ergonomically optimal devices in its class. The gamepad case is made from rubberized plastic for additional convenience. It is agreeable to touch and prevents your fingers from slipping off. Besides the main 8-position joystick, the 3D Game Pad offers 10 buttons and two analog controllers. The latter two can react to a press, too, thus increasing the number of available buttons to 12. The gamepad is connected to the PC via a USB port. It is recognized by the OS as a generic gamepad and does not require special drivers. To configure and calibrate it, you just go into the Game Controllers section of the Control Panel in Windows. Without a doubt, it makes a nice addition to the graphics card.
We guess it would be appropriate to have some popular third-person shooter or a fighting game included into the package together with the gamepad, but Foxconn decided different. The software bundle consists of programs for creating and managing CD/DVD images (VirtualDrive Pro) and for system recovery (RestoreIT). This approach is somewhat more pragmatic and justifiable. You may want to have software that allows creating a virtual optical drive in your system to avoid the troubles associated with the notorious StarForce protection. Gamers also need to restore their systems more often than other users. So, these two programs are indeed helpful and they cost $39.99 each separately. Yet we are dealing with a gaming graphics card – with the most advanced such card available today – and one or two games would indeed be appropriate.
The included manual is quite a normal thing. We don’t have any complaints about it. It can help an inexperienced user to install the new graphics card into the system.
The package and accessories of the Foxconn FV-N88XMAD2-ON are appealing due to the free-of-charge items, particularly a 3D Game Pad, with a total cost of $180 (according to the manufacturer). We would be pleased even more if the box contained at least one game. On the other hand, this would raise the retail price of the card and make it less attractive for those people who don’t need the particular games.
For the sake of readability we will be hereafter referring to the Foxconn card as Foxconn GeForce 8800 GTX.
The Foxconn card may come to the market in either black or green version. Our sample of the Foxconn GeForce 8800 GTX makes use of a black solder mask and does not differ from Nvidia’s reference card in anything.
External difference goes no further than the stickers on the cooler and fan. The picture on the cooler casing showing celestial bodies resembles the picture on the graphics card box.
The Foxconn GeForce 8800 GTX employs Samsung K4J52324QC-BJ11 memory. Its twelve 512Mb GDDR3 chips have a voltage of 2.0V and a rating frequency of 900 (1800) MHz. This is the frequency the memory is clocked at by the card, in compliance with Nvidia’s specification. The total amount of memory is 786MB with a 384-bit memory bus. The GPU frequencies are 576MHz and 1350MHz for the main and shader domains, respectively. The GPU is configured according to the official GeForce 8800 GTX specification: 32 TMUs, 24 ROPs, and 128 unified shader processors.
The card is equipped with two universal DVI-I connectors and a 7-pin mini-DIN connector that allows to plug an S-Video cable in directly.
OCZ Technology is a renowned supplier of high-performance memory modules. Few people remember that this company used to sell graphics cards, too, but gave up that business at the end of 2001. Today, OCZ is something more than just a memory supplier. It is a company that targets PC gamers and enthusiasts with a variety of advanced products and is going to introduce a number of new solutions later this year.
Besides a series of cooling devices that includes very advanced models intended for overclockers, a face-muscle-reading controller is expected – it will read your face to control the game. There will probably be new releases in the input device series, which includes only one mouse model as yet. Of course, modern games cannot be without high-speed graphics cards and OCZ is making ready to introduce a graphics card series, which would include pre-overclocked devices as well, for users with different requirements and capabilities. Now let’s check out the OCZ GeForce 8800 GTX, the first graphics card from the brand in years.
The box with the OCZ Technology GeForce 8800 GTX is smaller than the ASUS one, but is narrower and closer to being a square than the box of the Foxconn GeForce 8800 GTX. It is designed simply and laconically in black, white and green colors. This looks fine, and green raises associations with Nvidia whose corporate color it is. The picture on the face side of the box shows a sports car with “8800” on the number plate. This is a hint of the high speed the product from OCZ is going to give you.
The box from OCZ keeps the graphics card well protected. The entire volume of the box is filled with a block of polyurethane foam with two rectangular cut-outs for the card and for the accessories. This block is covered from above with a cardboard flap with soft padding. There is a hole in the box for you to have a look at the card. The following is shipped as the card’s accessories:
The accessories to the OCZ GeForce 8800 GTX do not include any special bonuses, yet it is the only graphics card in this review to come with a full set of cables. The brief installation manual is designed like a multilingual poster and contains just the basic facts you want to know to be able to install the card into your PC.
The package of the OCZ card looks fine and protects the product well, but we don’t find any games or useful software in the box. You do get everything necessary to use the card, but there no nice trifles that would make it special. On the other hand, this may be just what you need if you do not want to pay extra for accessories you’ll never use anyway.
The OCZ product is a copy of the reference card with a black PCB and an Nvidia logo in the bottom right corner. You shouldn’t worry if you encounter a green version of the card, though. The green and black-colored cards are in fact absolutely identical from a technical standpoint.
OCZ Technology changed the sticker on the cooler with its own sticker that shows a sports car and an OCZ logo.
As opposed to the cards from ASUS and Foxconn, the OCZ GeForce 8800GTX makes use of more modern GDDR3 memory chips, Samsung K4J52324QE-BJ1A. QE series chips with the letters BJ in the suffix work at a lower voltage than QC series chips: 1.9V against 2.0V. This leads to a somewhat lower level of power consumption and heat dissipation, the frequency being the same. The suffix BJ1A denotes the fastest version with a rating frequency of 1000 (2000) MHz, but the chips are actually clocked at 900 (1800) MHz on this graphics card, leaving some good reserve for overclocking experiments. The total amount of memory is a standard 768 megabytes accessed across a 384-bit memory bus.
The OCZ GeForce 8800GTX has standard GPU frequencies: 576MHz and 1350MHz for the main domain and the shader domain, respectively. All the GPU subunits are active, namely 32 TMUs, 24 ROPs, and 128 shader processors.
The card has an ordinary configuration of connectors: two universal DVI-I outputs and a 7-pin mini-DIN that allows to plug an S-Video cable in directly.
XFX approaches the problem of packaging just like ASUS does: the colorful external package contains another one, made of corrugated cardboard. This is the most compact box in this review, but it is thicker than the boxes from ASUS, Foxconn and OCZ. The design is even more laconic than with OCZ: the manufacturer logo and the model name on a white background are accompanied with a picture of a werewolf in the left part of the box.
The box is thick because there is a foam-rubber sheet at the bottom for additional protection during transportation. Just like in the ASUS package, the graphics card is fixed in a separate container. The following accessories are supplied with it:
This is the most frugal set of accessories among all we describe in this review. These don’t even include adapters for connecting additional power. XFX probably implies that the owner of a GeForce 8800 GTX surely has a power supply with two 6-pin PCI Express power connectors. Like with the OCZ product, the XFX GeForce 8800 GTX XXX comes without additional accessories.
The brief installation guide describes how to install the card into your system and contains recommendations on how to cool the system properly. It also offers information about compatibility of different video cables and adapters with 4-pin, 7-pin, and 9-pin connectors employed in modern graphics cards.
Like with the OCZ GeForce 8800 GTX, the packaging is good in terms of aesthetics and protection it provides to the product, but the accessories are insufficient due to the lack of PCI Express power adapters. We guess many users are going to agree with us on that point. Hopefully, the manufacturer will correct that drawback.
The XFX graphics card is the only one in this review to be made on a PCB with a green solder mask. It is thus easy to tell it from other three GeForce 8800 GTX included into this review.
Besides the color of the PCB, the XFX GeForce 8800 GTX XXX differs from the others in the picture on the cooler’s casing. The sticker shows a copy of the picture on the face side of the card’s box.
Just like the OCZ GeForce 8800GTX, the XFX GeForce 8800 GTX XXX Edition is equipped with Samsung K4J52324QE-BJ1A memory. The words “XXX Edition” mean increased clock rates, and the memory chips are clocked at 1000 (2000) MHz, which is in fact their rating frequency but higher than the reference card’s memory frequency. This increases the memory bandwidth from 86.4GB/s to 96GB/s for higher performance in high resolutions with high-quality FSAA modes supported by the GeForce 8800 architecture. The amount of memory and the memory bus width of the XFX card are standard, of course, i.e. 768 megabytes and 384 bits, respectively.
The GPU is pre-overclocked by the manufacturer as well. A frequency of 630MHz is specified for the main GPU domain, but the real frequency of this domain is 621MHz due to the peculiarities of the GeForce 8800 clock generator. This parameter influences the scene fill rate and performance of the ROPs and is going to provide a speed bonus in scenes that need a high fill rate as well as in high resolutions with enabled high-quality antialiasing.
Perhaps considering the frequency of 1350MHz already high for the shader processors, XFX did not increase it. As a result, the frequency delta of the XFX GeForce 8800 GTX XXX Edition is about 2.1 rather than ~2.3 as usual. You should keep this in mind if you are going to overclock the card more. The GPU has 32 TMUs, 24 ROPs, and 128 unified shader processors.
The XFX GeForce 8800 GTX XXX Edition has two universal DVI-I outputs and a 7-pin mini-DIN connector that allows to plug an S-Video cable in directly.
Our earlier tests showed that the reference cooling system Nvidia installs on its new-generation graphics cards is not only efficient, but also quiet. It rivals the leaders of the past in that parameter. We mean the coolers of the GeForce 7900 GTX and Radeon X1950 XTX. Despite the increased frequency of the main domain, the cooler of the XFX GeForce 8800 GTX 768MB DDR3 XXX Edition behaves just like the coolers of the non-overclocked versions of GeForce 8800 GTX, i.e. very quietly even under 3D load.
With 681 million transistors in the graphics core and a shader processor frequency of over 1GHz, G80-based solutions can’t possibly have low power consumption. We performed our power consumption tests on a special testbed equipped with connectors for measuring instruments.
The measurements were performed with a Velleman DVM850BL multimeter (0.5% accuracy).
We loaded the GPU by launching the first SM3.0/HDR graphics test from 3DMark06 and running it in a loop at 1600x1200 resolution and with enabled 4x full-screen antialiasing and 16x anisotropic filtering. The Peak 2D load was created by means of the 2D Transparent Windows test from Futuremark’s PCMark05 benchmarking suite. We did not test the XFX GeForce 8800 GTX XXX Edition separately. Its shader processors, which are the biggest consumer in the GeForce 8800, are clocked at the same frequency as the reference card’s, while the growth of the frequencies of the main domain and memory is not as big as to influence the card’s power draw much.
As we wrote in our earlier review, the reference GeForce 8800 GTX from Nvidia refused to run on our Intel Desktop Board D925XCV for unclear reasons. Early samples of reference GeForce 8800 GTX must have had some BIOS-related problems since the newer samples with other BIOS versions work correctly on the mentioned mainboard.
Click to enlarge
We found out in our review of the MSI NX8800GTS-T2D640E-HD-OC that the GeForce 8800 GTX needed just a little more power than the AMD Radeon X1950 XTX did and its 12V lines, including the internal one, would each bear a load of 40-43W. The GeForce 8800 GTX is not economical in 2D mode because the card does not lower its clock rates then.
Judging by the results, the GeForce 8800 GTX does not really need a very special power supply. A high-quality 450W unit should suffice unless you want to build a SLI subsystem out of two such graphics cards. If you do, you’ll need a 600W or higher PSU with four 6-pin PCI Express connectors. If your PSU has only two such connectors, you’ll have to use power adapters included with the card and power the second card from Molex connectors of the PSU. Purchasing an extremely expensive 900-1000W PSU is not actually necessary even for a SLI configuration with two GeForce 8800 GTX.
The GeForce 8800 series has a sophisticated GPU frequency distribution. The main GPU domain can change its frequency with a variable step of 9, 18 or 27MHz, but the shader processor domain can change its frequency with a step of 54MHz only. The problem is that many overclocking programs can work with the main domain frequency only. In the standard GeForce 8800 GTX the frequency ratio is 1 to ~2.34, but pre-overclocked versions of the card may have a different ratio and their frequencies are going to change in a different way at overclocking. Fortunately, some overclocking tools, e.g. the latest version of RivaTuner, can report the real frequencies of the GPU domains to make the process of overclocking easier.
There is also a simple formula to calculate the frequency of the GeForce 8800’s shader processors during overclocking with a high enough precision:
OC shader clk = Default shader clk / Default core clk * OC core clk,
where OC shader clk is the (approximate) resulting frequency, Default shader clk is the initial shader processor frequency, Default core clk is the initial core frequency, and OC core clk is the frequency of the overclocked core.
Basing on this information and on the frequency monitoring data reported by RivaTuner, we attempted to overclock the graphics cards included into this review. Here are the results:
The ASUS has the best result in this test, which is remarkable as it does not have 1ns memory as the cards from OCZ and XFX have. The OCZ version has a lower memory frequency than the leader, probably due to the lower voltage of Samsung’s QE series chips. The memory chips of the XFX GeForce 8800 GTX XXX Edition are not much better at overclocking, either. Coming with a pre-overclocked main domain, this card has the smallest growth of the shader processor frequency as well. The XFX may be experimented with further by modifying it and updating its BIOS with a different value of the shader processor frequency – the card may show better overclockability then. The Foxconn FV-N88XMAD2-ON looks worse than the products from ASUS and OCZ, but its shader block frequency is higher than that of the XFX GeForce 8800 GTX XXX Edition, which may be more valuable in some cases than a higher main domain frequency.
These results should not be viewed as a direct recommendation to purchase the winner as having the highest overclocking potential. Your overclocking success depends largely on the abilities of the particular sample of the graphics card, besides a lot of other factors.
During our comparative testing of the four GeForce 8800 GTX graphics cards we used the following hardware platforms:
Since we believe that the use of tri-linear and anisotropic filtering optimizations is not justified in this case, the graphics card drivers were set up in standard way to provide the highest possible quality of texture filtering.
We selected the highest possible graphics quality level in each game. We didn’t modify the games’ configuration files. Performance was measured with the games’ own tools or, if not available, manually with Fraps utility. We also measured the minimum speed of the cards where possible.
Since GeForce 8800 GTX belongs to the most expensive high-end graphics solutions, it means that the users buying these graphics cards will most likely have a monitor with over 24” diagonal at their disposal. Therefore, we decided to include the 2560x1600 resolution into the list of our test modes. If the game didn’t support the resolution that high, we stuck to the standard set of test modes: 1280x1024/960, 1600x1200 and 1920x1200. We used “eye candy” mode everywhere, where it was possible without disabling the HDR or Shader Model 3.0 increasing the image quality. We enabled FSAA and anisotropic filtering from the game’s menu. If this was not possible, we forced them using the appropriate driver settings of ATI Catalyst and Nvidia ForceWare.
Besides the Asustek, Foxconn, OCZ and XFX solutions we have also included the following graphics cards into our test session:
Since the graphics cards from Asustek, Foxconn and OCZ are shipped with reference frequencies, we decided not to include all four graphics cards into the performance charts, as their results are identical. Moreover, we didn’t include the numbers for overclocked graphics cards as well, because the overclocking results vary from card to card and cannot be guaranteed.
For our tests we used the following games and benchmarks:
The game does not support resolutions with an aspect ratio of 16:9 and 16:10 without your adding special parameters into its launch icon, but our reviews are intended for a large audience of gamers and we try to use only those settings the game itself offers. Besides that, even with the additional parameters the support of widescreen resolutions is implemented shabbily in Battlefield 2142. Some 2D graphics, like the HUD, may not display, for example. That is why we tested this game in 4:3 resolutions only, i.e. 1600x1200, 1920x1400 and 2048x1536 pixels.
Even with enabled 4x FSAA each version of GeForce 8800 GTX provides enough performance for you to be able to play this game at a resolution of 2048x1536. The speed is never lower than 40fps, which means you won’t lose smoothness and accuracy of control even in the very heat of a battle. As expected, the pre-overclocked XFX card shows its superiority over the other three GeForce 8800 GTX beginning with 1920x1440 resolution. This advantage amounts to 9-10%, which is quite good considering that the XFX’s shader processors are not overclocked.
The game is very demanding at its maximum graphics quality settings that make use of SM3.0 and HDR and the GeForce 8800 GTX cannot yield even 50fps at 1600x1200. The speed bottoms out to 21fps in the most complex scenes, so there is not much playing comfort here.
On the other hand, Call of Juarez is not a typical shooter, but includes such third-person shooter elements as “slo-mo”/”Matrix mode”. Subjectively, the game is quite playable on the GeForce 8800 GTX with enabled HDR, although the numbers may seem to suggest otherwise. However, you’ll have to limit yourself to 1600x1200 or even to 1280x1024 resolution for more playing comfort.
Somehow the game does not like the AMD Radeon X1950 XTX, but the dual-GPU GeForce 7950 GX2 feels much more confident than the new-generation GeForce 8800 GTS. The difference is small, though, whereas the GeForce 8 provides a higher-quality anisotropic filtering than the GeForce 7.
Since the game limits the fps rate, we should consider minimal performance, because it determines the smoothness of the virtual troops control.
All GeForce 8800 GTX models are beyond any competition and provide sufficient minimal performance delta. Even in 1920x1200 it equals 25fps, which is enough for maximum comfort in Command & Conquer 3. none of the less powerful graphics cards can ensure small performance delta like that, although AMD Radeon X1950 XTX, for instance, hits the performance bar successfully.
As for those who own a GeForce 7950 GX2 graphics cards, we can only sympathize with you, guys: the average performance on this graphics adapter drops below 25fps in 1600x1200 already. Moreover, the minimal fps rate is only 10fps, so forget about gaming comfort whatsoever. And you should blame Nvidia’s aggressive policy aimed at ousting GeForce 7 family from the market and replacing it with the GeForce 8. Although the new graphics card family demonstrates impressive results, a lot of gamers out there still own plenty of previous-generation Nvidia solutions, and unfortunately, the most vulnerable ones are those who purchased GeForce 7950 GX2 hoping to get unprecedented performance. These dual-chip graphics cards depend a lot on the software support of the games in the graphics card drivers. However, since the official ForceWare version for GeForce 7 family was last updated over half a year ago, there is hardly any hope for the GeForce 7 SLI to function normally in the new games.
The game can refuse to run at the eye candy settings on the Ultra level of detail. We benchmarked the graphics cards with anisotropic filtering only.
Nvidia’s flagship product shows a good enough speed at a resolution of 2560x1600 pixels. Considering the requirements of the genre, this is quite enough for normal play, although you have to be ready for slowdowns in certain scenes. As opposed to Battlefield 2142, the XFX GeForce 8800 GTX XXX Edition outperforms graphics cads with the reference frequencies by 11% at 1600x1200 and by 14% at 1920x1200. The gap is smaller at 8% in the resolution of 2560x1600. The pre-overclocked card from XFX does not open new opportunities, i.e. display resolutions, before the gamer, though. This often happens with overclocked hardware, though.
The AMD Radeon X1950 XTX looks good even against the GeForce 8800 GTS, making the resolution of 1920x1200 playable.
The XFX benefits little from the factory overclocking in this game. Here, the architectural features of the G80 with its division into separate domains help us see that Gothic 3 depends more on the computing power of shader processors than on the scene fill rate. The game not supporting full-screen antialiasing, the influence of the memory subsystem on its speed is rather low, at least when the graphics card has a 384-bit memory bus and a memory frequency of 900 (1800) – 1000 (2000) MHz.
Anyway, we can say that the GeForce 8800 GTX allows playing with comfort not only at a resolution of 1920x1200, but also at 2560x1600. Although the game uses a first-person view, it is not a typical first-person shooter and you don’t need a frame rate of 60fps to enjoy it fully. Owners of monitors with a diagonal of 30” will be able to play this game in the monitor’s native resolution and the impression won’t be spoiled by defects of the scaling-down.
Starting with version 1.04 this game allows using FSAA, so we tested it in the eye candy mode. Neverwinter Nights 2 declares support for HDR, yet this mode is not fully implemented by the developer and is not operational in the current version of the game. Perhaps this situation will change with more patches.
The increased frequency of the main domain helps the XFX GeForce 8800 GTX XXX Edition remain ahead of the other three versions of GeForce 8800 GTX. The gap amounts to 14-16% at 1600x1200 and 1920x1200, but diminishes to 8% at 2560x1600. Neverwinter Nights 2 is playable at speeds higher than 15fps, so we won’t recommend using the highest display resolution. Even if your monitor supports it, you’d better limit yourself to 1920x1200. Otherwise the gameplay may not be quite as comfortable as you wish even on an overclocked GeForce 8800 GTX.
The GeForce 8800 GTS and GeForce 7950 GX2 look good among the slower solutions. These two allow playing the game normally at a resolution of 1600x1200 pixels.
Note that although the GeForce 7950 GX2 (and the GeForce 7950 Quad SLI system based on two such cards) was once positioned by Nvidia as a premium-class solution for playing at extremely high resolutions, it now suffers a terrible performance hit on switching from 1920x1200 to 2560x1600 pixels. We can’t explain this behavior because the amount of memory can’t be the cause – the Radeon X1950 XTX loses much less speed on making the same transition. Perhaps this is a problem of SLI technology the dual-chip graphics card is based upon.
Adding the “-ll” command into the game launch icon, we successfully started it up on our GeForce 8800 cards. With this command and ForceWare 97.92 the game could start up, but did not allow to use FSAA and HDR simultaneously. So, we enabled the latter option for our tests as it has a bigger effect on the image quality.
Surprisingly, the XFX GeForce 8800 GTX XXX Edition fell behind its three opponents that are clocked at the reference frequencies. Fortunately, the difference is not big, at only 5%. Each card coped with this game successfully, providing acceptable performance at 1600x1200, but the Radeon X1950 XTX with much more modest technical characteristics is a mere 10% behind them in that resolution. You shouldn’t replace your X1950 XTX with a GeForce 8800 GTX to play Splinter Cell series games just because this wouldn’t give you a critical improvement of performance.
The GeForce 7950 GX2 cannot boast high results even at a resolution of 1280x1024 pixels. The lack of a SLI profile for Splinter Cell: Double Agent in the ForceWare driver results in the card’s using only one GPU of the two. Anyway, the GeForce 7 is an outdated solution. Hopefully, this problem won’t affect Nvidia’s upcoming dual-chip cards.
The average performance of all GeForce 8800 GTX models hardly depends on the resolution, at least until we get to 2560x1600. Minimal performance also doesn’t vary much, the delta lies within 32-34fps, therefore, you can choose any resolution you want if the monitor matrix supports it.
From the previous generation solutions only GeForce 7950 GX2 looks competitive enough if we disregard worse quality of anisotropic filtering. At least in 1600x1200 it yields just a little bit to GeForce 8800 GTX and runs almost as fast as GeForce 8800 GTS. And the minimal performance of Nvidia’s former flagship solution doesn’t drop below 25fps.
Unfortunately, we can’t say the same about AMD Radeon X1950 XTX that cannot get beyond 40fps in 1600x1200, and its minimal performance of less than 25fps cannot ensure comfortable gameplay. Looks like the AMD solution featuring only 16 TMUs simply lacks fillrate speed, because S.T.A.L.K.E.R boasts a lot of wide open spaces with rich vegetation, a lot of buildings, anomalies, etc.
The GeForce 8800 GTX suffers an almost twofold performance hit when switching from 1920x1200 to 2560x1600 resolution. The minimum speed is not below a critical level, yet we wouldn’t recommend you to play the game in this mode. You should either use two such cards in a SLI subsystem or disable FSAA.
The XFX GeForce 8800 GTX XXX Edition adds up a 10% bonus to the average frame rate, yet it cannot reach even 45fps at 2560x1600 with enabled 4x FSAA. At a resolution of 1920x1200 pixels the GeForce 8800 GTX can run the game normally even without the 10% speed bonus the XFX card provides. So, any version of GeForce 8800 GTX suits equally well for playing this game.
Having much more modest technical characteristics, the GeForce 8800 GTS is far slower than the senior model of the series. Anyway, you can play the game in 1600x1200 on a GeForce 8800 GTS.
As for the last-generation graphics cards, the Radeon X1950 XTX can run Supreme Commander at an acceptable speed in 1600x1200 with enabled 4x FSAA. You’ll have to limit yourself to 1280x1024 if you’ve got a GeForce 7950 GX2.
We have tested four GeForce 8800 GTX models offered by Nvidia’s partners. To make your shopping choice easier, we want to analyze which graphics card is better from different aspects such as performance, overclockability, originality, and accessories.
It’s all clear in this category. Unlike the other three versions, the XFX GeForce 8800 GTX 768MB DDR3 XXX Edition is pre-overclocked by the manufacturer (its main GPU domain and memory frequencies are higher than those of the reference card). This ensures a certain performance gain, from a few percent to 14-16% depending on the particular application and display mode. The gain is 8-10% on average.
We can’t say this is a critical advantage because it does not make a higher resolution playable in any of the tested games. On the other hand, having even this rather modest performance gain can’t be bad.
The products from ASUS, Foxconn and OCZ deliver identical performance as they are all clocked at the frequencies of the reference card. They are indistinguishable from this aspect.
If you have ever tried to overclock anything, you should be aware that your overclocking success is largely determined by a combination of random factors.
It is hard to make any recommendations in this area, but the XFX stands out among the others from this standpoint due to its non-standard frequencies again. It is quite possible that some samples of this graphics card will be able to work at a main domain frequency higher than 648MHz, but the shader processors will anyway overclock worse due to the smaller frequency delta than with the version that begins to overclock with the standard 576/1350MHz. We guess there will be few users, even among overclockers, who will attempt to correct the frequency delta and re-flash the card’s BIOS. That is why the XFX GeForce 8800 GTX 768MB DDR3 XXX Edition is not a good choice for an overclocker.
We had expected better overclockability from graphics cards with Samsung’s K4J52324QE-BJ1A memory, but were disappointed. It was the ASUS EN8800GTX with K4J52324QC-BJ11 chips that had the highest memory frequency while the two cards with QE series chips could hardly overcome the 1000 (2000) MHz mark. On the other hand, the Foxconn card, which has the same memory as the EN8800GTX, reached only 1000 (2000) MHz, so the memory type cannot be a guiding factor here.
Thus, if you want to overclock, you should consider the ASUS EN8800GTX. If you don’t care about overclocking, just choose any GeForce 8800 GTX you like. Overclocking the XFX GeForce 8800 GTX 768MB DDR3 XXX Edition involves some difficulties, and this card won’t suit you if you want to overclock, but do not want to update the card’s BIOS.
These graphics cards have nothing original about them because they are just ordinary GeForce 8800 GTX manufactured at Foxconn’s or Flextronics’ facilities for Nvidia and then shipped to Nvidia’s partners. Some personalization is achieved by the graphics card vendors through putting their own stickers, but this depends on your personal taste. The XFX card differs with the green color of its PCB, but other companies’ GeForce 8800 GTX may be green, too (we guess it depends on what contract manufacturer the particular card was made by). The four described graphics cards all have a reference cooler from Nvidia and identical noise characteristics, so we just can’t prefer any one of them to the others in this respect, either.
What puts one top-end graphics card apart from another is accessories. Every graphics card vendor wants to make its product special by offering some exciting accessory.
The OCZ Technology GeForce 8800GTX and XFX GeForce 8800 GTX 768MB DDR3 XXX Edition both have a scanty set of accessories, and the latter even lacks 2xMolex → 6-pin PCI Express power adapters. So, it is the Asus EN8800GTX/HDTP/768M and the Foxconn FV-N88XMAD2-ON that compete here.
The ASUS comes with a better software bundle which includes two full games, Tom Clancy’s Ghost Recon Advanced Warfighter and GTI Racing, as well as a Pro version of the Futuremark 3DMark06 benchmarking suite. This is a strong argument in favor of the ASUS card for those who are interested in those two games.
The accessories to the Foxconn FV-N88XMAD2-ON may seem more interesting for some users, though. Besides two useful programs, the box contains a high-quality gamepad that makes a nice companion to the graphics card. This is a more pragmatic approach, by the way. Some people may just not like the included free games whereas the programs for managing disc images and for system recovery are going to be useful for every gamer. And the included gamepad will surely be appreciated by everyone who plays third-person shooters and fighting games. We like Foxconn’s approach better than ASUS’, but it’s you who makes the final decision.
Now we can choose the best buy in each of the mentioned categories.
Speaking in general, each of the discussed versions of Nvidia GeForce 8800 GTX will make a good buy. They do not differ much from each other and can ensure excellent performance in modern games, even making the colossal resolution of 2560x1600 pixels playable in some cases.