by Alexey Stepin , Yaroslav Lyssenko
04/03/2009 | 05:29 PM
As we wrote in an earlier article, the Nvidia GeForce GTS 250 series is not something really new and innovative, yet is a good combination of technical characteristics and price. The model with 1024 megabytes of graphics memory and a full-featured G92b core has a recommended price of $149. The higher-class product GeForce GTX 285 comes at an official recommended price of $399 and its retail price even starts from $340 now.
We know that Nvidia’s single-chip flagship can successfully oppose the ATI Radeon HD 4850 X2 as well as a CrossFireX tandem consisting of two ordinary Radeon HD 4850 cards. Considering that two GeForce GTS 250 1GB cards cost about $300, we wonder how good this pair would be in SLI mode. Theoretically, this configuration has a chance to beat the GeForce GTX 285 because it has more shader processors (256 against 240) and more texture processors (128 against 80). Additionally, the G92b chip can be clocked at higher frequencies than the G200b.
So, it is going to be interesting to compare a GeForce GTS 250 SLI with a GeForce GTX 285. This comparison will make sense until the latter’s retail price declines to $300 and lower, which will not be quick due to the high manufacturing cost of the G200b chip. Of course, running two graphics cards in multi-GPU mode has its downsides and your mainboard must support the appropriate technology. Such a configuration will depend on software optimizations, but we guess that the opportunity of getting more speed in modern games than from a GeForce GTX 285 will appeal to many gamers. We are going to make this comparison in this review and see if building such a configuration makes any sense.
We will also tell you about two new and interesting GeForce GTS 250 1GB models, each with its particular highs and lows. These are EVGA GeForce GTS 250 Superclocked Edition and Gigabyte GV-N250OC-1GI.
The EVGA card comes in unified packaging we have already described in our earlier reviews of the company’s products. The box is somewhat smaller than the ones with EVGA’s GeForce GTX 285/295 series cards, though. The distinguishing feature of the EVGA GeForce GTS 250 series is the green strip going along the top of the package.
The memory type is indicated incorrectly: DDR3 instead of GDDR3. Otherwise, we can find no fault with the packaging. It looks neat and stern, following EVGA’s traditional style. The words Superclocked Edition are applied by means of a transparent sticker. As usual, there is a window in the back panel of the box for you to see the part of the graphics card with the serial number. Below the window is a sticker with the same number. When buying the card you must make sure the two numbers coincide. This proves the authenticity of the product and allows you to take part in the EVGA Step-Up program.
Inside the box there is a plastic container in which the graphics card is fixed securely. It protects the card against damage but we guess that foam-rubber materials are better yet. Besides the card, the box contains the following accessories:
The accessories are frugal but you can hardly expect something else from a $149 product, especially as there is everything necessary to install and use the card. Still, we guess the manufacturer might have included a DVI-I → HDMI adapter.
As usual, the included disc contains drivers, an electronic version of the user manual, a full version of the Fraps utility which is highly popular among gamers, and the exclusive EVGA Precision tool that can be used to overclock the graphics card and monitor its temperatures.
So, the quality and design of the card’s packaging are good overall, but the accessories might have been better. We guess such things as a DVI-I → HDMI adapter and S/PDIF cable would not have affected the price of the product much. That’s not a serious problem, however. And EVGA should be given credit for taking care about owners of older PSUs that don’t offer 6-pin power connectors for graphics cards.
The EVGA GeForce GTS 250 Superclocked uses a reference PCB developed by Nvidia especially for the new graphics card series. We don’t know why they didn’t use the very good PCB of the GeForce 8800 GTS 512. Perhaps the reason is of a purely marketing nature. Notwithstanding the aged G80/G92 architecture, Nvidia wants to show that its development department has not been idle.
Well, the new PCB is quite a good one too, being compact and easy to use. It is equipped with one 6-pin PCI Express 1.0 connector. This PCB comes with either black or green coating. Our card is black as you can see.
The GPU voltage regulator is based on a four-phase PWM controller ON Semiconductor NCP5388. We have seen this controller in the new version of GeForce GTX 260 Core 216 and in some non-reference cards from Gainward and Palit. Each of the regulator’s phases incorporates three power MOSFETs. The card’s simple memory controller is based on an uP6161N chip manufactured by uPI Semiconductor. As we have mentioned above, the reference PCB of the GeForce GTS 250 has only one connector for external power supply, which is a logical solution considering the low power consumption of the 55nm G92 chip even when the latter is clocked at a high frequency.
The card is equipped with eight chips of GDDR3 memory placed in a semicircle around the GPU. These Hynix H5RS1H23MFR-N2C chips have a capacity of 1Gbit (32Mbit x 32), a voltage of 2.0V, and a rated frequency of 1200 (2400) MHz. This is the fastest memory in the GDDR3 product range from Hynix. According to Nvidia’s official specs, the GeForce GTS 250 has a memory frequency of 1100 (2200) MHz, but the described card from EVGA belongs to the Superclocked series and its memory chips are pre-overclocked to 1123 (2246) MHz. That’s not much of overclocking, though. With a 256-bit memory bus the memory bandwidth is thus increased from 70.4 to only 71.9GBps.
The GPU-Z tool still cannot identify the G92 revision correctly and tell it from the G92b. It is a G92b chip that it the heart of this graphics card, of course. It is revision B1 and manufactured on 55nm tech process. Our sample is dated the 44th week of 2008, i.e. the end of October. Its main and shader domains are pre-overclocked from the standard 738/1836MHz to 771/1890MHz, respectively. This modest frequency growth will hardly add much performance in 3D games, but nothing prevents you from trying to overclock the card further. The GPU configuration is standard with 128 unified execution units, 64 texture processors, and 16 raster back-ends.
Otherwise, there is nothing special about the EVGA GeForce GTS 250. It is equipped with a standard set of interface connectors including two dual-link DVI-I ports, a 7-pin analog video output, two MIO connectors for Triple-SLI support, and an S/PDIF output.
The cooling system deployed on the card continues the well-known series of Nvidia’s reference coolers that traces its origin back to the GeForce 8800 GTX.
The main component of the system is a heatsink consisting of thin aluminum plates and connected to the copper heat-exchanger with two heat pipes. The heatsink is cooled by a blower with 4-pin connection. The hot air is exhausted out of the system case through the slits in the card’s mounting bracket. All of the system’s elements are fastened on an aluminum base that has juts opposite hot components such as the voltage regulator’s MOSFETs and memory chips. Here, dark-gray elastic pads are used as a thermal interface instead of traditional fiber pads soaked in thermal grease. This solution must be cheaper and easier to implement. Rather liquid light-gray thermal grease is applied between the heat-exchanger and GPU.
The cooling system is covered with a black plastic casing with EVGA sticker. Besides the color of the PCB, this is the only element that differentiates this product from other GeForce GTS 250 cards that use the reference PCB design. The cooling system is fastened to the PCB with a lot of spring-loaded and ordinary screws, forming nearly a single whole with the latter. There is no risk of the GPU die getting damaged through a misalignment of the cooler.
This cooler design has proved its worth before. It combines good cooling performance with low level of noise. We don’t have a reason to think that it won’t cope with the GeForce GTS 250.
Gigabyte’s box is slimmer but wider than the one with EVGA’s GeForce GTS 250 Superclocked. It is embellished with a picture of a war robot that is meant to symbolize the huge might of this graphics card.
The type and amount of graphics memory are indicated correctly. The text on the box also mentions that the GV-N250OC-1GI belongs to the Ultra Durable VGA series in which Gigabyte focuses on quality and reliability ensured through the use of top-class components such as polymer capacitors, low RDS(on) MOSFETs, select memory chips, etc. The 2oz Copper PCB technology needs a special mention, but we will discuss it in more detail later on, following the course of this review. The text “GDDR3 OverClock 1.1GHz” is somewhat incomprehensible because the memory frequency of 1100 (2200) MHz is actually standard for the GeForce GTS 250.
There is a white cardboard box inside the colorful wrapper. The graphics card is fixed within a foam-rubber tray. The following accessories are placed nearby:
Like with the EVGA GeForce GTS 250 Superclocked, the accessories are far from gorgeous but sufficient considering the S/PDIF cable. The HDMI→DVI-I adapter is necessary due to the card’s having all the three standard output interfaces: DVI-I, HDMI and D-Sub/VGA. This product seems to be positioned by Gigabyte as a universal solution suitable for both inexpensive gaming stations and advanced multimedia HTPCs.
Besides drivers, the included disc contains Gigabyte’s Gamer HUD Lite utility that is similar to EVGA Precision in functionality. The full version of that tool offers the option of fine-tuning the GPU voltage while the Lite version only allows to optimize the power consumption of the card in 2D mode by reducing the clock rates and voltage of the graphics core. Anyway, this is still a full-featured tool for overclocking and monitoring the GPU.
So, the packaging of the Gigabyte GV-N250OC-1GI does not look as good as the box with the EVGA GeForce GTS 250 Superclocked. The picture of the robot is just too trivial and boring. However, this packaging offers more protection and contains somewhat better accessories than what you get with the EVGA product. And as these properties are more important than the exterior design, Gigabyte wins this round.
As opposed to the EVGA GeForce GTS 250 Superclocked, Gigabyte’s card uses a unique PCB developed especially for the Ultra Durable series. The PCB is 2 centimeters shorter than the reference card. It is only 21 centimeter long.
This compactness is mainly achieved by compacting the GPU voltage regulator using small low RDS(on) MOSFETs. But like on the reference PCB, the regulator has four phases and is based on an ON Semiconductor NCP5388.
The memory voltage regulator is based on two STMicroelectronics D1703L MOSFETs and seems to be controlled by the miniature chip marked as CD-N2B and designated as U1 on the PCB. The card has only one power connector but it is placed at the shorter edge of the card. This should not be a problem even for compact system cases considering the shorter length of the PCB. The stricken-out letters Pb indicate that lead and its compounds are not used in the manufacturing process.
As we’ve promised above, we will know tell you about the unique technology of some Gigabyte products including the GV-N250OC-1GI. It is called 2oz Copper PCB. What do those ounces stand for? Well, the marketing folks bring confusion into descriptions of even simple things and technologies. Here, the point is that the PCB’s metallization layers are double the usual thickness – 70 rather than 35 micrometers. If calculated for a square foot of PCB, this results in a 2oz increase in the mass of employed copper. That’s why the name is 2oz Copper PCB. This technology is not just pure marketing, though. Hot components like the GPU or power transistors of the voltage regulator heat up the PCB underneath them. And as copper has excellent thermal conductivity, the thicker metallization layers should improve the PCB’s ability to dissipate heat and thus help in cooling those components. Gigabyte claims that the GPU temperature of Ultra Durable VGA series with 70-micrometer copper interconnects is 5 to 10% lower than that of cards with ordinary 35-micrometer metallization. Gigabyte’s thermographic pictures look convincing enough. We cannot check this out by ourselves, yet the improvement in cooling does agree with theory.
Like the above-described EVGA GeForce GTS 250 Superclocked, the Gigabyte GV-N250OC-1GI is equipped with Hynix H5RS1H23MFR-N2C chips that have a capacity of 1 gigabit and a rated frequency of 1200 (2400) MHz. The card’s memory frequency agrees with Nvidia’s official specs and is 1100 (2200) MHz. With a 256-bit bus this ensures a peak memory bandwidth of 70.4GBps.
The graphics core is marked as G92-421-B1. It was manufactured on the 42nd week of 2008, i.e. in the middle of October. In full accordance with Nvidia’s official GeForce GTS 250 specs, the GPU has a frequency of 738MHz and 1836MHz for the main and shader domain, respectively. The GPU has the maximum configuration possible for the G92/G92b processor: 128 ALUs, 64 texture-mapping units, and 16 raster back-ends.
Gigabyte has tried to make its product modern and universal. Therefore the GV-N250OC-1GI does not support the obsolete analog video formats like S-Video and others. Instead, it offers a dedicated HDMI port with protective cap. Besides, the mounting bracket accommodates D-Sub VGA and DVI-I ports. This makes the card suitable for both gaming systems and HTPCs, especially as a cable for translating S/PDIF audio into HDMI is included into the box. Gigabyte does not think that anyone will build Triple-SLI configurations out of GeForce GTS 250 and offers only one MIO connector on the card. It has a protective cap, too.
The GV-N250OC-1GI is equipped with a somewhat simplified version of the Zalman VF1000 cooler. It is called Zalman VF1050. It is simplified because it lacks a universal fastening mechanism and uses a fan with 2-pin connection. Otherwise, this is a full-featured cooler following the best design tradition of Zalman. The cooler’s nickel-plated heatsink look very beautiful in combination with the blue color of the PCB.
In fact, the heat-exchanger and the heat pipes are made from copper but the heatsink’s plates are from aluminum. We did not see the characteristic red color of copper when we tried to scratch them. And moreover, the assembled cooler weighs only 225 grams. It would be much heavier if the heatsink were made from copper. This should not be a problem, however. The Zalman VF1050 should be quite able to cope with the G92b chip easily and noiselessly even without a copper heatsink. The fan is blowing sideways as well as downward, thus lowering the temperature of the PCB with its components. Coupled with the thicker metallization layers, this should make the cooling of the GV-N250OC-1GI even more effective.
A layer of dry dark-gray material is used as a thermal interface between the heat-exchanger’s sole and the GPU die. We can’t even call it thermal grease because the stuff is like regular glue, fastening them both into a single whole. The cooler is fastened to the PCB with four threaded poles and spring-loaded nuts. There is no misalignment, so the single drawback of this cooling system is that the hot air is not exhausted out of the system case. The cooler looks highly promising overall.
The Palit GeForce GTS 250 1GB card we tested earlier had two power connectors, so we will retest the power consumption of the GeForce GTS 250 cards as they have only one PCIe 1.0 connector. We use the following testbed for that:
Following our standard procedure, the 3D load was created by the first SM3.0/HDR test from 3DMark06 running in a loop at 1600x1200 with forced 4x FSAA and 16x AF. The 2D load was emulated by the 2D Transparent Windows test from PCMark05.
Click to enlarge
The two versions of Nvidia’s card have similar results in this test, obviously because they use one and the same controller (NCP5388) in the GPU power circuits. There is indeed no need to install two power connectors on G92b-based graphics cards. They don’t even load the single 6-pin PCIe 1.0 connector fully (it is rated for a load up to 75 watts).
The cooling system installed on the Gigabyte GV-N250OC-1GI card copes with its job just perfectly. The GPU temperature is never higher than 60 degrees centigrade, which is 11 degrees lower than the result of the reference card. We can’t say how many degrees the 2oz Copper PCB technology has contributed to this result, but anyway. Gigabyte’s card deserves our praises for superb cooling efficiency. Let’s check out the noise now.
We got the following results when measuring the noise at a distance of 1 meter from the working testbed (the ambient noise was 43dBA):
The EVGA GeForce GTS 250 Superclocked Edition is not quiet because its fan speed grows up under load, making the card audible amidst the noise from the other system components. However, like with all other reference coolers from Nvidia, the spectrum of the noise is not irritating. It is the hiss of air passing through the heatsink. The Gigabyte GV-N250OC-1GI has a much more impressive result: it is virtually silent when both idle and under continuous 3D load.
As we have mentioned previously, EVGA and Gigabyte both equipped their products with exclusive overclocking & monitoring tools, Precision and Gamer HUD Lite, respectively. This is what they look like:
These are functional tools that do their job well enough, but we guess most overclockers will prefer the popular, flexible and universal RivaTuner. That’s what we did, too.
In our overclocking experiment we increased the frequencies of the EVGA GeForce GTS 250 Superclocked Edition to 820/2010MHz for the graphics core and 1240 (2480) MHz for memory.
A shader domain frequency of over 2GHz is very high, especially as we did not have to resort to some special overclocking tricks or tools to achieve it.
The GPU of the Gigabyte GV-N250OC-1GI could be overclocked even higher to 820MHz for the main domain and 2040MHz for the shader domain but its memory chips did not speed up well. Notwithstanding the manufacturer’s claim of using select memory chips, they only worked at 1180 (2360) MHz, not even reaching their rated frequency of 1200 (2400) MHz. The card’s PCB wiring may be not optimal for the chips or we may have just had a not-very-overclockable sample of the product.
So, both graphics cards did well in this round of our test session, too. But we should acknowledge that the Gigabyte has better cooling and lower level of noise. On the other hand, the EVGA card may be the preferable option for a compact system case because its cooling system will exhaust the hot air out of the latter.
We are going to investigate the performance of GeForce GTS 250 SLI system using the following testbed:
The graphics card drivers were configured in the same way as before: to provide the highest possible quality of texture filtering and to minimize the effect of default software optimizations. As a result, our ATI and Nvidia driver settings looked as follows:
The list of benchmarks includes the following gaming titles and synthetic tests:
First-Person 3D Shooters
Third-Person 3D Shooters
We selected the highest possible level of detail in each game using standard tools provided by the game itself from the gaming menu. The games configuration files weren’t modified in any way, because the ordinary user doesn’t have to know how to do it. We made a few exceptions for selected games if that was necessary. We are going to specifically dwell on each exception like that later on in our article.
Since the main goal of our today’s test session is to study the performance of GeForce GTS 250 SLI configuration, we have also included the following graphics accelerators to participate in our test session, besides the solutions from EVGA and Gigabyte described above:
We ran our tests in the following resolutions: 1280x1024, 1680x1050 and 1920x1200. GeForce GTS 250 SLI tandem as well as single GeForce GTX 285 and Radeon HD 4850 X2 graphics cards were also tested in 2560x1600, as they belong to a higher price range. Everywhere, where it was possible we added MSAA 4x antialiasing to the standard anisotropic filtering 16x. We enabled antialiasing from the game’s menu. If this was not possible, we forced them using the appropriate driver settings of ATI Catalyst and Nvidia GeForce drivers.
Performance was measured with the games’ own tools and the original demos were recorded if possible. We measured not only the average speed, but also the minimum speed of the cards where possible. Otherwise, the performance was measured manually with Fraps utility version 2.9.8. In the latter case we ran the test three times and took the average of the three for the performance charts.
The relatively inexpensive GeForce GTS 250 SLI tandem looks good in this test. At resolutions below 2560x1600 it is inferior to the Radeon HD 4850 X2 but faster than the GeForce GTX 285. And it even goes ahead of them both at 2560x1600. The performance growth relative to the single card varies from 50 to 70% depending on the resolution. That’s quite far from the theoretical maximum, yet impressive for a $300 solution. This solution is even more appealing for users who are going to upgrade their graphics subsystem in two steps.
On the other hand, the single GeForce GTX 285 is a good option, too. It is simpler to install and use, requires but one PCI Express x16 slot, and does not depend on software optimizations. And it ensures a comfortable frame rate at every resolution including 2560x1600.
Although the GeForce GTS 250 SLI tandem has the best average speed in three out of the four resolutions, being especially faster than the GeForce GTX 285 and Radeon HD 4850 X2 at 1280x1024. However, its bottom speed is no higher than 20fps. In other words, the SLI tandem cannot ensure smooth gameplay in every scene, even though is close to doing that.
Take note that the gap between the GeForce GTS 250 SLI and Radeon HD 4850 X2 shrinks from 25% to 8% as the resolution grows up, and the dual-chip card from ATI is even ahead at 2560x1600.
We disabled the integrated frame rate limiter in the game console for the sake of comparing the cards. The game’s built-in benchmarking options do not provide information about the bottom speed, so there is no such info in the diagrams.
We can see no clear winner in this test. The GeForce GTS 250 SLI and the single GeForce GTX 285 go ahead in different resolutions and are joined by the Radeon HD 4850 X2 at 1920x1200. The GeForce GTX 285 is better in such parameters as noise level and the ease of installation and use.
The graphics subsystem with two G92b cores has no rivals in this test until the resolution of 2560x1600 in which the single GeForce GTX 285 has a comfortable bottom speed of over 25fps. The GeForce GTS 250 SLI offers a bottom speed of only 20fps then, which will not be enough for a demanding gamer.
The GeForce GTX 285 also offers comfortable speed at the lower resolutions, so it is a better buy that the couple of GeForce GTS 250 1GB cards. Of course, if these two graphics subsystems have comparable price.
The three leaders of this review go neck and neck in this test. It is only from the resolution of 1920x1200 that the GeForce GTS 250 SLI tandem falls behind the GeForce GTX 285 and Radeon HD 4850 X2 in terms of bottom speed, although ensures comfortable conditions for play. Nvidia’s flagship single-processor card wins again with the total of its consumer properties.
The game runs on the Source engine and has an integrated benchmark, but the latter does not report the bottom speed information.
Again, the GeForce GTX 285 is somewhat worse than its dual-processor opponents in sheer speed but surpasses them in the other consumer properties.
SLI technology is 76% efficient in this test, which is quite a normal result for modern multi-GPU solutions developed by both ATI and Nvidia.
To achieve a playable speed in this game we disabled FSAA and such resource-consuming options as Sun rays, Wet surfaces and Volumetric Smoke. We use the Enhanced full dynamic lighting (DX10) mode for our test and additionally enable the DirectX 10.1 mode for the ATI cards.
Here, the GeForce GTS 250 SLI tandem is really victorious. Like the Radeon HD 4850 X2, it delivers comfortable performance at 1920x1200 and nearly does the same at 2560x1600. The single GeForce GTX 285 is unable to do that at its default GPU and memory frequencies.
Take note tat the single GeForce GTS 250 1GB does not cope with the maximum graphics quality settings even at 1280x1024, and the factory overclocking does not help the EVGA card at all – its average speed grows up while its bottom speed remains the same and below the required 25fps.
The pair of GeForce GTS 250 cards working in SLI mode delivers fantastic performance, leaving its opponents behind at resolutions up to 1920x1200 inclusive, yet we guess such a high speed is just redundant. Even the single GeForce GTS 250 can yield a playable frame rate at these resolutions. The pre-overclocked version from EVGA is especially good but the silent version from Gigabyte is quite good for this game, too.
The efficiency of Nvidia SLI technology nears its theoretical maximum here. You even get a performance growth of over 100% at 1920x1200 when switching from one GeForce GTS 250 to two such cards. Here, this speed has a practical value as it makes the resolution of 1920x1200 available for play. You can’t play the game comfortably in this mode on the single GeForce 9800 GTX+, GTS 250, Radeon HD 4850 and Radeon HD 4870. On the other hand, the GeForce GTX 285 copes with the job just as well and does not require a SLI-compatible mainboard. It is only 6-8% slower than the GeForce GTS 250 SLI, which makes no difference for gamers.
It is hard to compete with ATI’s solutions in Fallout 3. The GeForce GTS 250 SLI subsystem costing $300 is only able to match the $200 Radeon HD 4870 1GB card in this test. There is one exception, though. The SLI configuration offers a better bottom speed at 2560x1600. Well, there is still no good reason to prefer the SLI tandem to a single GeForce GTX 285 except when you increase your computer’s graphics performance steadily by adding a second GeForce GTS 250 to the one you already own.
The GeForce GTS 250 SLI beats the GeForce GTX 285 by a small margin and cannot show an acceptable bottom speed at 1920x1200. As for the individual cards from Gigabyte and EVGA, the latter enjoys but a small advantage due to its factory overclocking and does not allow to play comfortably even at 1280x1024 with full-screen antialiasing forced on.
The overall picture is the same as we could see in Devil May Cry 4. The advanced graphics cards are just unnecessarily fast. A single GeForce GTS 250 is quite sufficient for this game at resolutions up to 1920x1200. The GeForce GTS 250 SLI tandem should be given credit for having the highest result among all the tested solutions at 1680x1050 and 1920x1200.
This game caters to two categories of gamers. First of all, it is intended for all who love the Tom Clancy’s series of gaming titles. And it also appeals to the rather limited audience of people who like to play car simulators. It features excellent visuals and has rather high graphics subsystem requirements. Therefore it is an ideal benchmarking tool, especially as it comes with integrated benchmarking options – without the option of recording bottom speed, though.
Tom Clancy’s H.A.W.X. being a recently released title, Nvidia has not yet made it compatible with SLI technology. This explains the poor performance of the GeForce GTS 250 SLI tandem that is just routed by the GeForce GTX 285. The gap is over 25%. ATI seems to be ahead of Nvidia in optimizing new games for CrossFire: the Radeon HD 4850 X2 is competing with the single-chip flagship of the green team quite successfully.
The game is quite a demanding application for the computer’s graphics subsystem. The gameplay was only smooth and comfortable when we used the two most advanced solutions included into this review and only at the two lowest resolutions.
The game has a frame rate limiter fixed at 30fps. We could not disable it.
Notwithstanding the release of the add-on, the standings have not changed. Nvidia’s solutions still have problems with performance when you turn on 4x full-screen antialiasing. Even such advanced solutions as GeForce GTX 285 or GeForce GTS 250 SLI cannot maintain a comfortable bottom speed at a resolution of 1280x1024.
The couple of GeForce GTS 250 cards working in SLI mode is in the lead at 1280x1024 but hardly differs from the GeForce GTX 285 at the higher resolutions. And the latter card is far easier to use and does not require a SLI-compatible mainboard. As this game winds up our set of gaming tests, we have to admit that the GeForce GTS 250 SLI has not showed any serious advantages over the single-processor G200b-based opponent throughout this test session.
The tested SLI configuration does not show anything exceptional in 3DMark06, taking second place behind the Radeon HD 4850 X2 in terms of overall score. Despite its 128 texture processors it is not very good in the SM2.0 tests, but shows its best in the SM3.0/HDR tests, even outperforming its dual-processor rival from the red camp.
We minimize the CPU’s influence by using the Extreme profile (1920x1200, 4x FSAA and anisotropic filtering). We also publish the results of the individual tests across all display resolutions to provide a full picture.
There are no oddities in the behavior of the GeForce GTS SLI configuration in 3DMark Vantage. This tandem occupies a place between the GeForce GTX 285 and the Radeon HD 4850 X2, being a little behind of the former and a little ahead of the latter. The efficiency of SLI technology approaches the theoretical maximum: the performance growth amounts to 90%.
The GeForce GTS 250 SLI tandem loses to the single GeForce GTX 285 in the first test but goes ahead of the latter in the second test. This must be the result of its having 256 shader processors clocked at over 1.8GHz as opposed to the opponent’s 240 shader processors clocked at about 1.5GHz. The GeForce GTS 250 SLI wins the resolution of 2560x1600 pixels of both tests, however. That’s not what it did in the previous tests often.
Just as we had expected, the GeForce GTS 250 SLI configuration showed a very impressive performance in modern 3D games. It was considerably faster even than the GeForce GTX 285 across a number of applications thanks to its advantage in core clock rates as well as in the amount of ALUs (256 against 240) and texture processors (128 against 80). We are going to present the summary data to you now.
At a resolution of 1280x1024 pixels the average performance growth from switching from one GeForce GTS 250 to two such cards amounts to 64%. It varies from 18% to almost 100% depending on the specific application. In one new game (Tom Clancy's H.A.W.X.) there was a performance hit because the current version of the GeForce driver does not yet support SLI mode for this application.
The GeForce GTS 250 SLI configuration is also good in comparison with the GeForce GTX 285, enjoying an average 10.5% advantage over the latter. The single exception is Tom Clancy’s H.A.W.X. which again reminded us of the fundamental vulnerability of all modern homogenous multi-GPU solutions. We mean their dependence on software support.
It is not so easy to name the better one between the GeForce GTS 250 SLI and Radeon HD 4850 X2. If you calculate the average result, the former solution has a 5% advantage over the latter. However, Nvidia’s SLI configuration wins only seven out of the 15 tests whereas ATI’s solution wins the other eight. And we mean not only the extremely rare Radeon HD 4850 X2 which has only been released by Sapphire but also any Radeon HD 4850 CrossFireX tandem. Considering the recent price cut, the latter configuration is going to be a dangerous opponent to the GeForce GTS 250 SLI. Besides, CrossFireX-compatible mainboards are more widespread on the market than SLI-compatible ones.
When we switch to the resolution of 1680x1050, the efficiency of Nvidia SLI technology with respect to GeForce GTS 250 grows up to 69%, but the average advantage of the corresponding tandem over the GeForce GTX 285 drops to 9%. Besides Tom Clancy’s H.A.W.X., the SLI configuration is now slower than Nvidia’s flagship in Enemy Territory: Quake Wars, although by 3% only. The competition with the Radeon HD 4850 X2 gets tougher: the average advantage of Nvidia’s tandem shrinks to 1.5% but it wins nine tests now.
The display resolution grows up and so does the efficiency of the SLI configuration built out of two GeForce GTS 250 cards. The efficiency is 75% on average, but rises to 100% and more in three out of the 15 tests. The tested multi-GPU configuration outperforms the GeForce GTX 285 in most of the tests at this resolution, too. The exceptions are Tom Clancy’s H.A.W.X. and World in Conflict. The GeForce GTS 250 SLI equals ATI’s multi-GPU solution that has a similar price and capabilities in speed but wins eight tests and loses seven.
In the highest display resolution the GeForce GTS 250 SLI tandem is slower than the GeForce GTX 285 in four out of the 15 tests, namely Enemy Territory: Quake Wars, F.E.A.R. 2: Project Origin, World in Conflict, and Tom Clancy's H.A.W.X. The dual-processor configuration is on the winning side overall. We should note, however, that the single-chip card ensures more comfort in some games, e.g. in Far Cry 2, because it offers a higher bottom speed. The dual-chip solution from Nvidia loses in its competition with the ATI Radeon HD 4850 X2 but not by much: the ATI solution wins eight and loses seven tests. Moreover, the GeForce GTS 250 SLI can occasionally ensure comfortable conditions for the gamer where the opponent fails, for example in Fallout 3.
Summing everything up, the idea of uniting two GeForce GTS 250 1GB cards, each costing $149, into a SLI tandem is viable because such a configuration is generally superior to the GeForce GTX 285 in performance. The latter is also more expensive as yet. Its retail price starts from $340-350. On the other hand, it does not require a SLI-compatible mainboard as the GeForce GTS 250 SLI tandem does. And the choice will be limited to mainboards with nForce chipsets for both Intel LGA775 and AMD AM2/AM2+ platforms.
The performance benefits provided by the GeForce GTS 250 SLI tandem are generally not crucial and do not open new opportunities for the gamer that are unavailable with the single GeForce GTX 285. Therefore we consider the latter as a better product that does not depend on multi-GPU support in the driver. Building the SLI tandem will only make sense if you cannot spend more than $150 for your graphics card today, but you are going to increase your graphics subsystem performance in the near future. But don’t forget that you will also need an appropriate mainboard.
The two versions of GeForce GTS 250 1GB described in this review are both good in their own ways. The EVGA boasts pre-overclocked frequencies for a certain performance benefit and has overclockable memory. Its cooling system exhausts the hot air out of the system case. The Gigabyte is compact, has an HDMI port, high-quality components, and a near-silent cooler with excellent cooling performance. The former will suit an ordinary gamer with limited budget whereas the latter is more universal and can be used in a mainstream gaming system as well as in a HTPC.
EVGA GeForce GTS 250 Superclocked Edition: