GeForce GTX 560 from EVGA, Gigabyte and MSI

Today we are going to discuss three mainstream GeForce GTX 560 graphics cards boasting unique PCB designs, proprietary cooling systems and increased working frequencies. We will also test them in an SLI configuration. Read our review for more details!

by Sergey Lepilov
07/26/2011 | 01:51 PM

According to statistics from Steam, sub-$200 graphics cards from the value and mainstream segments are currently the most popular products among computer users. Nvidia GeForce GTX 560 demonstrates the best growth dynamics of all graphics cards supporting DirectX 11 API. While this product was launched only in May, its sales volumes have already exceeded those of its direct competitors on AMD processors and have reached almost twice the sales volume of GeForce GTX 570, which came out about 6 months earlier and offers higher performance at higher price.

 

GeForce GTX 560 owes its growing popularity not only to an appealing combination of price and performance, but also to the fact that Nvidia didn’t regulate specifically the parameters of the GTX 560 graphics cards having established a frequency range for the GPU and memory. Moreover, it turned out that there were simply no reference GeForce GTX 560 based graphics cards at all and the manufacturers were free to design and introduce their own PCB layouts with unique features. As a result, we saw a lot of proprietary solutions which stood out not only due to the number of active phases in their GPU and memory VRM circuitry or clock frequencies, but also due to their unique cooling systems, which makes GeForce GTX 560 even more appealing for the end-users.

Today we are going to talk about three graphics cards like that. They are EVGA GeForce GTX 560 1 GB Superclocked, Gigabyte GeForce GTX 560 1 GB Ultra Durable (GV-N56GOC-1GI) and MSI GeForce N560GTX 1 GB Twin Frozr II-OC.

Package and Accessories

All three graphics cards come in packages of constructively the same design: they all have an exterior slip-on cover and an internal box made of thick heavy cardboard. Gigabyte’s product boasts the largest and most colorful packaging of the three:

 

 

 

At the same time, despite their smaller size, EVGA’s and MSI’s packaging is just as informative as Gigabyte’s. They provide all necessary information about the peculiarities of the products, bundled accessories and system requirements. I would also like to add that all graphics cards are very securely packed, so they will survive most transportation challenges without any problems.

The accessories bundled with all three GeForce GTX 560 models are very similar. Each card comes with two additional power cables, one DVI-to-VGA adapter, one miniHDMI-to-HDMI adapter, a CD disk with drivers and utilities, manuals and promotional marketing materials:

 

Besides the EVGA graphics card also has a company logo sticker, the MSI card – a coupon for free download and activation of Lara Croft and the Guardian of Light game. None of the three tested products have any SLI bridges included with the accessories. EVGA and MSI graphics cards are made in China, and the Gigabyte card – in Taiwan. All graphics cards come with three-year warranty.

PCB Design and Functionality

All three cards look very differently:

  

  

  

Gigabyte GeForce GTX 560 Ultra Durable has the flashiest exterior with large cooling fans. Despite the seeming differences Gigabyte’s and EVGA’s solutions have PCB of the same length – 210 mm as opposed to a longer 223 mm MSI PCB. However, if we take the entire graphics card with the cooler into consideration, then Gigabyte is the longer card of the two: its cooler makes it15 mm longer and 18 mm taller than the EVGA card. As for MSI, its total length is even greater – 240 mm. This way EVGA solution is the most compact of the three.

All three cards have two DVI-I and one miniHDMI out. They are combined with the back panel grid for ousting warm air from the system case. MSI again added some uniqueness by shaping the grid as their company name:

Both GeForce GTX 560 solutions from Gigabyte and EVGA have common grids, without any decorative twists like that, which may lower the functionality of the grid.

However, when it comes to SLI support and additional power supply, all three makers agreed on the implementation: each of the cards has one MIO connector (only 2-way SLI is supported) and two six-pin additional power connectors:

 

I would like to remind you that the maximum Nvidia GeForce GTX 560 power consumption is declared at 150 W and the minimal power supply capacity for a system equipped with this graphics accelerator should be 450 W. However, Gigabyte recommends using at least 500 W PSU.

Now let’s take a look at the graphics cards’ PCBs:

 

The major differences can be found in the VRM layout. EVGA and Gigabyte products have 4+1 and 3+1 VRM circuitry where 1 phase is assigned to the memory chips. As for MSI card, its voltage regulator circuitry is also designed as 3+1, but each phase in the processor part is doubled that is why we could say that this circuitry is “virtually six-phase” one. Besides, they also used only high-quality capacitors and chokes. Gigabyte’s “Ultra Durable” series also uses only high-quality Japanese capacitors and an optimized PCB that allows reducing the GPU temperature by 5-10%, increasing the overclocking potential by 10-30% and becoming an energy-efficient card by the same percentage. EVGA solution is a modest one: no big promises of any kind.

The readable GPU marking on all three graphics cards is the same - GF114-325-A1, and the barely visible marking differs only by the production week of 2011. EVGA graphics card uses a GPU manufactured in week 11, Gigabyte – in week 12 and MSI – in week 7.

Hardware GPU configuration corresponds exactly to Nvidia’s specifications. GeForce GTX 560 has 336 unified shader processors, 56 texturing units and 32 raster back ends. I would like to remind you that the GTX 560 core frequency may vary between 810 and 950 MHz, and the shader domain frequency – between 1620 and 1900 MHz. All three graphics cards participating in our today’s review have different clock speeds. EVGA GeForce GTX 560 Superclocked works at 851/1701 MHz, Gigabyte GeForce GTX 560 Ultra Durable – at 830/1660 MHz, and MSI GeForce N560GTX Twin Frozr II-OC – at 870/1740 MHz. In other words the latter graphics card has the fastest GPU of the three. However, the previously reviewed Palit GeForce GTX 560 Sonic Platinum had an even faster GPU: 900/1800 MHz.

The amount of video memory is the same on all three graphics cards: each has 1 GB of GDDR5 in eight FCFBA micro-chips. However, the chips themselves are different. EVGA and MSI cards have Samsung chips marked as K4G10325FE-HC04, while Gigabyte graphics card carries Hynix H5GQ1H24AFR-T2C:

 

Nevertheless, both these chips have the same 0.4 ns nominal access time and 5000 MHz theoretical effective frequency. In 3 D mode the vide o memory of the cards works at slightly different frequencies: the video memory on EVGA card works at 4104 MHz, on Gigabyte – at 4008 MHz and on MSI – at 4080 MHz. As for other specifications, these chips have 1.5 V voltage, 256 bit bus, and 270 MHz effective frequency in power-saving mode.

The screenshots below show the graphics cards specifications as reported by GPU-Z utility:

  

Cooling System Design and Performance

Besides the packaging, accessories bundles, PCB design and frequencies, all three GeForce GTX 560 graphics cards have their own unique coolers:

 

The cooler on EVGA GeForce GTX 560 Superclocked consists of a copper base, three 6 mm heatpipes coming out of it and three aluminum heatsink arrays, one of which is soldered directly to the base of the cooler and the other two hang at the ends of the heatpipes. Gigabyte GeForce GTX 560 Ultra Durable features a cooling system with four 6 mm heatpipes forming part of the base (heatpipe direct touch technology) that hold a heatsink of aluminum fins pressed firmly against the heatpipes. Nickel-plated heatsink of the MSI GeForce N560GTX Twin Frozr II-OC consists of a classical copper base, two 6 mm and two 8 mm heatpipes and thin and long aluminum fins.

The fans are also very different in each of the coolers:

  

EVGA graphics card uses one 11-blade fan 75 mm in diameter made by AVC. This particular DASA0815R2U model is based on a fluid dynamic bearing and can be PWM controlled in the interval from 1000 to 4000 RPM. Gigabyte card uses two 9-blade fans 96 mm in diameter tied together using WindForce 2X technology, which implies certain placement on the PCB and the rotation direction. The GTX 560 Ultra Durable has both fans installed at 0 angle along the same axis. They both rotate in the same direction (counterclockwise). Gigabyte uses Everflow T129215SM fans with a slide bearing. They rotate at 1000-1950 RPM. The MSI card is equipped with a pair of 75 mm 11-blade fans from PowerLogic. This PLD08010S12HH model also uses a slide bearing. Their rotation speed is PWM controlled between 1500 and 4500 RPM.

Now let’s see how efficient and noisy these coolers are and how well they cope with their task. To accomplish this, we used a test from Aliens vs. Predator (2010) game. We ran it five times with maximum graphics quality settings in 1920x1080 resolution and 16x anisotropic filtering, but without the FSAA (which allows heating up the GPUs even more):

Besides this mode we also warmed up the testing participants by FurMark version 1.9.9 stability test in 1920x1080 resolution:

We used MSI Afterburner utility version 2.2.0 Beta 5 to monitor graphics card temperatures, frequencies and fan rotation speeds. We also sued GPU-Z version 0.5.4 utility. All tests were performed inside a closed system case with the ambient temperature at 27.5-28.0°C.

First let’s check out the temperatures of our tested cards during the gameplay with automatic fan rotation speed control:


EVGA

Gigabyte

MSI

Now let’s continue with the tests in the same gaming mode with the fans at their maximum rotation speed:


EVGA

Gigabyte

MSI

Now we will repeat the test one more time with the load created by FurMark in automatic fan mode:


EVGA

MSI

…and at maximum fan rotation speed:


EVGA

MSI

The FurMark charts have no Gigabyte results on them for a reason. The thing is that this particular graphics card behaved really strange: three minutes into the test the temperature almost fully stabilized at 74°C, which is lower than by the other two participants, but then the card suddenly stopped sending the signal to the monitor and it shut down. We suspect that one of the electronic components on the PCB or maybe even a specific unit of the GPU could get overheated (which may often happen with heat-pipe direct touch coolers). As a result, we couldn’t check the cooling efficiency of the Gigabyte cooler under ultimate load.

All other results are summed up on the diagram below for your convenience:

The cooling efficiency of three GeForce GTX 560 graphics accelerators reviewed today turned out pretty predictable. The EVGA cooler works very well, but in automatic mode it yields to the other two competitors. At the same time, GeForce GTX 560 Superclocked remained the coolest graphics card at maximum fan rotation speed. Gigabyte GeForce GTX 560 Ultra Durable product with two largest fans could have taken the leader’s crown in the efficiency test, if it hadn’t failed the FurMark test. Note that the maximum rotation speed of its fans doesn’t exceed 2000 RPM. As for MSI, their Twin Frozr II once again proved superb, that is why it will be available for purchase separately very soon. However, it is impossible to draw the final verdict about the graphics card coolers efficiency without checking out their acoustic readings. So, let’s move on to the noise tests.

The noise level of each cooler was measured between 1:00 and 3:00 AM in a closed room about 20 m2 big using CENTER-321 electronic noise meter. The lowest noise reading our noise meter device can register is 29.8 dBA and the subjectively comfortable noise level in these testing conditions was around 36 dBA (do not mix it up with low noise level). The noise level for each cooler was tested outside the system case when the only noise sources in the lab were the cooler and its fan. The noise meter was installed on a tripod and was always at a 150 mm distance from the cooler fan rotor. The tested cooling systems were placed at the edge of the desk on a sheet of polyurethane foam. The fans rotation speed was adjusted in the entire supported range with the new controller by changing the voltage with 0.5 V increment.

Here are the obtained results (the dotted lines show where fans worked in automatic mode during the gaming load tests):

Overall I can conclude that Gigabyte’s graphics card ahs the quietest cooler of the three, which is exactly what we have expected judging by its design and its fans speed modes. In 2D mode you can’t hear Gigabyte fans at all, just as the EVGA cooling fan. However, we can’t say the same about the MSI fans: the noise from two 75 mm fans rotating at 1800 RPM can be clearly distinguished against the background of a quiet system case. Although in my subjective opinion, during the gaming tests Twin Frozr II works in a more acoustically comfortable noise range than the EVGA cooler.

We performed all overclocking experiments using the graphics cards’ proprietary coolers and without changing the GPU voltage. The results showed that EVGA GeForce GTX 560 Superclocked can remain stable at up to 930/1860/4640 MHz, Gigabyte GeForce GTX 560 Ultra Durable – at up to 960/1920/4880 MHz and MSI GeForce N560GTX Twin Frozr II-OC reached the maximum frequencies of 940/1880/5020 MHz:

  

This is a pretty typical GeForce GTX 560 overclocking result. Gigabyte graphics card managed to achieve the highest GPU frequency, while MSI card hit the maximum memory speed. Of course, it is important to remember that overclocking success depends a lot on a specific graphics card piece.

The temperatures of all three testing participants didn’t change much during overclocking (they only gained about 2°C in GPU temperature):


EVGA

Gigabyte

MSI

However, it only happened because the graphics cards cooling fans sped up in automatic mode and prevented the temperature from growing any higher.

Temperatures in SLI mode seem to be much worse:

As you can see, but the GPU of the Gigabyte graphics card located at the top of the 2-way SLI bundle heated up to 100°C during common gaming tests, even though this card has a very efficient cooler, as we have just seen above:

Of course, it could be the summer heat or the closed system case, or the close proximity of the second graphics card or the sound card in the system. But aren’t these the usual conditions. In which our system components normally work? Besides, not everyone has a roomy system case with well-organized internal ventilation, like Antec Twelve Hundred, and a versatile mainboard with multiple options for PCI-Express slot configurations like Gigabyte GA-X58A-UD9. It is obvious that you need graphics cards with airflow directed along the PCB and then pushed out of the system case if you want to have you SLI or CrossFireX systems running smoothly. A liquid-cooling system could be an even better alternative, but this would be a topic for a completely different article.

Testbed Configuration and Testing Methodology

All graphics cards were benchmarked in a closed system case with the following configuration:

In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 25x and “Load-Line Calibration” (Level 2) enabled to 4.5GHz. The processor Vcore was increased to 1.46875V in the mainboard BIOS:

The 6 GB of system DDR3 memory worked at 1.5 GHz frequency with 7-7-7-16_1T timings and 1.64V voltage. Turbo Boost and Hyper-Threading technologies were disabled during our test session.

The test session started on July 12, 2011. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:

The graphics cards were tested only in one today’s most popular resolution: 1920x1080. The tests were performed in two image quality modes: “Quality+AF16x” – default texturing quality with enabled 16x anisotropic filtering and “Quality+ AF16x+AA4(8)x” with enabled 16x anisotropic filtering and full screen 4x anti-aliasing (MSAA) or 8x if the average framerate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings or configuration files. If the corresponding options were missing, we changed these settings in the Control Panel of GeForce/ION drivers. There were no other changes in the driver settings.

The list of games and applications used in this test session was limited to the most current and resource-consuming titles and all of them were updated to their latest versions. As a result, the list had one popular semi-synthetic benchmarking suite, one technical demo and 12 games of various genres. Here is the complete list of tests used with the settings (all games listed in their release order):

If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.

Performance Tests

Since we are already well familiar with the performance of Nvidia GeForce GTX 560 from one of our previous reviews, we decided not to compare the today’s testing participants against their competitors from the AMD clan. Instead we will focus on the performance differences between the cards from EVGA, Gigabyte and MSI working at the same frequencies and compare them with the performance of the best overclockable solution of the three – Gigabyte GeForce GTX 560 Ultra Durable at 960/1920/4880 MHz. We will also check how effective the SLI configuration built of two graphics cards (MSI and Gigabyte at 870/1740/4080 MHz) will be.

The graphics cards are sorted out in the diagrams in the same order as we discussed them above. The results for EVGA GeForce GTX 560 1 GB Superclocked are of dark-gray color, Gigabyte GeForce GTX 560 1 GB Ultra Durable – blue, MSI GeForce N560GTX 1 GB Twin Frozr II-OC – red. The SLI configuration is marked with a darker red. All graphics cards have 1 GB of memory, that is why we didn’t mention in anywhere on the diagrams.

Let’s check out the obtained results.

3DMark 2011

Unigine Heaven Demo

BattleForge: Lost Souls

S.T.A.L.K.E.R.: Call of Pripyat

Metro 2033: The Last Refuge

Just Cause 2

Aliens vs. Predator (2010)

Lost Planet 2

StarCraft 2: Wings of Liberty

Sid Meier’s Civilization V

Tom Clancy's H.A.W.X. 2

Crysis 2

Total War: Shogun 2

DiRT 3

All results obtained in our today’s test session do not differ that much that is why we decided not to comment on each diagram, but will rather offer you overall analysis. In fact, there is very little difference in performance between Gigabyte, EVGA and MSI graphics cards. The maximum advantage of the fastest card of the three, which is MSI, over the slowest of the three – the Gigabyte – doesn’t exceed 3%, and in most cases doesn’t exist at all. It would definitely be wrong to base your choice only on this number. At the same time, if you overclock Gigabyte GeForce GTX 560, you will be able to gain about 10-14%, which is already much more significant, especially since in Gigabyte’s case you don’t need to upgrade the cooler. The same is true for the other two tested accelerators.

The next diagram shows very clearly the advantages of a GeForce GTX 560 SLI configuration over a single graphics card:

Note that the performance gain from the use of an SLI configuration is not as obvious and impressive as in case of a CRossFireX system, but nevertheless, two GeForce GTX 560 cards are on average 71% faster without MSAA and 80% faster with MSAA across the board. Nvidia SLI technology is the most efficient in such titles as BattleForge: Lost Souls, Sid Meier's Civilization V and Total War: Shogun 2, and the least efficient in Aliens vs. Predator (2010) and DiRT 3.

If you would like to see more detailed results, please check out the following table.

Conclusion

All three graphics cards discussed in our today’s article have a lot in common. They have the same graphics processor and identical amount of video memory onboard (1 GB), very similar accessories and very close performance. Therefore, if the difference in physical dimensions are not among the top criteria on your list, you should pay special attention to such aspects as efficiency of the cooling system, noise and price. In the first two aspect Gigabyte GeForce GTX 560 Ultra Durable is the indisputable leader, even despite a weird issue with FurMark test. Strange as it might seem, but Gigabyte graphics card even costs less than the other two testing participants: it is selling for $190, while EVGA GeForce GTX 560 Superclocked comes at a little higher price of $195, and MSI GeForce N560GTX Twin Frozr II-OC – at $217. Of course, this is very small difference, but it is again in Gigabyte’s favor.

In conclusion we are proud to award Gigabyte GeForce GTX 560 Ultra Durable with our Recommended Buy title:

Anyway, we hope that our today’s article will help you find a GeForce GTX 560 to your liking.