PCB Design and Functionality
All three cards look very differently:
Gigabyte GeForce GTX 560 Ultra Durable has the flashiest exterior with large cooling fans. Despite the seeming differences Gigabyte’s and EVGA’s solutions have PCB of the same length – 210 mm as opposed to a longer 223 mm MSI PCB. However, if we take the entire graphics card with the cooler into consideration, then Gigabyte is the longer card of the two: its cooler makes it15 mm longer and 18 mm taller than the EVGA card. As for MSI, its total length is even greater – 240 mm. This way EVGA solution is the most compact of the three.
All three cards have two DVI-I and one miniHDMI out. They are combined with the back panel grid for ousting warm air from the system case. MSI again added some uniqueness by shaping the grid as their company name:
Both GeForce GTX 560 solutions from Gigabyte and EVGA have common grids, without any decorative twists like that, which may lower the functionality of the grid.
However, when it comes to SLI support and additional power supply, all three makers agreed on the implementation: each of the cards has one MIO connector (only 2-way SLI is supported) and two six-pin additional power connectors:
I would like to remind you that the maximum Nvidia GeForce GTX 560 power consumption is declared at 150 W and the minimal power supply capacity for a system equipped with this graphics accelerator should be 450 W. However, Gigabyte recommends using at least 500 W PSU.
Now let’s take a look at the graphics cards’ PCBs:
The major differences can be found in the VRM layout. EVGA and Gigabyte products have 4+1 and 3+1 VRM circuitry where 1 phase is assigned to the memory chips. As for MSI card, its voltage regulator circuitry is also designed as 3+1, but each phase in the processor part is doubled that is why we could say that this circuitry is “virtually six-phase” one. Besides, they also used only high-quality capacitors and chokes. Gigabyte’s “Ultra Durable” series also uses only high-quality Japanese capacitors and an optimized PCB that allows reducing the GPU temperature by 5-10%, increasing the overclocking potential by 10-30% and becoming an energy-efficient card by the same percentage. EVGA solution is a modest one: no big promises of any kind.
The readable GPU marking on all three graphics cards is the same - GF114-325-A1, and the barely visible marking differs only by the production week of 2011. EVGA graphics card uses a GPU manufactured in week 11, Gigabyte – in week 12 and MSI – in week 7.
Hardware GPU configuration corresponds exactly to Nvidia’s specifications. GeForce GTX 560 has 336 unified shader processors, 56 texturing units and 32 raster back ends. I would like to remind you that the GTX 560 core frequency may vary between 810 and 950 MHz, and the shader domain frequency – between 1620 and 1900 MHz. All three graphics cards participating in our today’s review have different clock speeds. EVGA GeForce GTX 560 Superclocked works at 851/1701 MHz, Gigabyte GeForce GTX 560 Ultra Durable – at 830/1660 MHz, and MSI GeForce N560GTX Twin Frozr II-OC – at 870/1740 MHz. In other words the latter graphics card has the fastest GPU of the three. However, the previously reviewed Palit GeForce GTX 560 Sonic Platinum had an even faster GPU: 900/1800 MHz.
The amount of video memory is the same on all three graphics cards: each has 1 GB of GDDR5 in eight FCFBA micro-chips. However, the chips themselves are different. EVGA and MSI cards have Samsung chips marked as K4G10325FE-HC04, while Gigabyte graphics card carries Hynix H5GQ1H24AFR-T2C:
Nevertheless, both these chips have the same 0.4 ns nominal access time and 5000 MHz theoretical effective frequency. In 3 D mode the vide o memory of the cards works at slightly different frequencies: the video memory on EVGA card works at 4104 MHz, on Gigabyte – at 4008 MHz and on MSI – at 4080 MHz. As for other specifications, these chips have 1.5 V voltage, 256 bit bus, and 270 MHz effective frequency in power-saving mode.
The screenshots below show the graphics cards specifications as reported by GPU-Z utility: