Every graphics card from both Nvidia and AMD supports multi-GPU technology (called SLI and CrossFireX, respectively), the only exception being junior product series, which are not really meant for gaming. Frankly speaking, these technologies are far from popular, yet their developers keep on optimizing them with each new graphics architecture to ensure maximum performance scalability. However, even hardcore gamers, let alone ordinary users, prefer to buy a single top-end graphics card instead of two mainstream ones running in SLI or CrossFireX mode. This approach is perfectly right in the majority of applications, but there are situations when the two mainstream cards are going to be faster than the single top-end card in average as well as bottom speed, which is important for playability.
So, in this review we will study the efficiency of Nvidia’s SLI technology with two and three GeForce GTX 660 Ti graphics cards and will compare such multi-GPU configurations with the fastest single-processor card and with the dual-processor GeForce GTX 690. We won’t describe the cards themselves because we’ve recently posted a large roundup of eight GeForce GTX 660 Ti products. Three of them were used to prepare this article.
Testbed Configuration and Testing Methodology
All participating graphics cards were tested in the following testbed:
- Mainboard: Intel Siler DX79SI (Intel X79 Express, LGA 2011, BIOS 0537 from 07/23/2012);
- CPU: Intel Core i7-3960X Extreme Edition, 3.3 GHz, 1.2 V, 6 x 256 KB L2, 15 MB L3 (Sandy Bridge-E, C1, 32 nm);
- CPU cooler: Phanteks PH-TC14PE (2 x 135 mm fans at 900 RPM);
- Thermal interface: ARCTIC MX-4;
- System memory: DDR3 4 x 4GB Mushkin Redline (2133 MHz / 9-10-10-28 / 1.65 V);
- Graphics cards:
- NVIDIA GeForce GTX 690 2x2 GB 256 bit GDDR5, 1020/6008 MHz;
- ASUS GeForce GTX 680 DirectCU II TOP 2 GB 256 bit GDDR5, 1137/6008 MHz;
- MSI N660Ti PE 2GD5/OC TwinFrozr IV 2 GB 256 bit GDDR5, 1020/6008 MHz;
- Gigabyte GeForce GTX 660 Ti Ultra Durable 2 GB 256 bit GDDR5, 1020/6008 MHz;
- KFA2 GeForce GTX 660 Ti EX OC 2 GB GDDR5, 1020/6008 MHz;
- System drive: Crucial m4 256 GB SSD (SATA-III,CT256M4SSD2, BIOS v0009);
- Drive for programs and games: Western Digital VelociRaptor (300GB, SATA-II, 10000 RPM, 16MB cache, NCQ) inside a Scythe Quiet Drive 3.5” HDD silencer and cooler;
- Backup drive: Samsung Ecogreen F4 HD204UI (SATA-II, 2 TB, 5400 RPM, 32 MB, NCQ);
- System case: Antec Twelve Hundred (front panel: three Noiseblocker NB-Multiframe S-Series MF12-S2 fans at 1020 RPM; back panel: two Noiseblocker NB-BlackSilentPRO PL-1 fans at 1020 RPM; top panel: default 200 mm fan at 400 RPM);
- Control and monitoring panel: Zalman ZM-MFC3;
- Power supply: Seasonic SS-1000XP Active PFC F3 (1000 W, 120 mm fan);
- Monitor: 27” Samsung S27A850D (DVI-I, 2560x1440, 60 Hz).
We used MSI, Gigabyte and KFA2 graphics cards to build our 2-way and 3-way SLI configurations:
The GPU frequencies of these graphics cards were locked at 1020 MHz, and the graphics memory frequency remained unchanged at 6008 MHz:
While the nominal clock frequencies of the first graphics card remained unchanged, we increased the GK104 frequencies of the dual-processor card to match the speed of GeForce GTX 660 Ti, i.e. to 1020 MHz. There will be no “red” cards in our today’s test session.
In order to lower the dependence of the graphics cards performance on the overall platform speed, we overclocked our 32 nm six-core CPU with the multiplier set at 37x, BCLK frequency set at 125 MHz and “Load-Line Calibration” enabled to 4.625 GHz. The processor Vcore was increased to 1.49 V in the mainboard’s BIOS:
Hyper-Threading technology was enabled. 16 GB of system DDR3 memory worked at 2 GHz frequency with 9-11-10-28 timings and 1.65V voltage.
The test session started on September 8, 2012. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:
- Intel Chipset Drivers 188.8.131.521 WHQL from 07/27/2012 for the mainboard chipset;
- DirectX End-User Runtimes from November 30, 2010;
- Nvidia GeForce 306.23 beta driver from 08/27/2012 for Nvidia based graphics cards.
We ran the tests in two resolutions: 1920x1080 and 2560x1440 pixels. The tests were performed in two image quality modes: “Quality+AF16x” – default texturing quality in the drivers with enabled 16x anisotropic filtering and “Quality+ AF16x+MSAA 4(8)x” with enabled 16x anisotropic filtering and full screen 4x or 8x antialiasing if the average frame rate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings. If the corresponding options were missing, we changed these settings in the GeForce driver Control Panel. We also disabled Vsync there. There were no other changes in the driver settings.
The list of games and applications used in this test session includes two semi-synthetic benchmarking suites, one technical demo and seventeen games of various genres with all updates installed as of the beginning of the test session date. We included two new gaming titles – F1 2012 and Borderlands 2:
- 3DMark Vantage (DirectX 10) – version 184.108.40.206, Performance and Extreme profiles (only basic tests);
- 3DMark 2011 (DirectX 11) – version 220.127.116.11, Performance and Extreme profiles;
- Unigine Heaven Demo (DirectX 11) – version 3.0, maximum graphics quality settings, tessellation at “extreme”, AF16x, 1280x1024 resolution with MSAA and 1920x1080 with MSAA 8x;
- S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
- Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
- Just Cause 2 (DirectX 11) - version 18.104.22.168, maximum quality settings, Background Blur and GPU Water Simulation disabled, two consecutive runs of the “Dark Tower” demo;
- Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
- Lost Planet 2 (DirectX 11) – version 1.0, maximum graphics quality settings, motion blur enabled, performance test “B” (average in all three scenes);
- Sid Meier’s Civilization V (DirectX 11) – version 22.214.171.1248, maximum graphics quality settings, two runs of the “diplomatic” benchmark including five heaviest scenes;
- Total War: Shogun 2 (DirectX 11) – version 2.0, built in benchmark (Sekigahara battle) at maximum graphics quality settings;
- Crysis 2 (DirectX 11) – version 1.9, we used Adrenaline Crysis 2 Benchmark Tool v.126.96.36.199. BETA with “Ultra High” graphics quality profile and activated HD textures, two runs of a demo recorded on “Times Square” level;
- Hard Reset Demo (DirectX 9) – benchmark built into the demo version with Ultra image quality settings, one test run;
- Batman: Arkham City (DirectX 11) – version 1.2, maximum graphics quality settings, physics disabled, two sequential runs of the benchmark built into the game.
- Battlefield 3 (DirectX 11) – version 1.4, all image quality settings set to “Ultra”, two successive runs of a scripted scene from the beginning of the “Going Hunting” mission 110 seconds long;
- Nexuiz (DirectX 11) – version 1.0, Ultra settings, double run of the in-built benchmark with High and Ultra profiles.
- DiRT Showdown (DirectX 11) – version 1.0, built-in benchmark at maximum graphics quality settings (“Ultra” preset) on the “Nevada” track.
- Sniper Elite V2 Benchmark (DirectX 11) – version 1.05, we used Adrenaline Sniper Elite V2 Benchmark Tool v188.8.131.52 BETA with maximum graphics quality settings (“Ultra” profile), Advanced Shadows: HIGH, Ambient Occlusion: ON, Stereo 3D: OFF, two sequential test runs.
- Sleeping Dogs (DirectX 11) – version 1.0, built-in benchmark, maximum graphics quality settings, Hi-Res Textures Pack installed, FPS Limiter and V-Sync disabled, a double run of the benchmark with antialiasing at the Normal and Extreme levels.
- F1 2012 (DirectX 11) – version 1.0, built-in benchmark with “Ultra” image quality settings.
- Borderlands 2 (DirectX 11) – version 1.0, built-in benchmark with maximum image quality settings and maximum PhysX level, FXAA enabled.
If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.