Testbed Configuration and Testing Methodology
All graphics cards were benchmarked in a closed system case with the following configuration:
- Mainboard: Gigabyte GA-X58A-OC (Intel X58 Express, LGA 1366, BIOS F5c from 09/06/2011);
- CPU: Intel Core i7-980X Extreme Edition, 3.33 GHz, 1.225 V, 6 x 256 KB L2, 12 MB L3 (Gulftown, B1);
- CPU cooler: Thermalright Archon (Thermalright TY-140 fans at 600-1260 RPM);
- Thermal interface: ARCTIC MX-4;
- System memory: DDR3 3 x 2GB OCZ Platinum Low-Voltage Triple Channel (Spec: 1600 MHz / 7-7-7-24 / 1.65 V);
- Graphics cards:
- System drive: RAID-0 of 2 x Kingston V-series SNV425S2128GB SSD (SATA-II, 128 GB, MLC, Toshiba TC58NCF618G3T controller);
- Drive for programs and games: Western Digital VelociRaptor (300GB, SATA-II, 10000 RPM, 16MB cache, NCQ) inside Scythe Quiet Drive 3.5” HDD silencer and cooler;
- Backup drive: Samsung Ecogreen F4 HD204UI (SATA-II, 2 TB, 5400 RPM, 32 MB, NCQ);
- System case: Antec Twelve Hundred (front panel: three Noiseblocker NB-Multiframe S-Series MF12-S2 fans at 1020 RPM; back panel: two Noiseblocker NB-BlackSilentPRO PL-1 fans at 1020 RPM; top panel: standard 200 mm fan at 400 RPM);
- Control and monitoring panel: Zalman ZM-MFC2
- Power supply: Xigmatek “No Rules Power” NRP-HC1501 1500 W (with a default 140 mm fan);
- Monitor: 30” Samsung 305T Plus.
In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 25x and “Load-Line Calibration” (Level 2) enabled to 4.5 GHz. The processor Vcore was increased to 1.46875 V in the mainboard BIOS:
The 6 GB of system DDR3 memory worked at 1.44 GHz frequency with 7-7-7-16_1T timings and 1.5V voltage. Turbo Boost and Hyper-Threading technologies were disabled during our test session.
The test session started on October 8, 2011. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:
- Intel Chipset Drivers 126.96.36.1992 WHQL from 10.04.2011 for the mainboard chipset;
- DirectX End-User Runtimes from November 30, 2010;
- AMD Catalyst 11.10 preview driver from 10.07.2011 for AMD based graphics cards;
- Nvidia GeForce 285.37 beta driver from 09.26.2011 for Nvidia based graphics cards.
The graphics cards were tested in two resolutions: the today’s most popular 1920x1080 and the maximum 2560x1600. The tests were performed in two image quality modes: “High Quality+AF16x” – maximum texturing quality with enabled 16x anisotropic filtering and “High Quality+ AF16x+MSAA4(8)x” with enabled 16x anisotropic filtering and full screen 4x anti-aliasing (MSAA) or 8x if the average framerate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings or configuration files. If the corresponding options were missing, we changed these settings in the Control Panels of Catalyst and GeForce drivers. There were no other changes in the driver settings.
The list of games and applications used in this test session includes two popular semi-synthetic benchmarking suites, one technical demo and 15 games of various genres:
- 3DMark Vantage (DirectX 10) – version 188.8.131.52, Performance and Extreme profiles (only basic tests);
- 3DMark 2011 (DirectX 11) – version 184.108.40.206, Performance and Extreme profiles;
- Unigine Heaven Demo (DirectX 11) – version 2.5, maximum graphics quality settings, tessellation at “extreme”, AF16x, 1920x1080 resolution with MSAA 4x;
- BattleForge: Lost Souls (DirectX 11) – version 1.2 (08.18.2011), maximum image quality settings, shadows enabled, SSAO technology enabled, two runs of the built-in benchmark;
- S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
- Left 4 Dead 2 (DirectX 9) – version 220.127.116.11, maximum graphics quality settings, d81 demo (two runs) on “Gold Stream (Beta)” map of the “Alpine Greek” level;
- Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
- Just Cause 2 (DirectX 11) - version 18.104.22.168, maximum quality settings, Background Blur and GPU Water Simulation enabled, two consecutive runs of the “Dark Tower” demo;
- Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
- Lost Planet 2 (DirectX 11) – version 1.0, maximum graphics quality settings, motion blur enabled, performance test “B” (average in all three scenes);
- StarCraft 2: Wings of Liberty (DirectX 11) – version 1.0, all image quality settings at “Ultra”, Physics “Ultra”, reflections On, two 2-minute runs of our own jt1 demo;
- Sid Meier’s Civilization V (DirectX 11) – version 22.214.171.1248, maximum graphics quality settings, two runs of the “diplomatic” benchmark including five heaviest scenes;
- Tom Clancy's H.A.W.X. 2 (DirectX 11) – version 1.04, maximum graphics quality settings, shadows On, tessellation Off (not available on Radeon), two runs of the test scene;
- Crysis 2 (DirectX 11) – version 1.9, Adrenaline Crysis 2 Benchmark Tool v126.96.36.199 BETA, “Ultra High” graphics quality profile, High-Definition textures enabled, two runs of the demo in “Times Square” scene;
- Total War: Shogun 2 (DirectX 11) – version 2.0, built in benchmark (Sekigahara battle) at maximum graphics quality settings;
- DiRT 3 (DirectX 11) – version 1.2, built-in benchmark at maximum graphics quality settings on the “Aspen” track;
- World of Planes (DirectX 9) – alpha-version (from 08.19.2011), maximum image quality settings, one test run;
- Hard Reset Demo (DirectX 9) – benchmark built into the demo version with Ultra image quality settings, one test run.
If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.