Although the Palit GeForce GTX 680 Jetstream is pre-overclocked back at the factory, we managed to slightly increase the GPU frequency of our sample and also get a substantial increase in its memory frequency. The GPU was stable at a base clock rate of 1155 MHz (and a boost clock rate of 1233 MHz). The graphics memory could work at 6900 MHz.
The most interesting thing is that the temperature of our Palit remained at the same level, 78°C, under peak load at the same speed of the fans!
Palit’s cooler is highly efficient indeed.
Testbed Configuration and Testing Methodology
All graphics cards were tested in a system with the following configuration:
- Mainboard: Intel Siler DX79SI (Intel X79 Express, LGA 2011, BIOS 0460 from 03/27/2012);
- CPU: Intel Core i7-3960X Extreme Edition, 3.3 GHz, 1.2 V, 6 x 256 KB L2, 15 MB L3 (Sandy Bridge-E, C1, 32 nm);
- CPU cooler: Phanteks PH-TC14PE (2 x 135 mm fans at 900 RPM);
- Thermal interface: ARCTIC MX-4;
- System memory: DDR3 4 x 4GB Mushkin Redline (Spec: 2133 MHz / 9-11-10-28 / 1.65 V);
- Graphics cards:
- Nvidia GeForce GTX 690 2 x 2 GB, 256 bit, GDDR5, 915/6008 MHz;
- Palit GeForce GTX 680 Jetstream 2 GB, 256 bit, GDDR5, 1006/6300 MHz;
- NVIDIA GeForce GTX 680 2 GB/256 bit GDDR5, 1006/6008 MHz;
- Gigabyte Radeon HD 7970 Ultra Durable 3 GB/384 bit GDDR5, 1000/5800 MHz;
- Sapphire Radeon HD 7970 OC Dual-X 3 GB/384 bit GDDR5, 1000/5800 MHz;
- System drive: Crucial m4 256 GB SSD (SATA-III,CT256M4SSD2, BIOS v0009);
- Drive for programs and games: Western Digital VelociRaptor (300GB, SATA-II, 10000 RPM, 16MB cache, NCQ) inside Scythe Quiet Drive 3.5” HDD silencer and cooler;
- Backup drive: Samsung Ecogreen F4 HD204UI (SATA-II, 2 TB, 5400 RPM, 32 MB, NCQ);
- System case: Antec Twelve Hundred (front panel: three Noiseblocker NB-Multiframe S-Series MF12-S2 fans at 1020 RPM; back panel: two Noiseblocker NB-BlackSilentPRO PL-1 fans at 1020 RPM; top panel: standard 200 mm fan at 400 RPM);
- Control and monitoring panel: Zalman ZM-MFC3;
- Power supply: Xigmatek “No Rules Power” NRP-HC1501 1500 W (with a default 140 mm fan);
- Monitor: 30” Samsung 305T Plus.
The frequencies of GeForce GTX 680 graphics cards tested in SLI configuration were adjuted to the nominal values:
In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 37x, BCLK frequency set at 125 MHz and “Load-Line Calibration” enabled to 4.625 GHz. The processor Vcore was increased to 1.46 V in the mainboard BIOS:
Hyper-Threading technology was enabled. 16 GB of system DDR3 memory worked at 2 GHz frequency with 9-11-10-28 timings and 1.65V voltage.
The test session started on May 1, 2012. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:
- Intel Chipset Drivers 22.214.171.1240 WHQL from 01/26/2011 for the mainboard chipset;
- DirectX End-User Runtimes libraries from November 30, 2010;
- AMD Catalyst 12.3 driver from 03/28/2012 + Catalyst Application Profiles 12.3 (CAP1) from 03/29/2012 for AMD based graphics cards;
- Nvidia GeForce 301.33 beta driver for Nvidia graphics cards.
The graphics cards were tested in two resolutions: 1920x1080 and 2560x1600. The tests were performed in two image quality modes: “Quality+AF16x” – default texturing quality in the drivers with enabled 16x anisotropic filtering and “Quality+ AF16x+MSAA 4(8)x” with enabled 16x anisotropic filtering and full screen 4x or 8x antialiasing if the average framerate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings. If the corresponding options were missing, we changed these settings in the Control Panels of Catalyst and GeForce drivers. We also disabled Vsync there. There were no other changes in the driver settings.
The list of games and applications used in this test session includes two popular semi-synthetic benchmarking suites, one technical demo and 15 games of various genres:
- 3DMark Vantage (DirectX 10) – version 126.96.36.199, Performance and Extreme profiles (only basic tests);
- 3DMark 2011 (DirectX 11) – version 188.8.131.52, Performance and Extreme profiles;
- Unigine Heaven Demo (DirectX 11) – version 3.0, maximum graphics quality settings, tessellation at “extreme”, AF16x, 1280x1024 resolution with MSAA and 1920x1080 with MSAA 8x;
- S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
- Left 4 Dead 2 (DirectX 9) – version 184.108.40.206, maximum graphics quality settings, proprietary d98 demo (two runs) on “Death Toll” map of the “Church” level;
- Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
- Just Cause 2 (DirectX 11) - version 220.127.116.11, maximum quality settings, Background Blur and GPU Water Simulation disabled, two consecutive runs of the “Dark Tower” demo;
- Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
- Lost Planet 2 (DirectX 11) – version 1.0, maximum graphics quality settings, motion blur enabled, performance test “B” (average in all three scenes);
- StarCraft 2: Wings of Liberty (DirectX 9) – version 1.4.3, all image quality settings at “Extreme”, Physics at “Ultra”, reflections On, two 2-minute runs of our own “bench2” demo;
- Sid Meier’s Civilization V (DirectX 11) – version 18.104.22.1688, maximum graphics quality settings, two runs of the “diplomatic” benchmark including five heaviest scenes;
- Tom Clancy's H.A.W.X. 2 (DirectX 11) – version 1.04, maximum graphics quality settings, shadows On, tessellation Off (not available on Radeon), two runs of the test scene;
- Total War: Shogun 2 (DirectX 11) – version 2.0, built in benchmark (Sekigahara battle) at maximum graphics quality settings;
- Crysis 2 (DirectX 11) – version 1.9, we used Adrenaline Crysis 2 Benchmark Tool v.22.214.171.124. BETA with “Ultra High” graphics quality profile and activated HD textures, two runs of a demo recorded on “Times Square” level;
- DiRT 3 (DirectX 11) – version 1.2, built-in benchmark at maximum graphics quality settings on the “Aspen” track;
- Hard Reset Demo (DirectX 9) – benchmark built into the demo version with Ultra image quality settings, one test run;
- Batman: Arkham City (DirectX 11) – version 1.2, maximum graphics quality settings, physics disabled, two sequential runs of the benchmark built into the game.
- Battlefield 3 (DirectX 11) – version 1.3, all image quality settings set to “Ultra”, two successive runs of a scripted scene from the beginning of the “Going Hunting” mission 110 seconds long.
If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.