We measured the level of noise using an electronic noise-level meter CENTER-321 in a closed and quiet room about 20 sq. meters large. The noise-level meter was set on a tripod at a distance of 15 centimeters from the graphics card which was installed on an open testbed. The mainboard with the graphics card was placed at an edge of a desk on a foam-rubber tray.
The bottom limit of our noise-level meter is 29.8 dBA whereas the subjectively comfortable (not low, but comfortable) level of noise when measured from that distance is about 36 dBA. The speed of the graphics card’s fan was being adjusted by means of a controller that changed the supply voltage in steps of 0.5 V.
We’ve included the results of reference Nvidia GeForce GTX 680 and AMD Radeon HD 7970 cards into the next diagram for the comparison’s sake (the vertical dotted lines indicate the top speed of the fans in automatic regulation mode):
The DirectCU II cooler from ASUS is not only efficient but also quiet, as it has already proved in our earlier tests. Unfortunately, we can’t say the same about the cooler of the MSI R7970 Lightning. The latter is louder than the reference AMD Radeon HD 7970 and is in fact the loudest among the four tested cards. MSI programmers should have written lower fan speed limits into the card’s BIOS to make it quieter. The supplier of 100mm fans for the Twin Frozr IV might also be changed because Power Logic fans have never been perfect in terms of noisiness.
Testbed Configuration and Testing Methodology
Nvidia GeForce GTX 670 and its competitors were tested in a system with the following configuration:
- Mainboard: Intel Siler DX79SI (Intel X79 Express, LGA 2011, BIOS 0460 from 04/24/2012);
- CPU: Intel Core i7-3960X Extreme Edition, 3.3 GHz, 1.2 V, 6 x 256 KB L2, 15 MB L3 (Sandy Bridge-E, C1, 32 nm);
- CPU cooler: Phanteks PH-TC14PE (2 x 135 mm fans at 900 RPM);
- Thermal interface: ARCTIC MX-4;
- System memory: DDR3 4 x 4GB Mushkin Redline (Spec: 2133 MHz / 9-11-10-28 / 1.65 V);
- Graphics cards:
- Asus GeForce GTX 680 DirectCU II TOP 2 GB, 256 bit, GDDR5, 1137/6008 MHz and 1212/7168 MHz;
- Nvidia GeForce GTX 680 2 GB, 256 bit, GDDR5, 1006/6008 MHz;
- MSI R7970 Lightning 3 GB, 384 bit GDDR5, 1070/5600 MHz and 1165/7160 MHz;
- AMD Radeon HD 7970 3 GB, 384 bit, GDDR5, 925/5500 MHz;
- System drive: Crucial m4 256 GB SSD (SATA-III,CT256M4SSD2, BIOS v0009);
- Drive for programs and games: Western Digital VelociRaptor (300GB, SATA-II, 10000 RPM, 16MB cache, NCQ) inside Scythe Quiet Drive 3.5” HDD silencer and cooler;
- Backup drive: Samsung Ecogreen F4 HD204UI (SATA-II, 2 TB, 5400 RPM, 32 MB, NCQ);
- System case: Antec Twelve Hundred (front panel: three Noiseblocker NB-Multiframe S-Series MF12-S2 fans at 1020 RPM; back panel: two Noiseblocker NB-BlackSilentPRO PL-1 fans at 1020 RPM; top panel: standard 200 mm fan at 400 RPM);
- Control and monitoring panel: Zalman ZM-MFC3;
- Power supply: Xigmatek “No Rules Power” NRP-HC1501 1500 W (with a default 140 mm fan);
- Monitor: 27” Samsung S27A850D.
In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 37x, BCLK frequency set at 125 MHz and “Load-Line Calibration” enabled to 4.625 GHz. The processor Vcore was increased to 1.46 V in the mainboard BIOS:
Hyper-Threading technology was enabled. 16 GB of system DDR3 memory worked at 2 GHz frequency with 9-10-10-28 timings and 1.65V voltage.
The test session started on May 14, 2012. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:
- Intel Chipset Drivers 188.8.131.520 WHQL from 01/26/2011 for the mainboard chipset;
- DirectX End-User Runtimes libraries from November 30, 2010;
- AMD Catalyst 12.4 driver + Catalyst Application Profiles 12.4 (CAP1) for AMD based graphics cards;
- Nvidia GeForce 301.42 WHQL driver for Nvidia based graphics cards.
The graphics cards were tested in two resolutions: 1920x1080 and 2560x1440. The tests were performed in two image quality modes: “Quality+AF16x” – default texturing quality in the drivers with enabled 16x anisotropic filtering and “Quality+ AF16x+MSAA 4(8)x” with enabled 16x anisotropic filtering and full screen 4x or 8x antialiasing if the average framerate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings. If the corresponding options were missing, we changed these settings in the Control Panels of Catalyst and GeForce drivers. We also disabled Vsync there. There were no other changes in the driver settings.
The list of games and applications used in this test session includes two popular semi-synthetic benchmarking suites, one technical demo and 15 games of various genres:
- 3DMark Vantage (DirectX 10) – version 184.108.40.206, Performance and Extreme profiles (only basic tests);
- 3DMark 2011 (DirectX 11) – version 220.127.116.11, Performance and Extreme profiles;
- Unigine Heaven Demo (DirectX 11) – version 3.0, maximum graphics quality settings, tessellation at “extreme”, AF16x, 1280x1024 resolution with MSAA and 1920x1080 with MSAA 8x;
- S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
- Left 4 Dead 2 (DirectX 9) – version 18.104.22.168, maximum graphics quality settings, proprietary d98 demo (two runs) on “Death Toll” map of the “Church” level;
- Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
- Just Cause 2 (DirectX 11) - version 22.214.171.124, maximum quality settings, Background Blur and GPU Water Simulation disabled, two consecutive runs of the “Dark Tower” demo;
- Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
- Lost Planet 2 (DirectX 11) – version 1.0, maximum graphics quality settings, motion blur enabled, performance test “B” (average in all three scenes);
- StarCraft 2: Wings of Liberty (DirectX 9) – version 1.4.3, all image quality settings at “Extreme”, Physics at “Ultra”, reflections On, two 2-minute runs of our own “bench2” demo;
- Sid Meier’s Civilization V (DirectX 11) – version 126.96.36.1998, maximum graphics quality settings, two runs of the “diplomatic” benchmark including five heaviest scenes;
- Tom Clancy's H.A.W.X. 2 (DirectX 11) – version 1.04, maximum graphics quality settings, shadows On, tessellation Off (not available on Radeon), two runs of the test scene;
- Total War: Shogun 2 (DirectX 11) – version 2.0, built in benchmark (Sekigahara battle) at maximum graphics quality settings;
- Crysis 2 (DirectX 11) – version 1.9, we used Adrenaline Crysis 2 Benchmark Tool v.188.8.131.52. BETA with “Ultra High” graphics quality profile and activated HD textures, two runs of a demo recorded on “Times Square” level;
- Hard Reset Demo (DirectX 9) – benchmark built into the demo version with Ultra image quality settings, one test run;
- Batman: Arkham City (DirectX 11) – version 1.2, maximum graphics quality settings, physics disabled, two sequential runs of the benchmark built into the game.
- Battlefield 3 (DirectX 11) – version 1.4, all image quality settings set to “Ultra”, two successive runs of a scripted scene from the beginning of the “Going Hunting” mission 110 seconds long;
- DiRT Showdown (DirectX 11) – version 1.0, built-in benchmark at maximum graphics quality settings (“Ultra” preset) on the “Nevada” track.
If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.