Technical Specifications and Recommended Pricing
Testbed and Methods
The tests were performed in a closed system case. Our testbed was configured as follows:
- Mainboard: Gigabyte GA-X58A-UD9 (Intel X58 Express, LGA 1366, BIOS F6c beta from 07/18/2011);
- Processor: Intel Core i7-980X Extreme Edition, 3.33 GHz, 1.225 V, 6 x 256 KB L2, 12 MB L3 (Gulftown, B1);
- Thermal interface: Arctic Cooling MX-2;
- Memory: DDR3 3 x 2 GB OCZ Platinum Low-Voltage Triple Channel (Spec: 1600MHz / 7-7-7-24 / 1.65 V);
- Graphics cards:
- Gigabyte GeForce GTX 560 1 GB Ultra Durable (GV-N56GOC-1GI) 830/1660/4008 MHz;
- Inno3D iChiLL GTX 560 Ti 1 GB 930/1860/4200 MHz;
- System drive: RAID 0 made of 2 Kingston V-series SNV425S2128GB SSDs (SATA-II, 300 GB, MLC, Toshiba TC58NCF618G3T);
- HDD for games and programs: Western Digital VelociRaptor WD3000HLFS (SATA-II, 300 GB storage capacity, 10,000 RPM, 16 MB cache, NCQ) inside Scythe Quiet Drive 3.5” silencer and cooler chassis;
- Backup HDD: Samsung EcoGreen F4 HD204UI (SATA-II, 2000 GB, 5400 RPM, 32 MB, NCQ);
- System case: Antec Twelve Hundred (front panel: three Noiseblocker NB-Multiframe S-Series MF12-S2 fans at 1080 RPM; back panel: two Noiseblocker NB-BlackSilent PRO PL-1 fans at 1080 RPM; top panel: standard 200 mm fan at 400 RPM);
- Control and monitoring panel: Zalman ZM-MFC2;
- Power supply: Xigmatek “No Rules Power” NRP-HC1501 1500 W (with a default 140 mm fan).
Since we couldn’t get the new ZEROtherm CoolMaxx 4000 cooler to work on any of the AMD Radeon HD graphics cards that we had at the time of tests, we only checked its performance on Nvidia GeForce GTX. Unfortunately, both graphics cards use their own proprietary coolers, which are more efficient than the reference to begin with:
That is why we can only compare the performance of the CoolMaxx 4000 with the performance of these two coolers. Well, looks like ZEROtherm’s challenge just got harder :)
The testing programs were installed under Microsoft Windows 7 Ultimate x64 SP1. We used DirectX End-User Runtimes libraries (from November 2010), Nvidia GeForce/ION 280.25 graphics card drivers. We used two 12-minute runs of FurMark version 1.9.1 in stability test mode with “Burn-in” option enabled and resolution set to 1920x1080 resolution. We enabled anisotropic filtering 16x in the driver control panel in order to increase the GPU operational load:
We also measured the graphics card temperatures in game mode using five runs of Aliens vs. Predator game in 1920x1080 resolution with maximum graphics quality settings but without antialiasing:
By testing the graphics cards in this mode we should be able to see their temperatures under typical gaming load.
The tests were run at least twice for each type of load. The temperature stabilization period between the two test cycles was about 10-12 minutes. The ambient temperature was checked next to the system case with an electronic thermometer with 0.1°C precision that allows monitoring the temperature changes over the past 6 hours. During our test session room temperature stayed around 25.0-25.8°C.