Testbed and Methods
The graphics cards were benchmarked in a closed system case with the following configuration:
- Mainboard: DFI LANPARTY DK X48-T2RS (Intel X48, LGA775, BIOS 29/08/2008)
- CPU: Intel Core 2 Extreme QX9650 (3.0GHz, 1.25V, 2x6MB L2 cache, 4x333MHz FSB, Yorkfield, C0 revision)
- CPU cooler: Thermalright SI-128 SE (Scythe Ultra Kaze at 1320rpm)
- Thermal interface: Gelid GC1
- System memory:
- 2 x 1024MB Corsair Dominator TWIN2X2048-9136C5D DDR2 SDRAM (Specs: 1142MHz, 5-5-5-18, 2.1V)
- 2 x 1024MB CSX DIABLO CSXO-XAC-1200-2GB-KIT DDR2 SDRAM (Specs: 1200MHz, 5-5-5-16, 2.4V)
- Disk subsystem: Western Digital VelociRaptor (300GB, SATA-II, 10,000rpm, 16MB cache, NCQ)
- HDD cooling and sound insulation system: Scythe Quiet Drive for 3.5-inch HDD
- Optical drive: Samsung SH-S183L DVD-burner (SATA-II)
- System case: ASUS ASCOT 6AR2-B Black&Silver (with 120mm 960rpm Scythe Slip Stream system fans for intake and exhaust and another such fan, at 800rpm, on the side panel)
- Control and monitoring panel: Zalman ZM-MFC2
- Power supply: Thermaltake Toughpower (W0218, 1500W, 140mm fan)
- Monitor: 24-inch BenQ FP241W (1920x1200@60Hz)
To minimize the CPU’s influence on the graphics cards’ performance I overclocked the CPU to 4.00GHz at 1.575V voltage during the tests.
The system memory worked at a frequency of 1066MHz with 5-4-4-12 timings (Performance Level = 6) and 2.30V voltage.
The tests were run under Windows Vista Ultimate Edition x86 SP1. I used the latest drivers available at the moment of my tests (some screenshots were made before or after the tests and may show different driver versions):
- Intel Chipset Drivers version 18.104.22.1687
- DirectX dated August 2008
- ATI Catalyst 8.10 HF (8.542)
- Nvidia ForceWare 178.24 (15/10/08, WHQL)
- Physics driver PhysX 8.09.04 (16/09/08, WHQL)
The graphics card drivers were set at High Quality. Thus, all the optimizations available in the ForceWare and Catalyst drivers were disabled, save for the Catalyst A.I. option that was left at its default value (Standard). I turned full-screen antialiasing and anisotropic filtering on from the menu of each game. If the game didn’t provide such options, I enabled FSAA and AF from the control panels of ForceWare and Catalyst. The Transparency Antialiasing (multisampling) option was turned on in ForceWare and Adaptive Antialiasing (multisampling) was turned on in ATI Catalyst.
The graphics cards were tested at two resolutions: 1280x1024 and 1920x1200. I didn’t test them at 1680x1050 to save time.
The cards were benchmarked in the following games and applications:
- 3DMark 2006 (Direct3D 9/10) – build 1.1.0, default and extreme (1920x1200, 16x AF and 4x AA) settings
- 3DMark Vantage (Direct3D 10) – v1.0.1, Performance and Extreme profiles (basic tests only)
- Unigine Tropics Demo version 1.0 (Direct3D 10), integrated benchmark, resolution of 1280x1024 pixels without AF and FSAA and 1920x1200 with 16xAF and 4xFSAA
- World in Conflict (Direct3D 10) – version 22.214.171.124(b89), “Very High” graphics quality profile, UI texture quality = Compressed; Water reflection size = 512; DirectX 10 rendering enabled
- Enemy Territory: Quake Wars (OpenGL 2.0) – version 1.5, highest graphics quality settings, d5 demo on the Salvage level, Finland
- Call of Duty 4: Modern Warfare MP (Direct3D 9) – version 1.7.568, “Extra” quality of visuals, d3 demo on the Bog level
- Unreal Tournament 3 (Direct3D 9) – version 1.3, highest graphics quality settings (level 5), Motion Blur and Hardware Physics enabled, a flyby of the “DM-ShangriLa” map (two cycles) using HardwareOC UT3 Bench v126.96.36.199
- Devil May Cry 4 (Direct3D 10) – game version 1.0, “Super High” quality settings, the final result is the average frame rate in two subsequent runs of the second scene of the benchmark (SCENE2)
- S.T.A.L.K.E.R.: Clear Sky (Direct3D 10) – game version 1.5.05, Improved Full DX10 Lighting profile plus 16x anisotropic filtering and other maximum graphics quality settings, my own s04 demo record (a triple run of the test)
- Crysis Warhead (Direct3D 10) – game version 188.8.131.520, “Enthusiast” quality profile, the card is tested twice on the Frost level from Crysis Warhead Benchmark Tool beta 0.29
- Far Cry 2 (Direct3D 10) – version 1.00, Ultra High settings profile, a double run of the Ranch Small test from Far Cry 2 Benchmark Tool (v184.108.40.206)
I added the results of three more cards into this review for the sake of comparison. First, it is the HIS Radeon HD 4870 512MB at the default (750/3600MHz) and overclocked (845/3920MHz) frequencies. Comparing this card to the Palit Radeon HD 4870 1024MB at the same frequencies will show us if the Radeon HD 4870 can get any benefits from having twice the standard amount of graphics memory. Second, the pre-overclocked Leadtek GeForce GTX 260 will be compared with a BFG GeForce GTX 260 896MB so that we could see the difference in performance of the two versions of the same card at the same frequencies. Third, an XFX GeForce GTX 280 1024MB will help us see if the new GeForce GTX 260 is much slower than Nvidia’s flagship product. By the way, the XFX card comes with a full version of Far Cry 2. I mention this fact because you usually get obsolete games together with graphics cards.