Testbed and Methods
The graphics card was benchmarked in a system case with the following configuration:
- Mainboard: DFI LANPARTY DK X48-T2RS (Intel X48, LGA775, BIOS 10.03.2008)
- CPU: Intel Core 2 Extreme QX9650 (3.0GHz, 1.25V, 2x6MB L2 cache, 4x333MHz FSB, Yorkfield, C0 revision)
- CPU cooler: Thermalright SI-128 SE (Scythe Ultra Kaze at 1320rpm)
- Thermal interface: Gelid GC1
- System memory:
- 2 x 1024MB Corsair Dominator TWIN2X2048-9136C5D DDR2 SDRAM (Specs: 1142MHz, 5-5-5-18, 2.1V)
- 2 x 1024MB CSX DIABLO CSXO-XAC-1200-2GB-KIT DDR2 SDRAM (Specs: 1200MHz, 5-5-5-16, 2.4V)
- Disk subsystem: Western Digital VelociRaptor (300GB, SATA-II, 10,000rpm, 16MB cache, NCQ)
- HDD cooling and sound insulation system: Scythe Quiet Drive for 3.5-inch HDD
- Optical drive: Samsung SH-S183L DVD-burner (SATA-II)
- System case: ASUS ASCOT 6AR2-B Black&Silver (with 120mm 960rpm Scythe Slip Stream system fans for intake and exhaust and another such fan, at 800rpm, on the side panel)
- Control and monitoring panel: Zalman ZM-MFC2
- Power supply: Thermaltake Toughpower (W0218, 1500W, 140mm fan)
- Monitor: 24-inch BenQ FP241W (1920x1200@60Hz)
To minimize the CPU’s influence on the graphics card’s performance I overclocked the CPU to 4.00GHz at 1.575V voltage before the tests.
The system memory worked at a frequency of 1000MHz with 5-4-4-12 timings (Performance Level = 6) and 2.175V voltage.
The tests were run under Windows Vista Ultimate Edition x86 SP1 (with all the critical updates available for November 10, 2008). I used the latest drivers available at the moment of my tests:
- Intel Chipset Drivers version 18.104.22.1687
- DirectX libraries dated November 2008
For each GeForce/ForceWare driver I installed the PhysX pack available at the moment of the driver’s release or included into the driver bundle. The drivers were tested in the order of their release. Each driver/PhysX pair was installed only after the previous pair had been uninstalled and the system had been cleaned with Driver Sweeper 1.5.5.
The drivers were set at High Quality and the Transparency Antialiasing (Multisampling) option was turned on. Vertical synchronization was forced off. Other settings were left at their defaults. I turned full-screen antialiasing and anisotropic filtering on from the menu of each game. If the game didn’t provide such options, I enabled FSAA and AF from the control panel of the GeForce driver.
The graphics cards were tested at two resolutions, 1280x1024 (or 1024x960) and widescreen 1920x1200. We used the following games and applications, including two synthetic benchmarks, one techno-demo and eleven games of various genres:
- 3DMark 2006 (Direct3D 9/10) – build 1.1.0, at the default and custom (1920x1200, 16x AF and 4x AA) settings
- 3DMark Vantage (Direct3D 10) – v1.0.1, Performance and Extreme profiles (basic tests only)
- Unigine Tropics Demo version 1.1 (Direct3D 10), integrated benchmark, highest rpahics quality settings, resolution of 1280x1024 pixels without AF and FSAA and 1920x1200 with 16x AF and 4x FSAA
- World in Conflict (Direct3D 10) – version 22.214.171.124(b89), “Very High” graphics quality profile, UI texture quality = Compressed; Water reflection size = 512; DirectX 10 rendering enabled
- Enemy Territory: Quake Wars (OpenGL 2.0) – version 1.5, highest graphics quality settings, d5 demo on the Salvage level, Finland
- Call of Duty 4: Modern Warfare MP (Direct3D 9) – version 1.7.568, “Extra” quality of visuals, d3 demo on the Bog level
- Unreal Tournament 3 (Direct3D 9) – version 1.3, highest graphics quality settings (level 5), Motion Blur and Hardware Physics enabled, a flyby of the “DM-ShangriLa” map (two cycles) using HardwareOC UT3 Bench v126.96.36.199
- Devil May Cry 4 (Direct3D 10) – game version 1.0, “Super High” quality settings, the final result is the average frame rate in two subsequent runs of the second scene of the benchmark
- S.T.A.L.K.E.R.: Clear Sky (Direct3D 10) – game version 1.5.07, Improved Full DX10 Lighting profile plus 16x anisotropic filtering and other maximum graphics quality settings, my own s04 demo record (a triple run of the test)
- Crysis Warhead (Direct3D 10) – game version 188.8.131.520, “Very High” quality profile, the card is tested twice on the Frost level from HardwareOC Crysis WARHEAD Bench v184.108.40.206
- Far Cry 2 (Direct3D 10) – version 1.00, Ultra High settings profile, a double run of the Ranch Small test from Far Cry 2 Benchmark Tool (v220.127.116.11)
- X3: Terran Conflict (Direct3D 10) – version 18.104.22.168, maximum quality of textures and shadows, fog enabled, the More Dynamic Light Sources and Ship Color Variations parameters are turned on, the final result is the average frame rate in one run of the four demos
- Left 4 Dead (Direct3D 9) – version 22.214.171.124, maximum quality, d3 demo (two runs) on Level 3 “No Mercy”, first scene “The Seven”
- Lost Planet: Colonies (Direct3D 10) – version 1.0, Maximum Quality settings, DX10 HDR Rendering, integrated benchmark consisting of two scenes
The last game is rather old, yet I added it into the list because the GeForce 180.48 driver claimed to improve the speed of this game by 80%! I just wanted to check that claim out.
I tested the cards twice in each application (do not confuse this with a double run of the demos). The final result is the best fps/score value out of the two cycles. It is shown in the diagrams.