Articles: Graphics
 

Bookmark and Share

(1) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 ]

Testbed and Methods

The graphics card was benchmarked in a system case with the following configuration:

  • Mainboard: DFI LANPARTY DK X48-T2RS (Intel X48, LGA775, BIOS 10.03.2008)
  • CPU: Intel Core 2 Extreme QX9650 (3.0GHz, 1.25V, 2x6MB L2 cache, 4x333MHz FSB, Yorkfield, C0 revision)
  • CPU cooler: Thermalright SI-128 SE (Enermax Magma fan at 1260rpm)
  • Thermal interface: Gelid GC1
  • System memory:
    • 2 x 1024MB Corsair Dominator TWIN2X2048-9136C5D DDR2 SDRAM (Specs: 1142MHz, 5-5-5-18, 2.1V)
    • 2 x 1024MB CSX DIABLO CSXO-XAC-1200-2GB-KIT DDR2 SDRAM (Specs: 1200MHz, 5-5-5-16, 2.4V)
  • Graphics card: HIS Radeon HD 4870
  • Disk subsystem: Western Digital VelociRaptor (300GB, SATA-II, 10,000rpm, 16MB cache, NCQ)
  • HDD cooling and sound insulation system: Scythe Quiet Drive for 3.5-inch HDD
  • Optical drive: Samsung SH-S183L DVD-burner (SATA-II)
  • System case: ASUS ASCOT 6AR2-B Black&Silver (with 120mm 960rpm Scythe Slip Stream 120 system fans for intake and a 960rpm Enermax Magma on the side panel)
  • Control and monitoring panel: Zalman ZM-MFC2
  • Power supply: Thermaltake Toughpower (W0218, 1500W, 140mm fan)
  • Monitor: 24-inch BenQ FP241W (1920x1200@60Hz)

To minimize the CPU’s influence on the graphics card’s performance I overclocked the quod-core CPU to 4.00GHz at 1.575V voltage before the tests.

The system memory worked at a frequency of 1000MHz with 5-4-4-12 timings (Performance Level = 6) and 2.175V voltage.

The tests were run under Windows Vista Ultimate Edition x86 SP1 (with all the critical updates available for December 19, 2008). I used the latest drivers available at the moment of my tests:

  • Intel Chipset Drivers version 9.1.1.1004 Alpha
  • DirectX libraries dated November 2008

The drivers were tested in the order of their release. Each driver was installed only after the previous one had been uninstalled and the system had been cleaned with Driver Sweeper 1.5.5. The following changes were made in Catalyst Control Center: the graphics quality level was changed from Quality to High Quality and the Adaptive Antialiasing option was set at Quality. Vertical synchronization was Always Off. Other settings were left at their defaults. I turned full-screen antialiasing and anisotropic filtering on from the menu of each game. If the game didn’t provide such options, I enabled FSAA and AF from the driver’s Control Panel.

The graphics cards were tested at two resolutions, 1280x1024 and 1920x1200, and with or without 16x anisotropic filtering and 4 or 8x full-screen antialiasing in the following games and applications:

  • 3DMark 2006 (Direct3D 9/10) – build 1.1.0, at the default and custom (1920x1200, 16x AF and 4x AA) settings
  • 3DMark Vantage (Direct3D 10) – v1.0.1, Performance and Extreme profiles (basic tests only)
  • Unigine Tropics Demo version 1.1 (Direct3D 10), integrated benchmark, highest graphics quality settings, resolution of 1280x1024 pixels without AF and FSAA and 1920x1200 with 16x AF and 4x FSAA
  • World in Conflict (Direct3D 10) – version 1.0.0.9(b89), “Very High” graphics quality profile, UI texture quality = Compressed; Water reflection size = 512
  • Enemy Territory: Quake Wars (OpenGL 2.0) – version 1.5, highest graphics quality settings, d5 demo on the Salvage level, Finland
  • Call of Duty 4: Modern Warfare MP (Direct3D 9) – version 1.7.568, “Extra” quality of visuals, d3 demo on the Bog level
  • Unreal Tournament 3 (Direct3D 9) – version 1.3, highest graphics quality settings (level 5), Motion Blur and Hardware Physics enabled, a flyby of the “DM-ShangriLa” map (two cycles) using HardwareOC UT3 Bench v1.3.0.0
  • Devil May Cry 4 (Direct3D 10) – game version 1.0, “Super High” quality settings, the final result is the average frame rate in two subsequent runs of the second scene of the benchmark
  • S.T.A.L.K.E.R.: Clear Sky (Direct3D 10) – game version 1.5.07, Improved Full DX10 Lighting profile plus 16x anisotropic filtering and other maximum graphics quality settings, my own s04 demo record (a triple run of the test)
  • Crysis Warhead (Direct3D 10) – game version 1.1.1.690, “Very High” quality profile, the card is tested twice on the Frost level from HardwareOC Crysis WARHEAD Bench v1.1.1.0
  • Far Cry 2 (Direct3D 10) – version 1.02, Ultra High settings profile, a double run of the Ranch Small test from Far Cry 2 Benchmark Tool (v1.0.0.1)
  • X3: Terran Conflict (Direct3D 10) – version 1.2.0.0, maximum quality of textures and shadows, fog enabled, the More Dynamic Light Sources and Ship Color Variations parameters are turned on, the final result is the average frame rate in one run of the four demos
  • Left 4 Dead (Direct3D 9) – version 1.0.0.5, maximum quality, a double run of the d3 demo (No Mercy level, The Seven scene)
  • Lost Planet: Colonies (Direct3D 10) – version 1.0, Maximum Quality settings, DX10 HDR Rendering, integrated benchmark consisting of two scenes
  • Grand Theft Auto 4 (Direct3D 9) – version 1.0.1.0, High texture quality, Highest rendering quality, double run of the integrated benchmark

I tested the cards twice in each application (do not confuse this with a double run of a demo). The final result is the best fps/score value out of the two cycles. It is shown in the diagrams. The bottom speed is also shown whenever possible.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 ]

Discussion

Comments currently: 1
Discussion started: 01/18/09 05:53:03 AM
Latest comment: 01/18/09 05:53:03 AM

View comments

Add your Comment