Articles: Graphics
 

Bookmark and Share

(40) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 ]

Testbed Configuration and Testing Methodology

All participating graphics cards were tested in a system with the following configuration:

We’re already preparing two large reviews of off-the-shelf Radeon HD 7790 and GeForce GTX 650 Ti Boost, so today we’ll limit ourselves to their photos and specs.

This ASUS card is pre-overclocked from 1000/6000 to 1075/6400 MHz, but we dropped its frequencies to the standard level for this testing. Moreover, we benchmarked our ASUS Radeon HD 7790 at highest frequencies we could achieve, namely 1220/6640 MHz.

 

The reference Nvidia GeForce GTX 650 Ti Boost worked at the standard clock rates of 980/6008 MHz, so we didn’t do anything about its frequencies.

The overclocked card was stable at a GPU frequency of 1140 MHz (1193 in boost mode) and a memory frequency of 6648 MHz.

 

Besides competing with each other, the Radeon HD 7790 and GeForce GTX 650 Ti Boost will be compared with products of similar pricing. The Radeon HD 7850 is represented by a product from HIS whereas the Radeon HD 7770 GHz Edition is a reference card from AMD.

 

 

The screenshot of the HIS Radeon HD 7850 shows an increased GPU frequency but we dropped it to the standard level (840 MHz) for our testing.

The GeForce GTX 650 Ti Boost is bracketed in Nvidia’s hierarchy by the GeForce GTX 660, represented in this review by an ASUS product, and by the ordinary GeForce GTX 650 Ti which is represented by a Gigabyte card.

 

 

As you can see, we tested them at the standard clock rates.

In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 48x, BCLK frequency set at 100 MHz and “Load-Line Calibration” enabled to 4.8 GHz. The processor Vcore was increased to 1.385 V in the mainboard BIOS:

Hyper-Threading technology was enabled. 16 GB of system DDR3 memory worked at 2.133 GHz frequency with 9-11-10-28 timings and 1.65V voltage.

The test session started on March 31, 2013. All tests were performed in Microsoft Windows 7 Ultimate x64 SP1 with all critical updates as of that date and the following drivers:

Since we are discussing lower performance graphics cards, we only tested them in one resolution - 1920x1080. Lower resolutions start making less sense to look at because Full HD monitors are getting more and more affordable. The tests were performed in two image quality modes: “Quality+AF16x” – default texturing quality in the drivers with enabled 16x anisotropic filtering and “Quality+ AF16x+MSAA 4x” with enabled 16x anisotropic filtering and full screen 4x antialiasing. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings. If the corresponding options were missing, we changed these settings in the Control Panels of Catalyst and GeForce drivers. We also disabled Vsync there. There were no other changes in the driver settings.

As usual, we expanded and updated out benchmarking suite. We removed Unigine Heaven benchmark, which is practically duplicating the newer Valley test. Moreover besides the previously added Resident Evil 6, Crysis 3 and Tomb Raider (2013), we also included such new gaming titles as StarCraft 2: Heart of the Swarm and BioShock Infinite. Of course, we haven’t forgotten about all the latest updates, which are current at the time of tests. As of today, out benchmarking suite includes two popular semi-synthetic benchmarks, one demo and 15 resource-consuming  games of various genres:

  • 3DMark 2011 (DirectX 11) – version 1.0.3.0, Performance and Extreme profiles;
  • 3DMark 2013 (DirectX 9/11) – version 1.0, benchmarks in “Cloud Gate”, “Fire Strike” and “Fire Strike Extreme” scenes;
  • Unigine Valley Bench (DirectX 11) – version 1.0, maximum image quality settings, AF16x and(or) MSAA 4x, 1980x1080 resolution;
  • Resident Evil 6 Bench (DirectX 9) – version 1.0, all settings adjusted for maximum quality, FXAA3HQ antialiasing, Blur enabled, 1920x1080 and 2560x1440 resolutions;
  • S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
  • Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
  • Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
  • Total War: Shogun 2: Fall of the Samurai (DirectX 11) – version 1.1.0, built-in benchmark (Sekigahara battle) at maximum graphics quality settings and enabled MSAA 8x in one of the test modes;
  • Crysis 2 (DirectX 11) – version 1.9, we used Adrenaline Crysis 2 Benchmark Tool v.1.0.1.14. BETA with “Ultra High” graphics quality profile and activated HD textures, two runs of a demo recorded on “Times Square” level;
  • Battlefield 3 (DirectX 11) – version 1.4, all image quality settings set to “Ultra”, two successive runs of a scripted scene from the beginning of the “Going Hunting” mission 110 seconds long;
  • Sniper Elite V2 Benchmark (DirectX 11) – version 1.05, we used Adrenaline Sniper Elite V2 Benchmark Tool v1.0.0.2 BETA with maximum graphics quality settings (“Ultra” profile), Advanced Shadows: HIGH, Ambient Occlusion: ON, Stereo 3D: OFF, two sequential test runs;
  • Sleeping Dogs (DirectX 11) – version 1.5, we used Adrenaline Sleeping Dogs Benchmark Tool v1.0.0.3 BETA with maximum image quality settings, Hi-Res Textures pack installed, FPS Limiter and V-Sync disabled, two consecutive runs of the built-in benchmark with quality antialiasing at Normal and Extreme levels;
  • F1 2012 (DirectX 11) – update 10, we used Adrenaline Racing Benchmark Tool v1.0.0.13 with “Ultra” image quality settings during two laps on Brazilian “Interlagos” race track with 24 other cars and a drizzling rain; we also used “Bonnet” camera mode;
  • Borderlands 2 (DirectX 9) – version 1.3.1, built-in benchmark with maximum image quality settings and maximum PhysX level, FXAA enabled;
  • Hitman: Absolution (DirectX 11) – version 1.0.446.0, built-in test with Ultra image quality settings, with enabled tessellation, FXAA and global lighting;
  • Crysis 3 (DirectX 11) – version 1.0.1.3, all graphics quality settings at maximum, Motion Blur amount – Medium, lens flares – on, FXAA and MSAA4x modes enabled, two consecutive runs of a scripted scene from the beginning of the “Swamp” mission 110 seconds long;
  • Tomb Raider (2013) (DirectX 11) – version 1.1.732.1, all image quality settings set to “Ultra”, V-Sync disabled, FXAA and 2x SSAA antialiasing enabled, TessFX technology activated, two consecutive runs of the benchmark built into the game;
  • StarCraft II” Heart of the Swam (DirectX 10) – version 2.0.6.25180, texture quality settings at “Ultra”, all other image quality settings at maximum, two consecutive runs of the pre-recorded demo with a mass battle 110 seconds long;
  • BioShock Infinite (DirectX 11) – version 1.1.21.26939, we used Adrenaline Action Benchmark Tool v1.0.2.1, two consecutive runs of the built-in benchmark with “Ultra” and “Ultra+DOF” quality settings.

If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 ]

Discussion

Comments currently: 40
Discussion started: 05/05/13 11:52:47 AM
Latest comment: 05/15/14 05:44:56 AM

View comments

Add your Comment