Articles: Graphics
 

Bookmark and Share

(13) 

Table of Contents

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 ]

At the end of the last year we tested AMD’s Radeon HD 6850 and HD 6870 graphics cards in CrossFireX mode and were impressed with the performance increase of such configurations over respective single cards which amounted up to 100%. AMD did in fact make it clear that they had overhauled their multi-GPU technology in the new Barts and Cayman processors not only on the software but also on the hardware level, so CrossFireX configurations could indeed be expected to work more effectively across more games and 3D benchmarks. We checked this out with the mentioned cards but what about AMD's top-of-the-line single-chip product Radeon HD 6970? Will it be that effective in a CrossFireX configuration and what will the temperature of two such graphics cards, installed next to each other into a system case, be? We will try to answer these questions in this review. For our testing to be not so narrowly focused and have a competitive streak, we will also add the results of Nvidia GeForce GTX 570s in single-card and SLI modes. Thus, we will see how the multi-GPU technologies from Nvidia and AMD compare in terms of efficiency.

Testbed and Methods

All graphics cards were benchmarked in a closed system case with the following configuration:

In order to lower the dependence of the graphics cards performance on the overall platform speed, I overclocked our 32 nm six-core CPU with the multiplier set at 24x and “Load-Line Calibration” (Level 2) enabled to 4.5GHz. The processor Vcore was increased to 1.475V in the mainboard BIOS:

The 6 GB of system DDR3 memory worked at 1.5 GHz frequency with 7-7-7-16_1T timings and 1.64V voltage. Turbo Boost and Hyper-Threading technologies were disabled during our test session.

AMD Radeon HD 6970 graphics cards are the reference ones. They work at nominal clock frequencies. However, both Nvidia GeForce GTX 570 graphics cards are also designed following the reference layout, but one of the cards worked at OC frequencies, so we had to adjust them to the nominal level before starting the tests:

 

 

The test session started on February 19, 2011. All tests were performed in Microsoft Windows 7 Ultimate x64 with all critical updates as of that date and the following drivers:

The graphics cards were tested in games in two currently most popular resolutions: 1920x1080 and 2560x1600. The tests were performed in two image quality modes: “High Quality+AF16x” – maximum texturing quality with enabled 16x anisotropic filtering and “High Quality+ AF16x+AA4(8)x” with enabled 16x anisotropic filtering and 4x full screen anti-aliasing (MSAA) or 8x if the average framerate was high enough for comfortable gaming experience. We enabled anisotropic filtering and full-screen anti-aliasing from the game settings or configuration files. If the corresponding options were missing, we changed these settings in the Control Panel of Catalyst and GeForce/ION drivers. Vertical sync was always off in driver control panels.

The benchmarking games and applications list has been updated minimally since our last review. Besides the usual patches updates for synthetic benchmarks, we recorded a new test scene for Left 4 Dead 2 game (d60), and used a new test scene B instead of the previously used test scene A. Moreover, we have temporarily added a beta version of the new Crysis 2 game, which will be replaced with a full game version as soon as it becomes available. Unfortunately, Crysis 2 only supports DirectX 9 for now.

So, the complete list of test applications included two popular semi-synthetic benchmarking suites, one technical demo and 18 games of various genres. Here is the complete list of tests used with the settings (all games listed in their release order):

  • 3DMark Vantage (DirectX 10) – v1.0.2.1, Performance and Extreme profiles (basic tests only);
  • 3DMark 2011 (DirectX 11) – version 1.0.0.1, Performance and Extreme profiles;
  • Unigine Heaven Demo (DirectX 11) – version 2.1, maximum graphics quality settings, tessellation at “extreme”, AF16x, 1280x1024 resolution without AA and 1920x1080 resolution with AA 4x;
  • Crysis (DirectX 10) – game version 1.2.1, “Very High” settings profile, two runs of “Assault harbor” test from Crysis Benchmark Tool version 1.0.0.5;
  • Far Cry 2 (DirectX 10) – version 1.03, “Ultra High” settings profile, two runs of the Ranch Small test from Far Cry 2 Benchmark Tool (v1.0.0.1);
  • BattleForge: Lost Souls (DirectX 11) – version 1.2 (02.10.2011), maximum image quality settings, shadows enabled, SSAO technology enabled, two runs of the built-in benchmark;
  • Resident Evil 5 (DirectX 10.1) – version 1.2, variable benchmark with maximum graphics quality settings without motion blur, we took AVG values from the third scene for further analysis, because it was the most resource-hungry;
  • S.T.A.L.K.E.R.: Call of Pripyat (DirectX 11) – version 1.6.02, Enhanced Dynamic DX11 Lighting profile with all parameters manually set at their maximums, we used our custom cop03 demo on the Backwater map;
  • Borderlands: the Secret Armory of General Knoxx (DLC) (DirectX 9) – version 1.4.1, “timedemo1_p” demo with maximum image quality settings;
  • Grand Theft Auto IV - Episodes From Liberty City (DirectX 9) – version 1.1.2.0, the test from “The Ballad of Gay Tony” scene, “Very High” image quality settings, “View Distance” = 23%;
  • Left 4 Dead 2: The Sacrifice (DirectX 9) – version 2.0.6.0, maximum graphics quality settings, d60 demo (two runs) on “1. Docks” map of the “Sacrifice” level;
  • Metro 2033: The Last Refuge (DirectX 10/11) - version 1.2, maximum graphics quality settings, official benchmark, “High” image quality settings; tesselation, DOF and MSAA4x disabled; AAA aliasing enabled, two consecutive runs of the “Frontline” scene;
  • Just Cause 2 (DirectX 11) - version 1.0.0.2, maximum quality settings, Background Blur and GPU Water Simulation enabled, two consecutive runs of the “Dark Tower” demo;
  • Aliens vs. Predator (2010) (DirectX 11) – Texture Quality “Very High”, Shadow Quality “High”, SSAO On, two test runs in each resolution;
  • Lost Planet 2 (DirectX 11) – version 1.0, maximum graphics quality settings, motion blur enabled, performance test “B” (average in all three scenes);
  • StarCraft 2: Wings of Liberty (DirectX 11) – version 1.0, all image quality settings at “Ultra”, Physics “Ultra”, reflections On, two 2-minute runs of our own jt1 demo;
  • Mafia 2 (DirectX 11) – version 1.0.0.3, maximum graphics quality settings, two runs of the built-in benchmark;
  • Sid Meier’s Civilization V (DirectX 11) – version 1.0.1.135, maximum graphics quality settings, two runs of the “diplomatic” benchmark including five heaviest scenes;
  • F1 2010 (DirectX 11) – version 1.01, built-in benchmark at Ultra quality settings including one lap on the “Silverstone” track;
  • Tom Clancy's H.A.W.X. 2 (DirectX 11) – version 1.04, maximum graphics quality settings, shadows On, tessellation Off (not available on Radeon), two runs of the test scene;
  • Crysis 2 (DirectX 9) – beta-version, “Hardcore” graphics quality profile, two runs of the demo in “Central Park” scene.

If the game allowed recording the minimal fps readings, they were also added to the charts. We ran each game test or benchmark twice and took the best result for the diagrams, but only if the difference between them didn’t exceed 1%. If it did exceed 1%, we ran the tests at least one more time to achieve repeatability of results.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 ]

Discussion

Comments currently: 13
Discussion started: 03/11/11 04:23:20 AM
Latest comment: 03/21/11 04:00:14 AM

View comments

Add your Comment