Articles: Graphics
 

Bookmark and Share

(25) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 ]

Testbed and Methods

We decided to pit the GeForce 6600 GT against the last-season stars, the GeForce FX 5950 Ultra and the RADEON 9800 XT, since they have eight pixel pipelines like the new NVIDIA GPU does, and work at comparable frequencies and, what’s most important, sell for $200-300 today, thus belonging to the same price category. So users ready to spend that sum of money for a graphics card will surely compare these offers among themselves.

With the arrival of new solutions (RADEON X800 XT/PRO and GeForce 6800/Ultra/GT), older 8-pipeline GPUs, once performance leaders, have moved a class down, to the sector of mainstream graphics cards. Regrettably, this is not a very quick process, and such cards still cost quite a sum, since the new -generation products are rare guests in shops, yet. Well, the price will fall certainly, and these ex-leaders will have to face competition from the new breed of mainstream GPUs, the GeForce 6600 and RADEON X700. So we want to see how competitive the GeForce 6600 is against the well-known combatants like the RADEON 9800 XT and the GeForce FX 5950 Ultra. The latter two GPUs have an added bonus – their 256-bit memory bus. Theoretically, it should give them an advantage in high resolutions and/or with full-screen antialiasing. We will see shortly how big this advantage really is.

Besides the above-mentioned cards we also included a GeForce 6800 GT with the PCI Express interface and a RADEON X600 XT. The latter is in fact promoted by the ATI camp as a direct competitor to the GeForce 6600 which works at lower frequencies than the 6600 GT.

Unfortunately, we hadn’t a mainboard with both AGP and PCI Express slots, so we had to run the tests of the GeForce FX 5950 Ultra and RADEON 9800 XT on our AMD64 platform. The results thus can only be compared indirectly. Our test platforms were configured as follows:

The Intel Prescott platform:

  • Intel Pentium 4 560 (Socket 775, 3.60GHz, 1MB L2 cache);
  • Intel Desktop Board D925CXC;
  • 1GB DDR2 PC2-4300 (533MHz) SDRAM (Micron Technology, 2x512MB);
  • Samsung SpinPoint SP0812C (Serial ATA-150, 8MB buffer);
  • Creative SoundBlaster Audigy 2 sound card;
  • Microsoft Windows XP Pro SP2, DirectX 9.0?;
  • ATI CATALYST 4.8, NVIDIA ForceWare 65.76.

The AMD Athlon 64 platform:

  • Athlon 64 3400+ CPU (2.20GHz, 1MB L2 cache);
  • ASUS K8V Deluxe mainboard;
  • OCZ PC-3200 Platinum EB DDR SDRAM (2x512MB, CL2.5-3-2-8);
  • Seagate 7200.7 HDD (SerialATA-150, 8MB buffer);
  • Creative SoundBlaster Audigy 2 sound card;
  • Microsoft Windows XP Pro SP2, DirectX 9.0?;
  • ATI CATALYST 4.8, NVIDIA ForceWare 61.77 (version 60.85 for 3DMark03).

We used the following benchmarks and games:

First Person 3D Shooters:

  • Call of Duty;
  • Doom III;
  • Unreal Tournament 2004;
  • Halo: Combat Evolved;
  • Far Cry;
  • Painkiller;
  • Counter Strike: Source Beta;
  • Highly Anticipated DirectX 9 Game 1;
  • Highly Anticipated DirectX 9 Game 2.

Third Person 3D Shooters:

  • Splinter Cell: Pandora Tomorrow;
  • Prince of Persia: Sands of Time;
  • Hitman: Contracts;
  • Tomb Raider: Angel Of Darkness;
  • Thief: Deadly Shadows;
  • Max Payne 2: The Fall of Max Payne.

Simulators:

  • IL-2 Sturmovik: Aces in the Sky;
  • Lock On;
  • Colin McRae Rally 04;

Strategy games:

  • Command & Conquer Generals: Zero Hour;
  • Perimeter.

Semi-synthetic Benchmarks:

  • Half-Life 2 Stress Test;
  • Aquamark3.

Synthetic Benchmarks:

  • Futuremark 3DMark03 build 340.

As usual, we selected the settings in each game that would produce the best-looking picture on the screen. The settings were identical for all the tested cards. We had disabled the anisotropic and tri-linear filtering optimizations for ForceWare 61.77 before running the tests. The other version of the driver from NVIDIA, ForceWare 65.76, needs one comment.

The developer renamed the Anisotropic optimization option into Anisotropic mip filter optimization and added the Anisotropic sample optimization option. The first setting has really just changed the name; its key point remained the same as in the previous versions of ForceWare – if enabled, it replaced tri-linear with bi-linear filtering on all the texture stages, save for the first one. This leads to a certain performance gain at a tradeoff of a practically unnoticeable image quality loss. The new option, Anisotropic sample optimizations, performs a series of optimizations of texture samples, probably reducing the number of texture lookups when it can be done without compromising the image quality much, but again this option doesn’t tamper with the first texture stage.

We are glad to see that the control over all the optimizations is in the hands of the user. NVIDIA plays fair and trusts the user – we think the user community is going to value this.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 ]

Discussion

Comments currently: 25
Discussion started: 09/07/04 11:05:51 AM
Latest comment: 08/25/06 01:17:08 PM

View comments

Add your Comment