Articles: Graphics
 

Bookmark and Share

(10) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 ]

The Leadtek WinFast A400 TDH features the NVIDIA GeForce 6800 GPU that works at a frequency of 325MHz.

This graphics card is equipped with 128MB of ordinary DDR SDRAM in 2.8ns chips from Hynix that work at 700MHz:

The NVIDIA GeForce 6800 is a simpler chip compared to the GeForce 6800/6800GT – it has some of its pixel pipelines turned off. The reduced power consumption and heat dissipation of the GeForce 6800 game me some hope that this processor would have a high overclockability. Contrary to my expectations, this sample was rather unwilling to increase the core frequency: the maximum frequencies at which the card was stable were 390/950MHz. That is, the core frequency grew by 20% exactly and the memory frequency grew more, by 36%. In order to install a water-cooling system without leaving the memory chips bare, I just took the memory heatsink from the GeForce 6800 GT reference card and put it on Leadtek’s GeForce 6800: the two cards have different wiring, but the holes for fastening the cooling system are identical in them.

After my installing the water-cooling system, the maximum stable frequencies of the card were 400/975MHz (+23% GPU frequency and +39% memory frequency).

Before getting to the tests, I can’t help expressing my thoughts on the subject that worries every good overclocker with respect to the GeForce 6800. So, the question sounds like this: Is it possible to turn on all of the 16 pipelines on the NVIDIA GeForce 6800?

Yes, it is! This is easy: just flash the BIOS from the GeForce 6800 Ultra. After the BIOS re-flash, the graphics processor enables all 16 pixel pipelines and the card becomes… NON-OPERATIONAL!

Why? Because those four missing pipelines hadn’t been disabled just for nothing – they have certain defects, which show themselves as artifacts in 3D applications after you enable those pipelines. You won’t have the opportunity of returning things back easily: after flashing back the BIOS from the 12-pipelined GeForce 6800 you’ll see that the situation remains the same – the card still has 16 pipelines and also has those 3D artifacts. In order to disable the defective pipe-work and restore the card’s operability you will need special software and a specially modified version of the BIOS.

Thus, although it is possible to transform a GeForce 6800 into a GeForce 6800 Ultra/GT by turning on all 16 pipelines, there’s no practical gain from that so far: the disabled pipelines are 100% sure to have defects and their turning on results in various visual artifacts.

There’s no sense to hope that some of NVIDIA’s GeForce 6800 chips have 16 “healthy” pipelines and some of them are just disabled for no particular reason – NVIDIA is not in a position to make such gifts today. The chips are tested back at the factory and are immediately sorted. Fully operational chips are transformed into the GeForce 6800 Ultra/GT, and those that have certain defects – defects that that can be avoided by disabling a few pipelines – become the 12-pipelined GeForce 6800.

It is of course possible that the NV40 manufacturing process will be improved after a while. In this case, there will be an insufficient amount of defective chips to produce the GeForce 6800 in mass quantities, and NVIDIA will have to turn pipelines off in fully operational chips. By that time, however, NVIDIA will certainly have introduced various protection techniques, and it will be less easy to make any kind of conversion to the GeForce 6800.

Testbed and Methods

The testbed was configured as follows:

  • AMD Athlon 64 3400+ CPU;
  • ASUS K8V-SE mainboard;
  • 2x512MB TwinMOS PC3200 DDR SDRAM, CL 2.5.

Software:

  • Windows XP Pro SP1;
  • DirectX 9.0b;
  • Detonator 61.34 for graphics cards on NVIDIA’s GPUs;
  • Catalyst 4.5 for graphics cards on ATI’s GPUs.

ATI’s Catalyst driver enables optimizations of tri-linear and anisotropic filtering even at the maximum quality settings and you cannot turn them off from the driver’s control panel. So, to create similar environments for all the test participants, I selected the maximum quality mode for the graphics cards on ATI’s chips and the “Quality” mode with enabled tri-linear and anisotropic filtering optimizations for cards on NVIDIA’s GPUs.

The internal settings of each game were set up for the maximum possible image quality. There are two test modes: “pure speed” and “eye candy”, the latter uses 4x full-screen anti-aliasing and 16x anisotropic filtering (8x anisotropy level for the NVIDIA GeForce FX 5950 Ultra).

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 ]

Discussion

Comments currently: 10
Discussion started: 07/31/04 08:05:14 AM
Latest comment: 06/22/06 01:51:14 PM

View comments

Add your Comment