Articles: Graphics
 

Bookmark and Share

(35) 

Table of Contents

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 ]

Introduction

The 4-th of April, 2004, was a remarkable day in the 3D graphics realm. Having previously lost the lead to ATI Technologies, NVIDIA Corporation announced a new graphics processor codenamed NV40. This chip made NVIDIA a technological leader since it was the first consumer graphics solution with such revolutionary technologies as next-generation pixel and vertex shaders (Shader Model 3.0), floating-point color representation and others.

As a sign of departure from the past, NVIDIA abandoned the letters FX in the names of the graphics cards on the new GPU, and GeForce 6800 cards were really brilliant in all the benchmarks, wresting the crown from the RADEON 9800 XT. This was not an easy victory, though. The chip came out very complex, consisting of 222 million transistors, and an acceptable chip yield was only found at frequencies of 350-400MHz. Besides that, the higher heat dissipation made a clumsy and noisy dual-slot cooling system necessary. But even with all these drawbacks the release of the GeForce 6800 Ultra was a big step forward for NVIDIA as well as for the industry at large.

Soon after that, on May 4, ATI Technologies replied with the release of the R420 processor and R420-based graphics cards. Unlike NVIDIA’s, ATI’s approach was evolutionary rather than revolutionary: the RADEON X800 was in fact a greatly improved RADEON 9800 rather than something completely new. That approach was quite justifiable then: the R420 was a rather simple chip (160 million transistors against the NV40’s 222 million), and coupled with new dielectric materials this simplicity allowed ATI to raise the frequency of the new solution to 520MHz, achieving a very high level of performance.

The NV40 and R420 were in fact equals in their basic technical characteristics. Each chip had 16 pixel pipelines and 6 vertex processors, but the RADEON X800 XT was generally faster than the GeForce 6800 thanks to higher operational frequencies. NVIDIA’s card couldn’t use its support of Shader Model 3.0 to its advantage since there were no games capable of using this feature. Even the Far Cry patch that added SM 3.0 to this game didn’t change anything as the same patch also added Shader Model 2.0b which was implemented in competitor processors from ATI.

So, NVIDIA held the crown of the king of 3D graphics but for a very short while. Moreover, the difficulties with production of such a complex chip as the NV40 almost immediately resulted in a deficit of GeForce 6800 Ultra cards (well, ATI’s RADEON X800 XT and PRO were not abundant, either). Later on ATI split the RADEON X800 family into two lines by releasing the high-performance R480 (RADEON X850) and the mass-user-oriented 0.11-micron R430. The maximum frequency of the R480 chip reached 540MHz whereas the max clock rate of NVIDIA’s NV40 and NV45-based solutions was only 425MHz (on “special edition” graphics cards from certain manufacturers). The top models of NVIDIA’s graphics cards were still inferior in performance to their counterparts from the ATI camp.

The announcement of the multi-chip SLI technology helped NVIDIA to offer more performance than the ATI RADEON X850 XT Platinum Edition could deliver. Yet, the solution consisting of two GeForce 6800 Ultra/GT graphics cards turned to be too expensive, awkward and power-wasting and also required a special mainboard based on the nForce4 SLI chipset. On the other hand, people who wanted the best performance money could buy didn’t care about these things at all, and NVIDIA’s multi-GPU technology became quite popular.

So, ATI’s trumps by the middle of 2005 were:

  • RADEON X850 XT Platinum Edition, the world’s fastest graphics card
  • The widest range of high-performance graphics processors (five models for each AGP and PCI Express buses)

NVIDIA had a few aces, too:

  • Multi-GPU SLI technology, the planet’s fastest graphics solution
  • Formal technological superiority since the GeForce 6 supported Shader Model 3.0, High Dynamic Range etc
  • The GeForce 6600 series was enjoying success in the middle-range sector of the market

In other words, both GPU developers offered products that were the best in some way or another, but neither could offer a chip both fastest and feature-richest. Today NVIDIA and ATI both need a new graphics processor that would return the crown of an absolute leader to one of them. NVIDIA was the first to announce a new-generation GPU, making ATI Technologies hurry up with an answer.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 ]

Discussion

Comments currently: 35
Discussion started: 06/22/05 07:40:19 PM
Latest comment: 02/02/07 02:48:19 PM

View comments

Add your Comment