Articles: Graphics
 

Bookmark and Share

(2) 

Table of Contents

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 ]

The launch of Nvidia’s GeForce 200 architecture was far from successful. Having focused on developing the world’s most complex and highest-performing monolithic graphics core, Nvidia seemed to lose when ATI responded to the G200 with its simpler and much cheaper RV770 processor. G200-based solutions, already expensive due to their complexity, proved to be totally unprofitable when Nvidia had to cut their prices in order to make them competitive. In fact, even that price cut didn’t affect the appeal of those products much. A chip consisting of about 1.5 billion transistors and manufactured on 65nm tech process just could not work at high frequencies. The senior model of the new series barely reached GPU clock rates of 600/1300MHz. The junior model was even slower and could not compete with the Radeon HD 4870 in terms of performance as well as price.

But as time went by, Nvidia was steadily correcting its errors. First of all, the company increased the performance of the junior GeForce GTX 260 by unlocking some of the GPU’s functional subunits. As a result, the card got a longer name by adding “Core 216” and got competitive against the ATI Radeon HD 4870. Later on, this very model became the first to use the new 55nm version of the G200 core together with a new, greatly simplified and cheaper-to-make design of the PCB. That was most important for G200-based solutions that had been under high pressure from ATI. The GeForce GTX 285 was simplified, too. Thanks to the improved tech process, the G200b chip boasted increased overclocking potential, making the successor to the GeForce GTX 280 competitive to the ATI Radeon HD 4850 X2 across many applications. Besides, the 55nm G200 chip helped Nvidia strike back in the sector of premium dual-processor solutions. Announced on January 8, 2009, the GeForce GTX 295 dethroned the ATI Radeon HD 4870 X2 indeed.

However, Nvidia does not look that fine in the sector of less expensive solutions. The monstrous G200, even in its new 55nm incarnation, is no good for such products due to its complexity and high manufacturing cost, but the company doesn’t have a new entry-level core as yet. It only has the 55nm version of the G92 chip. Therefore this chip has been chosen for the new mainstream graphics card that complements the GeForce 200 series from below. The new product is not just a renamed GeForce 9800 GTX+ as we will explain now.

First of all, we want to remind you that the G200 chip itself is not a truly innovative solution because it only differs from the G92 in the number of functional subunits and the ALU design (see our review for details). And the G92 in its turn traces its origin back to the G80 which was announced in November 2006. The key difference between them is the architecture of texture processors.

Thus, Nvidia’s current graphics architecture is old by the standards of the IT industry. This may be viewed as either lack of progress or maturity of the architecture itself. In fact, G92-based solutions can be counted among the GeForce 200 family. It wouldn’t be a big mistake since there are no fundamental differences between the G92 and G200. What Nvidia’s development department has been busy with all this time is beyond the scope of this review, but we do know a few things about the next generation of Nvidia’s graphics cores.

So, based on the 55nm G92b core, the GeForce 9800 GTX+ proved to be a worthy rival to the highly popular Radeon HD 4850, but there were pitfalls besides the confusion in Nvidia’s product nomenclature. The card’s PCB design was inherited from the GeForce 9800 GTX which had been developed as the flagship of the GeForce 9 series to replace the GeForce 8800 GTX/Ultra. Such a complex and expensive PCB wouldn’t do for products with recommended prices of $129 and $149. Something had to be done about that.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 ]

Discussion

Comments currently: 2
Discussion started: 03/18/09 11:50:15 PM
Latest comment: 03/19/09 08:20:01 AM

View comments

Add your Comment