Articles: Graphics

Bookmark and Share

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 ]

ATI Radeon HD 3870 X2: Work Principles

Here are the specifications of the ATI Radeon HD 3870 X2 card in comparison with Nvidia’s counterparts:

In fact, the new card is no different, except for GPU and memory frequencies, from two Radeon HD 3870 card working in CrossFire mode. Theoretically, the ATI Radeon HD 3870 X2 boasts a huge potential and should be competitive against the GeForce 8800 GTX/Ultra as well as GeForce 8800 GTS 512MB, yet you can never be sure with multi-GPU solutions. You can only see what they are capable of if you test them in real 3D applications. But before we test the new card, we are going to see how it is designed.

The two RV670 GPUs are joined together one a single PCB in CrossFire mode by means of the Compositing Engine available in each graphics core. This engine used to be external in earlier versions of CrossFire and implemented by means of a Xilinx Spartan-3 FPGA. It was first integrated into the graphics core in the ATI RV570 chip, installed on the Radeon X1950 Pro. The logical scheme of CrossFire looks like this:

Here, each GPU has its own PCI Express bus. It means the mainboard must have two PCI Express x16 slots and the chipset must support CrossFire technology. This support is only implemented in AMD’s and Intel’s chipsets, but not in Nvidia’s, which doesn’t suit the ATI Radeon HD 3870 X2 as it is positioned as a single graphics card. Of course, it is expected to work with both its graphics cores on any mainboard with any chipset.

There have been attempts to create a universal multi-GPU solution that needs only one expansion slot since 2000. We can recall the ATI Rage Fury Maxx that became the victim of faulty software AFR and overall buggy drivers. The last attempt by Nvidia was more of a success. Working on its GeForce 7900 X2 and later on GeForce 7950 X2, the company tried to solve the problem of inter-GPU communication and compatibility by means of a special PCI Express x48 switch. It looked like that:

Alas, Nvidia did not achieve full compatibility: its graphics card would not start up on some mainboards. The company’s website even had a special page that listed certified mainboards the GeForce 7950 GX2 was guaranteed to work with. Besides that, the old trait of Nvidia SLI technology, its dependence on the software part, was conspicuous, too: the performance of the card could plummet to the level of a GeForce 7900 GT or even lower if the particular game was not explicitly supported in the driver.

That solution had a potential, however, and Nvidia could have spent time and money to polish its dual-GPU card to ideal, but it preferred to abandon the GeForce 7950 GX2 in favor of the new G80 core and the GeForce 8800 series of graphics cards, which not only had higher performance but also a more advanced unified architecture.

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 ]


Comments currently: 36
Discussion started: 02/18/08 01:37:52 PM
Latest comment: 04/20/08 11:21:00 AM

View comments

Add your Comment