Articles: Graphics
 

Bookmark and Share

(3) 

Table of Contents

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 ]

Now that there have appeared actual PCI Express-supporting chipsets, this new interface, actively promoted by a score of industry-leading companies, has got beyond lab samples and has started its conquest of the world. So far, at a slow pace – only two families of Intel’s chipsets (numbered 925 and 915) currently support PCI Express, but similar chipsets from VIA, SiS and NVIDIA are coming up. The real battle for the PCI Express market is going to break out in the last weeks of this year.

Transitioning to PCI Express Now

Quoting the common opinion, the PCI Express bus has been developed to replace two well-established industry standards – PCI and AGP – because their bandwidths have become insufficient for current applications. This goes without argument for the PCI bus with its 133MB/s, but AGP is another matter, which we will discuss now.

The AGP bus is in fact a special variety of PCI, optimized to provide the maximum speed of data transfers from the system RAM to the graphics card, but not counter-wise. As you know, this was done to store the textures in the system memory, but it turned out soon that the AGP DIME mode didn’t allow to achieve an acceptable performance. As the result, no one used AGP-texturing, save for Intel who made the i740 graphics chip that stored all textural data in the computer’s system memory.

Much time has passed since then, and the amount of memory installed on board of graphics cards has increased significantly. The AGP bus is now mainly used to pump the textures into the local graphics memory once and to transfer vertexes later; the graphics processor is working with the textures directly, taking them from the local memory which is clocked at much higher frequencies than the system RAM.

The AGP 8x standard boosted the data-transfer rate to 2.1GB/s, but the AGP 4x-8x transition didn’t actually give any advantages as the bandwidth of the earlier standard was quite enough to “feed” all the necessary data to the graphics processor.

Still, the advent of the PCI Express interface implies the transition from AGP to PCI Express x16, the latter having a bandwidth of 4GB/s in one direction. This speed is hardly called for today, but it may come in handy as there appear games with complex graphics that require processing huge amounts of geometrical and textural data.

Both major players in the PC graphics field – ATI Technologies and NVIDIA Corporation – welcomed the new bus, but took opposite approaches to the implementation of PCI Express in their products. NVIDIA preferred to leave its current GPUs as they were and supported PCI Express by means of a special converter chip. ATI Technologies, on the contrary, endowed its X600 and X300 chips with native support of the new bus. We are going to see the worth of these two approaches now as we’ve got two mainstream graphics cards on GPUs from both companies for our today’s tests. They are PowerColor X600 XT and Albatron Trinity PCX 5750.

As you can easily guess, the former card is based on the ATI RADEON X600 GPU, while the latter is in fact a PCI Express version of the NVIDIA GeForce FX 5700. Both cards came to us in their colorful retail packages with documentation, adapters, cables, software and all. The package with the “PowerColor” label and the ATI logo was on top of the parcel, so we’ll open it up first.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 ]

Discussion

Comments currently: 3
Discussion started: 03/16/05 06:48:17 PM
Latest comment: 02/03/07 09:43:49 PM

View comments

Add your Comment