by Alexey Stepin
12/06/2004 | 06:29 PM
Now that there have appeared actual PCI Express-supporting chipsets, this new interface, actively promoted by a score of industry-leading companies, has got beyond lab samples and has started its conquest of the world. So far, at a slow pace – only two families of Intel’s chipsets (numbered 925 and 915) currently support PCI Express, but similar chipsets from VIA, SiS and NVIDIA are coming up. The real battle for the PCI Express market is going to break out in the last weeks of this year.
Quoting the common opinion, the PCI Express bus has been developed to replace two well-established industry standards – PCI and AGP – because their bandwidths have become insufficient for current applications. This goes without argument for the PCI bus with its 133MB/s, but AGP is another matter, which we will discuss now.
The AGP bus is in fact a special variety of PCI, optimized to provide the maximum speed of data transfers from the system RAM to the graphics card, but not counter-wise. As you know, this was done to store the textures in the system memory, but it turned out soon that the AGP DIME mode didn’t allow to achieve an acceptable performance. As the result, no one used AGP-texturing, save for Intel who made the i740 graphics chip that stored all textural data in the computer’s system memory.
Much time has passed since then, and the amount of memory installed on board of graphics cards has increased significantly. The AGP bus is now mainly used to pump the textures into the local graphics memory once and to transfer vertexes later; the graphics processor is working with the textures directly, taking them from the local memory which is clocked at much higher frequencies than the system RAM.
The AGP 8x standard boosted the data-transfer rate to 2.1GB/s, but the AGP 4x-8x transition didn’t actually give any advantages as the bandwidth of the earlier standard was quite enough to “feed” all the necessary data to the graphics processor.
Still, the advent of the PCI Express interface implies the transition from AGP to PCI Express x16, the latter having a bandwidth of 4GB/s in one direction. This speed is hardly called for today, but it may come in handy as there appear games with complex graphics that require processing huge amounts of geometrical and textural data.
Both major players in the PC graphics field – ATI Technologies and NVIDIA Corporation – welcomed the new bus, but took opposite approaches to the implementation of PCI Express in their products. NVIDIA preferred to leave its current GPUs as they were and supported PCI Express by means of a special converter chip. ATI Technologies, on the contrary, endowed its X600 and X300 chips with native support of the new bus. We are going to see the worth of these two approaches now as we’ve got two mainstream graphics cards on GPUs from both companies for our today’s tests. They are PowerColor X600 XT and Albatron Trinity PCX 5750.
As you can easily guess, the former card is based on the ATI RADEON X600 GPU, while the latter is in fact a PCI Express version of the NVIDIA GeForce FX 5700. Both cards came to us in their colorful retail packages with documentation, adapters, cables, software and all. The package with the “PowerColor” label and the ATI logo was on top of the parcel, so we’ll open it up first.
PowerColor’s card comes in a rather small package of a sober design. Some people like such design more.
There’re no redundant things in decoration – just an official combination of black and blue with a small patch of color in the center. The special texture on the black-blue background looks like the package is made of carbon fiber rather than of ordinary cardboard. The back side of the package contains all the information about the technical characteristics of the product. You find the following inside the box:
The discs with software are packed into a special envelope labeled ProPack. The accessories to the PowerColor X600 XT are truly rich, but let’s take a look at the graphics card proper.
The crimson-red coloration of the printed circuit board of the PowerColor X600 XT looks cute, but the cooling system can astonish anyone: a roughly-processed aluminum heatsink, fastened to the board with two traditional spring clips, carries a queer-looking disc with narrow slits. There are strange rubber tendrils and a rubber ring in the center, too.
We are quite at a loss thinking of a meaning this thing may symbolize. This disc doesn’t seem to add any beauty to the device; moreover, it almost blocks airflow to the fan that blows at the heatsink. The only thing this rubber disc can do is prevent the overall efficient cooling system from working normally.
The heatsink sits tight on the memory chips – the thermal pads have an appropriate thickness, but the memory chips on the back side of the PCB get no cooling at all. Moreover, two pads out of four are smaller and don’t wholly cover their chips, which may result in a worse heat takeoff.
The PowerColor X600 XT uses the reference PCB design which is rather complex due to the relatively high clock rates – there are a lot of elements even on the front side of the PCB. There are also seats left for a Rage Theater chip with its companions that endow the card with the VIVO functionality. These seats are empty in our sample as the next snapshot shows:
The back side of the PCB is scattered with a handful of small elements. The additional power connector is missing – the device is quite satiated with the power it receives through the PCI Express x16 slot.
The amount of the onboard graphics memory is 128 megabytes; modern FBGA-packaged 2.5ns chips from Hynix are employed here. The chips are rated to work at 400 (800DDR) MHz, but they are clocked at 350 (700DDR) MHz on the PowerColor card. Checking the card after its installation into the system we were surprised to find that the frequency of the graphics core only equaled 350MHz. That’s strange for a graphics card with the “XT” in its name because the original RADEON X600 XT from ATI is clocked at 500MHz GPU and 365 (730DDR) MHz memory. So, the frequencies are severely undercut in the card from PowerColor. The low GPU clock rate can have a most negative effect on the performance since even the reference RADEON 9600 PRO and X600 PRO cards clock their processors at 400MHz, not to mention the RADEON 9600 XT clocked at half a gigahertz.
Running a little ahead, we should confess that the PowerColor card easily reached 500/365 (730DDR) MHz frequencies at overclocking, being based on the reference PCB and equipped with normal 2.5ns memory. In other words, it reached the nominal frequencies of the RADEON X600 XT. The overclocked PowerColor was stable throughout our tests and delivered the performance of a full-fledged RADEON X600 XT.
So why did the manufacturer drop the clock rates of its product? Is it an attempt to make the device cheaper? Well, the use of the reference PCB and the full-speed memory don’t confirm this supposition. We may be dealing with a kind of the FX Power Pack! Ultra/1300XT Golden Sample from Gainward, which had the design of the GeForce FX 5900 Ultra and offered excellent overclockability, but its default clock rates equaled those of the cheaper GeForce FX 5900 XT. But Gainward positioned its card as an overclocker’s choice from the start, giving all the necessary information to the potential customer, while PowerColor never mentions that its card works at reduced clock rates. There’s of course a possibility that this is a problem of our particular sample, but otherwise people who want to buy a RADEON X600 XT may be disappointed at the PowerColor X600 XT, not knowing beforehand about its real technical characteristics. We think this approach of the manufacturer can only harm its own reputation – customers’ confidence is easily lost, but hard to regain.
The PowerColor X600 XT is no leader in acoustic characteristics – its small high-speed fan has to suck air in through the narrow slits in the “decorative” disc that covers it almost completely, so it’s no wonder it produces quite an annoying hissing sound. The term “noiseless” has nothing to do with this card of course. We also found a bright red LED on the fan, but this highlighting wasn’t practically visible through the thick layer of red rubber.
The card from PowerColor overclocked to 520MHz GPU and 380 (760DDR) MHz memory in our tests. That’s excellent considering the starting point (350MHz GPU and 350 (700DDR) MHz memory), but only average if we compare that to the frequencies of the reference RADEON X600 XT. Considering this good overclockability it becomes the stranger that the default frequencies of the card are set so low.
The quality of the onscreen image in 2D applications is what can be called the norm for the majority of modern graphics cards, i.e. it is sharp in all resolutions up to 1600x1200@75Hz inclusive.
Let’s now take a look at Albatron’s offer.
The box the Trinity PCX 5750 is packed into a box which a couple of centimeters wider than the package of the PowerColor X600 XT, but whose design is the opposite of PowerColor's. Here, we have an extremely bright and eye-catching color scheme.
The bright colors attract the eye, while the picture on the front side of the box reminds you about the gaming purposes of the product within ?C you see a humanoid war machine colored aggressive red and a cute-looking girl, probably the pilot of the steel monster.
You may call this design gaudy or tasteless, but it certainly make its job done ?C attracting the potential customer. A note about the packaging: the card from PowerColor is packed into a white box that's wrapped into a soft cover you have to remove before opening the box. The card from Albatron can be opened right away. The box whose back side described the technical capabilities of the product included the following:
Well, it is rather measly even compared to the PowerColor X600 XT, not to mention the luxuries of ASUS' products. They didn't even put a DVI-I-to-D-Sub adapter, which is an indispensable accessory of any modern graphics card! There are no cables for attaching the card to video equipment, either.
Take a look at the photos of the two graphics cards and you'll realize that Albatron and PowerColor took opposite approaches to designing their products:
Albatron's card looks cute enough with its dark-blue PCB and silvery blue cooling system fastened to the PCB with three spring clips. The combination of dark-blue and silver looks winning - we should give Albatron some additional points for design at least.
The cooler of the Trinity PCX 5750 has one serious defect, though. It has a poor contact with the memory chips, despite the rubber-like pads. The pads are just not thick enough for a proper contact - in our sample of the card some of the pads stuck to the memory chips leaving a gap between themselves and the heatsink sole. Other pads just touched the heatsink gently. The memory chips on the back side of the PCB are not cooled at all, like on the PowerColor X600 XT. Otherwise, the cooling system of the Trinity PCX 5750 looks reassuring enough. Besides the GPU core and the memory chips the cooler also keeps the AGP-PCI Express bridge cool (as you remember, NVIDIA implements the new interface through a bridge chip, so the GeForce PCX 5750 is in fact a special version of the GeForce FX 5700).
The small open-die chip under the graphics processor is that very bridge that links the two graphics busses.
The PCB design isn't complex, save for the area on the back side near the GPU and AGP-PCI Express bridge where there is a lot of small elements. The card from Albatron doesn't use an additional power connector, either, as the PCI Express bus can supply up to 75 watts of power to the attached device. The Trinity PCX 5750 consumes far less than that.
This graphics card carries 128 megabytes of graphics memory accessed across a 128-bit bus: there are eight 3.6ns chips from Samsung in the obsolete TSOP package. The memory is rated for 275 (550DDR) MHz, but is actually clocked at 250 (500DDR) MHz, since some Trinity PCX 5750 samples can come with cheaper 4ns memory from Hynix. The graphics processor of this card is clocked at 425MHz frequency.
This graphics cad couldn’t boast a total lack of noise – the small fan produced some noise which was quite perceptible against the sounds of other system components, but there was no annoying hiss as with the PowerColor X600 XT. So, the acoustic properties of the Trinity PCX 5750 can be considered acceptable.
Overclocking yielded excellent results: the card was stable at 540/300 (600DDR) MHz, its nominal frequencies being 425/250 (500DDR) MHz! Thus, this graphics card surpassed all other GPUs from NVIDIA in the core frequency, including the GeForce FX 5800 Ultra with its 500MHz. The relatively small memory overclocking gain is explained by the simplified PCB design and the use of TSOP-packaged chips with a rather high access time by today’s standards. Anyway, 300 (600DDR) MHz is excellent for memory chips rated to work at 275 (550DDR) MHz.
The quality of 2D image was no worse than with the PowerColor X600 XT, i.e. it was normal in all resolutions up to 1600x1200@75Hz inclusive. Well, it’s quite difficult to meet a graphics card that would output a bad 2D image nowadays – the manufacturers usually take care about that.
We explored the two above-described graphics cards on a testbed configured as follows:
We used the following games and benchmarks for our tests:
First Person 3D Shooters:
Third Person 3D Shooters:
We selected the best possible graphics quality settings in each applications – the same for cards on GPUs from ATI and NVIDIA. The only exception is Doom 3: we used the Medium Quality settings for this game as recommended by id Software for mainstream graphics cards equipped with 128MB of graphics memory. We also enabled tri-linear and anisotropic filtering optimizations in the ForceWare driver. Again, since the tested cards are middle-range products, we don’t publish absolutely unplayable results we got in some operational modes of the more demanding games.
We see the outcome of PowerColor’s experiments with the clock rates of its card – it is much slower than the reference RADEON X600 XT. What’s more, the PowerColor card is even worse than the original RADEON X600 PRO! On the other hand, this doesn’t prevent the PowerColor X600 XT from outperforming the Trinity PCX 5750.
We see a rare picture when we turn on full-screen anti-aliasing and anisotropic filtering – the Albatron Trinity PCX 5750, a card with a NVIDIA GPU, becomes a leader. It is competing with the RADEON X600 XT and leaves it behind at overclocking. That’s just the beginning, though. Let’s see what we have in other tests.
The game being very harsh on the computer’s graphics subsystem, we only used two low resolutions and didn’t enabled anti-aliasing or anisotropic filtering.
As you see, none of the graphics cards provides an acceptable fps rate here, but the reference RADEON X600 XT is the closest to doing so. We are not talking about 1280x1024 resolution – you don’t get more than 20fps with any of the cards.
The d3dm4 level is simple as it is intended for multiplayer games and has no monsters. Three graphics cards have a more or less acceptable speed here – the overclocked PowerColor X600 XT, the overclocked Albatron Trinity PCX 5750 and the original RADEON X600 XT. They are predominant in 1280x1024 resolution, too, but you won’t have a comfortable play in this display mode anyway. After all, Doom 3 isn’t for mainstream hardware; it likes powerful graphics processors, huge amount of RAM, and fast CPUs.
The Albatron Trinity PCX 5750 is an outsider here, as it can only reach the level of the RADEON X600 XT through overclocking.
The Albatron wins two resolutions out of three in the eye candy mode and almost overtakes the original RADEON X600 XT at overclocking.
It’s almost the same on the Metallurgy level, although this map is simpler compared to Torlan.
PowerColor’s experiments with the clock rates of their card played a bad trick in this test – the Albatron has the same performance and even becomes a leader in 1600x1200. The specifics of this level must have helped it somewhat – its closed environments and complex geometry suit better for the fast vertex processors of the Albatron Trinity PCX 5750.
We won’t publish the results we got in 1600x1200 as all the participating graphics cards yielded less than 15fps in that resolution.
There’s parity in this shooter – the Albatron Trinity PCX 5750 has almost the same results as the PowerColor X600 XT. Of course, the latter is better at executing pixel shaders this game abounds in, but its greatly reduced GPU frequency prevents it from showing its best.
We had no doubts about the winner of the Far Cry test as the RADEON X600 in its various incarnations is more efficient at handling pixel shaders. Still, the PowerColor X600 XT has the worst result at the nominal frequencies due to the low GPU clock rate. This is true for the eye candy mode with its full-screen antialiasing and anisotropic filtering, too.
The only difference on the Regulator level is that the gap between the PowerColor and the original RADEON X600 XT is much smaller. The Albatron remains on the last place, and that’s natural considering the specifics of the GeForce FX architecture and Far Cry’s shaders-heavy game engine.
Painkiller is among those few games that feel all right under any conditions. Thanks to its well-written engine, the game is playable even on the slowest of graphics cards. That’s why we can offer you the numbers for all the resolutions and modes:
Once again the Albatron Trinity PCX 5750 finds itself on the last place; the top position is shared by the reference RADEON X600 XT and the overclocked PowerColor X600 XT. Note that all the participating cards allow for a comfortable play in any resolution, but the game is far from being primitive – its levels are beautiful and complex and there are shader-based special effects everywhere.
With full-screen anti-aliasing and anisotropic filtering enabled, two cards – the original RADEON X600 XT and the overclocked PowerColor – maintain playability up to 1600x1200 resolution, notching 30fps in it. So, Painkiller is really a rare combination of beauty and simple tastes.
Different RADEON X600 XT models are competing here; the PowerColor X600 XT has the worst results working at the default frequencies. When overclocked, it overtakes the RADEON X600 XT made by ATI. Although working in the DirectX 8.0 mode, the Albatron has quite noncompetitive results.
It’s all the same in the second demo we made in this game.
This game makes an extensive use of various special effects created by means of pixel shaders, so the outcome is quite expected – the graphics cards on the X600 GPU are again in the lead. The overclocked Albatron Trinity PCX 5750 is, however, close to the competitor that works at the default frequencies and even overtakes it in 1600x1200.
The PowerColor X600 XT and the Albatron Trinity PCX 5750 have the same speeds at their default frequencies, but the PowerColor is far ahead at overclocking.
The PowerColor X600 XT is slightly faster than the Albatron Trinity PCX 5750 in the pure speed mode. The overclocked PowerColor is beyond competition in this test.
This is a classic case – when we enable full-screen antialiasing and anisotropic filtering, the higher load on the graphics memory subsystem proves to be an advantage for the GPUs from ATI Technologies. The overclocked PowerColor X600 XT still holds its first place.
This time we limit ourselves with the first two resolutions of the pure speed mode, since the performance delivered by the test participants is low even in them.
Various flavors of the RADEON X600 XT win this test, but the PowerColor is the last among them at its regular GPU and memory frequencies. It becomes the best at overclocking, though. Once again we’re left wondering why PowerColor dropped the GPU clock rate of its card so low.
Lock On is a highly demanding game, too, but less so than Aces in the Sky. The tested cards are advancing neck and neck, but the PowerColor X600 XT of course wins at overclocking, since its memory speeds up better than the obsolete and slow memory of its competitor. In the eye candy mode the Albatron is behind the PowerColor X600 XT at the default frequencies, but overtakes it at overclocking.
Based on the GeForce FX architecture, the Albatron Trinity PCX 5750 suffers from its main disadvantage – low pixel shader performance. That’s why this card is behind the rest of the devices.
It seems like a tie in the eye candy mode, as the Albatron and PowerColor have almost the same results. The overclocked PowerColor X600 XT is beyond competition, though.
The PowerColor X600 XT is not far ahead of the Albatron Trinity PCX 5750 and loses to the original ATI RADEON X600 XT. Overclocking helps the card from PowerColor to win once again.
It’s almost the same in the eye candy mode, but the gap between the PowerColor and the Albatron is somewhat wider.
None of the graphics cards provides an acceptable fps rate, but still the cards with the RADEON X600 GPU have better results.
Aquamark3 enjoys high fill rates and fast vertex processors, so this is one of the few tests the Albatron Trinity PCX 5750 manages to win, not counting overclocking in. But if we overclock the two tested cards, the PowerColor X600 XT comes ahead, again.
Albatron’s card feels less confident under a high load – it just steps back to the last place, while the various scions of the RADEON X600 family have the podium for themselves. The PowerColor X600 XT delivers the performance of the ATI RADEON X600 PRO, roughly measuring.
3DMark03’s totals seem to tell quite clearly that the Albatron Trinity PCX 5750 is the slowest card present, and the overclocked PowerColor X600 XT is the fastest. Well, totals can be misleading sometimes, so let’s check each of the game tests independently.
Despite all the simplicity of the first game test, Albatron’s card again has the lowest result, while the PowerColor X600 XT keeps on the same level with the RADEON X600 PRO. With full-screen anti-aliasing and anisotropic filtering enabled, the Trinity PCX 5750 outputs more frames per second than the PowerColor.
The competitors have the same performance in the second game test, but the Trinity PCX 5750 gains the upper hand in the eye candy mode.
Strangely enough, the Albatron Trinity PCX 5750 slows down in the third game test and even falls behind the PowerColor X600 XT which is clocked at its default frequencies. Why strange? Because NVIDIA’s architecture has always been highly efficient in the second and third tests of 3DMark03 set. This must be the bad influence of the slow memory. On the other hand, the Albatron is ahead in the eye candy mode.
As expected, the PowerColor X600 XT wins the fourth game test which uses complex version 2.0 pixel shaders. ATI’s VPUs are overall more efficient with such shaders than NVIDIA’s.
Making the conclusion to this review is simple as the only point where the PowerColor X600 XT is inferior to the Albatron Trinity PCX 5750 is the appearance. The latter looks prettier, but loses to the competitor in almost all parameters, from the obsolete GeForce FX architecture which is inefficient in modern applications and the slow TSOP-packaged memory to the scanty accessories where a DVI-I-to-D-Sub adapter is even missing. As for their comparative performance, one glance over the following diagram should answer all of your questions:
So, the Albatron is slower than the PowerColor in almost every test or, more rarely, has the same performance, for example in Doom 3. The only exception if the beta version of the second next-generation game where the lack of pixel shaders coupled with the complex scene geometry allow the Trinity PCX 5750 to win. The overclocked PowerColor X600 XT, however, leaves no chance at all to the Albatron Trinity PCX 5750, as it reaches the level of the reference RADEON X600 XT (the bright orange line in our diagram).
The Albatron is doing better in 1024x768 with full-screen antialiasing and anisotropic filtering enabled, but the PowerColor X600 XT still wins more tests.
Speaking about the device from PowerColor, we are really astonished at the extremely low GPU frequency as it is set by default, although the card is manufactured according to the reference design and can work without problems at the normal frequencies of a RADEON X600 XT – 500/365 (730DDR) MHz. If the manufacturer had wanted to make an inexpensive product, it would have certainly used a simplified design or at least equipped the card with slower (i.e. cheaper) memory chips. But in reality, the PowerColor X600 XT has the reference PCB and fast FBGA-packaged 2.5ns memory chips, like any regular RADEON X600 XT. The reduction of the default GPU frequency from 500 to 350MHz leads to the PowerColor card being slower than the RADEON X600 PRO in some cases which clocks its GPU at 400MHz. Again, the PowerColor can easily overclock to the normal frequencies of the RADEON X600 XT and deliver the appropriate performance. The most annoying thing, though, is that the manufacturer doesn’t warn the customers against the frequency reduction, which is unfair towards people who are shopping for a RADEON X600 XT-based card.
As for the Albatron Trinity PCX 5750, we have a beautiful and well-made product, which is regrettably out-dated now. We’re now looking forward to the more progressive GeForce 6600 series to appear in shops, so purchasing a graphics card with the GeForce FX architecture for the PCI Express platform seems to be a hasty decision now. As our tests confirm, the GeForce 6600 provides more performance as well as the support of Shader Model 3.0.
According to Pricewatch.com, a PowerColor X600 XT would cost you about $190, which is rather steep for a card with reduced frequencies. You may remember that the recommended price of the GeForce 6600 GT is $199 – that’s just a little more than the price of the PowerColor card, but the GeForce 6600 GT gives you much more performance. Moreover, the cheaper GeForce 6600 is going to come out soon, which is faster even than the original RADEON X600 XT, not to mention the PowerColor X600 XT. This high price doesn’t allow us to recommend that card to you. It may be of some interest to overclockers, again if the price suits them.
The Albatron Trinity PCX 5750, according to the same search system, costs about $135, and that’s more appropriate than the price of the PowerColor. On the other hand, the Albatron card uses the out-dated GeForce FX architecture, which makes it less suitable for modern games. Its excellent overclockability may be interesting to overclockers, but otherwise we don’t think it wise to purchase this card – it would be better to wait for the GeForce 6600.
So, the two mainstream graphics cards for the PCI Express platform we have reviewed today can suit to the current owners of this platform or to people who are going to transition to it right now, but don’t want to pay big sums for a GeForce 6800 GT or a RADEON X800 XT/PRO. If you’re going to make this transition a bit later on, you may want to wait for GeForce 6600/6600 GT cards to appear in shops. After these new cards arrive, prices for devices of the RADEON X600 and GeForce PCX families will surely go down, making them a more attractive buy for users with a limited budget.
Now let me sum up the highs and lows of the two tested solutions:
PowerColor X600 XT
Albatron Trinity PCX 5750