The leading graphics chipset manufacturers who tend to make their accelerators more and more powerful faced a paradox. Namely, they were really perplexed to see old models being more popular than the new ones. Well, it's natural: most users aren't keen to pay an exorbitant price for another upgraded graphics subsystem as they know that new applications, which make full use of all these innovated technologies, will appear much later than the chips. The same logical thinking usually guides large PC OEMs, so they equip only a limited number of new systems with these novelties. This fact made graphics chip and card manufacturers focus on the Low-End market sector and start developing value solutions.
The first company to promote several products from the same family at a time became 3dfx. In those times when Voodoo3 3000 was regarded as an up-to-date accelerator, the company also offered its cheaper version, Voodoo3 2000, and then an even cheaper Velocity 100. Thanks to a pretty rich choice of the cards offered, 3dfx was lucky to get the highest profits and to win a good deal of OEM contracts. Presently it sticks to the same strategy and offers both: expensive Voodoo5 5500 cards and cheaper Voodoo4 4500 cards.
NVIDIA seems to have followed in the footsteps of 3dfx. It launched GeForce2 GTS accelerator (the moment it was announced cards based on it cost about $300) simultaneously with a cheaper version, GeForce2 MX (the graphics cards were to stay within $150 correspondingly). And NVIDIA proved to have chosen the right strategy. GeForce2 MX-based graphics cards are very popular in the market among the users and PC assemblers. By the by, it's thanks to this value product that NVIDIA overtook ATI in sales volumes.
But ATI whose pricy graphics cards are now a serious competitor to those by NVIDIA wasn't intended to miss its chance and launched a 32MB version of RADEON with SDR memory. Well, and that's exactly the product we will now discuss in detail.
As for ATI, it plans to introduce an even cheaper version of RADEON intended first of all for 2D applications. It is known as RADEON VE and differs tangibly from RADEON SDR. It features no T&L engine and only one rendering pipeline. But on the other hand, RADEON VE has a peculiarity of its own: it supports dual-monitor configurations. For that purpose ATI has licensed the technology from Appian Company that is a sheer coryphaeus in this field.
Before we get down to the results of the tests we have run, let's have a look at the architecture of ATI RADEON 32MB SDR. This card features the same RADEON graphics processor, which we have already seen on more expensive graphics accelerators from ATI. The company managed to reduce the cost of its product thanks to SDR SDRAM used instead of DDR SDRAM featuring twice greater memory bus bandwidth. This approach differs a bit from that of NVIDIA. As you remember, to create a cheaper GeForce2 MX solution it didn't just shift to SDR memory, but also introduced some changes to the GeForce2 GTS graphics core: 4 rendering pipelines were cut down to only 2. Then NVIDIA brought the core frequency down to 175MHz (GeForce2 GTS originally featured 200MHz). As for RADEON 32MB SDR, its frequency makes 166MHz (the same with RADEON 32MB DDR and an OEM version with 64MB DDR). In other words, RADEON 32MB SDR can be regarded as a slower model only when compared with the retail version of RADEON 64MB DDR. However, despite all the fuss, NVIDIA GeForce2 MX still works at a higher core frequency than RADEON 32MB SDR.
Since the major rivals of ATI RADEON 32MB SDR are the graphics cards based on NVIDIA GeForce2 MX, which fall into the same price group, and probably 3dfx Voodoo4 4500, it makes sense to compare the features of all the three cards:
|ATI RADEON 32MB SDR||NVIDIA GeForce2 MX||Voodoo4 4500|
|Graphics processor||RADEON6C||NV11||VSA-100 (Napalm)|
|Number of texturing pipelines||2||2||2|
|Number of texturing units per pipeline||3||2||1|
|Fillrate w/o multitexturing||333 mln pixels/sec*||350 mln pixels/sec||333 mln pixels/sec|
|Fillrate with multitexturing||333 mln pixels/sec*||350 mln pixels/sec||166 mln pixels/sec|
|Memory bus width||128bit||128bit||128bit|
|Memory bus bandwidth||2.6GB/sec||2.6GB/sec||2.6GB/sec|
|Manufacturing technology||0.18 micron||0.18 micron||0.25 micron|
As it goes from the table above, 3dfx Voodoo4 4500 is no serious opponent to the other two cards as its fillrate is twice as low as that of the competitors in multitexturing mode. However, the architectures of GeFroce2 MX and RADEON 32MB SDR are very similar.
The only striking difference between the graphics processor architectures is the number of texturing units in their rendering pipelines: three units by RADEON and two ones by GeForce2 MX. So, theoretically, in the applications, which take advantage of RADEON's ability to lay 3 textures over a pixel per clock, RADEON 32MB SDR is sure to perform much better than NVIDIA's offspring. But unfortunately, applications of the kind are not available now. All the modern games make a 2-textures-per-pixel overlay at the maximum, so the advantage of RADEON is not called for yet. And as for the classic multitexturing mode, if we disregard RADEON's specific functions ensuring more efficient use of the memory bus, GeForce2 MX seems to be a faster solution due to its higher core clock and hence higher fillrate.
No wonder that the T&L engine of RADEON 32MB SDR shows the same performance as Charisma engine of other, more expensive graphics cards by ATI: 30mln triangles per second. That's a bit more than the T&L engine of GeForce2 MX provides (its performance is 25mln triangles).
Now let's see how things stand in practice. ATI RADEON with SDRAM looks very much like the card's elder brother with DDR SGRAM memory. However, as it's equipped with a different type of memory, its layout is completely different from the cards with DDR SGRAM.
The card is provided with 4 8MB 6ns SDR SDRAM memory chips by Micron totaling to 32MB. The memory nominal frequency is 166MHz.
The memory and the graphics chip working frequencies of ATI RADEON 32 MB SDR are synchronized. According to the specs, they work at the same 166MHz. But as the PowerStrip utility indicates, the real frequency of these two is only 160MHz.
Take a look at the screenshot taken from PowerStrip:
The card is designed on green textolite and has a heatsink with a cooler, which has already become a traditional feature of all graphics cards by ATI. The heatsink is fixed with heat-conductive glue on top of the graphics processor and provides the required chipset cooling.
This graphics card also features the unused layout for the TV- and DVI-Outs. But as ATI states on its site, RADEON 32MB SDR graphics cards shipped in retail should have a TV-Out, while the DVI-Out spot is made just in case.
Before we eventually pass over to the results of our tests, it makes sense to focus on ATI's brand HyperZ technology used in all RADEON graphics cards. As we will explain a bit later, the thing is that at high resolutions in 32-bit color mode RADEON performs better than GeForce2 MX, although theoretically calculated fillrate of both shows just the opposite: GeForce2 MX is ahead of ATI RADEON. And it's only the HyperZ that makes ATI RADEON as fast as it is and brings the theory to naught. The new technology lets slightly unload the memory bus, which bandwidth happens to be the main bottleneck for today's graphics accelerators based on classical architecture.
The main idea of the HyperZ technology is to unload the memory bus by cutting down the number of pixel Z-coordinate transfers. In other words, it optimizes the work of the Z-buffer. This happens thanks to three patterns used in RADEON: Hierarchical Z, Z-Compression and Fast Z-Clear.
Hierarchical Z first makes RADEON check the visibility of a pixel and only after that it renders this pixel. As a result, the accelerator's workload gets tangibly smaller in case the scene is rich in objects overlaying one another. This works since the pixels, hidden by those that are closer to the observer, are not engaged in rendering.
Z-Compression is devised to compress the information without missing the data stored in the Z-buffer. This way, the info about the pixel depth is transferred compressed, which unloads the memory bus. Of course, all this produces extra strain on the graphics processor, but in the long run, this chipset enhancement provides a certain performance gain. Besides, it's also good for the graphics memory that would be more busy should the data be stored uncompressed.
The third algorithm the HyperZ technology comprises is known as Fast Z-Clear. It allows clearing Z-buffer faster before every new frame is rendered.
Finally, as ATI promises, these three algorithms should allow increasing the fillrate by about 20%, i.e. HyperZ makes the fillrate of ATI RADEON 32MB SDR equal to approximately 400mln pixels per second. This way the theoretically calculated fillrate of RADEON 32MB SDR turns higher than that of NVIDIA GeForce2 MX, taking into account the influence of HyperZ, of course. But you should keep in mind that, as long as the key principle of HyperZ is to unload the memory bus, the 20% increase is possible only if the bandwidth is really the main problem of the graphics subsystem, i.e. in 32-bit color mode at higher resolutions.
To estimate the practical effect of HyperZ, we tested RADEON 32MB SDR in 3DMark2000 with HyperZ in on- and off-modes:
As we assumed, HyperZ became more efficient as the resolution got higher. The gain in fillrate may even reach 30% when the resolution is set to the maximum! So, we can only admit that HyperZ is no marketing trick but a really smart technology providing a substantial performance gain.
The CD features a driver version 4.12.3050. After it is installed, the ATI icon appears in the system tray.
You may use this icon to quickly open the display settings page, device manager and a help-file with the description of possible problems and remedies.
Display setting of ATI RADEON 32MB SDR's drivers provide a good deal of various properties pages that help to adjust the card for work in different modes.
With the help of the Direct3D page you can disable V-Sync, change Anti-Aliasing settings, enable and disable the texture compression and choose the configuration of the Z-buffer.
OpenGL settings can be also changed on a corresponding page.
Apart from the above mentioned pages, you get a gamma correction option. Besides, you can also check what driver version you are using.
In addition to the driver and the utilities, the CD contains a lot of necessary things. Among them there is a user's manual, an utility that informs you about any errors, de-installator software and a demo-movie showing what your ATI RADEON graphics card is capable of.
We used the following testbed for our practical experiments:
- Intel Pentium III 866 CPU;
- ASUS CUSL2 mainboard;
- 256MB (2 modules x 128MB ) PC133 SDRAM by Hyundai;
- IBM DTLA-372050 HDD;
- Microsoft WINDOWS 98.
We used the drivers version 4.12.3050, which were shipped with the card. Although we also tried a newer version of the drivers downloaded from ATI's web-site, we observed changes neither in the performance nor in the functions supported.
For a better comparison we took three more graphics cards:
- ATI RADEON 32MB DDR;
- SUMA Platinum GeForce2 MX 32MB (NVIDIA GeForce2 MX);
- 3dfx Voodoo4 4500.
We selected the following testing tools:
- Quake3 Arena version 1.16n (OpenGL).
- 3DMark2000 Pro version 1.1 (D3D).
First come the results obtained in Quake3 Arena (demo001).
On the chart you can see it clearly that ATI RADEON SDR performs worse than its competitors in 16-bit color mode. In 32-bit color mode the situation is slightly different, and although RADEON SDR failed to catch up with its elder brother, it succeeded in overtaking the other two competitors. As you can see, the difference in performance between RADEON SDR and NVIDIA GeForce2 MX still exists, though it's rather insignificant.
Now let's see the tested card work in Direct3D. For this purpose we checked its performance in 3DMark2000 Pro:
In 16-bit color mode at low resolutions NVIDIA GeForce2 MX stays an indisputable leader, but ATI RADEON SDR starts to press it at higher resolutions. In 32-bit color the situation is absolutely the same. The only resolution where ATI RADEON SDR performs less successfully than NVIDIA GeForce2 MX is 800x600x32.
Since ATI has no utilities for overclocking, we used PowerStrip version 3.0beta supporting RADEON SDR.
The graphics card was overclocked up to 190/190MHz and it worked really well. Here is the performance shown by the overclocked card in Quake3 Arena (demo001):
From the technological point of view ATI RADEON 32MB SDR is more enhanced than GeForce2 MX and has more promising future.
Compared with GeForce2 MX, RADEON SDR features a more powerful T&L unit and more texture units per each rendering pipeline. That should have a favorable effect in games laying three textures over a pixel per clock. And games like that are sure to become available soon, as the next generation of graphics cards by NVIDIA, NV20, will have three texture units per pipeline. Moreover, the unique HyperZ technology lets RADEON working at lower frequencies beat GeForce2 MX at high resolutions and provides a significant performance gain in complicated 3D scenes.
However, the cost of this graphics card - $150 - is too much for a simple version with no TV-Out, no dual-display support and no DVI-Out. This way, the sum looks blatantly too high, considering similar graphics cards based on GeForce2 MX are available for $100-$110. So, to help ATI RADEON 32MB SDR occupy its niche on the market, the price should be reasonably lowered to the level of GeForce2 MX-based cards.