ATI offers a wide range of graphics cards based on RADEON core. The most expensive of them are monstrous creatures bundled with a TV tuner, 64MB memory, Video-Outs, etc. Then come "common" RADEON DDR with 32 MB memory. The next product is RADEON SDR and RDEON LE. But even the latter are too expensive for the Low-End office PCs. Moreover, they are positioned as gaming solutions rather than office ones. So, for the first time in its long history ATI has made up its mind to launch a relatively cheap office-oriented graphics card. Thus, RADEON VE chip and a graphics card of the same name came into being.
This card is based on a modified RADEON chip deprived of one pixel pipeline and T&L unit. Instead, the developers integrated a second RAMDAC, a second CRT controller, TV- and DVI-Outs. As a result, all the features typical of a fully-fledged dual-display graphics card plus TV- and DVI-Outs appeared concentrated on one single chip. Since it is much cheaper to manufacture and build a graphics card with all the functions embedded into only one chip rather than work on a card with the whole bunch of additional elements, RADEON VE costs considerably less than its competitors also supporting dual-monitor configurations. For example, NVIDIA also has a certain presence in this market sector with its GeForce2 MX chip, which differs from the more expensive GeForce2 GTS by the absence of two pixel pipelines and the availability of TwinView dual-display technology.
Of course, here also comes Matrox with its G450, which is intended for non-gaming needs and supports DualHead technology, which allows connecting a second display to the system.
Such giants as Appian Graphics, a well-known developer of High-End graphics cards supporting multi-monitor configurations, offer quite expensive solutions and don't touch the Low-End sector that's why they can hardly be regarded as competitors here. However, it didn't prevent ATI from getting Appian Graphics' license for HydraVision technology to implement it in RADEON VE. We'll focus on this innovation later in the review.
We selected the following graphics card to test:
- ATI RADEON VE 32MB DDR SDRAM based on RADEON VE chip (test sample);
- SUMA Platinum GeForce2 MX Special Edition 32MB SDRAM based on NVIDIA GeForce2 MX chip;
- Matrox Millennium G450 32 MB DDR SDRAM based on G450 chip.
Our testbed had the following configuration:
- Intel Pentium III 1000MHz CPU;
- ASUS CUSL2 mainboard based on i815E;
- 2 x 128MB PC133 SDRAM;
- Fujitsu MPE3084AE 8.4GB 5400rpm HDD;
- Hitachi GD2500 DVD-drive.
We tested with the following software:
- Windows 98 SE;
- Windows 2000 + Service Pack 1;
- DirectX 8.0a for Windows 98;
- DirectX 8.0a for Windows 2000;
- Ziff-Davis Winbench99 v1.2;
- Quake3 Arena v.1.17;
- Unreal Tournament + patch 4.36;
- InterVideo WinDVD 2000.
- For GeForce2 MX we took Detonator 10.80 (released on 07.03.2001) for Windows 98 and Windows 2000.
- For ATI RADEON VE we chose drivers ver. 4.13.7089 (released on 05.03.2001) for Windows98 and ver. 5.13.3102 (released on 26.02.2001) for Windows2000.
- For Matrox Millennium G450 we installed drivers ver. 6.03.026 for Windows 98 and ver. 1.13.032 for Windows 2000.
Before passing over to detailed consideration of multi-monitor support and other peculiarities of RADEON VE and its rivals, we'll put together the main characteristics of the tested samples:
|ATI RADEON VE||Matrox Millennium G450||SUMA Platinum GeForce2 MX SE|
|Texturing units per pipeline||3||1||2|
|Fillrate (mln pixels per second)||150 (w/o HyperZ)||250||350|
|Graphics memory type||64bit DDR SDRAM||64bit DDR SDRAM||128bit SDRAM|
|First RAMDAC||300MHz, embedded||360MHz, embedded||350MHz, embedded|
|Second RAMDAC||300MHz, embedded||230MHz, embedded||150MHz, external|
|Supported displays and interfaces||VGA: CRT monitors |
DVI: FPD monitors
S-Video, RCA (composite out via converter)
|VGA: CRT monitors |
S-Video, RCA (composite out via converter)
|VGA: CRT monitors|
S-Video, RCA (composite out via converter)
First we investigated the performance of RADEON VE in different input modes comparing it to the capabilities of Matrox Millennium G450 and SUMA Platinum GeForce2 MX SE.
As the driver for RADEON VE (for Windows 98 and Windows 2000) is installed, a new properties page appears in the desktop properties:
On this page you may set the primary and the secondary output device. The card can display the image to monitors, LCD panels with DVI interface and also TVs. It doesn't matter, which of the devices is the secondary and which one is the primary. ATI provides an opportunity to swap them. And thanks to two absolutely identical RAMDACs you needn't care about any quality worsening when connecting the display to the secondary Out instead of the primary one.
RADEON VE can clone the image to two Outs and support the expanded desktop mode splitting it into two parts and showing each of them on each display.
Expanded Desktop (two monitors)
Expanded Desktop (monitor+TV)
You don't have to set the same resolution for both the devices. For instance, you may have a large desktop in 1600x1200 resolution on the display accompanied with a part in 800x600 resolution shown on the TV screen. In Windows 98 as well as in Windows 2000 the expanded desktop mode is supported by the operating system, and in Windows 2000 you'll find a HydraVision page in the desktop properties. There you may adjust the devices' settings to determine the primary and the secondary display. By the way, Windows 2000 allows expanding the desktop both vertically and horizontally.
The TV-Out of RADEON VE also deserves keen attention. The top resolution available for the TV set is 1024x768. If the resolution is higher, the panning mode is enabled. Mind the fact that the maximal resolution offered by GeForce2 MX for S-Video Out is only 800x600, as well as that of Matrox Millennium G450.
To adjust the image on the TV screen, you need to click at "TV" in the "Displays" page. In the newly shown window you can select the TV format and set the image quality to the maximum:
We can point out only one drawback of RADEON VE's TV-Out implementation: the color gamma is a bit poor. All the other feature of our today's hero are not in the least worse that those of RADEON VE's competitors.
Overlay Support, DVD
Overlay technology implies transferring the graphics data not to the video buffer, but to a separate area of the graphics memory, where it can be additionally processed on the hardware level. Note that overlay frame size and color depth do not depend on the resolution and the color depth mode of the desktop.
The overlay buffer may be displayed on the desktop or on the TV screen, for instance. Moreover, many graphics cards allow the users to adjust the overlay colors regardless of the desktop settings. Overlay is displayed on the desktop with the help of the "chromakey" technology. The Windows operating system draws a window for the overlay and fills it with a key color (commonly it is purple). When the graphics controller is rendering the data to DAC and comes across this color, it replaces it with the overlay buffer data, presumably scaling it out to the window size or to the full screen.
RADEON VE supports overlays both in Windows 98 and Windows 2000. However, in the clone mode when the DVD moving is played, there is no picture on the secondary screen, while the primary one works perfectly. And if we try to show this movie in the full screen mode, the secondary display gets simply disconnected, no matter what it is: a TV-set or a monitor. The same thing happens in Windows 2000.
Among other graphics cards, GeForce2 MX acts the same way, and G450 shows the movie on both displays without any trouble at all.
In the expanded desktop mode RADEON VE and GeForce2 MX show nothing in the movie window, if even a tiny part of it is dragged to the other part of the desktop.
DVD on two displays (monitor+TV)
However, if a situation like that takes place in Windows 2000, RADEON VE does provide a picture, while GeForce2 MX still shows nothing on the second display.
As for G450, due to DualHead DVD Max, it is capable of showing full-screen DVD movie on the secondary display as well as on the primary one.
GeForce2MX features a similar function aka Video Mirror:
Still, in Windows 2000 DualHead DVD Max of G450 and Video Mirror of GeForce2 MX refused to work. Unfortunately, RADEON VE supports no functions of the kind.
As a final word about overlays, we would like to stress that GeForce2 MX provides the vastest opportunities for overlay control.
Speaking of the color settings for the entire display, Matrox Millennium G450 offers the poorest choice:
ATI licensed HydraVision technology from Appian Graphics. This technology is integrated into the operation system to help the user to optimize his work by means of dual-display support and multi-desktop modes.
HydraVision ensures up to nine desktops with different applications running on each of them, if you wish. For instance, you may have your favorite solitaire or tetris on one desktop and the boring MS Word on another, then set a hotkey to shift from one desktop to another depending on the boss's location, thus enjoying the checkerwork of active and passive leisure time :-)
With the help of HydraVision you may also set the amount of desktops you need and assign a name to each of them:
Besides, you may determine the desktop for dialogues, make the PC remember the windows location and such-like:
Those windows, which size isn't permanently set, can be maximized to full screen as well as to the entire expanded desktop. For this purpose, there appears a special window control button.
To the left of it you'll find another new button opening a window with HydraVision settings for the current application:
As a whole, HydraVision is a pleasantly simple and convenient technology, aimed at making the life easy for those who have to work with several huge and complicated applications at a time.
2D Performance and Image Quality
We measured 2D performance using a set of tests included in Ziff-Davis Winbench99. The tests were run at the resolution set to 1024x768x32. Here is what we got:
The same tests run in Windows 2000 look as follows:
All the cards have almost similar 2D performance. Only in Windows 2000, when the cards performance grew down generally, RADEON VE hung back a bit.
2D performance is no objective matter, so it cannot be quantified. It seemed to us that Matrox Millennium G450 and ATI RADEON VE gave the best image quality on both monitors (we took Daewoo CMC 1511b, Relisys 772, Viewsonic P775 and Samsung SyncMaster 900IFT), and GeForce2 MX couldn't resist slight blurring effect in over-1280x1024 modes.
We really liked the image quality on the TV screen provided by Matrox Millennium G450, while ATI RADEON VE demonstrated slight horizontal blurring. But bear in mind, please, that we used S-Video-to-RCA converters shipped together with the cards, so if the TVs were connected via S-Video the situation could be different.
As we have predicted, the card based on GeForce2 MX is the leader. It is equipped with two pipelines with two texturing units on each. Besides, it is the only chip among those tested, which feature geometry hardware support. RADEON VE lost this feature for the sake of the cheaper RADEON chip, and G450 has never had it at all. Nevertheless, let us see the results.
Quake III Arena, standard settings:
RADEON VE allows playing Quake III Arena quit4e normally up to the resolution of 1024x768x32. We observed a notable performance drop as we switched to 32bit color mode because the memory bus bandwidth (though it is DDR memory) is 64bit only. HyperZ technology, of course, gives some positive effect, but it's hard to assess because of the relatively moderate achievements shown by RADEON VE.
For G450 a playable resolution was 800x600x16 and, perhaps, 800x600x32.
As it has been expected, GeForce2 MX won indisputable leadership.
The tests in Unreal Tournament demanded ultimate quality settings to make up for the engine's shortcoming, i.e., a very strong CPU dependence. For RADEON VE in Windows 2000 we had to set "UseAGPTextures" as "false", otherwise every time we tried to run Unreal Tournament the system restarted. That's why the results shown by RADEON VE may happen to be somewhat lower at 1024x768, for in this mode RADEON VE virtually worked as a PCI graphics card.
Here are the results obtained:
In Windows 98 RADEON VE pleased us with a negligible performance decrease after passing over to 32bit color, in spite of the cut-up memory bus. The reason is that the performance was restricted by the CPU, but not by the graphics card. So, even RADEON VE, not to mention GeForce2 MX, which results remain unchanged regardless of the resolutions, successfully got through rendering all the episodes in Unreal Tournament at these resolutions.
Instead, G450 failed to endure such strain, revealing a tangible change in performance as the resolutions changed. This card had more or less acceptable performance at the modest 800x600x16 (only in Windows 98).
RADEON VE by ATI has produced a favorable impression at our test lab. Its strong sides are multi-display support, good 2D quality, high-quality TV-Output implementation, DVI interface and, of course, the ability to support several desktops and all the functions of HydraVision.
Well, there are still some drawbacks, which should be pointed out: overlay control comes to adjusting their brightness, while the numerous TV-Out settings miss brightness control. Most probably, the situation will improve when new drivers for RADEON VE appear.
One of the few things that can't be improved is the low 3D performance. We guess, ATI's engineers have went overboard cutting the chip's 3D capabilities, even though the card isn't positioned as a game-oriented solution.
We believe that RADEON VE will win a certain share of the market, especially thanks to its considerably lower price in comparison with other graphics cards. ATI intends it for those who need a high-quality graphics card with great 2D and dual-display support or just a graphics card with DVI interface.
3D performance of RADEON VE keeps up with that of NVIDIA Riva TNT2 Ultra. Also taking into account DirectX7 support (and partially DirectX8 support) and attractively low price, this graphics card makes a dainty bit for not very rich gamers.