The market of graphics chips was very promising, but chaotic in the mid-nineties. There were tens of companies offering graphics processor, but since there were no clear 3D standards, most of them vanished in oblivion by 1998. In the late-nineties the rest of the pioneers, namely S3, Trident, Matrox and some others were not able to continue the rapid innovation process while remaining profitable and currently we see practically nothing new from them. Finally, there are now two major discrete graphics suppliers on the market: NVIDIA Corporation and ATI Technologies. Other companies release graphics chips very rarely and they cannot offer enough performance in highly-competitive 3D market. This week a new hope was born – XGI Volari family of graphics processors.
The new graphics breed announced by Extreme Graphics Innovation is targeted at different desktop market segments, besides, there are a number of solutions offered for the mobile field as well.
The vast majority of technologies found in the Volari V5 and Volari V8 graphics chips were inherited from SiS’ Xabre subsidiary and were developed as a part of the Xabre II project. With no surprises Volari V5 and V8 chips support DirectX 9.0 capabilities, but what is very strange for these days, the parts support a technology to work in dual mode!
Currently there is not a lot of information concerning the architecture of the Volari V5 and V8 graphics processors, but we still managed to find some details for you.
Both XGI Volari V5 and Volari V8 support DirectX 9.0 and OpenGL 1.4 capabilities, but there are concerns in regards their performance in complex Pixel Shader 2.0 and Vertex Shader 2.0 environments. The XGI Volari V5 and V8 have only 2 Vertex Shader pipelines, in contrast to 4 pipes of the RADEON 9700/9800 and 3 of the GeForce FX 5800/5900. As for pixel shader units, the V8 boasts with 4, while the V5 with 2 of them. Besides, the difference between the V5 and V8 is in the number of rendering pipelines – 4 and 8 respectively.
Following the best traditions of NVIDIA and ATI, XGI will offer 2 versions of the Volari V5 and two versions of the Volari V8: Ultra and “non-Ultra” clocked at 350 and 300MHz respectively. XGI does not disclose which manufacturing technology they use for the Volari chips, but keeping in mind that the chips are neither too complex nor speedy, I would expect 0.13 micron process in order to increase the yields.
XGI's Volari V5/V8 Ultra-based graphics cards will boast with 750MHz memory, while the “non-Ultra” solutions will deal with 650MHz DDR SDRAM. In future the company may also offer devices with DDR-II memory clocked at 900 and even 1000MHz for its V5/V8 products. Since XGI does not advertise any 256-bit bus for the memory, I assume that the chips still use the 128-bit bus.
The Volari V5 and V8 graphics processors have built-in 400MHz RAMDAC, but can only support TMDS transmitter and/or dual-monitor output using the XV301 chip. Furthermore, the Volari V5/V8 chips have integrated Cipher video processor and a hardware screen rotation technology support.
A very interesting peculiarity of the Volari V5 and V8 chips is their ability to work in dual mode - the so-called XGI Volari Duo technology. This allows providing the end-users two times more speed compared to single-chip products, but leads to pretty high production costs and possible issues with drivers. Historically there were a few consumer graphics cards with a number of graphics processors on them, those that are well-known are Obsidian Voodoo2 SLI, 3dfx Voodoo5 5500 and ATI Rage Fury MAXX. The Voodoo2 SLI was too expensive for general public and was not adopted by the mass market due to obvious reasons. The Voodoo5 5500 was priced at $399, but it cost just too lot for 3dfx itself still being slower compared to the GeForce2 GTS. The Rage Fury MAXX was left behind by the original GeForce256, had massive issues with drivers and some other troubles. After the collapse of the MAXX, ATI dropped the idea of dual-chip RADEON 256-based graphics cards. All-in-all, historically there were no truly successful dual-chip graphics solutions, but XGI still wants to have its try in the segment…
No actual graphics cards were announced together with the GPUs.