News
 

Bookmark and Share

(0) 

High-ranking executives of Nvidia Corp. said during various interviews that even though the company does not have up-to-date graphics card carrying two graphics processing units (GPUs), the company does not rule out a situation when its flagship will again feature two chips. The reason why Nvidia did not produce a multi-chip graphics board based on the latest GPU is exceptionally expensive and consumes too lot of power.

In the most recent years ATI, graphics product group of Advanced Micro Devices, concentrated on improving its multi-GPU technology to address the market of ultra high-end graphics sub-systems either with several graphics cards or with dual-chip graphics boards. Meanwhile, Nvidia Corp. decided to focus on large monolithic graphics chips, which are quite hard to develop and manufacture. According to ATI, two relatively small GPUs can be faster and cheaper to make, whereas Nvidia disagrees, claiming that multi-GPU technologies rely on software optimizations and do not work in all video games. Nevertheless, Nvidia’s previous flagship product was Nvidia GeForce 9800 GX2, which carried two chips.

After ATI recaptured the market of high-end graphics adapters with its ATI Radeon HD 4870 and 4850  X2 recently, Nvidia started to criticize the multi-GPU board again, however, it did not rule out possibility of launching something similar in future to re-capture the premium market segment.

“We have got nothing against GX2s and recently, we just had another GX2 with the 9800 GX2. It has its advantages and its disadvantages, so I don’t know that there’s any particular philosophical approach that we take here. We just have to look at the market and build the right product. […] We think our approach is the right approach. The best approach is to do both. If we could offer a single chip solution at $399, it certainly does not preclude us from building a two-chip solution at something higher,” said Jen-Hsun Huang, chief executive of Nvidia, during a conference call with financial analysts.

But it will tough for Nvidia to create a dual-chip graphics board using code-named G200 GPU: the GeForce GTX 280 chip consumes about 180W and two of such processors would consume about 360W, an amount that by far exceeds actual power consumption of an office personal computer. Even the cut-down version of the G200, the GTX 260, consumes about 136W, which is just slightly more compared to ATI Radeon HD 4870 that is manufactured using 55nm fabrication process; however, the G200 is considerably more expensive to manufacture due to less advanced 65nm process technology, which is a reason why Nvidia cannot offer its top-of-the-range single-chip solution for $399, like Mr. Huang said.

Still, Nvidia does not skip a possibility to advertise multi-GPU capability of its single-chip graphics cards that can offer performance similar to dual-GPU graphics cards from ATI/AMD.

“Is that [ATI Radeon HD 4870 X2] product the best product you can buy for a PC. A lot of people care about power consumption. Two GTX 260s have lower power consumption than one 4870 X2. In the Asia-Pacific region, for example, they’re sensitive about power consumption,” said Bryan Del Rizzo, an Nvidia spokesman, in an interview with Hexus.net web-site.

In fact, according to measurements of X-bit labs, Nvidia GeForce GTX 260 consumes 136W (which means no less than 272W power consumption in case of two graphics cards, not to count power consumption of Nvidia’s high-end chipsets that are compulsory for operation of Nvidia SLI multi-GPU technology), whereas ATI Radeon HD 4870 X2 has power consumption of around 280W, just a little bit compared to competing tandem. Besides, the GeForce GTX 260 cost $299, or $600 in case of a pair that requires an expensive Nvidia nForce SLI-based mainboard, while ATI Radeon HD 4870 X2 retails for $549 and can operate on any platform.

Potentially, Nvidia will be able to create a high-performance up-to-date dual-GPU graphics card using 55nm version of the code-named G200 chip, which is expected to arrive later this year, according to Mr. Huang, who recently said that by the end of fiscal year 2009 all the company’s graphics processors will be made using 55nm process technology. However, pricing of such card may be exceptionally high since 55nm tech does not lead to considerable .

“Hopefully, by the end of the year, we should be top-to-bottom 55nm,” Mr. Huang said.

Discussion

Comments currently: 0

Add your Comment




Related news

Latest News

Sunday, August 24, 2014

6:12 pm | Former X-Bit Labs Editor Aims to Wed Tabletop Games with Mobile Platforms. Game Master Wants to Become a New World of Warcraft

Thursday, August 21, 2014

10:59 pm | Khronos Group to Follow DirectX 12 with Cross-Platform Low-Level API. Khronos Unveils Next-Generation OpenGL Initiative

10:33 pm | Avexir Readies 3.40GHz DDR4 Memory Modules. DDR4 Could Hit 3.40GHz This Year

12:10 pm | AMD to Lower Prices of A-Series APUs for Back-to-School Season. New Prices of AMD A-Series APUs Revealed

Wednesday, August 20, 2014

10:53 am | AMD to Cut Prices on FX-9000, Other FX Processors: New Prices Revealed. AMD to Make FX Chips More Affordable, Discontinue Low-End Models

10:32 am | LG to Introduce World’s First Curved 21:9 Ultra-Wide Display. LG Brings Curved Displays to Gamers, Professionals

9:59 am | AMD Readies FX-8370, FX-8370E Microprocessors. AMD Preps Two New “Mainstream” FX Chips