News
 

Bookmark and Share

(0) 

High-ranking executives of Nvidia Corp. said during various interviews that even though the company does not have up-to-date graphics card carrying two graphics processing units (GPUs), the company does not rule out a situation when its flagship will again feature two chips. The reason why Nvidia did not produce a multi-chip graphics board based on the latest GPU is exceptionally expensive and consumes too lot of power.

In the most recent years ATI, graphics product group of Advanced Micro Devices, concentrated on improving its multi-GPU technology to address the market of ultra high-end graphics sub-systems either with several graphics cards or with dual-chip graphics boards. Meanwhile, Nvidia Corp. decided to focus on large monolithic graphics chips, which are quite hard to develop and manufacture. According to ATI, two relatively small GPUs can be faster and cheaper to make, whereas Nvidia disagrees, claiming that multi-GPU technologies rely on software optimizations and do not work in all video games. Nevertheless, Nvidia’s previous flagship product was Nvidia GeForce 9800 GX2, which carried two chips.

After ATI recaptured the market of high-end graphics adapters with its ATI Radeon HD 4870 and 4850  X2 recently, Nvidia started to criticize the multi-GPU board again, however, it did not rule out possibility of launching something similar in future to re-capture the premium market segment.

“We have got nothing against GX2s and recently, we just had another GX2 with the 9800 GX2. It has its advantages and its disadvantages, so I don’t know that there’s any particular philosophical approach that we take here. We just have to look at the market and build the right product. […] We think our approach is the right approach. The best approach is to do both. If we could offer a single chip solution at $399, it certainly does not preclude us from building a two-chip solution at something higher,” said Jen-Hsun Huang, chief executive of Nvidia, during a conference call with financial analysts.

But it will tough for Nvidia to create a dual-chip graphics board using code-named G200 GPU: the GeForce GTX 280 chip consumes about 180W and two of such processors would consume about 360W, an amount that by far exceeds actual power consumption of an office personal computer. Even the cut-down version of the G200, the GTX 260, consumes about 136W, which is just slightly more compared to ATI Radeon HD 4870 that is manufactured using 55nm fabrication process; however, the G200 is considerably more expensive to manufacture due to less advanced 65nm process technology, which is a reason why Nvidia cannot offer its top-of-the-range single-chip solution for $399, like Mr. Huang said.

Still, Nvidia does not skip a possibility to advertise multi-GPU capability of its single-chip graphics cards that can offer performance similar to dual-GPU graphics cards from ATI/AMD.

“Is that [ATI Radeon HD 4870 X2] product the best product you can buy for a PC. A lot of people care about power consumption. Two GTX 260s have lower power consumption than one 4870 X2. In the Asia-Pacific region, for example, they’re sensitive about power consumption,” said Bryan Del Rizzo, an Nvidia spokesman, in an interview with Hexus.net web-site.

In fact, according to measurements of X-bit labs, Nvidia GeForce GTX 260 consumes 136W (which means no less than 272W power consumption in case of two graphics cards, not to count power consumption of Nvidia’s high-end chipsets that are compulsory for operation of Nvidia SLI multi-GPU technology), whereas ATI Radeon HD 4870 X2 has power consumption of around 280W, just a little bit compared to competing tandem. Besides, the GeForce GTX 260 cost $299, or $600 in case of a pair that requires an expensive Nvidia nForce SLI-based mainboard, while ATI Radeon HD 4870 X2 retails for $549 and can operate on any platform.

Potentially, Nvidia will be able to create a high-performance up-to-date dual-GPU graphics card using 55nm version of the code-named G200 chip, which is expected to arrive later this year, according to Mr. Huang, who recently said that by the end of fiscal year 2009 all the company’s graphics processors will be made using 55nm process technology. However, pricing of such card may be exceptionally high since 55nm tech does not lead to considerable .

“Hopefully, by the end of the year, we should be top-to-bottom 55nm,” Mr. Huang said.

Discussion

Comments currently: 0

Add your Comment




Related news

Latest News

Monday, July 21, 2014

12:56 pm | Microsoft to Fire 18,000 Employees to Boost Efficiency. Microsoft to Perform Massive Job Cut Ever Following Acquisition of Nokia

Tuesday, July 15, 2014

6:11 am | Apple Teams Up with IBM to Make iPhone and iPad Ultimate Tools for Businesses and Enterprises. IBM to Sell Business-Optimized iPhone and iPad Devices

Monday, July 14, 2014

6:01 am | IBM to Invest $3 Billion In Research of Next-Gen Chips, Process Technologies. IBM to Fund Development of 7nm and Below Process Technologies, Help to Create Post-Silicon Future

5:58 am | Intel Postpones Launch of High-End “Broadwell-K” Processors to July – September, 2015. High-End Core i “Broadwell” Processors Scheduled to Arrive in Q3 2015

5:50 am | Intel Delays Introduction of Core M “Broadwell” Processors Further. Low-Power Broadwell Chips Due in Late 2014