News
 

Bookmark and Share

(2) 

Despite the fact that Intel Corp.’s code-named Larrabee discrete graphics processing unit (GPU) largely remains a mystery, Nvidia Corp. constantly criticizes Intel’s technology and predicts the lack of its business success. According to Nvidia, x86 instruction set compatibility will hardly help Larrabee to provide benefits and will also not allow Intel to create a truly high-performance graphics chip.

“Intel’s chip is lugging along this x86 instruction set, and there is a tax you have to pay for that. I think their argument is mostly a marketing thing,” said William Dally, chief scientist at Nvidia, in an interview with the New York Times news-paper.

Intel claims that compatibility with x86 instructions will help Larrabee to better suit for the so-called general purpose processing on graphics processing units (GPGPU), including calculations of physics effects, artificial intelligence and so on. However, since x86 processing cores are larger in terms of physical dimension compared to proprietary cores inside ATI Radeon or Nvidia GeForce GPUs, the number of such cores will be lower, hence, total horsepower of Larrabee may be lower compared to traditional GPUs that will compete with them.

It is interesting to note that ATI, graphics products group of Advanced Micro Devices (which primary business is creation of microprocessors) also does not see many benefits of x86 for graphics. According to the company, at this time it makes sense to continue using proprietary cores incompatible with x86 commands.

Intel did not react on the comments issued by Nvidia.

Tags: , Intel, Larrabee, GPGPU, Nvidia, Geforce, ATI, Radeon, AMD, x86

Discussion

Comments currently: 2
Discussion started: 04/13/09 10:25:36 PM
Latest comment: 04/13/09 11:14:14 PM

[1-2]

1. 
Only the time will tell. But remember, If Larabee will be slower but still close to the nvidia/ATI gpu's, it's succes could be guaranteed since being in fact a x86 GPU, theoreticaly, all the existing applications could take advantage of the huge extra power. Think about it.
0 0 [Posted by: TAViX  | Date: 04/13/09 10:25:36 PM]
Reply

2. 
I don't care if it's NVidia casting doubt on Intel's architecture or Intel hyping up their own products. Until I see 3rd party benchmarks these engineers, CEOs and etc. are just blowing hot air.
0 0 [Posted by: phatboye  | Date: 04/13/09 11:14:14 PM]
Reply

[1-2]

Add your Comment




Related news

Latest News

Tuesday, July 15, 2014

6:11 am | Apple Teams Up with IBM to Make iPhone and iPad Ultimate Tools for Businesses and Enterprises. IBM to Sell Business-Optimized iPhone and iPad Devices

Monday, July 14, 2014

6:01 am | IBM to Invest $3 Billion In Research of Next-Gen Chips, Process Technologies. IBM to Fund Development of 7nm and Below Process Technologies, Help to Create Post-Silicon Future

5:58 am | Intel Postpones Launch of High-End “Broadwell-K” Processors to July – September, 2015. High-End Core i “Broadwell” Processors Scheduled to Arrive in Q3 2015

5:50 am | Intel Delays Introduction of Core M “Broadwell” Processors Further. Low-Power Broadwell Chips Due in Late 2014

Wednesday, July 9, 2014

4:04 pm | Intel Readies New Quark “Dublin Bay” Microprocessors. Intel’s “Dublin Bay” Chips Due in 2015