Bookmark and Share


Intel Corp. said on Friday that due to constant delays and issues, the first-generation Larrabee graphics processor will be cancelled. As a result, it will not release its first discrete graphics processing unit (GPU) in the decade next year, as promised. It remains to be seen whether Intel will ever release its Larrabee graphics processor going forward.

“Larrabee silicon and software development are behind where we hoped to be at this point in the project. As a result, our first Larrabee product will not be launched as a standalone discrete graphics product. Rather, it will be used as a software development platform for internal and external use,” said Nick Knupffer, a spokesman for Intel.

Larrabee graphics processor has been delayed for years and it is hardly a surprise that Intel has decided to cancel the initial chip. It is more than likely that actual performance of the product in video games was considerably slower than the company hoped and was behind ATI Radeon HD 5800-series or Nvidia GeForce “Fermi” GF100 graphics cards.

Back in September ‘09 Intel showed off working Larrabee graphics processors for the first time. The company claimed that the main advantage of Larrabee is its ability to program the whole rendering pipeline, something, which is not possible even on latest DirectX 11 graphics processing units. Unfortunately, Intel could not demonstrate any tangible advantages of Larrabee back the and decided to show off a rather outdated ray-tracing demo during the first public showcase of Larrabee, but not a modern video-game.

Intel Larrabee used multiple in-order x86 CPU cores that were augmented by a wide vector processor unit, as well as fixed-function co-processors, which were most likely to be texture addressing units, texture filtering units, render back ends and other necessary parts of a modern graphics processing units (GPUs). According to Intel, incorporation of in-order Atom-like x86 cores into a massively parallel stream processing unit provided “dramatically higher performance per watt and per unit of area than out-of-order CPUs on highly parallel workloads” and also “greatly increased the flexibility and programmability” of the architecture as compared to standard GPUs.

Intel has been stressing for about two years now that Larrabee can be programmed the same way as a central processing unit, which indisputably added flexibility, but was likely to be used only by a handful of programmers, simply due to the fact that modern video game consoles – for which the vast majority of games is developed and from where they are ported to personal computers – cannot offer the same functionality as Larrabee.

After years of talking about Larrabee and the importance of x86 instruction set on the graphics market, it looks like Intel wants to take a break and reconsider its attitude towards graphics processing. It remains to be seen whether the company’s new Larrabee will actually support x86 and will sacrifice certain special purpose functions in favour of ultimate programmability, but at the cost of performance handicaps.

“We remain committed to delivering world-class many-core graphics products to our customers,” added Mr. Knupffer.

Tags: Intel, Larrabee, GPGPU, 32nm


Comments currently: 1
Discussion started: 12/02/10 03:06:01 PM
Latest comment: 12/02/10 03:06:01 PM


anyway, i don't think there was somebody that was actually waiting for that thing... so they can delay it until the end of the world:D groom wedding speech
0 0 [Posted by:  | Date: 12/02/10 03:06:01 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture