Bookmark and Share


A highly-ranking executive of Intel Corp. said in an interview that the vast majority of customers hardly need high performance graphics. While  it is clear that by far not everyone plays video games, the claim still reminds the phrase about 640 kilobytes of RAM being enough for everyone.

“When people think about graphics, they think about 3D war games and more realism. I'm not going to dismiss this, but (this market) attracts a relatively small amount of people. I think what a significant portion of consumers really want is media,” said Intel executive vice president Dadi Perlmutter, in an interview with Cnet web-site.

The media is really what the vast majority of consumers want and need. They need decoding of highly-demanding Blu-ray (MPEG4-AVC, MPEG4-MVC, VC1) streams along with advanced playback of Adobe Flash 10.1 video, which is something Intel’s  integrated graphics cannot provide. The biggest chipmaker admits that its chips do not offer certain multimedia functionality, however Intel does not seem to consider it a major drawback.

“To be fair, in the past few years, other than this year, AMD with ATI had a better integrated graphics solution than Intel,” said Mr. Perlmutter.

One man said around thirty years ago that 640KB or random access memory would be enough for everyone. Today, even 4GB of RAM may not be enough, but Bill Gates is the richest man on the planet.

Tags: , Intel, Larrabee, AMD, ATI, Radeon, Nvidia, Geforce, GPGPU


Comments currently: 11
Discussion started: 03/07/10 04:59:48 PM
Latest comment: 03/10/10 09:53:16 AM
Expand all threads | Collapse all threads


He's right.

This doesn't mean that there is a limit on graphics. As long as Moore's law goes on it graphics power will approximately rise with it.

At the current point in time the main purpose of discrete cards is for graphically intensive games. Even for games, improving graphics gives very little improvement in the quality of games; the important thing is the idea and implementation, not realism in general.

However when GPGPU becomes more and more widespread we'll see if integrated graphics can cope or whether such applications will often benefit from discrete cards.
0 0 [Posted by: CSMR  | Date: 03/07/10 04:59:48 PM]

He's right indeed, and I 'd even go further: not only do pretty much only gamers need graphics cards, but GPGPU is a desperate attempt by GPU makers to become relevant for other uses. Let's face it: more than 2 cores are useless for 95% of users, so are graphics cards, and so is GPGPU.

I'm even thinking of replacing my desktop PC with an ARM-based Plug (or EM0501), because I feel bad about wasting all those CPU cycles.
0 0 [Posted by: obarthelemy  | Date: 03/07/10 05:43:10 PM]

Alright, time for me to play devil's advocate. Hopefully I air some of the thoughts of those that read this article that disagree with the above two posters.

I would argue the way things are going, floating point operations are, and are going to be, of immense value moving forward. A GPU has these in spades, while a CPU does not. As we crowd the future of web 3.0 and 1080p streaming video for things even as trivial as homemade YOUTUBE video, not to mention other in-browser applications, these will be needed for anyone wanting more than a single tab open or to experience that future.

Add to this that even Joe Six-pack would like to be able to throw in a new game without the worry of hampering with their native resolution. It should JUST WORK. I would argue that today's native resolution for the masses is 1366x768; something Intel graphics simply cannot accomplish with their current transistor percentage allotment. I believe there is a strong argument to made for the culling of the lowest tier (64-bit) of graphics cards and incorporating them onto the CPU die. There is no reason this couldn't and shouldn't take place.

I think AMD has precisely the right idea with Fusion. With the stagnation and questionable usefulness for the masses at anything above four cores, something useful should be done with the extra space afforded by smaller process technologies. I completely understand many may disagree, but one cannot argue that the least common denominator being increased makes life not only more enjoyable for the masses, but spurs innovation while creating a better base experience for those that choose to opt for more.
0 0 [Posted by: turtle  | Date: 03/07/10 07:09:07 PM]

It's oft-repeated, but Gates never actually said that.


QUESTION: "I read in a newspaper that in l981 you said '640K of memory should be enough for anybody.' What did you mean when you said this?"

ANSWER: "I've said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough for all time."

Gates goes on a bit about 16-bit computers and megabytes of logical address space, but the kid's question (will this boy never work at Microsoft?) clearly rankled the billionaire visionary.

"Meanwhile, I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again."
0 0 [Posted by: bluvg  | Date: 03/07/10 10:28:21 PM]

We're not disagreeing: fusion is not discrete graphics. Intel's integrated graphics, currently "Intel HD graphics", takes the same approach and is becoming quite good.
Integrated graphics can support high resolutions (including 4+ megapixels common on 30"displays) and more than one monitor. The problem again is games; maybe they need to become more resolution-independent.
0 0 [Posted by: CSMR  | Date: 03/08/10 06:09:01 AM]
- collapse thread

I think GPU capabilities for processing complex tasks will become ever more important. We are seeing the tip of the iceberg here and its HUGE.
0 0 [Posted by: beck2448  | Date: 03/08/10 10:04:00 AM]

I'd like to present this scenario to people at Intel:
1. People stop to buy high end GPUs for games
2. If you don't play games on a PC you also don't need high end (expensive) CPU.
3. For everything else you can buy cheaper but good enough AMD CPUs.

0 0 [Posted by: zaratustra06  | Date: 03/08/10 02:26:47 PM]

"The media is really what the vast majority of consumers want and need. They need decoding of highly-demanding Blu-ray (MPEG4-AVC, MPEG4-MVC, VC1) streams along with advanced playback of Adobe Flash 10.1"

What a bunch of carcinogenic e.coli polluted baloney this guy serves. It resembles on all that ray tracing "vividly" pictures hype they claimed as next best thing that could be done only wit their software pipeline ... and in fact all that fuzzy furry things that RT bloggies claim as future in late 2007 we could enjoy since 2002. with r300 aka. r9700 inside :wink:

They sell whatever crap they want ... and all we cold do is obey. Thank god that they still didn't deliver that software optimized rendering kind-of-crap. It's just another phishy PhsyX war on horizon. And when some of us thought that physic emulation hype would brought us some good back in 2005.

And btw. we have all that media decoding crap in our current chipsets if we use amd/nvidia and briliant intel apu is way behind wasting cpu cycles for "gpus enhanced" media decoding. So x264 is impact on resources only with GMA crap inside since 2006 gma965 that claimed 720p decoding ... well my 4y athlonXP 2600+ was capable doing the same in that time (only 1 poor 2GHz core and single channel ddr333 memory)
0 0 [Posted by: OmegaHuman  | Date: 03/09/10 04:00:24 AM]

Sure and no one needs no more than 640KB of ram. The addon GPU will be around for many years to come for gaming, CAD, publishing, and so MUCH more so when Intel makes such a foolish statement as this if I were a stock holder I would be questioning the value of my investments with this company. Microsoft is a bottom line company and every one knows how the quality of it products have declined over the past decade in terms cleaning out bug prior to launch while Apple ditched one advantage that it had over the PC that being power PC that had a stronger FPU and more efficient pipe line stage. Sure price was awful on those power macs but you got what you paid for and the Cube was and still is cute even for me as a PC user (sill want one). Plus I don't see any single die cpus out there that can do 1-2 Tflop.
0 0 [Posted by: nforce4max  | Date: 03/10/10 04:57:18 AM]

Wait people don't need high end graphics interesting then why was Intel trying to create Larrabee? When it comes down to it a high end gpu is about as needed as a high end cpu and really both are overkill for the average PC user.
0 0 [Posted by: knowom  | Date: 03/10/10 08:24:27 AM]

Interesting, well is true: SUPER GPUs are useless, but also the 90 % of modern desktop computer technologies are. AMD Athlon II x2/x3/x4 are overpowered and then all the Intel Nahalem family is simply useless. DDR 400 are out there still today and do the job, DDR2 800/1066 are more than enough, DDR3 are just useless. Then the true is that a Sempron 140 with 1GB of DDRII, a 780g or a Geeforce8200 and at least 500gb of Hd is just enough for the 70% of the "normal" audience, Intel Core2Duo and Amd Athlon II x2 are just more than they need. The rest is only for enthusiasts and for benchmarks addicted; And I'm in this section.

That's the truth about desktop sector. Workstations needs more Cores, more memories, more Hardisks. Servers are an universe apart.

Dear Intel (the same for MICROSOFT/APPLE/AMD/NVIDIA and Co.) don't be too pragmatic,please.
0 0 [Posted by: Akula  | Date: 03/10/10 09:53:16 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture