Enter the Scene, Intel
Intel Corp, the world’s main maker of microprocessors, has always been very aggressive across all markets, trying to become a leader in any segment. But the company has been very pragmatic too: it decided to leave the market of standalone graphics chips in 1999 and it basically exited the flash and special purpose XScale processors businesses in 2006.
But Intel knows that without further development there comes stagnation and, according to sources familiar with the matter, is going to either enter the market of discrete graphics processors, or, at least, put nearly extreme efforts into the development of built-in graphics cores to make them much more competitive against discrete solutions.
Intel has been re-creating itself from a chipmaker into a provider of platform solutions for 1.5 years now and even changed its logotype to emphasize that Intel is not a provider of something “Inside”, but everywhere. To highlight the changes and to stress its rapid technology development, the world’s main producer of microprocessors introduced “Leap Ahead” motto and campaign early in 2006. In addition, Intel started to enter the market of consumer electronics by bringing-in Viiv platform, which effectively takes Intel brand-name to living rooms.
By the end of 2008, Intel may have 7 fabs producing chips using either 45nm, 65nm or even more advanced process technologies using 300mm wafers. While Intel has not provided any guidance regarding its manufacturing capacity growth or actual manufacturing abilities (thousands of wafers per month), 7 fabs is a formidable amount of high-tech manufacturing facilities, which will allow the Santa Clara, California-based chipmaker to fabricate astonishing quantities of CPUs, core-logic sets and other chips.
All in all, Intel is trying to sell as many chips for one computer as possible: a processor, a core-logic set, a wireless network controller and so on. Why not produce a discrete graphics processor and enter a rather lucrative market of graphics cards?
Even though nothing is known about Intel’s possible strategy on the market of discrete GPUs, entering this market might allow the company to achieve several goals:
- Load the fabs to the maximum level;
- Sell more chips in general;
- Sell rather lucrative graphics chips;
- Gain experience to create platforms for game consoles;
- Gain experience to create platforms and technologies for handhelds;
- Gain experience to create solutions for consumer electronics;
- Provide its customers all-in-one solution, consisting of processor, chipset, wireless network controller, graphics controller;
- Increase its influence on the PC industry;
What the world’s largest chipmaker – and other companies in the industry – knows for sure, is that the market of discrete audio processors has been essentially destroyed by Intel, when its integrated audio capabilities reached the level that satisfies casual consumers. While the demand towards the number of discrete graphics processors gradually rises, their market share shrinks and at some point in future developers of high-end discrete graphics processors may find themselves in a tough situation: they may sell very advanced chips, but they will spend too much on the development of technologies used in them. The latter means that those companies will be balancing between profits and losses, like the creators of luxurious sport cars today, for example, Bugatti.
Certainly, designers of graphics processors may develop their chipsets and chips for other markets, consumer electronics, handhelds and so on, but this means that they must refocus their companies and pay less and less attention to the discrete graphics chips businesses.
The main question right now is whether it makes sense for Intel to enter the discrete GPUs market or not. The answer is yes, provided that Intel will be able to offer a lineup of discrete processors, if not, the company would only be able to capture very niche markets among its devoted partners. Certainly, if the company launches its discrete GPUs, it will have the aforementioned advantages. On the other hand, there is always a threat that the market of standalone GPUs will erode and while R&D investments will not vanish, profits will be negligible.