Intel Corp. does not currently see many advantages that memory controller built-into central processing units (CPUs) would bring, as the firm has managed to increase performance without integration of memory controller with its forthcoming Core 2 processors. Nevertheless, in future the firm would integrate the appropriate circuits into its CPUs, moreover, it could add graphics capabilities to its chips, according to the company.
Speaking at a conference June 13, 2006, in
Intel admitted that built-in memory controller helps to reduce memory access latencies and eliminate memory controller hub as an additional component. But the world’s largest maker of microprocessors defended its current stance saying that by incorporating larger caches – on-chip memory pools – it can reduce memory latency impacts. At the same time, building memory controller into processor results in increased die size and power consumption of the CPU and reduces flexibility between supported memory types.
Nevertheless, according to eWeek web-site, Mr. Bhandarkar indicated that his company “probably” would “put the memory controller on the chip at some point”, though without giving any details or timeframs. In the same vein, Intel also “is looking” at integrating the graphics controller with the processor, but again, no timetable has been set.
Rumours about Intel’s plans to incorporate memory controller into processor have been floating around for a couple of years now, even though no one knows what future memory chip from Intel will get the feature. At the same time this is the first time, when the company speaks about building graphics core – which would obviously result in increased die size, transistor count and power consumption of the CPU – into microprocessors after the failure or project code-named Timna, which contained Intel Pentium III core, memory controller and graphics controller on chip.
Obviously, a project that revitalizes the idea which is about eight years old may be targeted primarily at low-end personal computers. For example, it could enable very affordable systems for developing countries. However, such an all-in-one chip would never boast with support for latest technologies.