Bookmark and Share


Intel’s code-named Nehalem processors have been projected to tangibly change Intel-based personal computer (PC) architecture thanks to new micro-architecture and built-in memory controller. But the actual transformation of Intel PC platform may be even more dramatic, as the forthcoming chips will have built-in graphics core and PCI Express support.

The first version of the Nehalem processor family will be the code-named Bloomfield chip aimed at extreme desktops/workstations, high-end desktops and servers as well as on some other quite expensive systems. The central processing unit (CPU) will have built-in triple-channel memory controller, will use Quick Path Interconnect (QPI) bus to connect to other chips within the system and will utilize LGA1366 form-factor. In general, Bloomfield-based PC platforms will resemble traditional AMD64 system architecture with CPU, North Bridge and I/O controller.

But microprocessors from the Nehalem family aimed at mainstream market – code-named Lynnfield and Havendale – will have substantial differences compared to Bloomfield, which will catalyze dramatic changes to PC architecture going forward, reports PC Watch web-site.

Intel’s Lynnfield processor is a Nehalem micro-architecture-based monolith quad-core microprocessor in LGA1160 form-factor with dual-channel DDR3 memory controller as well as PCI Express 2.0 x16 interface to connect add-on graphics cards.

Intel’s Havendale processor is multi-chip module (MCM) in LGA1160 form-factor containing Nehalem micro-architecture-based dual-core CPU as well as graphics and memory controller hub (GMCH) that features dual-channel DDR3 memory controller, PCI Express 2.0 x16 interface to connect add-on graphics cards as well as integrated graphics core. It is projected that both chips on the MCM are made using 45nm process technology.

Since both Lynnfield and Havendale have memory controller as well as PCI Express interconnection inside, there will be no need for GMCH (or North Bridge) on the mainboard. Instead, the new processors will connect directly to code-named Ibexpeak platform controller hub (PCH) that will carry hard drive controller, wired and wireless network controllers, monitor physical interfaces, PCI controller and other input/output as well as platform-related capabilities.

If today’s mainstream personal computers usually employ three chips that feature the core functionality of the system – CPU, (G)MCH and I/O controller – then in the Nehalem era mainstream systems will be based only on two chips: CPU and PCH. Both Lynnfield and Havendale are projected to emerge in the first half of 2009.

Intel officials did not comment on the news-story.


Comments currently: 26
Discussion started: 11/29/07 06:55:35 AM
Latest comment: 11/30/07 11:43:08 AM
Expand all threads | Collapse all threads


>yah... these chips will surely get AMD pissing in their pants!

If all AMD stuff goes falls in coma. Also if you know intel, and you dont, it will probably be nothing special.
0 0 [Posted by:  | Date: 11/29/07 08:28:07 AM]
- collapse thread

AMD is way too slow to make tweaks and enhancements to their products... take for example A64 and X2... it took them years (almost 5 yrs?!? give or take) to ramp up speed from 2ghz to 3ghz...

with the phenom design still buggy... and their 65nm SOI isnt working as they expected.... it may take a year or two to see the actual potential of the Phenom... even AMD's 45nm processing will definitely be delayed with the way they are progressing....

Nehalem on the other hand... will still get its roots from the CORE 2 architecture except for the integrate memory controllers.... the CORE 2 is showing great potentials in 45nm....

if intel gets their memory controller right... it will blow away AMD's advantage on most memory bandwdth benchmarks... and it might level the playing field on the server side....

but still the CORE 2 based XEONS are already blowing up smoke on the new opteron's a$$....
0 0 [Posted by:  | Date: 11/29/07 10:32:39 AM]
The K8 core is very old and a new processor beating it is what to be expected. People need to understand that old hardware will be slower than the new hardware. Also AMD is spreading their resources too thin and their employees seem to be assholes or the executitives are providing a bad work environment for the employees.

AMD goes for the performance rating rather than increasing the clock. Increasing the clock just adds more heat.

Sure smaller micro level does better with clock, but you do not see faster clocked processors for each smaller scale that the processor manufactures convert to. Intel rarely optimize their processors for each scale. They just keep going smaller.
0 0 [Posted by:  | Date: 11/29/07 11:16:59 AM]
quote: AMD goes for the performance rating rather than increasing the clock.

that is crap... clock speed is still a major factor in chip performance... AMD goes performance rating?!? the Phenom X2s have been benchmarked and they are barely better than A64 X2s clock per clock.... you call that performance rating?!? It's actually called pathetic...
0 0 [Posted by:  | Date: 11/29/07 07:52:32 PM]
Say what you want to say. These days is all about performance ratings not clock.
0 0 [Posted by:  | Date: 11/30/07 03:35:50 AM]

""Both Lynnfield and Havendale are projected to emerge in the first half of 2009.""

Er..... I think that's wrong.... Don't they mean the first half of 2008?
If I remember Intels' roadmap, the first Nahalems come next summer, then the 35nm parts come in early 2009.

Good article, but a little too technical for average dummies like me to understand. Just say it---- These chips rock!
0 0 [Posted by:  | Date: 11/29/07 10:33:15 AM]

As always Intel provides a lot a acronyms instead of saying System-on-Chip. Also they add their proprietary connection to external devices which is closed. Hypertransport is a lot better, faster, and it is open. Intel including a graphic controller will introduce more problems such as heat. In the past Intel graphics lacks reliability in 3D applications and other operating systems. Also they have cut corners for limited resolutions. As with all Intel products, it takes two chances to get things just right.
0 0 [Posted by:  | Date: 11/29/07 11:39:48 AM]
- collapse thread

"Intel including a graphic controller will introduce more problems such as heat. "

What kind of crack are you using?!? since when did a system with integrated graphics ever get hotter than a system with discrete graphics cards?!?

the answer idiot is none!!! most laptops are even installed with built-in graphics because they only require low power! Integrated graphics wasn't desgined for intensive 3d applications! Even those from ATI solutions for cheap AMD laptops (like X1250) actually suck!!!

besides who in the right mind uses a 3D application on a low-end integrated solution when its know that they can't handle it?!? Oh... yah... that would be only you!!
0 0 [Posted by:  | Date: 11/29/07 08:00:59 PM]
I do not use integrated graphics. I use discrete graphics for my notebook. People are jealous that it can get 5 to 6 hours on battery. Though both ATI and nVidia provides better integrated graphics than Intel ever has to offer.

There are a lot of idiots still run 3D on a integrated graphic chipset. Yes they get hot. You need to go to a real reviewer instead of some lame game reviewer that just came out of a grade school.
0 0 [Posted by:  | Date: 11/30/07 03:49:49 AM]

Right now we heard nothing of intel's success about intergrated memory controller in the chip which they tried long ago and failed. Anyway, let's say they'll succeed this time, shall we also expect them to succeed into:

[1] going triple channel
[2] implementing graphical chips?!
[3] integrating pci-e controllers?!
[4] going native quad cores instead of 2 duals stuck on the same chip?!
[5] be performing as expected
[6] no frequency limit due to the apparent increase in transistor count and the complexity of such architecture.

all within 1.5 years max?!!!!! that's just crazy talk IMO, though i would really LOVE to be proven wrong.

Another thing, how much will a high end motherboard cost?! 50$?! that'll be interesting to see.

Another thing, i hate the part where high end chips are supporting triple channel while mid-range/entery-level are going dual channel.
0 0 [Posted by:  | Date: 11/29/07 10:47:38 PM]

@I\'m an AMD Fanboy and I\'m crap!
Good for you.

Back on Topic:
Actually i also believe this is too big to be done within 1 year or even 1.5 years, though only time will tell.

If this is done overcoming the points "This is just too big to be even accepted" mentioned, i think this might become the greatest year in the history of microprocessors EVA

0 0 [Posted by:  | Date: 11/30/07 05:08:02 AM]
- collapse thread

dude... have you been reading the news a few months ago???

Intel has already been taped-out!!! that news was last september.

Taped-out - it means that engineering samples or prototypes have already been sent to their fabs for validation and testing. After validation and testing, their foundries are re-fitted for production...
0 0 [Posted by:  | Date: 11/30/07 07:47:51 AM]

i read that, but appartently i totally forgot about it ... my bad

though that leave us waiting for:
-integrated graphics
-pci-e 2.0 support
0 0 [Posted by:  | Date: 11/30/07 10:22:59 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture