Bookmark and Share


The very first details about the actual microprocessors based on code-named Haswell micro-architecture for mainstream desktops and notebooks have emerged on the Internet. Instead of increasing the number of cores inside its microprocessors, Intel Corp. will continue to improve efficiency to boost performance amid aggressive lowering of power consumption of chips.

Intel Haswell microprocessors for mainstream desktops and laptops will be structurally similar to existing Core i-series "Sandy Bridge" and "Ivy Bridge" chips and will continue to have two or four cores with Hyper-Threading technology along with graphics adapter that shares last level cache (LLC) with processing cores and works with memory controller via system agent, according to a slide (which resembles those from Intel) published by ChipHell web-site. On the micro-architectural level the chip will be a lot different: its x86 cores will be based on the brand new Haswell micro-architecture and its graphics engine based on Denlow architecture will support such new features as DirectX 11.1, OpenGL 3.2+ and so on.

The processors that belong to the Haswell generation will continue to rely on dual-channel DDR3/DDR3L  memory controller with DDR power gating support to trim idle power consumption. The chip will have three PCI Express 3.0 controllers, Intel Turbo Boost technology with further improvements, power aware interrupt routing for power/performance optimizations and other improvements. What is important is that Haswell-generation chips will sport new form-factors, including LGA 1150 for desktops as well as rPGA and BGA for laptops.

The new processors for mobile applications will continue to have thermal design power between 15W and 57W (15W, 37W, 47W and 57W) for ultra low-voltage and extreme edition models, respectively; while desktop chips will have TDP in the range between 35W and 95W, just like today. However, in a bid to open the doors to various new form-factors, such as ultrabooks, Intel implemented a number of aggressive measures to trim power consumption further even from the levels of Ivy Bridge, including power aware interrupt routing for power/performance optimizations, configurable TDP and LPM, DDR power gating, power optimizer (CPPM) support, idle power improvements, latest power states, etc.

The most important improvements of Haswell are on the level of x86 core micro-architecture. It is believed that the new MA will be substantially different from current Nehalem/Sandy Bridge generations, which will enable further scalability and performance increases. Besides, Haswell will support numerous new instructions, including AVX2,  bit manipulation instructions, FPMA (floating point multiple accumulate) and others. Denlow graphics core of Haswell will also sport substantially boosted performance and will also be certified to run many professional applications.

Intel did not comment on the news-story.

Tags: Intel, Ivy Bridge, Haswell, Denlow, 22nm, DirectX, OpenGL


Comments currently: 20
Discussion started: 11/10/11 12:58:47 PM
Latest comment: 07/13/16 11:16:11 AM
Expand all threads | Collapse all threads


Hotham 1.0 support?
0 0 [Posted by: bluvg  | Date: 11/10/11 12:58:47 PM]

show the post
2 5 [Posted by: otis_spunkmeyer  | Date: 11/10/11 04:16:06 PM]
- collapse thread

show the post
0 3 [Posted by: dudde  | Date: 11/10/11 07:50:04 PM]
I wont jump to conclusion, its better to Bench them when they arrive. Soon.
0 0 [Posted by: xentar  | Date: 11/10/11 08:30:39 PM]
Yes, dud, because it's just not possible that 2 people could hold a similar viewpoint that's counter to the hoards of people trying to echo the "popular viewpoint" that the talking-heads in the tech news site told them they should have.

Let's have a look at the server logs, clearly beenthere has an alias, and they're probably both JF-AMD attempting to do Intel-style deceptive PR, at the behest of that wily old President Obama...
0 1 [Posted by: otis_spunkmeyer  | Date: 11/11/11 04:10:11 AM]
TDP is ok, for as long Performance Improves. Second there is an improvement Second Generation Core 2 with i still use today.
0 0 [Posted by: xentar  | Date: 11/10/11 08:22:32 PM]
Adding Extra Core is a bad idea if your TDP increase but not much performance increase
1 1 [Posted by: xentar  | Date: 11/10/11 08:28:07 PM]
I think the idea is that well threaded apps will invariably see an increase with more cores, and that the extra cores can be turned off when they're not in use. This is just Intel trying to not give you more cores, as they are running out of ways to increase performance, but they want to save that 8-core upgrade to sell to you later, rather than playing that card now.
1 2 [Posted by: otis_spunkmeyer  | Date: 11/11/11 04:12:20 AM]
You know Intel if you add core even turning off extra core will cost you money. Intel not putting all Ace cards in the table in one Game.
0 1 [Posted by: xentar  | Date: 11/11/11 04:46:20 AM]
Are we playing poker, or are we buying a CPU? I don't care about Intels strategy, and I don't mind paying a premium for extra cores, but I do mind paying the 500% premium Intel is asking for on hexacores just on principle, it's a rip off.
0 1 [Posted by: darth_gayder  | Date: 11/11/11 03:14:21 PM]
Ace Card is a Metaphor, Good For you you can buy anything you want, but unlike everybody here your not a wise buyer

I would rather buy a OctaCore than Extra Core Or Hexacore if the task in OctaCore can be finish in 30 sec and consume less power, than Extra Core or Hexacore. 20 to 40 watts difference big issue if you are running servers. or heavy Task application

Thats Why Reviews and Benchmark Exist for comparison. Remember That
0 0 [Posted by: xentar  | Date: 11/11/11 05:35:42 PM]
In fact, 15W ULW MCM part is the lowest-power high-performance x86 platform that Intel has developed in the recent decade or more.

Even Core 2/Montevina SFF platform with 10W CPUs and 7W+ chipset consumes more.
0 0 [Posted by: Anton  | Date: 11/11/11 03:14:01 PM]
You are suggesting that 15 watts on 22nm is an acheivement compared to 17 watts on 65nm, especially considering the abysmal bump in clockspeed and lack of additional cores ?
0 0 [Posted by: darth_gayder  | Date: 11/11/11 03:16:41 PM]
A 15 watts on 22nm is a good only if the performance Increase. which is not possible in 45nm or 65nm, Increasing Clockspeed on 65nm or 45nm will result in high Power Consumption and Heating Issue.

Try to Bench 15 watts 22nm over 17watts 65nm im sure the gap is big.

Example That is: My AMD Athlon X2 5800 compare to AMD Athlon II X2 250. My AMD Athlon II X2 250 in my business performs better than is Predecessor "AMD Athlon X2 5800" and i bought it a less price

Haswell i believe is for UltraNotebook or Entry Level Desktop. Adding Extra Core will destroy that purpose. Why.1) I will increase Die size and Cost Manufacturing will higher, for Consumer Side Price will be high compare to Haswell 4 cores. 2.) If you are running in Ultra notebook you will have less time usage under battery mode by having extra core 3.) TDP will increase and Heat will higher too.

0 0 [Posted by: xentar  | Date: 11/11/11 06:53:12 PM]
These chips are designed for mobility, 2 to 4 cores are sufficient for most workload these days. Additional cores will only result in diminishing returns in terms of performance to battery life.
1 1 [Posted by: dudde  | Date: 11/12/11 06:52:33 PM]
I 100% agree. Still only 4 cores with laptop Haswell chips is stupid, and with TDP not being any lower than Sandy Bridge! If only AMD could kick Intel's butt, then maybe we'll get an improvement.
One good thing about this is that my old 3GHZ Penryn laptop won't be outdated quite so quickly
0 0 [Posted by: danwat1234  | Date: 01/14/12 08:42:32 PM]

Hi, I'm reading a technology site and yet I still make stupid assumptions that clockspeed, TDP and number of cores are all that matters. I do not consider IPC, idle power, average power, or benchmark results that prove my claim of "0-10%" performance improvements to be BS!
0 0 [Posted by: Dansolo  | Date: 11/23/11 12:45:49 PM]

I have my money saved up i skipped the first and second gen Core i series and stuck with my good old Core 2 Duo until i needed a upgrade and now is around that time after sticking with my Core 2 duo for 4 years.
0 0 [Posted by: SteelCity1981  | Date: 12/01/11 06:15:59 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture