Bookmark and Share


At its Financial Analyst Day event on Thursday, Advanced Micro Devices said that its chips in 2014 will be able to automatically decide which of its heterogeneous cores to use for a particular task. The context switching between x86 and stream processing cores will allow AMD to greatly speed up performance of its chips and optimize their power consumption.

"Ultimately, as we complete our roll-out of heterogeneous system architecture, we will have a very fine-grain control of where is computing [performed], it is an optimization," said Mark Papermaster, chief technology officer of AMD.

At present AMD's accelerated processing units - which integrate x86 general-purpose processing cores and highly-parallel stream processors on a single chip - can accelerate certain tasks  using Radeon stream processors only when software is made to specifically use them. Given the limitations of the current architecture, x86 and Radeon cores should use dedicated memory, which tends to be inefficient. Essentially, nowadays the software completely controls which compute resources to use and when.

But after years of evolution, which will involve development of both hardware and software/compilers/tools, accelerated processing units in 2014 will be able to dynamically decide (possibly, when it comes to new programs) which task is better to execute on a particular core thanks to new software as well as special features of the chips.

Dynamic context switching between different types of cores will not only greatly speed up performance of such chips, but will also optimize power consumption as the most efficient hardware will be used to perform an operation.

Earlier AMD expected to release "fully fused" Fusion chips in 2015 or beyond.

Tags: AMD, Fusion, ATI, Radeon, Steamroller, Excavator, Piledriver, Bulldozer, Sea Islands, Southern Islands, 28nm, 20nm


Comments currently: 23
Discussion started: 02/02/12 12:59:20 PM
Latest comment: 02/16/12 09:17:04 AM
Expand all threads | Collapse all threads


What does 'HSA' stand for?
0 0 [Posted by: Prosthetic_Head  | Date: 02/02/12 12:59:20 PM]
- collapse thread

HSA is the new name for FSA. FSA was disclosed on 2011 AFDS.
0 0 [Posted by: sirroman  | Date: 02/02/12 03:40:32 PM]
HSA (Heterogeneous Systems Architecture)
1 0 [Posted by: PsiAmp  | Date: 02/02/12 04:30:49 PM]

Which year in the gregorian calendar is amd year 2014?
0 0 [Posted by: uibo  | Date: 02/02/12 01:08:59 PM]
- collapse thread

0 0 [Posted by: madooo12  | Date: 02/02/12 09:57:53 PM]
one can only hope...
0 0 [Posted by: uibo  | Date: 02/03/12 08:46:29 AM]

HSA = Heterogeneous Systems Architechture

AnandTech has the complete story posted.
3 5 [Posted by: beenthere  | Date: 02/02/12 02:49:21 PM]

Just a little possum stirring. I refer to the AMD REALITY CHECK at FX GamExperience AMD v Intel, (see Legit Reviews article) This showed that even the most basic newer computer is able to surpass the software available to-day. Most if not all software available is at least 5 years behind the hardware development. The hardware by either cpu manufactures is so far ahead of the software available it is near pointless having a high end system. I am pretty sure my old p3 700 laptop would run most if not all the commercial software, and my old amd 4200 with 4 gig of ddr2800 will play most games if I move my HD6770 card to that system.
This is due to lazy programers or the O/S being way out of date.
1 3 [Posted by: tedstoy  | Date: 02/03/12 03:42:17 AM]
- collapse thread

That's one of the reason AMD isn't adding more cores on the next gen products. The O/S and software can't even effectively use the current core counts except to a degree in servers.

As far as software is concerned, programmers do what they are told to do by management. Most software companies aren't interested in quality software, just increased sales. Whatever is the cheapest means to crank crap out and sell for the highest profit is what software (and hardware) companies typically do.

The AMD Blind Test proved that CPU specifications are far less important than actual system performance, which many people have known for years since AMD intro'ed Athlon.
4 5 [Posted by: beenthere  | Date: 02/03/12 06:46:56 AM]
What are you talking about?

Did you even read that details of that test?

The first system used Sandy Bridge graphics which is an absolute joke:

HD3000 is worthless for any 3D modern games:

Obviously the GPU in Llano would have made that system far faster.

The 2nd test system pitted FX-8150 vs. 2700k but paired an HD7970 in an Eyefinity setup. That's essentially a 100% GPU limited situation since HD7970 is not fast enough for modern games in Eyefinity. See this review:

I think it's important to understand CPU vs. GPU limitations when doing this testing. If you a pair a fast CPU with a slow GPU, of course it's going to be horrible for games (i3 2100 vs. AMD's A8-3850 Llano APU). Similarly if you pair a very fast CPU with a GPU but increase the GPU workload to ridiculous levels for a single GPU to handle, you will bring your performance too low that GPU is completely a bottleneck (i.e., a single HD7970 cannot power 3x 1920x1080P displays in modern games at good frames).

What happens when you add 2 fast GPUs for multi-monitor or high resolution gaming? The bottleneck shifts more to the CPU an overclocked FX-8150 gets creamed by an overclocked 2500k:

I can assure you that your AMD 4200+ will fail miserable in modern games even when paired with an HD7970. Even when E6400 was paired with a low-end GTX460, it became a huge bottleneck in games. And E6400 is a faster CPU than your AMD 4200+.
2 2 [Posted by: BestJinjo  | Date: 02/04/12 06:30:47 AM]

show the post
3 7 [Posted by: beenthere  | Date: 02/04/12 09:55:48 AM]
- collapse thread

I am saying, they could have paired an i5 2400 vs. 2700k and obtained the same results when pairing an HD7970 with Eyefinity.

What happens in CPU limited games like SKYRIM or Starcraft 2, etc?


There isn't a single CPU in the entire new Bulldozer desktop lineup that makes sense. I suppose one can make the argument for AMD based on Phenom II X4 955/960T for multi-tasking and gaming on a budget. However, the entire FX4000-8000 line is a big joke. Why buy FX8120 or FX8150 when 2500k beats them in most situations and consumes far less power? Even i5 2400 is miles better and costs less than both!

Once IVB comes out, 2012 is a complete writeoff for AMD on the desktop.

Don't bring business practices into the equation. AMD used to have amazing products with Athlon XP+ (esp. 2500+ Barton), Athlon 64 was much better than Pentium 4 and Athlon X2 completely embarrased Pentium D. I don't want to hear excuses for why or what. AMD makes horrible desktop CPUs are the moment and there is 0 reason to buy one unless you can get it for way cheaper or just want to support the underdog. Performance wise and power consumption wise, they make 0 sense, especially since FX8000 line costs more than i5 2400 and loses to them.....

The new FX series is even worse than Phenom II. At least Phenom II is priced pretty low. I have no idea what AMD expects to do on April 8th when IVB brings another 20% more performance within 77W TDP power envelope. I think FX8120 would need to drop to $120 to make any sense at that point.

"AMD's products deliver everything they need or want - even in blind tests."

Not true at all. Let's fire up WOW, Starcraft 2, Diablo 3, SKYRIM, GTAIV and see what happens. What happens in Photoshop, encoding video to your tablet/smartphone using Quick Sync?

You have been promoting AMD products at Xbitlabs at every opportunity, which really questions your integrity given the obvious bias for all AMD products you have. I would have understood such devotion during Athlon XP --> Athlon 64 X2 era. But now, it's just pure fanboism on your part.

Did you even bother reading Xbitlabs' own review on FX series?

FX-8150 consumes almost 60W more power consumption and loses to i5-2500 in just about everything meaningful.

When fully overclocked, FX8150 consumes > 270W of power vs. 5.0ghz 2500k/2600k and loses to them, badly!

Even if a hypothetical user can't tell the difference in performance in GPU limited games (and say doesn't play CPU limited games at all), they would have to deal with the extra heat dissipation and higher electricity costs. None power users would be better served with a dual core i3 in the first place.

Bulldozer makes no sense at all at current prices. Even if Bulldozer was as fast as Intel's offering, the power consumption differences alone are already a huge negative. But of course its performance is lacklustre and it also costs more!

i5 2400 $189 > FX8120 $199
i5 2500k $229 > FX8150 $269

Why would anyone get the FX series?
2 1 [Posted by: BestJinjo  | Date: 02/04/12 11:23:11 AM]

^^^^ You've missed the point I'll explain again.

AMD sells products that meet the needs of 90% of PC users. Less than 5% of PC users buy the fastest CPU/GPU available. The masses of PC users is where the money is made and the market that AMD is catering to. That is why AMD's APUs are selling so well and why AMD gained 130 new OEMs with Llano - which is taking laptop market share from Intel. Did I mention that AMD sold 30 MILLION APUs?

Trinity laptop will basically end Intel's Ultrabook sales because Trinity will perform better and cost $200-$400 less.

Both the Phenom II and FX CPUs provide all the performance that 90% of consumers desire and at a very affordable price. That is why people buy these chips. AMD mobos also tend to cost less than Intel mobos so there is additional savings by going with AMD. FX CPUs OC nicely and scale well so that's another reason why people buy AMD. AMD's FX power consumption/scaling control is more advanced than Intel's. AMD CPUs typically use less power real world than Intel's, especially in server apps.

If you believe that Intel is best for you then that is what you should buy. Those who know better and who have a moral compass are voting with there wallet by purchasing AMD products.
4 5 [Posted by: beenthere  | Date: 02/04/12 01:09:41 PM]

(i.e., a single HD7970 cannot power 3x 1920x1080P displays in modern games at good frames)

Good frames is a matter of opinion no tests or logic can change.
You say the 7970 cannot run modern games in a 3 monitor setup with good frames. I call bs and say it can, you just feel that it is not good enough for your tastes. That does not mean the hardware is bad. The same thing can be said about cpu's as well.
1 0 [Posted by: veli05  | Date: 02/16/12 08:39:11 AM]

If you believe that Intel is best for you then that is what you should buy. Those who know better and who have a moral compass are voting with there wallet by purchasing AMD products.

I completely agree with this statement. I refuse to by Intel because I refuse to support a company that feels it is ethically acceptable to extort, coerce, and bludgeon their customers as a valid business practice. I could give tow s&$^ts if their stuff is faster, AMD's is fast enough to work for the wallet conscious consumer.
1 1 [Posted by: veli05  | Date: 02/16/12 09:17:04 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture