Bookmark and Share


 Performance that Advanced Micro Devices' eight-core processor demonstrated in real-world applications is far from impressive as the chip barely outperforms competing quad-core central processing units from Intel. The reason why performance of the long-awaited Bulldozer was below expectations is not only because it was late, but because AMD had adopted design techniques that did not allow it tweak performance, according to an ex-AMD engineer.

Cliff A. Maier, an AMD engineer who left the company several years ago, the chip designer decided to abandon practice of hand-crafting various performance-critical parts of its chips and rely completely on automatic tools. While usage of tools that automatically implement certain technologies into silicon speeds up the design process, they cannot ensure maximum performance and efficiency.

Automated Design = 20% Bigger, 20% Slower

"The management decided there should be such cross-engineering [between AMD and ATI teams within the company] ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left - they started to rely on synthesis tools, automatic place and route tools, etc.," said Mr. Maier in a forum post noticed by web-site.

A wafer with AMD Orochi dies used for AMD Opteron "Interlagos"/"Valencia and AMD FX "Zambezi" microprocessors

Apparently, automatically-generated designs are 20% bigger and 20% slower than hand-crafted designs, which results in increased transistor count, die space, cost and power efficiency.

"I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electro-migration and other problems," the former AMD engineer said.

Inefficiencies in Design?

While it is unknown whether AMD used automatic design flow tools for everything, there are certain facts that point to some inefficient pieces of design within Bulldozer. Officially, AMD claims that the Zambezi/Orochi processor consists of around 2 billion transistors, which is a very large number.

AMD Orochi floorplan

AMD publicly said that each Bulldozer dual-core CPU module with 2MB unified L2 cache contains 213 million transistors and is 30.9mm2 large. By contrast, die size of one processing engine of Llano processor (11-layer 32nm SOI, K10.5+ micro-architecture) is 9.69mm2 (without L2 cache), which indicates that AMD has succeeded in minimizing elements of its new micro-architecture so to maintain small size and production cost of the novelty.

As a result, all four CPU modules with L2 cache within Zambezi/Orochi processor consist of 852 million of transistors and take 123.6mm2 of die space. Assuming that 8MB of L3 cache (6 bits per cell) consist of 405 million of transistors, it leaves around whopping 800 million of transistors to various input/output interfaces, dual-channel DDR3 memory controller as well as various logic and routing inside the chip.

800 million of transistors - which take up a lot of die space - in an incredibly high number for various I/O, memory, logic, etc. For example, Intel's Core i-series "Sandy Bridge" quad-core chip with integrated graphics consists of 995 million.

While it cannot be confirmed, but it looks like AMD Orochi/Zambezi has several hundreds of millions of transistors that are a result of heavy reliance onto automated design tools.

The Result? Profit Drop!

As a consequence of inefficient design and relatively low performance, AMD has to sell its eight-core FX series processors (315mm2 die size) for up to $245 in 1000-unit quantities. By contrast, Intel sells hand-crafted Core i-series "Sandy Bridge" quad-core chips (216mm2 die size)  for up to $317 in 1000-unit quantities. Given the fact that both microprocessors are made using 32nm process technology [and thus have comparable per-transistor/per square mm die cost], the Intel one carries much better profit margin than AMD's microprocessor. 

AMD did not comment on the news-story.

Tags: AMD, Bulldozer, Zambezi, 32nm, Globalfoundries


Comments currently: 40
Discussion started: 10/14/11 05:18:19 AM
Latest comment: 12/08/11 02:06:47 AM
Expand all threads | Collapse all threads


This wafer on the picture contains Athlon X4, not Bulldozer's
1 1 [Posted by: Tristan  | Date: 10/14/11 05:18:19 AM]

AMD, you made a big flop with bulldozer. I don't know who AMD will manage to sell this piece of sh*t.
At least they have good notebook parts.
3 2 [Posted by: Filiprino  | Date: 10/14/11 06:10:32 AM]

So this should tell AMD after this mess to stop using SoC designs and start handcrafting your designs again.
5 2 [Posted by: SteelCity1981  | Date: 10/14/11 06:43:44 AM]

Another positive, upbeat AMD commentary by Xbit's based on unknowns and one engineer's recollection of what was happening years ago when he left AMD. there's nothing like quality research and reporting to skew public opinion and spread FUD.
10 6 [Posted by: beenthere  | Date: 10/14/11 08:56:59 AM]
- collapse thread


What the article does not mention is that Intel has been using mostly automated design tools to design their chips for a long time. Very little of Intel's designs nowadays are handcrafted.
6 5 [Posted by: quasi_accurate  | Date: 10/14/11 09:27:19 AM]
+1 to you and beenthere. I guess the Intel-owned tech media had their FUD planned months in advance.

Bulldozer is actually pretty good. The best thing they can scream is "OMG regressions!". Every Intel arch has had regressions, and the so-called "ticks" usually wind up being slower than what they replaced.

The runner-up Intel talking point is "ZOMG! You'll only get 60FPS in your favorite game! Bulldozer sucks!". Of course, I only saw Bulldozer being tested with Nvidia GPUs, I wonder if an AMD GPU might change that any...
4 6 [Posted by: dukie_bref  | Date: 10/14/11 03:39:01 PM]
show the post
2 5 [Posted by: hansmuff  | Date: 10/14/11 11:22:03 AM]
then stop coming here if you don't like it.
5 7 [Posted by: SteelCity1981  | Date: 10/14/11 11:29:04 AM]

We are very dissapionted in Amd.. this is really not good. At least not good for the consumer. Intel will keep their price high and will slow down their update since Amd cant offer anything at this moment against Intel.
9 1 [Posted by: 3Dkiller  | Date: 10/14/11 11:48:56 AM]
- collapse thread

So true! I counted on BD to put the pressure on Intel, so I could either:
1. Get a Sandy Bridge with a really nice discount.
2. Maybe even get a socket 2011 CPU at a reasonable price.
2 0 [Posted by: eltoro200  | Date: 10/15/11 02:50:47 PM]

It's inexcusable that a chip using twice as many transistors as a Phenom II X6 while having little or no performance benefit in general and actually preforming worse in many games. Yeah maybe this is fake but so's AMDs marketing right now. The FX chip fails.
5 2 [Posted by: megamanx00  | Date: 10/14/11 12:22:04 PM]
- collapse thread

You didn't seem to think so inexcusable when it happened with the Pentium IV vs. the Pentium III.
7 3 [Posted by: Jack Ripoff  | Date: 10/14/11 03:25:19 PM]
Lol yeah. The problem is that AMD can't ban Intel like Intel did, and that AMD doesn't have the same brand recognition Intel has.

Pentium 4s sold like cakes even when they where slower than Pentium III.
5 1 [Posted by: Filiprino  | Date: 10/15/11 08:43:23 AM]
Both of you are wrong. Pentium 4 was always faster than the Pentium III. It's kind of funny how things get exaggerated.

Pentium III had better IPC, but when released, the Pentium 4 was running at 1.5 GHz, Pentium III 1.0 GHz. Pentium 4 was to reach 2 GHz on that same process, Pentium III got to 1.1 GHz.

Pentium 4 at 1.5 GHz was faster than the Pentium III. Comparing process to process, 2.0 GHz was way faster than 1.1 GHz for the Pentium III.
1 2 [Posted by: TA152H  | Date: 10/18/11 07:50:29 AM]
Pentium 4 versions went down to 1.3 GHz (and, at "budget" speeds, often paired with very suboptimal, for P4, SDRAM).

Pentium III Tualatin went up to 1.4 GHz - but, hidden from public view, not really mentioned by Intel PR (however, some market segments just wouldn't buy into the P4 bullshit; likewise, many average folks in less affluent places - and hence less likely to buy overpriced trash, to fall for PR), so people like you often miss it when forming their "historical" myths.

Intel very clearly wanted to kill the "inconvenient" Tualatins - they were even made incompatible with existing S370 boards for no good reason. Of course, then the Tualatin design was picked-up by Haifa-based Intel Israel team (after Intel marketroids finally fully realised total unsuitability of P4 for laptops; and I used some with P4, what a waste); they morphed it into Pentium M, which in turn very directly spawned Core Duo. And the rest is (real) history.

Core 2 Duo was only slightly faster than Core Duo, which in turn was almost identical (just with 2nd core) to Pentium M, which in turn was only slightly faster than PIII Tualatin.
The perception that C2D, when released, smoked P4 came mostly from how horrible P4 was in the first place.

PIII-S (where Intel "graciously" didn't castrate full L2 from shipping cores) Tualatins were faster than any Willamette-based P4 (and smoked those 1.5 GHz debut P4's)

Even better at budget levels - there were tons of people with Tualatin-based Celerons, at 1.2 to 1.4 GHz - budget CPUs also often faster than "premium" Willamette P4's. Worse, some people misinformed by Intel PR, were replacing those Tualatin Celerons with 1.7 - 1.8 Celerons Willamette - which were very clearly slower (having ridiculously - for the needs of flawed P4 architecture - low amounts of L2 and slow RAM)
4 1 [Posted by: zima  | Date: 11/08/11 06:01:05 AM]
Yes I did think it was inexcusable then which is when I bought my first AMD Duron followed by an Athlon XP, and latter the Athlon 64 X2. Of course that was then and now it is AMD who looks like the fool
1 0 [Posted by: megamanx00  | Date: 10/17/11 09:18:11 AM]

As usually we see a lot of emotion and many false conclusions.

Bulldozer CPUs actually work quite well but they aren't the giant killer than some folks hoped for. Depending on the application Zambezi will work just fine for most people and provide good value. In enterprise the Bulldozer based Opterons are in high demand with Cray getting the first 10,000 CPUs for their Super computers.
7 7 [Posted by: beenthere  | Date: 10/14/11 12:40:17 PM]
- collapse thread

until they realize that they have been scammed off the so-called performance chip
6 6 [Posted by: dudde  | Date: 10/14/11 09:22:47 PM]

I'd rather believe statements from people who were actually at AMD than some commenters on some board. These statements fit pretty well on how Bulldozer performed. Its power hungry, poor performance/watt, poor performance/dollar, and in some cases, performance was equal to or less than a Phenom II 1100T. I would call that a very inefficient design. Also in Xbit's review, it show ed that a single-core 2600K with HT on outperformed an entire 2-"core" module from the FX-8150.

Sure, I would agree more complex designs now have to be automated, but that doesn't excuse AMD from putting out a poorly performing CPU.

This isn't FUD that Xbit is spreading here, it just fits too well with what Bulldozer gave us; it gave us to quote Bit-tech, a "stinker".
5 4 [Posted by: RtFusion  | Date: 10/14/11 12:44:41 PM]
- collapse thread

When you describe Bulldozer it looks like you're describing the Pentium IV. Yet it sold nevertheless, didn't it?
2 3 [Posted by: Jack Ripoff  | Date: 10/14/11 03:26:52 PM]
You're right, P4 did sell. But that doesn't excuse Intel at the time for making such a processor and neither does AMD in the case of Zambezi.

In my view, it IS inexcusable for AMD to put out the FX-8150 that performed equal to or less that a 1100T in some benches. Sure, it does catch up to the 2500K in other benchmarks, but it did so with much higher clocks and greater power consumption.

Intel freed themselves from Netburst and went back and adapted what they did from Pentium Pro and made the Core architecture and look where Intel is today. But with the case of AMD, they can't go back to K10 and shrink it and make enhancements to it. Bulldozer IS their architecture now for the foreseeable future and their claims of getting 10-15% for each iteration is laughable at best.

Again, P4 did sell and Zambezi will sell as well. But would you really want to purchase old and moldy cheese that's been delayed month after month with poor performance? I don't think so.

Oh and, in Anandtech's review, it does mention the similarities between the FX-8150 and the Pentium 4:
0 0 [Posted by: RtFusion  | Date: 10/14/11 04:29:56 PM]
There is or was a Fab in Austen Texas but chances are that was squandered as well. Pity that amd was once a good company with a decent product. Expect that one day that their only asset will be ATI alone and nothing else. They should just mark decent arm cpus instead with radeon graphics.
2 2 [Posted by: nforce4max  | Date: 10/16/11 06:59:26 AM]

If AMD has no people resources to handmade chips, I have the question: What all these 11000+ payroll staff of AMD are doing there? What AMD offices in India and China are doing? Maybe there the problems of bad design are hidden?
2 3 [Posted by: Azazel  | Date: 10/14/11 08:29:33 PM]

AMD is in big trouble. This BD is a fail, and their market value and debt load is strangling their ability to compete with Intel, who makes more profit per quarter than AMD is actually worth.
0 1 [Posted by: beck2448  | Date: 10/15/11 04:01:26 PM]

After reading this article and reading the thread that Maier is in, the most that ring very loud is the that Maier posted "AMD is indeed dead."

Read post number 206 in the following page.


I am an AMD fan and it is real to me after K10 and Bulldozer core processor that AMD is now dead to me.
1 2 [Posted by: tecknurd  | Date: 10/15/11 04:59:06 PM]

After this failure I won't be investing in anything amd at retail prices for a long time. BD is not even mobile worthy let alone for gaming or workstation use. As for the Phenom generation good luck because any one with a brain is going to realize that these will come at a high premium before to long and Intel is no value option. As for Intel I may as well experiment with a cheap 1156 build.
2 4 [Posted by: nforce4max  | Date: 10/16/11 06:56:14 AM]
- collapse thread

Wow, it's amazing how many technically clueless enthusiasts exist and the distorted views they spew.

For the record AMD has sold over 12 MILLION APUs in the past 9 months and demand FAR exceeds supply. Trinity will be coming in early '12 and will maintain AMD's laptop performance lead.

It's also worth noting that demand for Bulldozer based Opterons and FX-8150's far exceed supply because these are good CPUs.

AMD's 7000 series GPUs will aslo show in early '12 so there is a lot of goodness available to those who are not technically illiterate.
3 5 [Posted by: beenthere  | Date: 10/16/11 01:35:59 PM]
The APU are not that good either. Also APU is a bad name for marketing because it is hard for the general public to grasp. APU only work if the user demands only graphics, but besides that it comes as a costly chip.

12 million APU is meaningless if HP and Dell has limited options. Majority of the people do not care about APU because it is a complicated name to begin with.

Bulldozer based Opterons and the new FX models are pathetic. The reason why they are pathetic is because of the instruction per cycle for each core. Having eight cores does not increase instruction per cycle. Software does not behave in a 8xN way because it gets complicated to calculate the true output. If AMD made the Bulldozer core efficient at the start then we will see different numbers. The Bulldozer core is actually in the red of negative 30 percent.

Of course try defending in a cowardly way of the latest AMD's 7000 series. AMD got ATI's sickness, so the software for their hardware is poor. AMD Radeon graphics may perform well, but the reliability and stability is what hurts those cards.

You really do not know what you are talking about.
1 1 [Posted by: tecknurd  | Date: 10/17/11 11:36:18 PM]
he never bothered to read the benchmarks or never can understand them in the first place! AMD pays him to provide blindless propaganda
0 1 [Posted by: dudde  | Date: 10/18/11 02:47:36 PM]

AMD knew it had a flop on its hands before releasing, because it fired its CEO before releasing BD.
0 0 [Posted by: taltamir  | Date: 10/16/11 01:27:50 PM]
- collapse thread

As usual, completely baseless comments from people with no direct insight as to the happenings at AMD.
3 4 [Posted by: beenthere  | Date: 10/16/11 01:32:24 PM]

Not that I am an engineer or anything, but.... Remember the time of Pentium 4? That was Intel's attempt to bring a new architecture, tailored for high clock speeds and how that sucked compared to Pentium 3 and Athlon? They were actually forced to come back to Pentium 3 architecture and bare with that for a while. Be sure, they haven't abandoned P4 and will bring back updated architecture when the time comes. Comming back to BD, it's AMD P4, and guess what? It doesn't suck as much as P4 did back then. BD is a forward looking architecture and it's not fair comparing it clock to clock with older architectures before the software titles are recompiled. And it's not tailored for high single threaded performance. It will (depending on adoption) have its share in servers and high performance computers. Guess what CPU's upcomming fastest super computer will use? To the ones bashing AMD, google how many AMD patents is Intel using. No intents to start flaming though. Peace )))
0 1 [Posted by: MJ  | Date: 10/17/11 06:16:30 AM]

Yup, BD based Opterons are showing a 35% improvement over previous models and AMD can't produce them fast enough. Cray bought the first 10,000 for their Super Computers and they have standing orders for many more. It's unfortunate that CPU design is too complicated for many enthusiasts to understand. They are also swayed by subjective opinions that are not necessarily accurate.

The good news is people are free to buy whatever products make them happy. AMD's biggest problem is meeting demand for their Bulldozer based products and Llano - which is a good thing for AMD dispite the carping by those lacking the technical ability to comprehend what the BD architecture brings to the table.
3 2 [Posted by: beenthere  | Date: 10/17/11 07:18:14 AM]

And the new stepping is coming out soon. I am waiting to see what that does.
0 0 [Posted by: Rikaroo  | Date: 10/17/11 09:35:52 AM]

Just a quick tought: ex AMD engineer "explains" or should we say "shares his assumptions"? How can you explain something about the chip design if you left the company several years ago? By the way, is he acctually employed right now, or the only thing he does is "explaining" chip design pecularities of the chips he acctually haven't ever touched? Again, these are just my toughts
1 0 [Posted by: MJ  | Date: 10/17/11 12:56:48 PM]

Here is a very interesting post at HardOCP from some ex-engineers from AMD:
3 0 [Posted by: RtFusion  | Date: 10/17/11 05:28:43 PM]
- collapse thread

Thank you indeed for that link.
It's very interesting and helpful a lot.
2 1 [Posted by: Azazel  | Date: 10/17/11 09:15:14 PM]

Have you seen the power consumption graph of AMD Bulldozer FX 8150. When you reduce the multiplier of FX 8150, so that the clock frequency is less than 2.5 GHZ, Bulldozer consumes < 10W, so making Trinity with upgraded Bulldozer based Piledriver modules having less clock frequency will make it to have less TDP i.e., <10W. But it doesn't ends here. Trinity is not just a CPU, it's an APU, so AMD has to optimize the Integrated HD 7000 series too.

You may see the graph here - Bulldozer Power Consumption Graph
1 0 [Posted by: actoo  | Date: 10/18/11 09:31:51 AM]

Now you know where all those 800 million transistor went
1 0 [Posted by: TAViX  | Date: 12/08/11 02:06:47 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture