Bookmark and Share


We continue to publish family of news-stories called "Trends of 12" with some of the things that we expect to come alive this year. In many cases, stories in the series rather emphasize trends, but not exactly predict something. In other, we try to tell you something new.

From 2003 to 2006 the world's second largest maker of central processing units - Advanced Micro Devices - was an indisputable leader in high-performance desktop personal computer as its processors offered something that Intel-based systems simply could not. In 2012, the situation is entirely different: Intel Corp. is not only the indisputable leader on high-end desktop market, but is the only player on this market as its arch-rival simply has no offerings to seriously compete against it.

Advanced Micro Devices intends to release its second-generation accelerated processing units code-named Trinity in mid-May. The chips will feature new x86 general-purpose cores code-named Piledriver. Even AMD itself does not expect a breakthrough of general-purpose x86 performance from Piledriver micro-architecture, which evident from the official AMD ads. With the release of Intel's Ivy Bridge, the only choice for performance enthusiasts - both regular and "no compromise" - is an Intel processor.

Based on a recent review conducted by X-bit labs, quad-core Intel Core i7-3770K outperforms eight-core AMD FX-8150 by almost two times in many cases. The problem will hardly be solved by increase of micro-architecture efficiency as well as clock-speed of Trinity and Vishera chips by AMD.

The gap between AMD FX "Bulldozer" and Intel Core "Ivy Bridge" is just too significant at the moment on the desktop level. Essentially, if you want really high performance in x86-bound apps,  you really do not have a lot of choice, unfortunately.

Tags: AMD, Intel, Ivy Bridge, Sandy Bridge, 22nm


Comments currently: 36
Discussion started: 04/29/12 11:44:52 AM
Latest comment: 05/03/12 01:51:47 PM
Expand all threads | Collapse all threads


Another inflammatory AMD story created by Anton. SOS, DD.

The sub-title claiming AMD is unable to provide a viable gaming solution for highend gamers is an outright lie. Both Phenom II and FX CPUs provide a viable and excellent solution for highend gaming though they may not be as fast as Intel's top CPUs. In addition FX-8150 is very fast in BF3 which properly uses the AMD multi-core FX CPU design.

This is a very dishonest story - not because of the CPU performance but because of the stated conclusions which are untrue.

Slanted journalism is a dirty business...

It's the end of the month so X-Bit's needs to get it's page hit count up to keep advertising revenue peaked. What a disgrace.
27 17 [Posted by: beenthere  | Date: 04/29/12 11:59:45 AM]
- collapse thread


2) PIIs have been going EoL for the past few quarters now. Heck, even C2Q is "viable high-end Gaming solution" if you compare it to a PII (OEMs are trying to empty their inventory of the last shipment of Q9505).

If someone asks me for the CPU to stick into a high end gaming rig, i don't know what else i can suggest apart from a i5 2400-and-above processor
4 2 [Posted by: Marburg U  | Date: 04/29/12 02:39:27 PM]
But you're defining high end as fastest which excludes the 2400. High end means playing games at 2560 or using Quadros\Fire Pros.

When the FX was the fastest, I don't remember seeing boutique shops with custom systems which says to me Intel paid them off too.

They dropped the bottom out of the PC industry with the lowest priced, "performance" CPU ever. They changed their FMAC implementation a few times leaving AMD holding the bag. AVX seems to be an afterthought to hurt XOP. And I have two Opteron systems. One had to turn off AVX because it's not supported YET under HyperV.

This means that - indeed - Bulldozer is ahead of it's time and would have been moreso if it had come out in 2009.
1 0 [Posted by: BaronMatrix  | Date: 04/30/12 03:48:43 PM]
show the post
6 10 [Posted by: kailrusha  | Date: 04/29/12 06:34:26 PM]
Power consumption is much worse for Bulldozer

Even if it were true that Bulldozer was as fast as Ivy Bridge, why would anyone get an FX8120/8150, overclock it to 4.6-4.8ghz, just to put up with 319-340W of extra power consumption at load over a 5.0ghz 3570K or 4.8-4.9ghz 3770K?

Gaming performance with multiple fast GPUs is much worse with Bulldozer

However, we know Bulldozer is not even as fast. AMD's CPU may be fine for the majority of games when paired with a mid-range GPU such as HD7870, but start adding HD7970 or 2 of those and the bottleneck appears:

Limited longevity, imminent CPU bottlenecking is much more likely with Bulldozer in the future

Further, even if you have 1 mid-range GPU where FX4100 or something similar is adequate, sooner or later you'll upgrade your GPU to something faster and you'll be CPU limited. Suddenly, you'll have to buy a new motherboard, new CPU and upgrade in 2 years. So much for saving $70-80 over Intel's superior CPU. How much $ did people waste going form Phenom II to Bulldozer? Ironically that Core i5 750/760 @ 3.8ghz+ or i7 860 @ 3.8ghz+ from 2009 still mop the floor with FX series in games. So again, Intel was the smarter and cheaper option all this time; just as it continues to be that way with IVB.

People who buy high-end GPUs don't really buy $100-120 budget CPUs. The $80-90 savings for say an FX4100 isn't worth it over a $212 3570K

Most importantly, we can keep our CPU for 3-4 years now and upgrade GPUs 2-3x in that timeframe. 2500k @ 4.5ghz or 3570K @ 4.5ghz will last another 3-4 years while FX8150 is already bottlenecking GTX680/HD7970 OC in some games (Civilization V, SKYRIM, Starcraft II, etc.).

Why Bulldozer currently doesn't make sense

Why bother buying a processor that doesn't even cost less and consumes a lot more power but at best offers similar performance and in some cases far worse performance? Also, for guys like us who spend $400-600 on GPU hardware, it simply makes no sense to buy something crappy like a $120 FX4100. Might as well spend $90 more and get the real deal - 2500k/3570K. Over the course of 3-4 years of CPU ownership, that $90 savings is just $25-30 a year....

Seriously, do you work for AMD? It's perfectly fine to purchase AMD's CPUs if you are a fan of AMD, but there is no question they are inferior in performance/watt, or gaming performance, or overclocking performance. You seem to be defending Bulldozer line-up of chips every time it's put in bad light, which ironically is the truth. Accusing Xbitlabs of being biased against AMD is laughable since they have praised Athlon XP+, Athlon 64 and Athlon X2. You can just search for older reviews.

Even AMD advises reviewers to test their HD7950/7970 cards with Intel's CPUs......
25 5 [Posted by: BestJinjo  | Date: 04/29/12 08:25:15 PM]
BS. As usual. Why do you like being abused by anti-trust loving, OEM depriving bunch of crap heads?

Intel purposefully dropped Core 2 prices to hurt their ONLY competitor - as if normal growth of 12% was not good enough.

Just like with 3DNow, Intel used "cheating" to get their implementation mainstream. They continue the same behavior and embarrasses me.

Free market? Not with Intel around.
3 3 [Posted by: BaronMatrix  | Date: 04/30/12 03:51:37 PM]
What are we? Babies now? You still live in the past and not the present. The present now is who cares what happen in the past. The present said Intel is the leader in performance and also the leader in low power consumption. AMD can reach to Intel, but not with Bulldozer micro-architecture.
3 1 [Posted by: tecknurd  | Date: 04/30/12 06:00:46 PM]
BestJinjo, I agree. Some people do not know any better. All they know is worst and they can not see anything else.
2 1 [Posted by: tecknurd  | Date: 04/30/12 05:57:40 PM]
I just find it amusing that people here are accusing Xbitlabs of bias when they have continuously praised AMD's CPUs when they were superior to Intel's offerings.

Furthermore, Intel has made unsuccessful CPUs in the past, including Pentium 4 and Pentium-D. I know many of us made fun of PrescHOT. People are trying to make this a fanboy war of Intel vs. AMD. It's not about Intel vs. AMD, but a case of supporting an inferior product vs. spending $ on a much superior alternative. Efficient markets with rational consumers implies supporting the latter, not the former. Supporting an inferior alternative is simply being irrational, regardless of any brand preferences.

It's also ironic that some people here advocate supporting AMD because they desire competition, and yet supporting Bulldozer is supporting the idea that a hot, slow and expensive CPU is actually good for us consumers. I know when Intel released Pentium 4 and Pentium D, I skipped right over that and grabbed Athlon 64 3000+; later upgraded to X2 3800+. Why would we throw $ to support an inferior product since that actually sends a signal to the company that making crap products is good enough.

Here is another way to look at it. Recall Fermi being criticized by gamers for 2 years ("Thermi", nuclear reactor, etc.). Well Jen-Hsen Huang didn't ignore the constructive criticism but spent considerable resources and got the engineers to rethink the GPU architecture from a performance/watt perspective and what did we end up with? The most power efficient high-end GPU with Kepler architecture.

Perhaps AMD should actually do the same and listen to consumer criticism for low IPC, high power consumption, low performance/core, etc. and work to improve their architecture. Sitting there and spending marketing dollars to convince people that Bulldozer provides similar gaming experience in an "AMD sponsored" event isn't doing them any favours. In fact, it makes the company look worse because they are not admitting they need to improve on the product!!

Intel even admitted their failure with Netburst and what did we get? Conroe.

Fermi and Netburst are just two modern examples in semiconductors off the top of my head where the first step to a turn-around was admitting that those companies had inferior products. There are all kinds of examples like this in business, like Domino's completely rethinking their pizza recipe!
17 1 [Posted by: BestJinjo  | Date: 04/30/12 09:33:32 PM]

show the post
5 10 [Posted by: tbone8ty  | Date: 04/29/12 12:13:25 PM]
- collapse thread

Translation "Compare a 3.5 Ghz quad core to a 4.8 ghz octo-core...that'll show Intel!!"
9 3 [Posted by: AnonymousGuy  | Date: 04/29/12 04:54:28 PM]

Durh? After benchmark and benchmark result in last several years, who would be surprise? What else is new? This must be a slow Sunday afternoon.
2 1 [Posted by: Tukee44  | Date: 04/29/12 02:18:45 PM]

1. The basic statement isnt really relevant, all highend cpu today can run everything, Intel is undisputed leader today sure, but that isn´t the same as AMD "cant compete", because AMD offers a better pricetag. Today i would still go Intel if i buy a new system, unless i needed to cut down what i pay for the system, then AMD becomes a more reasonable choice.

2. i7 for comparison? You realise that of the cpu:s you compare, one is almost twice the price of the other? A quick look at a few sites here gives a pricetag around 3000kr for i7 3770 and 1600kr for the 8150.
In 3 of the benchmarks, the difference might be said to be equal to the difference in price. Overall, not even close.

3. Where AMD is competing WELL at the moment however is with their A-series chips, i´m seeing them to be amazingly popular.

4. My E8400 Core2 still manages to outrun even a >3Ghz i5 pretty well in some apps, does that mean that their new cpu:s are worthless? Nope, just as my friend´s AMD 1090T isnt bad just because SC2 favours Intel chips. His cpu can still outrun mine to a ridiculous degree in other apps.

Personally i want to see AMD updating their cpu:s with upsized/faster L1 and MUCH faster L2. They bungled those with the Bulldozer.
4 4 [Posted by: DIREWOLF75  | Date: 04/29/12 03:33:17 PM]
- collapse thread

I refuse to feed the beast. They give the Free Market system a black eye.

OEMs: we need you to lower CPU prices o we can actually amke a few bucks.

Intel: (maniacal laughter)

OEMs: We may be suckers. Is Trinity ready?
1 0 [Posted by: BaronMatrix  | Date: 04/30/12 03:53:24 PM]

I agree that bulldozer is horrible. High power consumption, very poor IPC and has large die size.

For the same price i5 3570 beats FX-8150 in most benchmarks. Even in the very few cases were FX-8150 wins the difference is always negligible and unnoticeable. However, in the cases that i5 3570 wins that difference could be massive sometimes.
3 1 [Posted by: maroon1  | Date: 04/29/12 03:48:35 PM]

People do under-estimate the power of thier current CPU's.
MY FX4100 @ 4.2 ghz (water cooled) did max out my Radeon 7970. I had 100% GPU useage for the most part in games.

It wasnt until i crossfired my 7970 when i saw a huge CPU bottleneck.

My minimum frames persecond did go up with crossfire (as my CPU useage went from 89% to 100%) but the FX 4100 @ 4.2 Ghz cant keep up to crossfire with really high end cards.

Now i have the option of a third video card... Hopefully upgrade to Steamroller AM3+
2 3 [Posted by: campdude  | Date: 04/29/12 04:25:57 PM]
- collapse thread

show the post
3 10 [Posted by: kailrusha  | Date: 04/29/12 06:44:26 PM]

show the post
2 20 [Posted by: tedstoy  | Date: 04/29/12 04:43:37 PM]
- collapse thread

show the post
2 9 [Posted by: kailrusha  | Date: 04/29/12 06:37:33 PM]
show the post
1 17 [Posted by: tedstoy  | Date: 04/29/12 09:18:01 PM]
show the post
1 10 [Posted by: kailrusha  | Date: 04/30/12 07:26:22 AM]
show the post
0 18 [Posted by: tedstoy  | Date: 04/30/12 04:14:51 PM]
For Intel IGP is a bonus and for AMD IGP is their life. If you use high demand office or 2D apps (Photoshop, SQL, dll) a powerless IGP is enough. But for a gamer, I think SB/IB with an add on card is the best choice. Unless you don't have enough money but still insist want to be called a(low-res)gamer then A6-A10 w/o add on is the best solution for you. (And WTH can buy i7 but can't buy add on GPU)
0 1 [Posted by: jpunk  | Date: 04/30/12 10:00:05 PM]

that graph is not really relevant. i do not see how you did compare. i see a spike in itunes... as far as i knwo it's not such a grat performing app or i may be wrong. then you have the games where there are some great deltas. what are you using there for a gpu, resolution, detail?! cmon! then you have the more "serious" apps 3dstudio max, adobe apps that do not show such a great difference. there is also the price delta between the two cpus that is... significant. amd can compete and competes. it's not the king of the hill but it offers good value and this matters most in my little village.
2 3 [Posted by: HHCosmin  | Date: 04/30/12 01:55:25 AM]
- collapse thread

show the post
2 8 [Posted by: kailrusha  | Date: 04/30/12 07:03:59 AM]

Hopefully AMD can enhance Piledriver and Trinity enough to gain market share back from Intel. I want Intel to put all their hard work into Haswell.
0 0 [Posted by: DirectXtreme  | Date: 04/30/12 12:35:01 PM]
- collapse thread

I want them to be broken up...
0 1 [Posted by: BaronMatrix  | Date: 04/30/12 03:56:05 PM]

show the post
0 11 [Posted by: TAViX  | Date: 05/01/12 02:13:56 AM]

I agree with beenthere that this is slanted journalism, even though I slant AMD. Viable refers to performance, and it should not be used as an euphemism for competition.
1 3 [Posted by: TeemuMilto  | Date: 05/01/12 10:07:48 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture