Although Advanced Micro Devices recently announced plans to design various system-on-chips based on ARM architecture, the company believes that the x86 instruction set will remain very important for the industry for decades to come. Quite possibly, the claim was made to debunk rumours about AMD’s plans to halt development of high-performance x86 chips in a bid to concentrate on low-power products.
“There are no doubts that x86 is going to be a huge portion of our business. I think that it is going to be an important segment of our business for 5 – 10+ years. The x86 is going to be here long after I am retired. […] There will be x86 applications just like there are mainframe applications today, 25 – 30 years later. That is not going to fundamentally change,” said Rory Read, chief executive of AMD, at Credit Suisse Technology Conference earlier this week.
Throughout its history, the x86 architecture has fought many battles, some of which determined the future of the whole computing industry. The x86 managed to become the de facto microprocessor standard for servers, which replaced mainframes and essentially killed proprietary processor architectures. The vast majority of x86 wins were conditioned by their high performance amid relatively low costs.

But today, x86 is facing a tough competition from ARM architecture, which is more energy efficient and which lets chip designers to make extremely low-cost chips that consume over a fraction of power the x86-based solutions do. As it appears, in many ways very high performance is not needed, but power consumption is crucial. Therefore, ARM-based microprocessors are gaining revenue share away from x86 nowadays.
It is quite obvious that the need for high performance will remain: there are complex programs that process huge amounts of data, there are sophisticated video games, there is video editing software and there are many other applications that take advantage of high-end chips, which rely on x86 instruction set. In fact, the gap between performance of ARM and x86 chips will remain rather huge for years to come.
AMD understands that very well, which is why its chief executive, who openly said that he did not want to compete against Intel Corp. for maximum performance, believes that x86 is here to stay. Perhaps, the claim also means that AMD will continue to develop high-end x86 solutions, even though it is tremendously hard for AMD to remain more or less competitive in that market segment.
Tags: AMD, x86, Bulldozer, Piledriver, Excavator, Steamroller, Intel, ARM, Cortex
Comments currently:
49
Discussion started: 11/30/12 04:29:47 AM
Latest comment: 12/14/12 07:16:14 PM
Expand all threads |
Collapse all threads
[1-8]
1.
I have two questions.
1.Who has stated this from AMD?
2.What do the AMD fanbois have to say about this who "CLAIM" that x86 is dying off?
show the post
12
18
[
Posted by: Avon4Balls

|
Date: 11/30/12 04:29:47 AM]
+ expand thread (4 answers)
- collapse thread
x86 will be around because it's needed by scientists, industry and government agencies to run powerful software. ARM is good for Facebook and the like. Intel and AMD will always have business for x86 architecture. But both companies are being forced to diversify their portfolios to meet the fragmentation of the market lead primarily by the demand for portable handheld internet devices.
Oh, if you didn't read, it was Rory Read who you agree with. Fancy that, Intel troll.
16
15
[
Posted by: linuxlowdown

|
Date: 11/30/12 05:10:45 AM]
AMD is trying hard to diversify! But what did they get? They lost focus on their core product! They lost their identity in the market that made them a leader a decade ago! AMD is taking the path towards becoming a second rate custom chip maker! And in a few more years, they will be scraps that will soon be sold to the chinese market!
9
10
[
Posted by: dudde

|
Date: 12/01/12 10:41:06 PM]
Utter verbal diahorrea.
10
10
[
Posted by: linuxlowdown

|
Date: 12/02/12 12:20:47 AM]
wow! moron calls out to himself! you've @$$ kissing far too long!
show the post
1
6
[
Posted by: dudde

|
Date: 12/14/12 07:14:00 PM]
2.
AMD Expects x86 to Remain Important Architecture for 25 – 30 Years
Of course, it will.
Will AMD stay in x86 business at least in 2-3 years? it's a big question.
14
15
[
Posted by: Azazel

|
Date: 11/30/12 06:33:48 AM]
+ expand thread (16 answers)
- collapse thread
aside from laptop/server/desktop cpus or the respective gpus/gpgpus, consoles either use amd chips or is rumored that the next gens will, why do you believe that they won't be around in the next years?
wii 97.18 million gpus so far
xbox 360 Worldwide: 70 million gpus so far
wii u 1.2 million until 26 Nov gpus
ps4 is rumored to have an a10 apu
xbox 720 is rumored to have a 7000 or 8000 series AMD gpu.
aside from those facts, fx-83xx is dominating in the productivity suite and competing the 1k $ cpus from intel.
so you wonder if amd will go out of business in 2 or 3 years? [sarcasm]I predict in 3 years, xbox is coming in '13, ps4 is coming in '14 and after about 100 million units sold, without those from wii u. So in 2015 after all those products they are going to die, 3 is the magic number[/sarcasm]
that's a relatively small answer for such a big question of yours.
show the post
5
9
[
Posted by: Yorgos

|
Date: 11/30/12 09:40:50 AM]
"Rumors" are quite different from "Facts".
So all those numbers you told us, please tell us where is the money and why AMD is sinking.
Looks like many were stealing money from AMD.
They are doing the same thing now and they are going to leave the carcass to rot.
Don't be fooled by the sudden jump of AMD shares this past 2 days. They sold the building, that is why. So basically AMD is now "homeless"
show the post
10
15
[
Posted by: Avon4Balls

|
Date: 11/30/12 10:29:10 AM]
Lease back agreements are very usual. For example Peugeot had to do it for their Paris headquarters earlier this year. btw, in your thinking all renters are homeless.
12
13
[
Posted by: linuxlowdown

|
Date: 11/30/12 05:49:01 PM]
You can't discuss logic and common sense on these boards ever since Avon and his crew joined the scene.
show the post
10
14
[
Posted by: BestJinjo

|
Date: 12/01/12 08:55:43 AM]
Avon stop posting non-sense.
Sold the building? You are CLUELESS regarding understanding of basic business concepts. You should stop posting regarding any business-related topic until you get a degree in business, such is the lack of your knowledge in this area.
Many companies in the world do a corporate sale/strategy leaseback as a way to inject immediate cash flow into the business while "exchanging" a high depreciation expense on the income statement into a lease expense which tends to be smaller, thus increasing earnings/net income. The company then leases back the building over 10-20 years.
"For some corporations, the objective in
sale/leaseback transactions is simply to show increased earnings. This is accomplished by converting equity - in the form of buildings with little or no debt - into cash, which may be needed for business expansion or other purposes, such as immediate cash flow infusion for day-to-day operations/payment of obligations.
Sale/leasebacks provide an opportunity to raise cash while maintaining operating control of the property as if the property were still owned by the corporation. The sale/leaseback, in such a case, is an excellent way to raise capital and, therefore, is a financing technique. Annual lease costs can be equated to the interest charged in a traditional financing situation. If a corporation can borrow funds at 10% for 15 years and if a sale/leaseback has an effective lease cost of 9%, then the sale/leaseback can be an effective vehicle."
http://www.conway.com/geofacts/pdf/50910.pdf
Sale leaseback is NOT the same as selling the building which is called "Disposition of real estate assets," whereby the actual legal ownership of the asset is transferred from one party to another.
Over the course of following your posts, I realized you haven't got a clue about how businesses function, and have very narrow understanding of finance and accounting-related concepts, nevermind the specific intricacies of the semi-conductor space. I don't know what your career is, but you should stop posting about business-related topics since you don't grasp the concepts.
12
14
[
Posted by: BestJinjo

|
Date: 12/01/12 08:37:51 AM]
How FX-83xx is competing with 1K$ intel ? Please explain ?
7
4
[
Posted by: maroon1

|
Date: 11/30/12 12:40:42 PM]
Why should it? 1K consumer CPUs compromise less than 2% of Intel's sales. Why would a company 60x smaller waste billions on trying to make $1K CPU that >98% of consumers won't buy? In fact, Intel tacks on a nearly $500 premium over i7-3930K and calls it X series. Only a clueless Intel fanboy would spend $1K for a 3970X over the 3930K (or someone whose company pays for the CPU for work).
Until people grasp the concept that to manufacture a high-performance CPU you need an expansive R&D budget and class-leading fabrication facility/supplier, they are going to be wasting their time waiting for AMD to compete with Intel above $225 range. Of course most of us have understood this since 2006, except clueless AMD-haters like Avon and 1234. Here we are nearly 7 years later and people still don't understand how the semi-conductor space works.
If I gave you $10 million dollars, you still could not compete with BMW in the car making business because just to get started, you would spend millions on getting state of the art manufacturing facilities and R&D and have no $ left over for marketing/advertising and so on. AMD can't magically design a faster 32nm CPU than a 22nm CPU. How is this rocket science? Amazing that people still don't understand how a manufacturing node is related to transistor density and performance/watt.
show the post
11
14
[
Posted by: BestJinjo

|
Date: 12/01/12 08:47:02 AM]
http://www.techpowerup.co...5W-TDP.html?cp=3#comments
that's for both the replies of my comment.
@avon4balls amd did a bad comeback with bulldozer they lost tones of money because of the long r&d and the poor performance that didn't get enough customers. Do you have any proof that ms or Sony are going to ditch amd gpus after a non-problematic cooperation the last 8 years?
I see a processor with great potential to scale up to 50 cores or whatever they want. they are going to make even better ipc in the next generations. I was about to get a 3770k plus a good motherboard, because i compile a lot of staff and my i7-720qm laptop cpu is weak for my verilog simulations, but I changed my mind after those results.
2
4
[
Posted by: Yorgos

|
Date: 11/30/12 02:41:24 PM]
LOL !! Your own link proves that your are wrong about FX-8350 competing with 1K$ intel when it comes to performance
Not to mention that the user in that link was cherry-picking benchmarks that favors FX-8350. So, even with cherry-picking you fail to prove ur previous comment.
2
4
[
Posted by: maroon1

|
Date: 11/30/12 05:18:41 PM]
you are right, and the benchmarks compiled with icc don't favor anyone.
prove me why those benchmarks favor amd? are those compiled with amd's compiler? are those using any amd's private instruction set? are you just addicted to trolling.
I am the only one here placing benchmarks and actual numbers and I am the one who gets all the hatred(a.k.a. thumbs down)
3
5
[
Posted by: Yorgos

|
Date: 11/30/12 06:41:39 PM]
The reason I said it favors FX-8350 because the guy in that link picked the best case scenario for FX-8350 out of different reviews. He deliberately ignored all other benchmark. Also, he ignored all lightly-threaded benchmarks.
For example, He could have picked those benchmark
http://www.xbitlabs.com/i.../amd-fx-8350/3dsmax-2.png
http://www.xbitlabs.com/i...pu/amd-fx-8350/winrar.png
http://www.xbitlabs.com/i...amd-fx-8350/starcraft.png
http://media.bestofmicro....57628/original/itunes.png
http://media.bestofmicro....27/original/handbrake.png
http://media.bestofmicro....riginal/skyrim%201920.png
Also, Why you are ignoring the other part of my comment. You have yet to explain how FX-8350 performs on par with 1K$ intel. i7 3930K/3960X smokes FX-8350 in almost every mutli-threaded benchmarks and lightly-threaded benchmark.
3
3
[
Posted by: maroon1

|
Date: 11/30/12 07:56:55 PM]
The link I posted has a dozen of benchmarks that fx8350, a 200$ cpu, is behind in some benchmarks at most 10%, at some ahead of a 1000$ cpu. I don't see why you see smoke.
Also I talked about productivity, yet you post 2 game benchmarks, one known to favor a particular gpu architecture.
If it isn't a win for a 200$ cpu and a socket that is going to last at least 2 more years, then what it is?
explain me what's the best case scenario for amd, is there a secret compiling trick that favors amd's micro-architecture and does the opposite for intel. do those of benchmarks prove that vishera is competing with a 5 times more expensive product?
do you know the meaning of productivity suites and unbiased program compilation?
http://openbenchmarking.o...lt/1210227-RA-AMDFX835085
this link provides a bunch of productivity benchmarks with programs that are compiled with oss compilers and run on oss o.s. that favors no architecture.
show the post
2
5
[
Posted by: Yorgos

|
Date: 11/30/12 08:34:51 PM]
GCC has a heavy but not very noticeable Intel bias. The underlying descriptors that tell the compiler what to look for in an architecture is much better in Intel's case rather than AMD's case. Intel GCC optimizations are done with real world tests while AMD GCC optimizations are not tested well or there is a lot of copy and pasta. The best compiler to use for AMD is VS11(2012) or better yet Windows 8 is the best platform to use Bulldozer/Piledriver on.
3
1
[
Posted by: seronx

|
Date: 12/01/12 02:20:05 AM]
The link I posted has a dozen of benchmarks that fx8350, a 200$ cpu, is behind in some benchmarks at most 10%,
At most 10% ?!! This comment proves how dishonest you are
Here it wins by 40% against FX-8350
http://img.techpowerup.org/121023/Capture041.jpg
Here it wins by 43%
http://img.techpowerup.org/121026/Capture058.jpg
Both of those benchmarks were taken from your own biased link. In most of the other benchmarks, it was over 20% faster. And yet you say 10% at most ?!! Stop being dishonest.
2
3
[
Posted by: maroon1

|
Date: 12/01/12 05:05:26 AM]
What kind of an argument is that -- a $195 CPU competing with a $999 offering? Let's compare a 3 series BMW to a Porsche 911 GT2 RS shall we?
For what it is, FX8350 is actually very decent. It's a good alternative to i5-3570K for those who need a CPU for multi-threaded tasks. Us gamers will choose i5-3570K but I know if I was doing video encoding or rendering, I'd take FX8350 in a heartbeat at $195 over an i5.
In relative standing to i7, the FX8350 is no worse than X1100 X6 was to Lynnfield and SB i7s. If all you care about is games, then AMD hasn't been worth buying for gamers since 2006, way before Bulldozer/Vishera. This constant anti-AMD hate is more than just Bulldozer. Just sayin'.
11
12
[
Posted by: BestJinjo

|
Date: 12/01/12 08:52:30 AM]
Yes, the console angle was the first thing which crossed my mind. Unless the new consoles fail miserably, AMD should be able to stay alive thanks to them.
Also it's rumoured that the next version of Surface Pro from Microsoft will feature an AMD APU.
0
2
[
Posted by: ET3D

|
Date: 12/02/12 06:11:14 AM]
3.
Why equate instruction set with performance levels or power consumption? While it seems the overall design is affected somewhat by the instruction set, I think it's mostly that historically x86 CPUs were going for maximum performance ignoring power, and ARM was going for more or less an opposite. With the amount of transistors to spare you have today, I assume x86 instruction decoding isn't a big undertaking.
x86 companies started focusing recently more on power, and ARM's performance approaches being sufficient for normal computing despite still targeting low power.
If Intel were to produce a Pentium 3 1GHz at current process technology it would be very low power but x86 just the same. If someone were to produce an ARM CPU with more execution units and higher clocks, it would enter desktop-range power consumption and performance.
show the post
3
13
[
Posted by: sanity

|
Date: 11/30/12 06:43:47 AM]
4.
Intel is making x86 important. AMD is making x86 unimportant.
10
8
[
Posted by: Tristan

|
Date: 11/30/12 12:09:08 PM]
+ expand thread (1 answer)
- collapse thread
Stop trolling.
12
14
[
Posted by: linuxlowdown

|
Date: 11/30/12 05:51:21 PM]
5.
fingers ... hand ... getting the picture ?
show the post
1
5
[
Posted by: ratnik

|
Date: 11/30/12 06:59:48 PM]
+ expand thread (1 answer)
- collapse thread
With all due respect, who can understand your psychotic thinking?
11
11
[
Posted by: linuxlowdown

|
Date: 11/30/12 11:20:55 PM]
6.
The portable applicance market is just another revenue stream for AMD. In this economy it's smart to diversify.
As far as performance X86 desktop solutions, AMD has been planning for over five years to eventually offer APUs in the entry, mid-level and high-end desktop segments. This is a smart and cost effective solution that will meet most PC users needs very well, use less power, produce less heat and deliver equal or better performance than all but the very highest priced discrete CPU/GPU combos.
Consumers win and fanbois lose.
13
9
[
Posted by: beenthere

|
Date: 11/30/12 07:29:22 PM]
+ expand thread (1 answer)
- collapse thread
Moron! AMD will be irrelevant in the market of x86 in 3-5 yrs. They're slowly abandoning the market in which built AMD in the first place! The plans to go to ARM is their exit plan!!!
show the post
8
12
[
Posted by: dudde

|
Date: 12/01/12 04:05:53 PM]
7.
Just give me a cpu socket that will last for 3 years, and commit yourself to develop at least mainstream x86 cpus, with no overclock or isa restriction - and i will pledge alliance to AMD even if they try to kick me in the nuts.
1
3
[
Posted by: Marburg U

|
Date: 12/01/12 12:53:24 PM]
+ expand thread (3 answers)
- collapse thread
They did kick you in the nuts and still you keep licking their balls! they will give you nothing!
show the post
8
11
[
Posted by: dudde

|
Date: 12/01/12 10:33:30 PM]
Intel is the one to put you into slavish bondage. And fanbois like you get off whilst paying the high prices for their services. Y'all do anything masochistic to feel the Intel inside.
10
9
[
Posted by: linuxlowdown

|
Date: 12/02/12 12:39:53 AM]
go get a real job moron! if you want to buy anything, you need to earn something to get it.. or are you still sucking your mother's tits for snacks?!?
show the post
0
6
[
Posted by: dudde

|
Date: 12/14/12 07:16:14 PM]
8.
25-30 years sounds to be realistic. The title of this article is another way of saying Quantum Computing will replace x86 and reach mainstream in 25-30 years. Current Inel's CEO answered the same in an interview, a 25+ year technology.
AMD's future is not in x86, but ARM-based designs as well and of course their exciting and promising HSA/GPGPU tech. The company still creates multi-billion dollar revenue as of 2011. The real reason AMD is so behind Intel is the lack of access to the cutting edge process technology which has been getting insanely difficult and expensive to develop/have since the migration to the 45nm and beyond nodes.
0
1
[
Posted by: texasti

|
Date: 12/03/12 12:42:04 PM]
[1-8]