News
 

Bookmark and Share

(49) 

Although Advanced Micro Devices recently announced plans to design various system-on-chips based on ARM architecture, the company believes that the x86 instruction set will remain very important for the industry for decades to come. Quite possibly, the claim was made to debunk rumours about AMD’s plans to halt development of high-performance x86 chips in a bid to concentrate on low-power products.

“There are no doubts that x86 is going to be a huge portion of our business. I think that it is going to be an important segment of our business for 5 – 10+ years. The x86 is going to be here long after I am retired. […] There will be x86 applications just like there are mainframe applications today, 25 – 30 years later. That is not going to fundamentally change,” said Rory Read, chief executive of AMD, at Credit Suisse Technology Conference earlier this week.

Throughout its history, the x86 architecture has fought many battles, some of which determined the future of the whole computing industry. The x86 managed to become the de facto microprocessor standard for servers, which replaced mainframes and essentially killed proprietary processor architectures. The vast majority of x86 wins were conditioned by their high performance amid relatively low costs.

But today, x86 is facing a tough competition from ARM architecture, which is more energy efficient and which lets chip designers to make extremely low-cost chips that consume over a fraction of power the x86-based solutions do. As it appears, in many ways very high performance is not needed, but power consumption is crucial. Therefore, ARM-based microprocessors are gaining revenue share away from x86 nowadays.

It is quite obvious that the need for high performance will remain: there are complex programs that process huge amounts of data, there are sophisticated video games, there is video editing software and there are many other applications that take advantage of high-end chips, which rely on x86 instruction set. In fact, the gap between performance of ARM and x86 chips will remain rather huge for years to come.

AMD understands that very well, which is why its chief executive, who openly said that he did not want to compete against Intel Corp. for maximum performance, believes that x86 is here to stay. Perhaps, the claim also means that AMD will continue to develop high-end x86 solutions, even though it is tremendously hard for AMD to remain more or less competitive in that market segment.

Tags: AMD, x86, Bulldozer, Piledriver, Excavator, Steamroller, Intel, ARM, Cortex

Discussion

Comments currently: 49
Discussion started: 11/30/12 04:29:47 AM
Latest comment: 12/14/12 07:16:14 PM
Expand all threads | Collapse all threads

[1-8]

1. 
show the post
12 18 [Posted by: Avon4Balls  | Date: 11/30/12 04:29:47 AM]
Reply
- collapse thread

 
x86 will be around because it's needed by scientists, industry and government agencies to run powerful software. ARM is good for Facebook and the like. Intel and AMD will always have business for x86 architecture. But both companies are being forced to diversify their portfolios to meet the fragmentation of the market lead primarily by the demand for portable handheld internet devices.

Oh, if you didn't read, it was Rory Read who you agree with. Fancy that, Intel troll.
16 15 [Posted by: linuxlowdown  | Date: 11/30/12 05:10:45 AM]
Reply
 
AMD is trying hard to diversify! But what did they get? They lost focus on their core product! They lost their identity in the market that made them a leader a decade ago! AMD is taking the path towards becoming a second rate custom chip maker! And in a few more years, they will be scraps that will soon be sold to the chinese market!
9 10 [Posted by: dudde  | Date: 12/01/12 10:41:06 PM]
Reply
 
Utter verbal diahorrea.
10 10 [Posted by: linuxlowdown  | Date: 12/02/12 12:20:47 AM]
Reply
 
show the post
1 6 [Posted by: dudde  | Date: 12/14/12 07:14:00 PM]
Reply

2. 
AMD Expects x86 to Remain Important Architecture for 25 – 30 Years
Of course, it will.
Will AMD stay in x86 business at least in 2-3 years? it's a big question.
14 15 [Posted by: Azazel  | Date: 11/30/12 06:33:48 AM]
Reply
- collapse thread

 
show the post
5 9 [Posted by: Yorgos  | Date: 11/30/12 09:40:50 AM]
Reply
 
show the post
10 15 [Posted by: Avon4Balls  | Date: 11/30/12 10:29:10 AM]
Reply
 
Lease back agreements are very usual. For example Peugeot had to do it for their Paris headquarters earlier this year. btw, in your thinking all renters are homeless.
12 13 [Posted by: linuxlowdown  | Date: 11/30/12 05:49:01 PM]
Reply
 
show the post
10 14 [Posted by: BestJinjo  | Date: 12/01/12 08:55:43 AM]
Reply
 
Avon stop posting non-sense.

Sold the building? You are CLUELESS regarding understanding of basic business concepts. You should stop posting regarding any business-related topic until you get a degree in business, such is the lack of your knowledge in this area.

Many companies in the world do a corporate sale/strategy leaseback as a way to inject immediate cash flow into the business while "exchanging" a high depreciation expense on the income statement into a lease expense which tends to be smaller, thus increasing earnings/net income. The company then leases back the building over 10-20 years.

"For some corporations, the objective in sale/leaseback transactions is simply to show increased earnings. This is accomplished by converting equity - in the form of buildings with little or no debt - into cash, which may be needed for business expansion or other purposes, such as immediate cash flow infusion for day-to-day operations/payment of obligations.

Sale/leasebacks provide an opportunity to raise cash while maintaining operating control of the property as if the property were still owned by the corporation. The sale/leaseback, in such a case, is an excellent way to raise capital and, therefore, is a financing technique. Annual lease costs can be equated to the interest charged in a traditional financing situation. If a corporation can borrow funds at 10% for 15 years and if a sale/leaseback has an effective lease cost of 9%, then the sale/leaseback can be an effective vehicle."
http://www.conway.com/geofacts/pdf/50910.pdf

Sale leaseback is NOT the same as selling the building which is called "Disposition of real estate assets," whereby the actual legal ownership of the asset is transferred from one party to another.

Over the course of following your posts, I realized you haven't got a clue about how businesses function, and have very narrow understanding of finance and accounting-related concepts, nevermind the specific intricacies of the semi-conductor space. I don't know what your career is, but you should stop posting about business-related topics since you don't grasp the concepts.
12 14 [Posted by: BestJinjo  | Date: 12/01/12 08:37:51 AM]
Reply
 
How FX-83xx is competing with 1K$ intel ? Please explain ?



7 4 [Posted by: maroon1  | Date: 11/30/12 12:40:42 PM]
Reply
 
show the post
11 14 [Posted by: BestJinjo  | Date: 12/01/12 08:47:02 AM]
Reply
 
http://www.techpowerup.co...5W-TDP.html?cp=3#comments

that's for both the replies of my comment.

@avon4balls amd did a bad comeback with bulldozer they lost tones of money because of the long r&d and the poor performance that didn't get enough customers. Do you have any proof that ms or Sony are going to ditch amd gpus after a non-problematic cooperation the last 8 years?

I see a processor with great potential to scale up to 50 cores or whatever they want. they are going to make even better ipc in the next generations. I was about to get a 3770k plus a good motherboard, because i compile a lot of staff and my i7-720qm laptop cpu is weak for my verilog simulations, but I changed my mind after those results.

2 4 [Posted by: Yorgos  | Date: 11/30/12 02:41:24 PM]
Reply
 
LOL !! Your own link proves that your are wrong about FX-8350 competing with 1K$ intel when it comes to performance

Not to mention that the user in that link was cherry-picking benchmarks that favors FX-8350. So, even with cherry-picking you fail to prove ur previous comment.
2 4 [Posted by: maroon1  | Date: 11/30/12 05:18:41 PM]
Reply
 
you are right, and the benchmarks compiled with icc don't favor anyone.
prove me why those benchmarks favor amd? are those compiled with amd's compiler? are those using any amd's private instruction set? are you just addicted to trolling.
I am the only one here placing benchmarks and actual numbers and I am the one who gets all the hatred(a.k.a. thumbs down)
3 5 [Posted by: Yorgos  | Date: 11/30/12 06:41:39 PM]
Reply
 
The reason I said it favors FX-8350 because the guy in that link picked the best case scenario for FX-8350 out of different reviews. He deliberately ignored all other benchmark. Also, he ignored all lightly-threaded benchmarks.

For example, He could have picked those benchmark
http://www.xbitlabs.com/i.../amd-fx-8350/3dsmax-2.png
http://www.xbitlabs.com/i...pu/amd-fx-8350/winrar.png
http://www.xbitlabs.com/i...amd-fx-8350/starcraft.png
http://media.bestofmicro....57628/original/itunes.png
http://media.bestofmicro....27/original/handbrake.png
http://media.bestofmicro....riginal/skyrim%201920.png

Also, Why you are ignoring the other part of my comment. You have yet to explain how FX-8350 performs on par with 1K$ intel. i7 3930K/3960X smokes FX-8350 in almost every mutli-threaded benchmarks and lightly-threaded benchmark.





3 3 [Posted by: maroon1  | Date: 11/30/12 07:56:55 PM]
Reply
 
show the post
2 5 [Posted by: Yorgos  | Date: 11/30/12 08:34:51 PM]
Reply
 
GCC has a heavy but not very noticeable Intel bias. The underlying descriptors that tell the compiler what to look for in an architecture is much better in Intel's case rather than AMD's case. Intel GCC optimizations are done with real world tests while AMD GCC optimizations are not tested well or there is a lot of copy and pasta. The best compiler to use for AMD is VS11(2012) or better yet Windows 8 is the best platform to use Bulldozer/Piledriver on.
3 1 [Posted by: seronx  | Date: 12/01/12 02:20:05 AM]
Reply
 
The link I posted has a dozen of benchmarks that fx8350, a 200$ cpu, is behind in some benchmarks at most 10%,


At most 10% ?!! This comment proves how dishonest you are

Here it wins by 40% against FX-8350
http://img.techpowerup.org/121023/Capture041.jpg

Here it wins by 43%
http://img.techpowerup.org/121026/Capture058.jpg

Both of those benchmarks were taken from your own biased link. In most of the other benchmarks, it was over 20% faster. And yet you say 10% at most ?!! Stop being dishonest.



2 3 [Posted by: maroon1  | Date: 12/01/12 05:05:26 AM]
Reply
 
What kind of an argument is that -- a $195 CPU competing with a $999 offering? Let's compare a 3 series BMW to a Porsche 911 GT2 RS shall we?

For what it is, FX8350 is actually very decent. It's a good alternative to i5-3570K for those who need a CPU for multi-threaded tasks. Us gamers will choose i5-3570K but I know if I was doing video encoding or rendering, I'd take FX8350 in a heartbeat at $195 over an i5.

In relative standing to i7, the FX8350 is no worse than X1100 X6 was to Lynnfield and SB i7s. If all you care about is games, then AMD hasn't been worth buying for gamers since 2006, way before Bulldozer/Vishera. This constant anti-AMD hate is more than just Bulldozer. Just sayin'.
11 12 [Posted by: BestJinjo  | Date: 12/01/12 08:52:30 AM]
Reply
 
Yes, the console angle was the first thing which crossed my mind. Unless the new consoles fail miserably, AMD should be able to stay alive thanks to them.

Also it's rumoured that the next version of Surface Pro from Microsoft will feature an AMD APU.
0 2 [Posted by: ET3D  | Date: 12/02/12 06:11:14 AM]
Reply

3. 
show the post
3 13 [Posted by: sanity  | Date: 11/30/12 06:43:47 AM]
Reply

4. 
Intel is making x86 important. AMD is making x86 unimportant.
10 8 [Posted by: Tristan  | Date: 11/30/12 12:09:08 PM]
Reply
- collapse thread

 
Stop trolling.
12 14 [Posted by: linuxlowdown  | Date: 11/30/12 05:51:21 PM]
Reply

5. 
show the post
1 5 [Posted by: ratnik  | Date: 11/30/12 06:59:48 PM]
Reply
- collapse thread

 
With all due respect, who can understand your psychotic thinking?
11 11 [Posted by: linuxlowdown  | Date: 11/30/12 11:20:55 PM]
Reply

6. 
The portable applicance market is just another revenue stream for AMD. In this economy it's smart to diversify.

As far as performance X86 desktop solutions, AMD has been planning for over five years to eventually offer APUs in the entry, mid-level and high-end desktop segments. This is a smart and cost effective solution that will meet most PC users needs very well, use less power, produce less heat and deliver equal or better performance than all but the very highest priced discrete CPU/GPU combos.

Consumers win and fanbois lose.
13 9 [Posted by: beenthere  | Date: 11/30/12 07:29:22 PM]
Reply
- collapse thread

 
show the post
8 12 [Posted by: dudde  | Date: 12/01/12 04:05:53 PM]
Reply

7. 
Just give me a cpu socket that will last for 3 years, and commit yourself to develop at least mainstream x86 cpus, with no overclock or isa restriction - and i will pledge alliance to AMD even if they try to kick me in the nuts.
1 3 [Posted by: Marburg U  | Date: 12/01/12 12:53:24 PM]
Reply
- collapse thread

 
show the post
8 11 [Posted by: dudde  | Date: 12/01/12 10:33:30 PM]
Reply
 
Intel is the one to put you into slavish bondage. And fanbois like you get off whilst paying the high prices for their services. Y'all do anything masochistic to feel the Intel inside.
10 9 [Posted by: linuxlowdown  | Date: 12/02/12 12:39:53 AM]
Reply
 
show the post
0 6 [Posted by: dudde  | Date: 12/14/12 07:16:14 PM]
Reply

8. 
25-30 years sounds to be realistic. The title of this article is another way of saying Quantum Computing will replace x86 and reach mainstream in 25-30 years. Current Inel's CEO answered the same in an interview, a 25+ year technology.

AMD's future is not in x86, but ARM-based designs as well and of course their exciting and promising HSA/GPGPU tech. The company still creates multi-billion dollar revenue as of 2011. The real reason AMD is so behind Intel is the lack of access to the cutting edge process technology which has been getting insanely difficult and expensive to develop/have since the migration to the 45nm and beyond nodes.
0 1 [Posted by: texasti  | Date: 12/03/12 12:42:04 PM]
Reply

[1-8]

Add your Comment




Related news

Latest News

Thursday, August 28, 2014

12:22 pm | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

9:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

6:41 pm | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture

Monday, August 25, 2014

6:05 pm | Chinese Inspur to Sell Mission-Critical Servers with AMD Software, Power 8 Processors. IBM to Enter Chinese Big Data Market with the Help from Inspur

Sunday, August 24, 2014

6:12 pm | Former X-Bit Labs Editor Aims to Wed Tabletop Games with Mobile Platforms. Game Master Wants to Become a New World of Warcraft