News
 

Bookmark and Share

(111) 

Advanced Micro Devices and ARM Holdings on Monday announced initiative that promises to change the datacenter industry. Under the terms of the agreement, AMD will offer datacenter-class microprocessors based on both ARM and x86 architectures. The first AMD Opteron chips based on the ARM architecture and AMD/SeaMicro Freedom fabric are projected to emerge in 2014.

"AMD led the data center transition to mainstream 64-bit computing with AMD64, and with our ambidextrous strategy we will again lead the next major industry inflection point by driving the widespread adoption of energy-efficient 64-bit server processors based on both the x86 and ARM architectures. Through our collaboration with ARM, we are building on AMD's rich IP portfolio, including our deep 64-bit processor knowledge and industry-leading AMD SeaMicro Freedom supercompute fabric, to offer the most flexible and complete processing solutions for the modern data center," said Rory Read, president and chief executive officer at AMD.

AMD Set to Design 64-Bit ARM Server Processors, Will Not Drop x86

AMD's Opteron microprocessors based on x86 and ARM architectures will integrate high-performance production-proven Freedom fabric originally designed at SeaMicro. The Freedom can connect thousands of processor cores, memory, storage and input/output traffic with up to 1.28Tb/s (160GB/s) speed. SeaMicro’s fabric supports multiple processor instruction sets, which makes it compatible with both AMD x86 and ARM technologies.

The integration of fabric into central processing units and eventually enterprise-class accelerated processing units will allow AMD to address different server markets with more or less unified offerings, which will offer different levels of performance and power consumption. AMD does not seem to have plans to put x86 and ARM cores into the same server chips, but will develop common building blocks that will be used for both x86 and ARM system-on-chips/microprocessors as well as various accelerated processing units. Essentially, AMD plans to integrate its leading-edge stream processing capabilities, advanced Freedom fabrics and other innovative technologies into ARM-based server chips to quickly and easily become a leading player on the market that is about to start a quick growth.

AMD stresses that x86 architecture is not going to disappear from servers or PCs in the coming years. Therefore, the chip designer will continue to develop x86 Opterons in addition to ARM Opteron products. It is also necessary to note that AMD does not rule out any possibilities to create a server chip based on its Jaguar ultra low-power x86 microarchitecture. It is possible that in the future AMD's Opteron lineup will include processors based on three different architectures: high-performance x86 (Steamroller, Excavator), ultra low-power x86 (Jaguar) and high-performance ARM (ARMv8 and successors).

"Over the past decade the computer industry has coalesced around two high-volume processor architectures – x86 for personal computers and servers, and ARM for mobile devices. Over the next decade, the purveyors of these established architectures will each seek to extend their presence into market segments dominated by the other. The path on which AMD has now embarked will allow it to offer products based on both x86 and ARM architectures, a capability no other semiconductor manufacturer can likely match," observed Nathan Brookwood, research fellow at Insight 64.

 

Offering different products with different architectural peculiarities for different markets is a part of AMD’s ambidextrous strategy. It remains to be seen how successful will AMD be with completely different offerings and whether internal competition between ARM and x86 [and potential cannibalization of the latter by the former] will do more harm than good for AMD.

Datacenter Owners and Software Designers Eager to Port Programs to ARM Architecture

Many large server customers find that they do not necessarily take advantage of all the performance their servers have. In some cases, the cost of a datacenter equipment and hardware equals to the cost of electricity it consumes throughout its lifetime. Therefore, for the majority of datacenter owners cutting power consumption is a crucial thing. Although server software in general is not compatible with ARMv8 64-bit architecture, many big server customers, such as Facebook, are eager to port their apps onto non-standard platforms themselves to save on energy costs.

AMD claims that it is working with several hardware OEMs and numerous software makers to ensure that there is the right eco-system for x86 and ARM server central processing units. AMD and its partners believe that software developers will be quick with porting their programs to ARMv8 64-bit architecture and the situation with slow transition to x86-64 will not repeat itself.

Rory Read, chief executive officer of AMD, believes that ARM-based servers are going to capture a double-digit percent of server market in three to five years. Mr. Read’s optimism regarding ARM-based servers is shared by numerous high-ranking executive around the industry. Actual penetration of ARM into the commercial server segment will depend on availability of compatible software and hardware. But software makers will only port their applications in case there is a market for them, which means actual deployments. So, at the end of the day, the success of ARM and AMD in servers depends on software developers again…

Tags: AMD, ARM, Opteron, SeaMicro, Freedom, Steamroller, Excavator, Jaguar, ARMv8

Discussion

Comments currently: 111
Discussion started: 10/29/12 02:19:52 PM
Latest comment: 11/06/12 02:19:35 PM
Expand all threads | Collapse all threads

[1-12]

1. 
show the post
5 12 [Posted by: AvONbaCK  | Date: 10/29/12 02:19:52 PM]
Reply
- collapse thread

 
That goes to show just how small your level of thinking is. Doesn't mean anything for consumers, as if. You do realize everything that happens in the big business world, all the technology employed, researched, and produced for them, eventually works its way down to consumer-grade products right?
5 6 [Posted by: mmstick  | Date: 10/29/12 02:27:49 PM]
Reply
 
show the post
4 11 [Posted by: AvONbaCK  | Date: 10/29/12 02:44:11 PM]
Reply
 
And where's your second troll head? It hasn't popped up yet today. I was expecting some of the ol' troll tag teaming with this news. Now go find a bridge to hide under, perhaps a sandy or ivy one. I will trade barbs with you any day of the week. But don't forget to try to make intelligent comments on the articles.
5 6 [Posted by: linuxlowdown  | Date: 10/29/12 03:24:01 PM]
Reply
 
show the post
4 10 [Posted by: AvONbaCK  | Date: 10/29/12 03:35:48 PM]
Reply
 
All good silicon needs some burning in. Hope you got a good heatsink like Noctua or Thermaltake.
6 3 [Posted by: linuxlowdown  | Date: 10/29/12 03:44:33 PM]
Reply
 
show the post
4 9 [Posted by: AvONbaCK  | Date: 10/29/12 03:50:00 PM]
Reply
 
Even a high school kid can solder, get real. Every computer repair kit has a good pair of soldering tools. Also, that's a bad idea, for various reasons. You have nothing to gain from that, but a lot to potentially lose.
5 4 [Posted by: mmstick  | Date: 10/29/12 04:12:38 PM]
Reply
 
show the post
4 10 [Posted by: AvONbaCK  | Date: 10/29/12 04:29:17 PM]
Reply
 
A lot of us bought AMD's CPUs when they were better during A64/X2 era but we don't sit here and trash talk AMD's inability to compete with Intel knowing the company is 75x smaller. AvON, if someone gave you $10 million, you still couldn't make me better ice cream than Haagen Dazs.

No one cares what you'll do with your delidded IVB since in 7 months a $225 Haswell will crush it.
6 4 [Posted by: BestJinjo  | Date: 10/29/12 06:14:02 PM]
Reply
 
show the post
4 9 [Posted by: AvONbaCK  | Date: 10/29/12 06:24:03 PM]
Reply
 
Does anyone else see the irony in this statement? Says alot about your character Avon if you think you are cool buying two Intel cpus back to back even though that is completely unnecessary, overkill, and a waste of money. Intel brainwashing at its finest lol
4 1 [Posted by: veli05  | Date: 10/30/12 08:37:21 AM]
Reply
 
It's not hard to see irony and contradiction all over Avon and 123's posts. They type so many comments like this that, psychologically they must have conditioned themselves to believe in their own words.
2 1 [Posted by: mmstick  | Date: 10/30/12 07:52:42 PM]
Reply
 
show the post
1 4 [Posted by: AvONbaCK  | Date: 10/30/12 11:53:14 PM]
Reply
 
show the post
4 10 [Posted by: 123  | Date: 10/29/12 04:49:44 PM]
Reply
 
In everything?

Notice how the average performance of a $199 Vishera is higher than i5-3570K?
http://techreport.com/rev...350-processor-reviewed/14

How can that be if IVB kills it in "everything"?

Stop trolling and get off this awesome website.
5 5 [Posted by: BestJinjo  | Date: 10/29/12 05:50:46 PM]
Reply
 
Actually 123 is correct that Ivy Bridge beats Vishera/Piledriver. That graph only shows overall performance. If you do all those tasks that techreport did this is the overall performance. Not everybody will be doing all those tasks on a daily basis. Only part of the test that everybody will be doing on a daily basis. Click on gaming and things gets into perspective. The i5-3570K is actually a better buy. Taking that gaming graph as an ideal of the performance vs price. The i5-3570K is overall a better value.

I have an i3-3225 and it may score low compared to A10-5800K, but the gaming graph shows a completely different story. I do not do any gaming but the i3-3225 has equal performance to a FX-8350 and the i3-3225 is cheaper.

Comparing the FX-8350 to an i3-3225 for which processor has the lowest power consumption. The i3 wins. Even the i5-3570K has lower power consumption than FX-8350.

Saying that FX-8350 is best at multi-threading will be AMD light bulb of the day. The amount of programs that are multi-threaded are at low numbers, so multi-threading does not count.
3 5 [Posted by: tecknurd  | Date: 10/29/12 11:37:32 PM]
Reply
 
"The amount of programs that are multi-threaded are at low numbers, so multi-threading does not count."

I'm sorry? There are tons of multithreaded software in existence. In fact, it's rare to find a program that isn't multithreaded. Even modern games are getting well threaded.

There is another factor too, a factor benchmarks can't measure, responsiveness of the system. I've used i7 systems, and FX systems. FX systems feel smoother, a lot smoother with applications and multitasking. Look tecknurd, people who need the power and buy products like FX-8350 are already power users who do a lot of multitasking. These aren't people from the dawn of GUI OS's who still use one program at a time.
3 3 [Posted by: mmstick  | Date: 10/30/12 02:04:54 AM]
Reply
 
Except the sub $200 processor is trading blows with the 3770k on Linux.
3 4 [Posted by: mmstick  | Date: 10/29/12 09:35:57 PM]
Reply
 
show the post
4 9 [Posted by: 123  | Date: 10/29/12 04:46:48 PM]
Reply
 
Don't you have something better to do? You know, with as much time and effort you spend on these articles, one would wonder if you are paid to do this.
5 5 [Posted by: mmstick  | Date: 10/29/12 05:20:55 PM]
Reply
 
Nah, he's not paid. He does it during his lunch time at Intel. Dude, you need to leave your desk sometimes.
2 3 [Posted by: linuxlowdown  | Date: 10/29/12 11:44:42 PM]
Reply
 
I missed you troll. You better finally sell your intel shares fast on this news. Intel are going down town. The Atom will be smashed.
2 2 [Posted by: linuxlowdown  | Date: 10/29/12 11:41:03 PM]
Reply
 
Apparently you completely missed the point... again.... I think it's time you got off this site and did some research. Let's start from the top, shall we?

Where did I state that consumers are buying server grade products? All technology starts in the server world, all the way back to the ENIAC, and even the Internet. Going back farther, people didn't see the need for telephones as they were only for big important businesses and government facilities, but eventually that tech made itself down to the consumer world. The same thing goes for the invention of electricity, where people said they were fine with their kerosene lamps, but eventually even that made itself down to the consumer world. The same thing is true for all technology involved in computers. RAM, motherboards, protocols, standards, everything begins in the server world. Server products are server products because of the premium you are paying for higher end technology. After the technology has had it's use in the server world and the servers are moving to bigger and better things, you find server technology trickling down to consumer products. Let me guess, you are using a multicore processor, where do you think the first multicore processor came from? It was an AMD Opteron server processor.
3 4 [Posted by: mmstick  | Date: 10/29/12 04:18:20 PM]
Reply
 
show the post
3 7 [Posted by: AvONbaCK  | Date: 10/29/12 04:51:01 PM]
Reply
 
Can you please remind us about what this conversation was already about? Instead of ignoring the point of the discussion and bashing both me and AMD, can you get back to the point? Oh, that's right, you don't read comments, you just hit the thumbs down button on everyone and comment on how much you hate AMD.

You have this pattern of derailing the topic as soon as you have lost the argument.
3 3 [Posted by: mmstick  | Date: 10/29/12 05:16:43 PM]
Reply
 
All his responses will lead to the same conclusion: AMD sucks, they failed, blah blah blah.

Then he goes out and buys IVB, what a joke. 7 months before Haswell launches, he wastes $ on a SB refresh that needs delidding to even make it worthwhile over Sandy. He doesn't even sounds like an intelligent Intel consumer. Who would be stupid enough to buy into Socket 1155 at this point when it's near EOL as early as June 2013? He can't even do simple things like timing his Intel CPU upgrades properly and he criticizes AMD's strategy.
5 5 [Posted by: BestJinjo  | Date: 10/29/12 05:45:52 PM]
Reply
 
show the post
2 6 [Posted by: 123  | Date: 10/29/12 06:38:59 PM]
Reply
 
show the post
3 6 [Posted by: AvONbaCK  | Date: 10/29/12 06:58:13 PM]
Reply
 
You never answered me. You said this development has no affect on the end user, I explained yes it does and how, you did not respond (you lose by default then). Face it, the events here are big news for everyone, and the fruits of this will make the way down to consumers in the near future whether you like it or not.

I'm sorry you are upgrading a perfectly good processor to something that isn't really an upgrade, even for gaming, more or less a sidegrade if anything. Meanwhile, I have an army of ten FX-8120's in a cluster configuration that can eat your little 'ivy' alive in performance per watt and sheer computational power. Unlike you, if I buy hardware, I need a lot of power, not one computer, not two, but as many as money can buy.
1 2 [Posted by: mmstick  | Date: 10/29/12 09:51:36 PM]
Reply
 
That's impressive. What do you do with it?
2 2 [Posted by: linuxlowdown  | Date: 10/29/12 11:27:47 PM]
Reply
 
More than your brain could grasp. A lot of batch jobs involving x264, neroaacenc, ffmpeg, various types of servers from web hosting, game hosting, SAMBA and FTP file servers, email, or whatever else I need to do at the moment. Idle cycles are spent in distributed computing projects to help medical researchers. As I've said earlier, my home network is more advanced than a typical corporate network. The power consumption really isn't that bad, it's actually quite low. Desktop hardware is good enough to be used as server hardware these days, especially FX processors.
1 2 [Posted by: mmstick  | Date: 10/30/12 02:12:41 AM]
Reply
 
No, I grasp that - I checked the logs and my brain reported it was ok the last time it checked itself. I thought you might do Folding@Home with your cluster.
3 1 [Posted by: linuxlowdown  | Date: 10/30/12 05:19:16 AM]
Reply
 
show the post
1 4 [Posted by: 123  | Date: 10/31/12 12:54:41 PM]
Reply
 
show the post
1 6 [Posted by: 123  | Date: 10/30/12 12:08:32 AM]
Reply
 
Compared to Sandy bridge, ivy bridge is about comparable for over clocking headroom. That is why it is a side grade

In any case, the frequency potential of the new Ivy Bridge processors turned out to be below our expectations. We didn’t manage to overclock them even to the same heights as the previous-generation Sandy Bridge. So, we can state that the overclocking potential of the newcomers has become worse, which may have been caused by the reduction of the geometrical die size of the new Ivy Bridge. Its overall size is 25% smaller than the Sandy Bridge die, and the computational cores have become only half the size of the Sandy Bridge cores. However, contemporary approach to processor die cooling doesn’t allow increasing the heat flow density in equal proportion, which causes local overheating of some parts of the processor cores during overclocking. High operational CPU core frequencies indirectly confirm that this problem indeed exists, although the processor cooler remained practically cold in this case.


http://www.xbitlabs.com/a...70k-i5-3570k_9.html#sect0

That is straight from Xbits review of the 3770k and 3570k. Read up on the tech before you make uninformed statements about it.
1 1 [Posted by: veli05  | Date: 10/30/12 08:43:14 AM]
Reply
 
show the post
1 5 [Posted by: AvONbaCK  | Date: 10/30/12 12:16:40 AM]
Reply
 
ROFL, 20-30FPS? From what, 150FPS to 180FPS? Get real, your monitor doesn't even refresh fast enough for that kind of framerate. I'm sorry Avon, but I'm an avid gamer myself. I bought a $500 GPU to go along with my FX-8120, and I own a steam list of over 400 games. There isn't a single game that I own that the framerates are less than the refresh rate of my 60Hz 1920x1200 monitor, that includes the most demanding game on this planet, Shogun 2.

You speak of making upgrades wisely, yet you are replacing a perfectly good CPU with something that isn't that much better. You want to talk about fanboyism ideas and religions. Take a look in the mirror Intel fanboy, Intel isn't like a religion to you, it IS your religion.
2 3 [Posted by: mmstick  | Date: 10/30/12 02:08:00 AM]
Reply
 
show the post
1 5 [Posted by: AvONbaCK  | Date: 10/30/12 02:48:40 AM]
Reply
 
Get real, My system used to be an overclocked Phenom II X6 1100T, then upgraded to this FX-8120, there was a decent improvement in framerate, but more importantly better responsiveness. Your problem is you have a 6950, and not a 7950, that's your problem, it has nothing to do with your CPU, and improving your CPU isn't going to magically improve your framerates more than you already get. Avon, you are way too contradictory, first you ignore my comments, then you say I claim idiotic things, and then you go off on the fanboy wagon. ROFL, didn't you bother to read my comment. I game on a 28 inch 1920x1200 resolution monitor. I may not get 150FPS in games like Shogun 2, but having an Intel CPU wouldn't magically get me there because graphics cards aren't strong enough. No, what you are stating, that 20-30FPS improvement, is ONLY possible to be seen in a benchmark when you are running a game at a really low resolution, and you are already getting high framerates like 150. The jokes on you Avon. In fact, even in those 'CPU tests', where they put the games in low resolution in order to test how a CPU can effectively utilize a GPU in these low demanding environments, the difference between AMD is still so small, perhaps 5% difference, that any overclock overcomes that difference, therefore making your FPS point moot. Name one game that I cannot get at least 60FPS in with my 7950 and FX-8120@4Ghz, I dare you. Shogun 2 is the most demanding game on the market, and I have no problems running it perfectly smooth. BF3 is so well threaded that even using Intel has 0% advantage. Please enlighten me to these nonexistant games of yours that I cannot run well, I'd like to play them.
2 1 [Posted by: mmstick  | Date: 10/30/12 03:41:30 AM]
Reply
 
show the post
1 5 [Posted by: AvONbaCK  | Date: 10/30/12 04:41:05 AM]
Reply
 
Really now, out of all games you pick one with the least need for CPU. F1 2012 is not a demanding game at all, my FPS does not go below 100 with 1920x1200 8xMSAA on ultra. Even if I were to get more than 100 FPS I wouldn't be able to tell the difference because my refresh rate is only 60Hz.
2 2 [Posted by: mmstick  | Date: 10/30/12 04:21:09 PM]
Reply
 
show the post
1 5 [Posted by: AvONbaCK  | Date: 10/30/12 06:20:23 PM]
Reply
 
AMD FANBOI you should of went for an Intel I5-3570K. Its far more powerful than ANYTHING AMD WILL EVER HAVE.
1 2 [Posted by: 123  | Date: 10/31/12 12:55:42 PM]
Reply
 
Thanks for stating the obvious Mr. Obvious:

AMD Market cap = 1.46B
Intel Market cap = 109.82B

You and others here seem to be in denial, expecting a company 75x smaller to compete with Intel in the high-end x86 CPU space. AMD doesn't even have its own fabs. Get real.

Having said that $199 Vishera offers great performance/$ overall, landing between i5-3470 and i7-2600K in performance:
http://techreport.com/rev...350-processor-reviewed/14

^That's not bad for a company 75x smaller in the eyes of anyone who isn't a delusional about where AMD stands in terms of company size/resources.

It's actually embarrassing for NV that they lost the single-GPU performance crown for GPUs this generation to a company that's so small!
4 4 [Posted by: BestJinjo  | Date: 10/29/12 05:43:35 PM]
Reply
 
show the post
2 7 [Posted by: AvONbaCK  | Date: 10/29/12 08:05:18 PM]
Reply
 
It 'seems' no one agrees with you or your troll party.
3 1 [Posted by: mmstick  | Date: 10/29/12 09:30:59 PM]
Reply
 
AvONbaCK, it seems everyone but you who followed AMD's public strategy knew this announcement had nothing to do with desktop/server x86 CPUs, but a new direction for AMD. AMD has said nearly a year ago they are no longer interested in competing with Intel in the high-end CPU race. Wake up man, wake up! It has been publicly known information.

You are seriously lost. AMD is licensing ARM to be able to enter the market much quicker than had they chosen to make their own efficient CPU design. Frankly, Intel can't make a more efficient server CPU than ARM, so doubtful AMD could have either given the urgency of being first in this micro-server space.

It looks like you and your other troll friends are the only delusional consumers who are still thinking AMD will produce an Intel beating consumer CPU. You might as well ask Honda to make a Bugatti Veyron beating supercar with a fraction of the resources VW Group has. Stop being delusional.
4 5 [Posted by: BestJinjo  | Date: 10/29/12 04:43:40 PM]
Reply
 
show the post
3 7 [Posted by: 123  | Date: 10/29/12 04:51:31 PM]
Reply
 
What do you mean looks like?

You and AvONbaCK have been living under a rock or something? AMD a long-time ago announced they are done in the high-end CPU space. They will only make gradual improvements about 10-15% per year but that's it. No more chasing to beat Intel in high-end x86 CPUs.

Why do you think AMD took a step into the microserver market in March of this year with the acquisition of Silicon Valley startup SeaMicro for $334 million? Their strategy change was evident for a long-time to anyone who actually followed this company closely. But it looks like you and AvON are still surprised by this news.

I told him already:
- AMD doesn't have the $ to design a faster CPU than Intel on paper
- AMD doesn't have 1 node manufacturing advantage that it needs to execute the design
- Therefore, outside of multi-threaded apps by virtue of offering more cores in AMD x86 CPUs, Intel will continue to hold the edge for a long-time.

If you are willing to spend > $200 on a gaming CPU, you go Intel, end of story. But that market of PC gamers is small. Even when AMD had a good gaming CPU during XP+, A64/X2 eras, it barely made made a dent. Most consumers will buy Intel CPUs on brand name alone and marketing even if AMD were to design a better CPU (not like it's happening).
4 4 [Posted by: BestJinjo  | Date: 10/29/12 05:01:22 PM]
Reply
 
show the post
2 8 [Posted by: 123  | Date: 10/29/12 06:42:00 PM]
Reply
 
Except AMD has the best graphics card on the market. Both in gaming, the professional workstation world, and in science. One 7950 of mine completes well over 3,000 Help Conquer Cancer work units in a day, meanwhile Mr. GTX 680 is barely scraping 400 work units per day. Everyone at WCG is swapping their NVIDIA cards for AMD cards because that is where the power is.
2 1 [Posted by: mmstick  | Date: 10/29/12 09:54:16 PM]
Reply
 
show the post
4 8 [Posted by: AvONbaCK  | Date: 10/29/12 05:14:55 PM]
Reply
 
You seem to not understand that there is a market for power efficient servers that don't require the high-processing power of x86 10-core+ CPUs. For those uses, no one in the world has a more efficient CPU compared to what ARM offers. You realize no company in the world has been able to dethrone ARM in the smartphone/tablet CPU space? Why would you think AMD can do it when even Intel can't do it?

There are already companies moving into this direction as Austin, Texas-based start-up Calxeda is also focusing on ARM-based processors for data centers.

You seem to be blaming RR for where AMD is today without realizing AMD's problems started in 2006 when they bought ATI. They ran out of $ to design high-end x86 CPUs for desktops, laptops or servers. It's amazing it took 6 years before they ended up where they are today. It's a miracle AMD survived this long against Intel. It is you who is delusional since you continue to want AMD to do something that's financially impossible given their resources.

You seriously have no idea what you are talking about. You are asking a micro-brewery in Wisconsin to create a better selling beer to compete with Stella Artois or Heineken. You are asking Mazda to make a 911 competitor. Are you joking? AMD never had a chance against Intel and it was only a matter of time before they gave up. We knew it was coming sooner or later.

AMD Market cap = 1.46B
Intel Market cap = 109.82B

You and others here seem to be in denial, expecting a company 75x smaller to compete with Intel in the high-end x86 CPU space. AMD doesn't even have its own fabs. Get real.

The only reason AMD even had a glimpse of brilliance during A64/X2 eras was mostly because Intel made a mistake with Netburst. If Intel brought to market Benias, AMD would have been behind in every generation in CPUs since its existence.
5 4 [Posted by: BestJinjo  | Date: 10/29/12 05:29:53 PM]
Reply
 
A64 wasn't the first time Intel was beaten by AMD, K7 Athlons outperformed both P3 and P4 at the beginning. Just to add it to the picture.
5 1 [Posted by: Martian  | Date: 10/29/12 05:48:57 PM]
Reply
 
I am pretty sure the highest-end P3s beat out K7s. Those Tallatin P3s were pretty fast. Also, Pentium 4 is Netburst gone wrong, which is what I mentioned already as the only real time when Intel royally messed up. Other than A64/X2, AMD was always behind and competed on price/performance and overclocking. CPUs like XP 2500+ Barton were awesome for the $ and gaming, but overall they still lost to the best Pentiums of that era. So even with Netburst, AMD was barely ahead until they launched A64 and that was mostly Intel's flop with P4. With A64/X2, AMD still had just 25% CPU market share despite having a superior CPU. In reality though Intel was never even behind in architecture since it delayed C2D and massaged Benias. Had Intel launched Benias to compete with A64 intead of Pentium 4, AMD would have been behind as well.

My main point is now AMD has no $ at all to compete with Intel in the high-end CPU space and people who keep saying how AMD is a failure are id**ts since it's like asking an auto company 75x smaller to compete with Toyota or GM or Ford. Delusional!

You know why VW Group loses $4 million dollars on each Bugatti Veyron they sell because they can. Small car companies cannot afford to make the best supercars just like small PC companies cannot afford to make the best CPUs. Why people still expect AMD to make a faster CPU than Intel is mind-blowing. The funny thing is they offer no alternative strategies but continue to live in their dream world that AMD should continue to spend millions trying to beat Intel when it has failed to do for 6 years in a row.
5 5 [Posted by: BestJinjo  | Date: 10/29/12 05:57:13 PM]
Reply
 
As I wrote, K7 Athlons clearly beated P3 (even the late server Tualatin) and early P4s as well.
5 1 [Posted by: Martian  | Date: 10/29/12 06:25:39 PM]
Reply
 
show the post
4 7 [Posted by: 123  | Date: 10/29/12 06:34:03 PM]
Reply
 
@BestJinjo
Tha is due to their own doing and wrong decisions. And most importantly bad Management and The Board that pulled off the plug. What did they gain? "Failure" That is very simple and the short story.
5 5 [Posted by: AvONbaCK  | Date: 10/29/12 07:21:49 PM]
Reply
 
@BestJinjo
Well they were 5 to 6 billion recently why did it drop to 1.46B? I think you should ask yourself that question. Its comon sence, nothing really hard to it really.
4 6 [Posted by: AvONbaCK  | Date: 10/29/12 06:27:27 PM]
Reply
 
show the post
3 6 [Posted by: 123  | Date: 10/29/12 06:37:31 PM]
Reply
 
It should be noted that AMD's market cap is so small because they are failing to make money, not because the company itself is somehow tiny. When they were doing well in 2005, their market cap was close to 30B, and Intel was at 90B.

Market cap represents the value of the company, not just the size.
4 0 [Posted by: pulzar  | Date: 10/29/12 08:53:11 PM]
Reply
 
That's right. Apple for example has a disproportionately large market cap compared to company size and current profits. So market cap also represents punters' bet on future market share/ profits/ company growth. AMD's smaller cap represents punters' bet that the company's profits will nosedive over the coming quarters through loss of market share in the future and also punters' loss of faith in its plan for future growth making it difficult for AMD to issue shares (or the financial like) at a higher price to raise capital. That is why Read has to come out swinging with AMD's new plans to sure up investors before it becomes a negative feedback loop. The "ambidextrous" plan he proposes today is, in other words, not to put all the AMD eggs in one basket when there are Intel piggies about. Creating more opportunities for business profit out from underneath the shadow of Intel will attract more investors.
3 2 [Posted by: linuxlowdown  | Date: 10/30/12 01:39:16 AM]
Reply
 
show the post
0 4 [Posted by: 123  | Date: 10/31/12 09:32:52 AM]
Reply
 
@AvONbaCK - Like there's nothing certain in life? Can't fault your insights there. Btw, this does mean something to consumers because micro servers are designed to run web services. And web services are for whom? That's right. A quick shout out to all the Yanks on the east coast preparing for Sandy. Good luck. Are thoughts are with you.
4 5 [Posted by: linuxlowdown  | Date: 10/29/12 02:34:19 PM]
Reply
 
It's just how technology has worked in this world for thousands of years. The highest grade technology fresh out of research and development goes to the highest echelons of power. In this case this is server grade use. Consumer hardware is older technology that has aged to the point where it is suitable for mass production and use in the consumer market.

All developments here are a sort of insight of what we will see in consumer products in the future.
3 3 [Posted by: mmstick  | Date: 10/29/12 04:24:36 PM]
Reply
 
AvONbaCK, it seems you really don't understand x86 CPU business, nor the current trends in the marketplace outside of x86 CPUs.

AMD was the last company on earth who could produce a competitive x86 CPU processor to Intel's. But even AMD has conceded that it is impossible to make a superior x86 processor given the fraction of the engineering and financial resources they have. As I told you in another thread, you also don't seem to get that Intel will have a full node lithography advantage over AMD for at least 5-10 years.

Therefore, instead of wasting hundreds of millions of dollars AMD doesn't have to satisfy your specific CPU gaming needs, they are looking at a big picture of where the world is going and trying to invest into those opportunities. It may or may not work but trying to compete with Intel in the x86 high-end CPU/server space is futile and AMD knows it. Since you haven't offered any alternatives to what AMD's management should be doing, but only continue to criticize everything they have been doing, it looks like it's impossible to discuss this topic with you objectively as you'll hate any direction AMD goes unless they magically create some Haswell beating CPU.
4 4 [Posted by: BestJinjo  | Date: 10/29/12 04:39:00 PM]
Reply
 
show the post
3 6 [Posted by: 123  | Date: 10/29/12 04:53:56 PM]
Reply
 
You are calling me an idiot?

I didn't invent market or consumer trends. Tell all the consumers who don't care about traditional PC space anymore or corporations who want more choices in the server market space when it comes to efficient CPUs.

Last quarter iPad sold more units than the entire desktop PC OEM market. Next year iPad will sell more units than the entire laptop PC market. You can deny it all you want but the # of PC gamers like us buying GTX680s in SLI and Core i7 3770K @ 5.0ghz is tiny. It's practically immaterial in making $ overall. Even for Intel most of the $ is in servers/workstations, while for Nvidia it is in the professional GPU space. Less than 10% of NV's desktop Kepler revenue comes from their $300+ GPUs, while the entire consumer desktop GPU division for NV is less than 20% of the entire company (in other words $300+ desktop GPUs are less than 2% of cash flows for NV). That shows you how small the market of PC enthusiasts is overall for these companies.

AMD never said they will stop making x86 products or stop focusing on APUs. The whole point of this deal is to integrate x86 and ARM CPUs into servers and allow them to work together using the SeaMicro software fabric. No other company right now knows how to get x86 and ARM processors to work together. The APU x86 strategy is there to stay, but even with Trinity being a competitive all-around CPU vs. Core i3s, it will hardly sell enough to make a difference since the traditional PC market is shrinking, not growing. The PC market is dying, as consumers no longer care about laptops or desktops. None of my friends want a new laptop or desktop. They are all buying iPad, tablets and smartphones every 2 years like clock work, while their desktops/laptops are from Core 2 Duo eras.

The major growth in the next 5 years are more efficient servers, smartphones and tablets. No need to waste $ designing $300-1000 CPUs if you know you can't beat Intel anyway. It's common sense to anyone except you and AvonX and that guy jmlg or w/e his name is. AMD failed with Phenom I / II and Bulldozer during the time when they had even more $ than today. No company in the world will beat Intel in the x86 high-end CPU space for at least 5-10 years at this rate. AMD knows it can't do it unless the company grows 5-10x the size of today. And the only way to grow is to move into other markets where there is strong growth.
4 5 [Posted by: BestJinjo  | Date: 10/29/12 05:03:56 PM]
Reply
 
That's because your friends are idiots who just want their new apple toy everytime comes out with a upgrade. Fool the Desktop is not dead just yet. I never said AMD could beat intel at x86 however AMD is spreading its wings too far I can't wait to see them GO DOWN. AMD doesn't need to grow to 5-10x the size they need to focus on APU's AND ONLY APU's below 10 watts that is. AMD is doomed as a company bye bye roy.
4 5 [Posted by: 123  | Date: 10/29/12 06:29:54 PM]
Reply
 
Well that's your problem, not AMDs. Why should they focus on APUs and lose out on all the opportunities and customers who want their other products? Haven't you seen their graphics card division consistently producing better graphics cards than NVIDIA year after year lately? I mean really, even in the workstation and scientific field AMD graphics cards are heralded as the king of performance. I get well over 3,000 work units in a day of Help Conquer Cancer complete off a single HD 7950, meanwhile the people with GTX 680s barely even manage 400 work units in a day. A guy with dual GTX 590s could only barely manage 1500 work units in a day.
2 3 [Posted by: mmstick  | Date: 10/29/12 09:58:07 PM]
Reply
 
@BestJinjo - 123 is jmlxg's new handle. jmlxg username got banned for being rude. It's still Jesse Lee.

http://disqus.com/google-...0f04271177ca6ae209b9223a/
3 4 [Posted by: linuxlowdown  | Date: 10/30/12 12:02:14 AM]
Reply
 
show the post
3 6 [Posted by: 123  | Date: 10/30/12 12:40:03 AM]
Reply
 
Actually, the people being ticked off are 'Intel fanboys' trying to act cool, when on the inside they are hurting. You can hear the crying through your comments.
3 2 [Posted by: mmstick  | Date: 10/30/12 03:44:08 AM]
Reply
 
You're on fire today mmstick. I like it. How much cola have you been drinking? :-)
3 2 [Posted by: linuxlowdown  | Date: 10/30/12 05:24:19 AM]
Reply
 
show the post
1 5 [Posted by: 123  | Date: 10/30/12 06:07:25 PM]
Reply
 
Says the guy who uses all caps and talks like an immature child. How can anyone take a childish comment like yours seriously?
2 2 [Posted by: mmstick  | Date: 10/30/12 07:39:05 PM]
Reply
 
Self admitted troll, ban his ip please.
2 2 [Posted by: veli05  | Date: 10/31/12 07:36:17 AM]
Reply
 
show the post
1 5 [Posted by: 123  | Date: 10/30/12 06:08:48 PM]
Reply

2. 
This is pretty big news, ARM decided to throw its lot in with AMD and if they can make their products worth while to AMD and ARM's existing and hopefully new server customers the future could look pretty ggod for both companies. Client space is getting less and less important as table techonology and general computing mobility mature so having more datacenter options to choose from and on a more energy efficient level is definetely something worth investing in.
5 4 [Posted by: veli05  | Date: 10/29/12 02:25:09 PM]
Reply

3. 
This is a good idea, if one remembers what Intel did to Apollo the chip designer. ARM do and think they are taking the lesser of 2 evils.
3 5 [Posted by: tedstoy  | Date: 10/29/12 04:55:11 PM]
Reply

4. 
$BestJinjo$
"(...) while for Nvidia it is in the professional GPU space. Less than 10% of NV's desktop Kepler revenue comes from their $300+ GPUs, while the entire consumer desktop GPU division for NV is less than 20% of the entire company (in other words $300+ desktop GPUs are less than 2% of cash flows for NV)"

I dont wanna hijack thread where youre competing with your friend but please explain this nonsense

Where do you think frikkin envys money coming from if not from GPU market. Their failed to be adopted Tegra2/3 chips that have how many smartphone units (SPU) sold. NTM that those came at fraction of price their CPUs are sold.

Or your're implying that they have made more catastrophic GPU revenues than is that of Tegra?


"I am pretty sure the highest-end P3s beat out K7s. Those Tallatin P3s were pretty fast. Also, Pentium 4 is Netburst gone wrong, which is what I mentioned already as the only real time when Intel royally messed up."

It's Tualatin and it's just 0.13um P6 derivative and it was nowhere competitive with 0.18um Palomino (K7 w/ SSE), which was a furnace btw but in those days didnt matter that power/compute ratio. It fall of because of Synchronous FSB was 1980's archetype and because even it had lot of resources P6 lacked of IPC until it's redone in Core2 incarnations. Banias/Dothan aka P6-M were just same old power sane P6 attached to Quad-pumped FSB which was ... part of NetBurst architecture. And that give them more appealing modern look.

And when you're feel so you made us "NetBurst arch" (P4P) explainable please tell us why and where did "it went so wrong" as you said. I above explained one part that was good and there were many others good parts.


"AMD was the last company on earth who could produce a competitive x86 CPU processor to Intel's. But even AMD has conceded that it is impossible to make a superior x86 processor given the fraction of the engineering and financial resources they have. As I told you in another thread, you also don't seem to get that Intel will have a full node lithography advantage over AMD for at least 5-10 years."

AMD and its silicon manufacturing partners are already full node shrink behind intel but it seems that made them no worse to make their CPU competitive from times they were lagging only for few month.

Why do you think AMD ever wished superior x86 CPU? To stay competitive with intel?
AMD always offered only more appropriate solution to PC market that was on same level as that of Intel just a year behind. In days of Pentium & K5 that wasn't so. Just like nowadays with their Bulldozer 32nm vs. Intel's SandyBridge 32nm. Intel this time even felt confident enough to delay his new Ivy Bridge CPU lineup claiming some bogus issues with current/future chipsets (iirc)

Why did AMD ever wish to implement x86-64 (AMD64) instruction on what was already Intels home turf? And if we except that, and every other instruction set AMD introduced was a spill out. Like their failed to establish 3DNow!, or SSE5 SIMD subsets. While the latter one is canceled during implementation in favor of crappy AVX which resulted that conceived Bulldozer on proven 45nm SOI node didn't saw daylights which resulted in production delay for two whole years.

Why did AMD so eagerly expect to build its Bulldozer modular architecture?

(btw. It's uncomprehensible to read a thread where you two competing in your economic skills mixing stocks market cap with companies products and so. AvONbaCK, and that $$unnamed$$ are freakin flamers telling us nothing)
4 3 [Posted by: OmegaHuman  | Date: 10/29/12 08:24:51 PM]
Reply
- collapse thread

 
show the post
2 6 [Posted by: 123  | Date: 10/29/12 08:37:25 PM]
Reply
 
Idiots like "BestJinjo" think that just because AMD cannot compete in the PC Desktop Market all of a suden its dying off. LoL
Actually Desktop PC's has always been the future and always this stupid consoles has been holding us back.
If they focused on Desktop PC's games and apllications in general would be YEARS ahead. Its a shame really that idiots like this are ruining the market.
5 5 [Posted by: AvONbaCK  | Date: 10/29/12 08:49:50 PM]
Reply
 
show the post
2 5 [Posted by: 123  | Date: 10/29/12 10:22:36 PM]
Reply

5. 
ARM's CEO had to endure the incompetence of its travel staff that booked him to fly to San Francisco through New York in the middle of the chaos that surrounds the Hurricane Sandy....
The same will happen with this new "Project Win #2" which will marry ARM w/ x86. I don't see any single benefit of Arm w/ x86.
14nm node in 2014 looks the kind of fiction also. It's impossible from any angle.
Rory should have worked in marketing business.
2 2 [Posted by: Azazel  | Date: 10/29/12 08:41:28 PM]
Reply
- collapse thread

 
show the post
2 7 [Posted by: 123  | Date: 10/29/12 09:07:20 PM]
Reply
 
show the post
2 5 [Posted by: AvONbaCK  | Date: 10/29/12 10:05:14 PM]
Reply
 
show the post
2 6 [Posted by: 123  | Date: 10/29/12 10:21:04 PM]
Reply

6. 
It's a great opportunity for AMD at least they weren't be hedged by the Intels clause where AMDs x86 cant be higher than 35% for two consecutive quarters if they want to keep their x86 license

Bad thing is that AMD is once again fully streaming to servers just like during AMD64 introduction. I'd be more than pleased to see few cheap to implement ARM cores in current x86-64 based PC solutions, and i dont like the feeling that i'd have to wait when AMD will become so charitable to implement it into PC grade CPU As we dont need Freedom fabric in them it should be done sooner rather than later. Excavator core on 22/20nm seems perfect candidate for me.

I certainly wouldn't like to see once again AMD needs to consult Intel how long to delay their products in someones Favor like they did with Bulldozer and favorizing Intel and their AVX.
1 2 [Posted by: OmegaHuman  | Date: 10/29/12 08:51:52 PM]
Reply
- collapse thread

 
show the post
2 7 [Posted by: 123  | Date: 10/29/12 09:05:48 PM]
Reply
 
You do realize AMD has more than one market right?
2 2 [Posted by: mmstick  | Date: 10/30/12 03:46:25 AM]
Reply
 
show the post
0 5 [Posted by: 123  | Date: 10/30/12 03:18:42 PM]
Reply
 
Why so much hatred over having choice? .... Ahh okay I just googled it. "Decidophobia" the fear of having choice. Sounds just horrible, you poor tortured soul

I hope you can find help. Take care.
1 0 [Posted by: JBG  | Date: 11/06/12 02:19:35 PM]
Reply

7. 
AMD made very good decision by listening to my advice to them which I've written one month ago in the comments to the article below:

http://www.xbitlabs.com/discussion/70475.html

Joking

Seriously that is the best decision AMD has made during last years so Rory looks like a wise man now!

After initial deployment in Servers the AMD's ARMv8 64-bit SOCs will definitely appear in Tablets and Smartphones which will replace the current desktop PCs and Laptops very soon.

In 2-3 years from now there will be no desktop PCs and no Laptops, there will be only tablets with detachable or bluetooth keyboards and smartphones which you'll be able to connect via wireless interfaces to Ultra HD TVs and Monitors.

And in 4-5 years from now we will not have to carry heavy laptops to work - we'll come with smartphone at the office and wireless keyboard, mouse, monitor will connect to our smartphone automatically and all data will be processed and stored on our smartphones.

You can come back to my comments in a few years and double-check it
2 3 [Posted by: calzahe  | Date: 10/30/12 12:07:51 AM]
Reply
- collapse thread

 
I think that it is apparent that Rory Read got the CEO gig at AMD for proposing the ambidextrous market plan back in 2011 during his job interview process. It needs someone with good contacts in the industry to pull it off. Having been CEO at Lenovo previously, he probably has Warren East on speed dial.
2 1 [Posted by: linuxlowdown  | Date: 10/30/12 05:32:36 AM]
Reply
 
show the post
1 6 [Posted by: 123  | Date: 10/30/12 09:49:49 AM]
Reply

8. 
show the post
0 3 [Posted by: idonotknow  | Date: 10/30/12 03:51:03 AM]
Reply

9. 
that collection was for INTEL but for me. i was exciting about that 8 cores at 4 GHz for a little Virtualbox system. but it was a long time waiting since AMD last generation 8 cores failed miserably. the new generation seems to be abit better but i need to skip lunch and dinner to get enough $ for the fun.

0 2 [Posted by: idonotknow  | Date: 10/30/12 04:00:21 AM]
Reply
- collapse thread

 
show the post
0 6 [Posted by: 123  | Date: 10/30/12 09:48:12 AM]
Reply
 
The only loser we see is the 'Intel fanboy' here. He thinks that just because a processor from brand X is 15% faster that it deserves the extra $200 and so he must type pointless drivel to AMD articles. Meanwhile, everyone who is actually using processors doesn't care about your opinion.
2 1 [Posted by: mmstick  | Date: 10/30/12 07:42:18 PM]
Reply
 
show the post
0 4 [Posted by: 123  | Date: 10/31/12 08:38:57 AM]
Reply

10. 
Haha, and Intel said ARM will "come and go"

The future is AMD and ARM bitches.
1 1 [Posted by: medo  | Date: 10/30/12 04:04:12 AM]
Reply
- collapse thread

 
show the post
0 5 [Posted by: 123  | Date: 10/30/12 02:47:46 PM]
Reply

11. 
Let's think little bit here. They said AMD should not buy Ati, that it is a bad move, now they work flawlessly. Radeon is biggest advantage of AMD.

So now they gona work with ARM, and somebody is still questioning their moves, saying they are wrong.

Come on.
1 2 [Posted by: kingpin  | Date: 10/30/12 07:18:00 AM]
Reply
- collapse thread

 
show the post
1 5 [Posted by: AvONbaCK  | Date: 10/30/12 10:07:20 AM]
Reply
 
show the post
1 5 [Posted by: 123  | Date: 10/30/12 02:47:04 PM]
Reply
 
They are simply here to instate their opinions that anything involving AMD will automatically 'fail', despite results actually showing otherwise. As well, from a business perspective, the death of AMD is very much impossible due to how well rooted they are.
2 1 [Posted by: mmstick  | Date: 10/30/12 07:43:45 PM]
Reply
 
show the post
1 4 [Posted by: AvONbaCK  | Date: 10/30/12 07:58:18 PM]
Reply
 
I'm not a part of your world, certainly. Your world is that of a delusion, a delusion I wish to be no part of. While we are enjoying our Linux and AMD, you can enjoy the sinking ship that is your wallet with your Wintel.
2 1 [Posted by: mmstick  | Date: 11/01/12 06:35:37 AM]
Reply

12. 
THIS COMMENTS FORUM IS FULL OF RETARDED TROLLS LOL
2 0 [Posted by: JBG  | Date: 11/06/12 02:07:31 PM]
Reply

[1-12]

Add your Comment




Related news

Latest News

Monday, July 28, 2014

6:02 pm | Microsoft’s Mobile Strategy Seem to Fail: Sales of Lumia and Surface Remain Low. Microsoft Still Cannot Make Windows a Popular Mobile Platform

12:11 pm | Intel Core i7-5960X “Haswell-E” De-Lidded: Twelve Cores and Alloy-Based Thermal Interface. Intel Core i7-5960X Uses “Haswell-EP” Die, Promises Good Overclocking Potential

Tuesday, July 22, 2014

10:40 pm | ARM Preps Second-Generation “Artemis” and “Maya” 64-Bit ARMv8-A Offerings. ARM Readies 64-Bit Cores for Non-Traditional Applications

7:38 pm | AMD Vows to Introduce 20nm Products Next Year. AMD’s 20nm APUs, GPUs and Embedded Chips to Arrive in 2015

4:08 am | Microsoft to Unify All Windows Operating Systems for Client PCs. One Windows OS will Power PCs, Tablets and Smartphones