Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!
Discussion on Article:
AMD: Engineers Are Gradually Shifting to System-on-Chip Development Model.
The number of veteran engineers are countable on your fingers, the number of level 1 and level 2 engineers that were let go is what accounts for the vast number of engineers laid off. The very same engineers who would leave AMD anyway in 2-3 years after joining. Its the norm in this industry. Not a big deal. The layoffs where bad for moral in any company but its not like all those engineers will be with AMD for a long time to come.
I know couple of guys who got laid off, they are already at other companies, one guy is about to join next month. It was irritating for them for that short while but thats about it.
> higher pay scale and aim for a promotion ...
It works for almost every Job.
Your Employer would prefer to lose you and waste their time than cough up another Buck. It doesn't matter what it costs as long as you do not get that Buck.
The other Employer would gladly cough up two extra Bucks, offer a Promotion (or a more important sounding Title) and have you do less work.
You QUICKLY (4-6 years if you are fast) get to a point where you simply are not going to get more unless you move far away or do something different.
Soon enough everyone catches on that if you are not getting over $200 per day your Telephone Interview is OVER (click!); so they (whichever fool calls) better start the conversation with saying how valuable you are and how much they pay before they ask IF you would come in for an interview.
At higher Levels you actually do have to be a bit nicer since there is more wiggle room (for Pay) and fewer openings; but then you are close to the point of packing up and leaving town (to move to where the Money is).
In the end you get to the last brick in the 'Yellow Brick Road' and you are switching Jobs, moving or running the Company while the Owner plays with his new Toy (Yacht, Airstream, or more interesting Company).
The days of young People or newcomers being suckers who slave for low pay was over more than a decade ago.
Majority of the people who got laid off in the engineering core were level 1 or level 2 and some senior engineers. These guys are younger compared to MTSes and would leave a stable company anyway in search of higher pay scales and quicker role promotions. They will land jobs at other places without much difficulty.
You don't understand SOC when you try to equate that with GPU, the reason GPUs are not talked in the same breath is because it has its own design flow. A smaller GPU can be part of a SOC. But the core GPU business is still based on the high performance, mainstream and entry level needs. The new GPUs are well on their way and the future products are also well detailed. Some leaks will happen as usual in time and then all will be revealed.
SOC is a different matter its not news to anyone who follows the trends, Papermaster has already mentioned AMD's SOC philosophies to the press in august 2012.
Papermaster and Lisa have not clearly said anywhere they are not interested in improving x86, the only comment made was by Rory that they don't want to compete directly with Intel for top supremacy. That does not automatically imply x86 is not going to be improved. Absurd logic to say the least.
The funny part is you first state 'IMO', if its your opinion then it cant be a statement from them
SOC development involves in making easy to integrate CPU cores, GPU, FCH, DDR PHY, I/O PHY etc. When designing, a methodology is adopted to cater to easy integration. This allows focus on specifics. If its a core you focus on? getting the best reusable core is the priority. There is nothing to stop any engineer from getting the best performance out of the core they are designing.
There is no logic behind these claims that an SOC is automatically a poor performing solution. Rather than making sweeping statements why don't you give a technical reason why that is. Lets hear them first.
Ah yes, lets not play games here anymore, I know very well you know about Jaguar and Upcoming steamroller archs. Jaguar is big evolution from Bobcat so is Steamroller when compared to bulldozer. I have seen your past activity in here to know you do know them very well so saying you don't understand the significance is waste of time.
So the question remains, why are you deliberately posting inaccurate and fabricated statements that will naturally cause informed people to respond and there by you get your chance to launch another sarcastic attack that has nothing to do with the topic in the first place?
There is a word for that behavior, you know very well what that is.
I give you this chance to respond. I have already seen your tactics over here:
Ah no my friend, I have seen this over and over in all your posts as far as last year on this site. Its your standard practice to state things and then mock the person who corrects you. Why are you back peddling now? Why not mock me as well? Fear you might be called out this time?
And besides, that's a polity form of expression. It's my assumption, not assertion. Isn't it?
No other way to it, pretty clear 'IMO' and then followed by 'clearly stated'
No its not a crime to be wrong, just in your case its pretty usual and then you mock someone who answers you in other posts.
I don't know any Rupley and I don't need online analyst/journalist opinions to know what architectural changes are significant.
If this person suggests the L2 cache is all the difference there is to Jaguar, that is a good laugh for anyone with sound background into Comparch.
There are changes already clearly mentioned in the press slides. Its the Pipeline as a whole that gets an uplift, new logic to support new instructions, branch predictor changes that alone is big deal, more decoupled logic that is yet to be disclosed to the press but hinted, L2 operation changes, the L2 interface design and its implications on fine grain macro gating, more out of order resources, total FPU revamp etc etc
Steamroller, as I expect, will be almost the same with adding a decoder per each core and some minor tweaks.
Incorrect, again if you really cared you would go through the slides presented at Hot chips 24 you would know steamroller has more improvements than Piledriver ever did over Bulldozer. Listing them out again is not needed the slides do a go job of explaining it enough for even someone with no comparch background to understand.
In any case it's my opinion. Is it prohibited to express a person's opinion on this site? The truth, in any case, we'll learn no later than a year when we'll see real products.
No its not a crime to express opinion but making underhanded comments that will definitely elicit a rational response has been your forte over the last year in this place just by looking at your posts. Those maybe your opinions but there is an agenda there, that is the problem.
In any case, everybody is free to correct me.
Second. I am posting my own opinion as many others do it as well.
Is it open discussion or not?
Ah! that is what I want to hear, but how come the change of heart now? You are only saying this now because you know what happened to certain individuals who were running amok in here and causing trouble.
The thing is people have corrected you in the past but you resort to ad hominem attacks.
linuxlowdown wrote: Stop trolling and sucking eggs
Is it a normal speech for educated person?
You know the answer to that yourself, you past behavior in here speaks for it self.
Look at that article I linked in the previous reply, what connection does your 'Intel sack' have to do with the topic? That was designed to irritate people into a response, when you didn't get a response then you post it again with silly comments on phrases like 'ground breaking'. Its obvious what you are trying to do here, you don't like people being enthusiastic about AMD and you liked to provoke them into a rational response for your own amusement. No different from some other people who vanished from here.
If you act decent and talk properly the rest will talk properly to you, you should earn it. You certainly have enough class to be the bigger man and present it in a different way without being snide about it.
And, NO. I am not fear of you. It's your choice to make the site clean as virgins,
Everybody see the degradation of all online IT press. They re-print the same news and the same official release, conduct almost the same benchmarks under control of makers. Traffic is no problem for them anymore because they got the money from the same makers.
And, sorry again, Do you understand the difference between "opinion" and "facts". Facts happens only in the past. Opinion is how an individual sees these fact as well as his/her vision of the future events. That's all. It can not be "right' or "wrong" opinion. The opinion is always subjuctive.
My opinion I expressed in so clear joking form that everybody who has a sense of humor understood it. Whom I irritated?
Already given in the posts above, its the same behavior in the past and the interesting part is you are mostly seen when its got anything to do with AMD. You are fishing of reactions.
And Jeff Rupley I know, I don't know any just 'Rupley', be more specific with names.
The ridiculous matter here is that he never said anything of the sorts, its his presentation with slides that talked about everything I mentioned about the big changes in Jaguar and it being a big evolution of bobcat. Its there in his very slides. And you will still say its just an L2 cache added with 4 independent cores? Is that what his multislide presentation had? 'Two' details spread over double digit slides? Really?
I don't see the funny in this.
Your version of the story made Jeff look like a an arm chair commentator who had no clue, please go and revisit the whole talk or read through his material before making absurd claims.
You don't understand the difference and your try to lecture others.
Stream process are a generic term yet NI and SI are was vastly different in their archs. Every place you go AMD makes it a point say its the biggest change in their GPU designs of late.
Similarly Module is a generic term, the microarch changes are what matters. And no they are not the same.
The site is open to discussion but you were not discussing, you were mocking people for amusement.
I saw that in that article I linked earlier where you posted the same thing twice, the 2nd time you got someone to take the bait.
Who are you trying to kid here?
My opinion I expressed in so clear joking form that everybody who has a sense of humor understood it. Whom I irritated?
Nice try but people don't post the same joke twice on the same thread. You were trying to get noticed and its obvious why.
Linuxlowdown's reaction is what anyone would do when someone does what you did.
Here’s what I’ve found: In many stories and in some video games, Azazel ‘s a cute little goat looking demon representing the worlds sin's and temptations (It represents Satan for Christians; Wikipedia).
What a cute and appropriate name for someone investing so much time posting inaccurate statements, fishing for strong exchange with passionate members, using them for its own amusement.
Vanakkuty, you guys are way too patient with these kinds, but the two of you kept me entertain during my lunch break, thanks!
Azazel... I’ll propose renaming my little girls’ hamster this way!
There are no risks to discrete GPUs or APUs for the foreseeable future. This means the right IP still exists for the right applications if those are some of your fears.
The only way to match or beat Intel in the CPU race is to engineer a more efficient CPU architecture. To do this you need to spend a colossal amount of $ and you still might not come close since Intel's CPU architectures are excellent. That means to stay alive, a company must look at other opportunities like SoCs, focus on HPC graphics for professionals, etc.
AMD losing to Intel in the high-end CPU race was a foregone conclusion because Intel has been 1-1.5 nodes ahead of AMD for a long time now.
32-nm, 1 year late. (2009 hit production 2010, AMD2011/12/13) (Awkward Specs)
28-nm, 2 years late. (2010 hit production 2012, AMD2013/14/15) (Denser than 22-nm)
20-nm, 3 years late. (2011 hit production 2014, AMD2015/16/17) (Denser than 16-nm)
14-nm, 3 years late. (2012 hits production 2015, AMD2016/17/18) (Simply 20-nm with FinFETs)
I wonder if FDSOI being cheaper than Bulk and FinFETs by half. Would possibly affect the roadmap that AMD has stated.
By 2017, the PC market would have grown, from the 390 ~ 400 million units from 2011 to around 790 ~ 800 million units.
So there's nothing shrinking on the PC market.
The other markets are GROWING very fast, but that doesn't happen in the detriment of the PC.
The tablet market will reach 400 million units in 2017, so it will be just as big as the PC market is right now.
That means that AMD really needs to develop solutions that will allow it to tap into that huge market, but ignoring a market selling 800 million units a year is definitely a bad idea.
So yes, go for tablets and SoCs, but don;t think for a second that the PC is becoming irrelevant in any way.
But for x86 amd is the #2 player.
I don't see any point of AMD doing yet-another ARM SoC like zillion other companies are doing.
There would be absolutely nothing that AMD could do better than those other companies.
AMD lost this opportunity when they sold their old handheld division to Qualcomm. (which benefited a LOT from it)
But creating mobile x86 SoC based on Jaguar core might have some sense, as Intel is gettin some design wins for Atom which is inferior to Bobcat, which is predecessor of Jaguar.
Add your Comment
Enter your username and e-mail address. Password will be sent to you.