Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!
Discussion on Article:
Specifications of Sony PlayStation 4 Development Kit Get Published.
Personally, I wish they'd release a model without any kind of drive at all so I could just slap a 2tb one in there without having a wasted hard drive; they could either just cut that out of their production costs or lower the price of the console to make up for it, either way I really wish that was an option.
As for specs on the consoles there are a lot of specs out there as rumor, some have got some of it right while most of it are inaccurate.
Wait for the jaguar reviews to be out and then decide on what it is worth. If you are still unsatisfied you can sell your shares you said you have in AMD. Making up pseudo science analysis is pointless.
You don't really know where I work, haven't told anyone. I have my means to know a lot about various companies, roadmaps, products, their specs etc AMD is one of them.
AMD has told the world whats in the slides, you don't have any idea on the final specs for Kabini and Temash yet you say its no good for said purpose.
You claim you have AMD shares yet all your posts are mostly uninformed material made from assumptions or pure misinformation when it comes to AMD. The distinct feeling here is that your claim of having AMD shares is being used by you as a cover for your poor posts. For a share holder you have a very poor understanding of AMD as a company.
Also, what sense would it make that they send out BD samples, and are going to use the Jaguar? They are different architectures, in case you didn't know.
Now you do.
You don't even know how development of games are done. Using higher end machines is always part of development process, they have always been the test beds from where they de-tune to meet target specs. Performance modelling is a big part of any software development process. This tells me your vocation has nothing to do with software development. Thats fine, but making up misinformaton doesn't help your case.
There are a lot specs thrown out there for the consoles, most of it is wrong, some have got it right, yes I also know the full details on both consoles. A reason why I don't engage in such discussion of spec speculations. That is for posters in here to enjoy engaging in.
Oh and I haven't overlooked the fact you were blind reading posts the last time, asking me to look before posting when infact it was you who replied to the wrong person (me) accusing me of needing look before posting. I was conversing with another person totally.
You haven't shown the least bit of decency to apologize for your mistake.
I suggest you tidy up your act a bit, your profile isn't very convincing especially given your actions.
You don't know anything about the subject.
You don't send development machines for completely different architectures, unless there are no better choices. BD has two integer units, and optimizing for it are very different. That's the point.
Also, games show better performance with fewer, more powerful processors. Didn't you know that?
I have AMD shares, and I know what AMD says about their products. Some blow-hard without any knowledge doesn't trump what AMD says, except in his own little world.
AMD has told everyone what to expect for Jaguar, and it's not ideal for a game console. Small people think they know more than companies, but, I trust the companies more than some random internet poster who acts like he knows more than he does.
It doesn't add up. It's a poor choice.
By the way, I know you're probably poor, but do you want to put up some money on whether I own AMD shares? If not, who's being the hyopocrite by not apologizing?
Oh, and in case logic isn't your strong point, and it clearly isn't, I do think AMD will get these design wins. Just not with the chip as it's being described. It's the wrong choice for the product, and the dev kits don't approximate it, but do approximate the much more powerful PD architecture. So, while nothing is ever certain, either they'll have to HEAVILY modify the Jag, or use a PD derivative. The latter of the two is far more likely.
So you are still saying you responded to the right person eh?
Its your first post from the top in that article. Pretty self explanatory when my post wasn't even for you, others have noticed it and mocked you for double vision.
I shall link the article for everyone to see.
and off you go since trolling seems to be your agenda.
You are out of your element on pretty much everything you say as others constantly correct you, I already tried the same and it was going over your head. Too much for you to understand perhaps.
You are a clueless person who is not a software developer nor a E.E engineer, really please, try to educate yourself about basic software development processes. Performance modelling is an everyday thing for many developers of specialized tasks like games, databases, high transaction HPC applications to name a few. The modelling is applied depending on what target performance you aim to tune to
To say this is nonsense is just hilarious! Its a pity you pretend to understand something when its awfully clear you are blatantly lying.
You are using poor understanding once again. The next gen consoles are aiming to up the base resolutions much higher than your average CPU benchmark resolutions you see in reviews. Its is more GPU limited with higher resolutions. The over heads faced on a console are also completely different to the overheads in a PC environment.
Apparently you don't know that either.
Says the one who had no understanding of the flow of semiconductor design through to fabrication until I had to explain it to you in a much over simplified way for you to easily understand, yet you couldn't understand and conveniently ignored replying to that post. Felt too much for you? Then don't engage in a subject that is way out of your league.
Here is that article where you got schooled:
AMD hasn't given out final specs so estimation is irrelevant.
Already I told you I work as a P.D engineer and I have my means to know things of special interest companies, AMD is one of them. It will become clear to you if you pay attention to the subtle hints I drop from time to time.
You should also prepare to eat humble pie when Kabini and Temash are out in the reviewer's hands. It will be a hard pill for your ego to swallow
You owning shares is about as real as people like David Wilcox, Anton LeVey who makes various fringe science claims. These guys said a lot of pseudo science gibberish that fooled, even impressed, the uninformed. Doesn't work around in here
Everyone can see now how you lack any decency from that link I posted above. Your apology on that is still pending.
I know exactly who the companies are involved in the development of these console chips. I also haven't confirmed or denied any particular company or companies having bagged the design wins that includes AMD.
Twisting words wont help you
I don't leak info of that sort to anyone. But know this you are in for one big surprise, it wont be good for you ego, because you won't understand how it works out just very well. Your posts here show you don't even have the basic tech savvy understanding of microprocessors, let alone debate with an engineer on comparch.
Once again look up 'performance modelling', maybe there is some hope for you if heed advice to educate yourself.
And you can save the claims about having shares in AMD, you obviously have a very poor understanding of AMD compared to even a street going average joe. You can take your AvonX/Jml/123 succession plans/ambitions elsewhere, we need serious posters or in the very least well mannered ones, the other kind is not conducive to good discussion.
Did it not occur to you that even FX-4300 system uses > 140W of power without the GPU?
Even A10-5800K CPUs & AM3 motherboards use too much power for a small console case:
There are actually two key reasons why Bulldozer/PD may not be a good fit for a console without significant reduction in clocks: Price and power consumption.
Considering how awful the CPUs were in PS3/360, an 8-core Jaguar out-of-order 128-bit floating point + AVX instruction set CPU would be magnitudes of times faster. Maybe you need to be reminded again that the Cell was just a 1-core CPU with 6 primitive SPE engines. Coding for the PS3 is much harder than x86 because it is built of 1 PPE and up to 8 SPE's (6 in the PS3). These SPE units are exceptionally difficult to get high levels of parallelism out of because the SPEs are not full cores. They are highly specialized companion cores and require special care to keep busy and running without going idle. So going from that to an 8-core Jaguar is a huge upgrade since you would get 8 actual stand-alone cores, with each one able to perform any calculations you want. You couldn't do this with SPEs.
x86 is also much more capable than the Power PC architectures. Capcom developers once stated that one PowerPC core in the Xbox 360 has 2/3 the performance of a Pentium 4 running at the same clock speed. Jaguar is expected to have 1.88x of the IPC of Pentium 4 because it will have 88% of the IPC of a Core 2 Duo.
Cell / Xbox 360 CPU core has 66% of the IPC of a Pentium 4. Jaguar has 1.88x IPC of Pentium 4.
Cell/Xbox 360 3.2ghz * 0.66 = 2.112Ghz equivalent Pentium 4
Jaguary 1.6ghz * 1.88 = 3.008Ghz equivalent Pentium 4
That means a single Jaguar core is 50% faster than a single Cell/Xbox 360 CPU core. If they put 8 of these cores, the CPU will be at magnitudes of times faster than PS3's Cell or Xbox 360's CPU.
If you can't understand this simple math, I can't simplify it further for you. I am not even going to get into the technical discussions of the processors themselves and how having a 128-bit floating point upgrade from Bobcat is huge. You say the performance will be "very poor", maybe compared to a modern Core i5/i7 but not compared to the anemic CPUs inside PS360.
Your math, wherever it is, is based on messed up numbers.
I already showed you where the performance was HALF of a Celeron.
I have both a Bobcat and a Pentium based Core 2. 88% my ass. You're talking about something you don't know anything about.
The Jag will be +15%, but then, a Core 2 using dual channel memory with a discrete card will be at least 15% faster than my Pentium.
They aren't close. The Bobcat is too slow to even run Netflix at high def. The Pentium does it with ease.
You have to use real numbers, not numbers on one particularly benchmark you like, especially 3D Mark. It's a much more complex architecture. Jaguar is good at what it is, but performance isn't one of the more important design criteria. It's more like a VIA Nano than anything.
Power is a joke for games and always has been. Not only do you have PS3/360 to prove how inefficient and slow PowerPC CPUs are, but the fastest CPUs for games have all been x86 in the last 10 years, specifically from AMD and Intel. Do you need me to look up gaming benchmarks for Apple G5 vs. Core 2 Duo when Apple said PowerPC CPUs are too slow and power hungry and ditched them for Intel's? Those benchmarks are not pretty.
Also, did you just conveniently ignore that the Xbox 360's CPU was PowerPC based and that CPU had worse IPC to a Pentium 4? Now you are just starting to sound like one of those Cell fans that compares CPUs based on floating point performance and theoretical Gflops.
"I have a Bobcat and a Pentium based Core, and they aren't even close. So, we're in fantasy land about this processor that has to be valid for the next seven years or so. That's your first mistake."
Bobcat is just 2 cores. Per the article, PS4 may have 8 Jaguar cores, or 4x the number of cores. Also, what does it mean "has to be valid for the next seven years or so." People still buy PS2, Wii, PS360 consoles despite all of those having outdated graphics. For a console to last, it's purely about sales, not graphics. If you just want to talk about graphics or maxing out a console, then PS3/360 were 'outdated' in 2007 when Crysis came out on the PC. Regardless, Xbox 720/PS4 can't make the same lasting impression if you will compared to their predecessors because PS3 cost Sony $800+ to manufacture, while Xbox 360 was $525. Sony market cap is barely above $15 billion. Given that PS3 was overall unprofitable for Sony since inception after you take into account hardware losses + software/accessory profits, then it stands to reason PS4 won't be some $700-800 console sold for $499-599 in retail. Not sure what you were expecting exactly.
"Second mistake is not realizing they can bring the power down on the PD pretty dramatically with lower voltages, and still end up with superior performance to the Jaguar. They are very different in performance."
I never said they are not different. I would much rather a console use something like an A10-6700 than an 8-core Jaguar.
As I mentioned already, even a Piledriver quad-core system (CPU+mobo) uses way too much power for a console. A10-5800K system uses 148W of power without the GPU being active:
Piledriver 6000 series are still going to be made on 32nm node. All I am doing is commenting on what happens if the consoles have a Jaguar CPU and providing reasons why even a quad-core Piledriver might be too costly or too power hungry. This is not about what hardware you would put into PS4 if you were on the project but what we are hearing based on rumors.
"Last, and most importantly, in case you didn't realize it, Jaguar is a different architecture than Bulldozer. You don't send out Bulldozer samples, if you are going to have a Jaguar based product. You wouldn't optimize the same way."
Aren't you contradicting yourself? You selectively choose to believe that the Bulldozer Dev kit rumor is true but the Jaguar rumor is false. What if both of those rumors are false, or the Bulldozer CPU in dev kit is false?
You also seem to love comparing things on paper and touting how awesome Power CPU architectures in PS3/360 were but ignoring real world results.
Wii U's CPU is a 3-core 1.2ghz OoO architecture and the GPU is a low end RV7xx series with just 12.8GB/sec of memory bandwidth.
Care to explain why games like Trine 2 look so much superior on Wii U despite such "weak specs" compared to PS3/360 consoles?
Trine 2 is one of the first games where developer didn't just do a 360/PS3 port to the Wii U but ported a PC game directly to the Wii U and took advantage of its more modern hardware:
"During an interview with NintendoLife, Frozenbyte’s sales manager Mikael Haveri commented that some of the content in Trine 2 for Wii U wouldn’t run on the PS3 or Xbox 360 without downscaling the graphics."
If Wii U is already capable of superior graphics to PS3/360 with some optimization for its 1.2ghz 3-core OoO CPU, don't you think PS4/Xbox 720 will blow PS3/720 away graphically with an 8-core OoO Jaguar and HD7000 GPU? Yes, they would, without any effort.
Ah wonderful, so throwing personal insults is part of your vast 'technical knowledge'? Errm not sure about that but one thing you can be sure off is earning your ban
@BestJinjo, its a waste of time to engage in conversations with illiterate people of this nature, they are merely making you waste your time by making you post all the valid info there by tiring you out typing it all up, unless you don't get tired in that case have at it. They get a rise out of seeing people taking the pains to correct their nonsense.
But he has reached the end of line with that insult he directed at you.
Add your Comment
Enter your username and e-mail address. Password will be sent to you.