News
 

Bookmark and Share

(20) 

Specifications of a Sony PlayStation 4 software development kit have been published by a web-site, confirming the PC nature of the next-generation video game system from Sony Corp. and the fact that it will rely on central processing unit and graphics processing unit designed by Advanced Micro Devices. Unfortunately, specs of the devkit have nothing to do with actual specification of the console, but give a lot of clues.

Sony PlayStation 4 development kit model number DVKT-KS000K is based on an eight-core AMD FX-series processor with “Bulldozer” cores, AMD Radeon HD “R1000-series” (DirectX 11.1-class Sea Islands product line based on Graphics Core Next architecture with some tweaks) graphics processing unit and  is equipped with 8GB of system memory and 2.2GB of graphics memory, reports Kotaku web-site. The system also features Blu-ray disc drive, 160GB hard drive, four USB 3.0 ports, two Ethernet ports as well as HDMI video/audio output in addition to optical output for 2.0, 5.1, 7.1 audio.

Keeping in mind the fact that AMD’s high-performance x86 Bulldozer and Piledriver cores are pretty large and expensive to make (FX-series microprocessors on both types of cores are currently manufactured using super-high performance 32nm HKMG process technology on expensive SOI wafers), it is unlikely that Sony PlayStation 4 will rely on a version of FX-series chip with eight cores. As reported before, both Microsoft Xbox “Durango” and Sony PlayStation 4 “Orbis” are going to be based on highly-integrated system-on-chips featuring eight low-power/low-cost AMD Jaguar x86 64-bit cores.


Images for illustrative purposes only

AMD’s Jaguar looks very promising on paper and has a number of advantages that may be especially valuable for game consoles, including 128-bit floating point unit (FPU) with enhancements and double-pumping to support 256-bit AVX instructions as well as an innovative integer unit with new hardware divider, larger schedulers and more out-of-order resources. In addition, AMD Jaguar supports the same new instructions as Bulldozer/Piledriver, including SSE4.1, SSE4.2, AES, PCLMUL, AVX, BMI, F16C as well as MOVBE (which is the reason why the PS4 devkit uses already available AMD FX-series chips). Another advantage of using AMD’s Jaguar for video game consoles is relatively simplistic design of the core that lets Microsoft and Sony to order manufacturing of chips powered by Jaguar at different foundries without major problems with porting the design to a different process technologies. Yet another benefit of Jaguar is its small size (just 3.1mm2 per die, without L2 cache), which allows to integrate eight of such cores into one chip without significantly improving costs.

Obviously, AMD’s Jaguar is substantially behind the company’s high-end x86 cores when it comes to general-purpose performance and therefore some of the operations may take a long time to complete, unless there are no special-purpose accelerators integrated or the consoles will heavily rely on GPGPU [general-purpose computing using GPUs] technologies.

The specifications of the PlayStation 4 development kit confirm that the PS4 “Orbis” will be largely based on off-the-shelf components tightly integrated into custom system-on-chips with Sony’s own IP. The next-generation PlayStation is expected to get eight all-new x86 cores with latest instructions as well as future-generation of AMD Radeon HD DirectX 11.1-class graphics engine, which should provide decent performance and decent quality of visuals. What obviously remains to be seen is what secret sauce will Sony add to the custom SoC designed by AMD as clearly the company will hardly rely on pure PC technologies, given its traditions.

What is a bit alarming is that the PS4 development kit is reportedly used today. Less than a year before the launch of the console, at least some game developers do not use the final hardware, which is reportedly going to be significantly less powerful than the PlayStation 4 “Orbis” devkit.

Sony did not comment on the news-story.

Tags: Sony, Playstation, Xbox, Xbox Next, Microsoft, Loop, Durango, Odin, Omni, Orbis

Discussion

Comments currently: 20
Discussion started: 01/25/13 11:02:00 AM
Latest comment: 01/29/13 08:57:50 PM
Expand all threads | Collapse all threads

[1-4]

1. 
I don't see anything alarming about using the devkit now. I suspect that the Jaguar based custom CPUs for these consoles will be far more capable than most people expect.
10 2 [Posted by: beenthere  | Date: 01/25/13 11:02:00 AM]
Reply

2. 
how about fixing load times anyone remember when sony and microsoft said the load times will be faster on ps3 and xbox 360, when they were just as long as they were on the xbox and ps2. i mean that would be a plus right off the bat. throwing in an ssd hard drive would make load times a lot faster. i hope that 160gb is an SSD i'd settle for an 80gb SSD hard drive over a regular 160gb hard drive any day.
8 2 [Posted by: SteelCity1981  | Date: 01/25/13 01:39:05 PM]
Reply
- collapse thread

 
The load times weren't much faster because they were handling way more data even with much more RAM; the N64 had the same problem, cartridge data transfer speeds didn't negate the need for RAM speed/amount(and Nintendo was full of crap in their marketing of the system). Sony and Microsoft pretty much had the exact same issue.

Personally, I wish they'd release a model without any kind of drive at all so I could just slap a 2tb one in there without having a wasted hard drive; they could either just cut that out of their production costs or lower the price of the console to make up for it, either way I really wish that was an option.
2 2 [Posted by: Facelord  | Date: 01/26/13 10:50:24 AM]
Reply

3. 
show the post
0 10 [Posted by: TA152H  | Date: 01/26/13 02:48:03 PM]
Reply
- collapse thread

 
You are basing your jaguar info on what? do you have a jaguar sample with you right now? Nope you don't.

As for specs on the consoles there are a lot of specs out there as rumor, some have got some of it right while most of it are inaccurate.

Wait for the jaguar reviews to be out and then decide on what it is worth. If you are still unsatisfied you can sell your shares you said you have in AMD. Making up pseudo science analysis is pointless.
7 1 [Posted by: vanakkuty  | Date: 01/27/13 03:39:22 AM]
Reply
 
show the post
0 6 [Posted by: TA152H  | Date: 01/28/13 03:02:46 PM]
Reply
 
Yeah I know lot about these things working in this industry as VLSI/PD engineer so save the patronizing talk about the subject, you don't know it.

You don't really know where I work, haven't told anyone. I have my means to know a lot about various companies, roadmaps, products, their specs etc AMD is one of them.

AMD has told the world whats in the slides, you don't have any idea on the final specs for Kabini and Temash yet you say its no good for said purpose.

You claim you have AMD shares yet all your posts are mostly uninformed material made from assumptions or pure misinformation when it comes to AMD. The distinct feeling here is that your claim of having AMD shares is being used by you as a cover for your poor posts. For a share holder you have a very poor understanding of AMD as a company.


The reality is, the IPS of such a device is probably too low for a console that needs to last for another six years. There are better choices out there.

Also, what sense would it make that they send out BD samples, and are going to use the Jaguar? They are different architectures, in case you didn't know.

Now you do.


You don't even know how development of games are done. Using higher end machines is always part of development process, they have always been the test beds from where they de-tune to meet target specs. Performance modelling is a big part of any software development process. This tells me your vocation has nothing to do with software development. Thats fine, but making up misinformaton doesn't help your case.


There are a lot specs thrown out there for the consoles, most of it is wrong, some have got it right, yes I also know the full details on both consoles. A reason why I don't engage in such discussion of spec speculations. That is for posters in here to enjoy engaging in.

********

Can you read, or just a fan-boi without knowledge?


Oh and I haven't overlooked the fact you were blind reading posts the last time, asking me to look before posting when infact it was you who replied to the wrong person (me) accusing me of needing look before posting. I was conversing with another person totally.

You haven't shown the least bit of decency to apologize for your mistake.

I suggest you tidy up your act a bit, your profile isn't very convincing especially given your actions.
4 1 [Posted by: vanakkuty  | Date: 01/28/13 07:19:08 PM]
Reply
 
Actually, I didn't make any mistakes except to post with someone who's obviously a liar.

You don't know anything about the subject.

You don't send development machines for completely different architectures, unless there are no better choices. BD has two integer units, and optimizing for it are very different. That's the point.

Also, games show better performance with fewer, more powerful processors. Didn't you know that?

I have AMD shares, and I know what AMD says about their products. Some blow-hard without any knowledge doesn't trump what AMD says, except in his own little world.

AMD has told everyone what to expect for Jaguar, and it's not ideal for a game console. Small people think they know more than companies, but, I trust the companies more than some random internet poster who acts like he knows more than he does.

It doesn't add up. It's a poor choice.

By the way, I know you're probably poor, but do you want to put up some money on whether I own AMD shares? If not, who's being the hyopocrite by not apologizing?

Oh, and in case logic isn't your strong point, and it clearly isn't, I do think AMD will get these design wins. Just not with the chip as it's being described. It's the wrong choice for the product, and the dev kits don't approximate it, but do approximate the much more powerful PD architecture. So, while nothing is ever certain, either they'll have to HEAVILY modify the Jag, or use a PD derivative. The latter of the two is far more likely.
0 2 [Posted by: TA152H  | Date: 01/29/13 11:26:02 AM]
Reply
 
Actually, I didn't make any mistakes except to post with someone who's obviously a liar.


So you are still saying you responded to the right person eh?


Its your first post from the top in that article. Pretty self explanatory when my post wasn't even for you, others have noticed it and mocked you for double vision.

I shall link the article for everyone to see.

http://www.xbitlabs.com/n...of_Revenue_This_Year.html

and off you go since trolling seems to be your agenda.

You are out of your element on pretty much everything you say as others constantly correct you, I already tried the same and it was going over your head. Too much for you to understand perhaps.

You don't send development machines for completely different architectures, unless there are no better choices. BD has two integer units, and optimizing for it are very different. That's the point.


You are a clueless person who is not a software developer nor a E.E engineer, really please, try to educate yourself about basic software development processes. Performance modelling is an everyday thing for many developers of specialized tasks like games, databases, high transaction HPC applications to name a few. The modelling is applied depending on what target performance you aim to tune to

To say this is nonsense is just hilarious! Its a pity you pretend to understand something when its awfully clear you are blatantly lying.

Also, games show better performance with fewer, more powerful processors. Didn't you know that?


You are using poor understanding once again. The next gen consoles are aiming to up the base resolutions much higher than your average CPU benchmark resolutions you see in reviews. Its is more GPU limited with higher resolutions. The over heads faced on a console are also completely different to the overheads in a PC environment.

Apparently you don't know that either.

I have AMD shares, and I know what AMD says about their products. Some blow-hard without any knowledge doesn't trump what AMD says, except in his own little world.


Says the one who had no understanding of the flow of semiconductor design through to fabrication until I had to explain it to you in a much over simplified way for you to easily understand, yet you couldn't understand and conveniently ignored replying to that post. Felt too much for you? Then don't engage in a subject that is way out of your league.

Here is that article where you got schooled:

http://www.xbitlabs.com/n...ymore_Head_of_Nvidia.html

AMD has told everyone what to expect for Jaguar, and it's not ideal for a game console. Small people think they know more than companies, but, I trust the companies more than some random internet poster who acts like he knows more than he does.


AMD hasn't given out final specs so estimation is irrelevant.

Already I told you I work as a P.D engineer and I have my means to know things of special interest companies, AMD is one of them. It will become clear to you if you pay attention to the subtle hints I drop from time to time.

You should also prepare to eat humble pie when Kabini and Temash are out in the reviewer's hands. It will be a hard pill for your ego to swallow

By the way, I know you're probably poor, but do you want to put up some money on whether I own AMD shares? If not, who's being the hyopocrite by not apologizing?


You owning shares is about as real as people like David Wilcox, Anton LeVey who makes various fringe science claims. These guys said a lot of pseudo science gibberish that fooled, even impressed, the uninformed. Doesn't work around in here

Everyone can see now how you lack any decency from that link I posted above. Your apology on that is still pending.

Oh, and in case logic isn't your strong point, and it clearly isn't, I do think AMD will get these design wins. Just not with the chip as it's being described. It's the wrong choice for the product, and the dev kits don't approximate it, but do approximate the much more powerful PD architecture. So, while nothing is ever certain, either they'll have to HEAVILY modify the Jag, or use a PD derivative. The latter of the two is far more likely.


I know exactly who the companies are involved in the development of these console chips. I also haven't confirmed or denied any particular company or companies having bagged the design wins that includes AMD.

Twisting words wont help you

I don't leak info of that sort to anyone. But know this you are in for one big surprise, it wont be good for you ego, because you won't understand how it works out just very well. Your posts here show you don't even have the basic tech savvy understanding of microprocessors, let alone debate with an engineer on comparch.

Once again look up 'performance modelling', maybe there is some hope for you if heed advice to educate yourself.

And you can save the claims about having shares in AMD, you obviously have a very poor understanding of AMD compared to even a street going average joe. You can take your AvonX/Jml/123 succession plans/ambitions elsewhere, we need serious posters or in the very least well mannered ones, the other kind is not conducive to good discussion.
1 0 [Posted by: vanakkuty  | Date: 01/29/13 08:33:29 PM]
Reply
 
Fantasy? Xbitlabs is just reporting on various rumors but the authors are not conclusively stating that Jaguar will be in next generation consoles. Regardless, even though you continue to deny that an 8-core Jaguar would level the Cell/Xenon CPUs into the ground, even if it's slower than an 8-core Bulldozer/PD, you still haven't explained how you can actually include a Bulldozer/PD inside a PS4 given that the FX8350 system uses > 200W at load without the GPU being loaded:
http://www.xbitlabs.com/a.../amd-fx-8350_8.html#sect0

Did it not occur to you that even FX-4300 system uses > 140W of power without the GPU?
http://www.xbitlabs.com/a...20-6300-4300_8.html#sect0

Even A10-5800K CPUs & AM3 motherboards use too much power for a small console case:
http://www.computerbase.d...st-amd-fx-8350-vishera/9/

There are actually two key reasons why Bulldozer/PD may not be a good fit for a console without significant reduction in clocks: Price and power consumption.

Considering how awful the CPUs were in PS3/360, an 8-core Jaguar out-of-order 128-bit floating point + AVX instruction set CPU would be magnitudes of times faster. Maybe you need to be reminded again that the Cell was just a 1-core CPU with 6 primitive SPE engines. Coding for the PS3 is much harder than x86 because it is built of 1 PPE and up to 8 SPE's (6 in the PS3). These SPE units are exceptionally difficult to get high levels of parallelism out of because the SPEs are not full cores. They are highly specialized companion cores and require special care to keep busy and running without going idle. So going from that to an 8-core Jaguar is a huge upgrade since you would get 8 actual stand-alone cores, with each one able to perform any calculations you want. You couldn't do this with SPEs.

x86 is also much more capable than the Power PC architectures. Capcom developers once stated that one PowerPC core in the Xbox 360 has 2/3 the performance of a Pentium 4 running at the same clock speed. Jaguar is expected to have 1.88x of the IPC of Pentium 4 because it will have 88% of the IPC of a Core 2 Duo.
7 1 [Posted by: BestJinjo  | Date: 01/27/13 06:30:47 PM]
Reply
 
show the post
1 8 [Posted by: er_wendigo  | Date: 01/28/13 03:47:20 AM]
Reply
 
OK since you clearly didn't understand anything I said, I will spell it out in simplest terms:

Cell / Xbox 360 CPU core has 66% of the IPC of a Pentium 4. Jaguar has 1.88x IPC of Pentium 4.

Cell/Xbox 360 3.2ghz * 0.66 = 2.112Ghz equivalent Pentium 4

Jaguary 1.6ghz * 1.88 = 3.008Ghz equivalent Pentium 4

That means a single Jaguar core is 50% faster than a single Cell/Xbox 360 CPU core. If they put 8 of these cores, the CPU will be at magnitudes of times faster than PS3's Cell or Xbox 360's CPU.

If you can't understand this simple math, I can't simplify it further for you. I am not even going to get into the technical discussions of the processors themselves and how having a 128-bit floating point upgrade from Bobcat is huge. You say the performance will be "very poor", maybe compared to a modern Core i5/i7 but not compared to the anemic CPUs inside PS360.
6 1 [Posted by: BestJinjo  | Date: 01/28/13 01:02:12 PM]
Reply
 
show the post
0 7 [Posted by: TA152H  | Date: 01/28/13 03:07:22 PM]
Reply
 
I already showed to you the math before but you keep ignoring it. E-450 + 15% IPC is 88% of the performance of a 1.6ghz Core 2 Duo. Core 2 Duo has a 100% (or 2.0x) increase in IPC over Pentium 4. I also told you logically your opinion doesn't work -- Sony and MS would never replace their 7 year old consoles with consoles that have a slower CPU. All details aside, whatever CPU they go with, it'll be much faster.
6 1 [Posted by: BestJinjo  | Date: 01/28/13 03:48:13 PM]
Reply
 
Did you ever hear the expression, garbage in, garbage out?

Your math, wherever it is, is based on messed up numbers.

I already showed you where the performance was HALF of a Celeron.

I have both a Bobcat and a Pentium based Core 2. 88% my ass. You're talking about something you don't know anything about.

The Jag will be +15%, but then, a Core 2 using dual channel memory with a discrete card will be at least 15% faster than my Pentium.

They aren't close. The Bobcat is too slow to even run Netflix at high def. The Pentium does it with ease.

You have to use real numbers, not numbers on one particularly benchmark you like, especially 3D Mark. It's a much more complex architecture. Jaguar is good at what it is, but performance isn't one of the more important design criteria. It's more like a VIA Nano than anything.
0 2 [Posted by: TA152H  | Date: 01/29/13 11:30:50 AM]
Reply
 
show the post
0 7 [Posted by: TA152H  | Date: 01/28/13 03:14:13 PM]
Reply
 
"POWER is the highest performing processor in the world, by the way. It's a superior instruction set to x86, and always was, so let's leave that idiotic argument out of this, since it's not important anyway."

Power is a joke for games and always has been. Not only do you have PS3/360 to prove how inefficient and slow PowerPC CPUs are, but the fastest CPUs for games have all been x86 in the last 10 years, specifically from AMD and Intel. Do you need me to look up gaming benchmarks for Apple G5 vs. Core 2 Duo when Apple said PowerPC CPUs are too slow and power hungry and ditched them for Intel's? Those benchmarks are not pretty.

Also, did you just conveniently ignore that the Xbox 360's CPU was PowerPC based and that CPU had worse IPC to a Pentium 4? Now you are just starting to sound like one of those Cell fans that compares CPUs based on floating point performance and theoretical Gflops.

"I have a Bobcat and a Pentium based Core, and they aren't even close. So, we're in fantasy land about this processor that has to be valid for the next seven years or so. That's your first mistake."

Bobcat is just 2 cores. Per the article, PS4 may have 8 Jaguar cores, or 4x the number of cores. Also, what does it mean "has to be valid for the next seven years or so." People still buy PS2, Wii, PS360 consoles despite all of those having outdated graphics. For a console to last, it's purely about sales, not graphics. If you just want to talk about graphics or maxing out a console, then PS3/360 were 'outdated' in 2007 when Crysis came out on the PC. Regardless, Xbox 720/PS4 can't make the same lasting impression if you will compared to their predecessors because PS3 cost Sony $800+ to manufacture, while Xbox 360 was $525. Sony market cap is barely above $15 billion. Given that PS3 was overall unprofitable for Sony since inception after you take into account hardware losses + software/accessory profits, then it stands to reason PS4 won't be some $700-800 console sold for $499-599 in retail. Not sure what you were expecting exactly.

"Second mistake is not realizing they can bring the power down on the PD pretty dramatically with lower voltages, and still end up with superior performance to the Jaguar. They are very different in performance."

I never said they are not different. I would much rather a console use something like an A10-6700 than an 8-core Jaguar.
http://techreport.com/new...eveal-higher-clock-speeds

As I mentioned already, even a Piledriver quad-core system (CPU+mobo) uses way too much power for a console. A10-5800K system uses 148W of power without the GPU being active:
http://www.computerbase.d...-im-cpu-vergleichstest/8/

Piledriver 6000 series are still going to be made on 32nm node. All I am doing is commenting on what happens if the consoles have a Jaguar CPU and providing reasons why even a quad-core Piledriver might be too costly or too power hungry. This is not about what hardware you would put into PS4 if you were on the project but what we are hearing based on rumors.

"Last, and most importantly, in case you didn't realize it, Jaguar is a different architecture than Bulldozer. You don't send out Bulldozer samples, if you are going to have a Jaguar based product. You wouldn't optimize the same way."

Aren't you contradicting yourself? You selectively choose to believe that the Bulldozer Dev kit rumor is true but the Jaguar rumor is false. What if both of those rumors are false, or the Bulldozer CPU in dev kit is false?

You also seem to love comparing things on paper and touting how awesome Power CPU architectures in PS3/360 were but ignoring real world results.

Wii U's CPU is a 3-core 1.2ghz OoO architecture and the GPU is a low end RV7xx series with just 12.8GB/sec of memory bandwidth.

Care to explain why games like Trine 2 look so much superior on Wii U despite such "weak specs" compared to PS3/360 consoles?
http://www.eurogamer.net/...lfoundry-trine-2-face-off

Trine 2 is one of the first games where developer didn't just do a 360/PS3 port to the Wii U but ported a PC game directly to the Wii U and took advantage of its more modern hardware:

"During an interview with NintendoLife, Frozenbyte’s sales manager Mikael Haveri commented that some of the content in Trine 2 for Wii U wouldn’t run on the PS3 or Xbox 360 without downscaling the graphics."
http://www.geek.com/artic...-wii-u-graphics-20121010/

If Wii U is already capable of superior graphics to PS3/360 with some optimization for its 1.2ghz 3-core OoO CPU, don't you think PS4/Xbox 720 will blow PS3/720 away graphically with an 8-core OoO Jaguar and HD7000 GPU? Yes, they would, without any effort.
6 1 [Posted by: BestJinjo  | Date: 01/28/13 04:39:11 PM]
Reply
 
show the post
0 5 [Posted by: TA152H  | Date: 01/29/13 12:02:48 AM]
Reply
 
TA152H wrote to BestJinjo:
Jaguar is expected to have 88% of the IPC of the Core 2 Duo? By whom? You and your dog?


Ah wonderful, so throwing personal insults is part of your vast 'technical knowledge'? Errm not sure about that but one thing you can be sure off is earning your ban

@BestJinjo, its a waste of time to engage in conversations with illiterate people of this nature, they are merely making you waste your time by making you post all the valid info there by tiring you out typing it all up, unless you don't get tired in that case have at it. They get a rise out of seeing people taking the pains to correct their nonsense.

But he has reached the end of line with that insult he directed at you.
0 0 [Posted by: vanakkuty  | Date: 01/29/13 08:57:49 PM]
Reply

4. 
show the post
1 7 [Posted by: jihadjoe  | Date: 01/27/13 02:59:29 AM]
Reply

[1-4]

Add your Comment




Related news

Latest News

Monday, August 4, 2014

4:04 pm | HGST Shows-Off World’s Fastest SSD Based on PCM Memory. Phase-Change Memory Power’s World’s Fastest Solid-State Drive

Monday, July 28, 2014

6:02 pm | Microsoft’s Mobile Strategy Seem to Fail: Sales of Lumia and Surface Remain Low. Microsoft Still Cannot Make Windows a Popular Mobile Platform

12:11 pm | Intel Core i7-5960X “Haswell-E” De-Lidded: Twelve Cores and Alloy-Based Thermal Interface. Intel Core i7-5960X Uses “Haswell-EP” Die, Promises Good Overclocking Potential

Tuesday, July 22, 2014

10:40 pm | ARM Preps Second-Generation “Artemis” and “Maya” 64-Bit ARMv8-A Offerings. ARM Readies 64-Bit Cores for Non-Traditional Applications

7:38 pm | AMD Vows to Introduce 20nm Products Next Year. AMD’s 20nm APUs, GPUs and Embedded Chips to Arrive in 2015

4:08 am | Microsoft to Unify All Windows Operating Systems for Client PCs. One Windows OS will Power PCs, Tablets and Smartphones