Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Microsoft Xbox Next and PlayStation 4 to Feature Eight-Core AMD Jaguar Microprocessors – Report.

Started by: massau | Date 01/22/13 02:40:27 AM
Comments: 33 | Last Comment:  12/19/15 04:39:59 PM

Expand all threads | Collapse all threads


ram is cheap at the moment so they had better put a lot more in it like 16GB then you would be sure that they have enough ram for the next 8 years or so (because consoles live up to 8 years).
4 6 [Posted by: massau  | Date: 01/22/13 02:40:27 AM]
- collapse thread

Cost is only one factor you need a lot of DDR3 chips to get to 16GB cheaply and in a small space they're probably limited to 8 chips maybe 16 if they put them on each side of the motherboard. Which means you need to use the higher density 8Gb chips which are not cheap look up how much a single 16GB ddr3 dimm is.
1 0 [Posted by: sollord  | Date: 01/22/13 05:36:18 AM]
You don't need in a console as much memory as you need in a PC. You don't run full Windows in the background while playing in a console.
4 0 [Posted by: john_gre  | Date: 01/22/13 11:02:03 AM]
90% of the games cannot use more than 2 GB (3.5GB for execution) of RAM anyways because they are codded still on x32 bits...
4 4 [Posted by: TAViX  | Date: 01/22/13 05:36:27 AM]
these consoles will be 64 bit.. "AMD Jaguar x86 64-bit cores"
4 2 [Posted by: flyboy294  | Date: 01/22/13 09:45:00 AM]
The hardware will be yes, the question is if the software will also.
Personally i would expect that shift to be made, but it´s not something i would place a bet on.
3 0 [Posted by: DIREWOLF75  | Date: 01/22/13 10:23:48 AM]
16gig isnt as important in a console as it is in a PC since their isn't the overhead or running a full OS.
5 1 [Posted by: KeyBoardG  | Date: 01/22/13 07:24:44 AM]

Surprise !!.... CPU based on AMD Jaguar Micro-Architecture not Bulldozer, well done AMD. Bring that CPU to desktop as well....
6 1 [Posted by: tks  | Date: 01/22/13 04:26:30 AM]
- collapse thread

show the post
1 4 [Posted by: sollord  | Date: 01/22/13 05:43:14 AM]
i think the 15% is at the same freq (IPC), they clock higher and use less power, so it ends up being around 40%-50% faster with an increase in freq whilst using the same power.
3 0 [Posted by: parkerm35  | Date: 01/22/13 08:52:33 AM]
I am honestly surprised. I was thinking PS4 might use a 4-core Trinity APU with a dedicated GPU. Interesting that they are going with a 1.6Ghz 8-core Jaguar setup instead of a 3.0ghz 4-core AMD APU. I would have personally went with the faster quad-core as it's a lot more difficult to code games across 8 cores efficiently.
3 0 [Posted by: BestJinjo  | Date: 01/22/13 04:33:28 PM]

8 cores should be good enough for a while. and i'm sure game developers will take advantage of all of those cores in time.
2 0 [Posted by: SteelCity1981  | Date: 01/22/13 11:08:55 AM]
- collapse thread

yes, programmers make games for what hardware they target machine got. if it is 8-cores, games will soon start to be optimized to utilize them all.
optimizing (finally!) for more parallel execution is clearly the future, and it was long due late - a shame that existing architectures advantages - eg. more cores, 64-bit cpus etc are not utilized more, yet they are available for years...
2 0 [Posted by: snakefist  | Date: 01/23/13 02:31:09 AM]
In theory that sounds great but it didn't work out that way at all for Xbox 360/PS3. Xbox 360 was a 3-core 6 threaded CPU, while PS3 was a single main core CPU with 6 supporting smaller cores. How many games are threaded beyond 4 threads on those consoles or on the PC? Those consoles have been out since 2005/2006. Game code is notoriously poorly multi-threaded.

Sounds like the reason they are going for a 8-core Jaguar are lower power consumption and cost savings, not performance. It's basically a tablet/low-end laptop CPU, not a performance CPU. An A10-5800K or Richland quad would have been faster and easier to optimize for 4 cores.
1 0 [Posted by: BestJinjo  | Date: 01/23/13 09:47:25 AM]
i don't know how many games are threaded for those cores on the consoles, but i'm sure not very many consider the gpu and system ram would be bottlenecking any real potential with those extra cores for graphical use. the only real use i can see on the xbox 360 and ps3 with those cores would be geared towards manageing multiple operations more so then anything else like for kinect and the ps3's wand.

Sony is waiting to see what xbox 720 brings out to see what hardware it has so it can try to out due it.


Maybe we will see a Richland apu in the ps4 as from reported sony is going with a fusion based apu but as of yet no one knows anything further then that. if the ps4 goes with richland that would def make faster then xbox 720 jaguar apu.
0 0 [Posted by: SteelCity1981  | Date: 01/23/13 08:03:01 PM]

Since they are spending extra money for 8-cores, and that each of these cores are not high performance, MS and Sony expect game developers to code smartly in order to efficiently utilize all the available cores to get the highest performance. Plus they will have to design gaming engines to use GPU computation whenever possible.

Soon, normal laptops with decent APU will be able to run the latest Xbox Next/PS4 ported PC games.
1 0 [Posted by: gjcjan  | Date: 01/22/13 11:10:57 AM]
- collapse thread

While they are not high performing cores like desktop Bulldozer/Vishera/SB/IVB/Haswell CPUs, compared to the Cell and Xbox 360's Xenon it'll still be a huge improvement as those processors had an IPC slower than a Pentium 4. The Cell was actually a 1 core CPU with 6 supporting SPE engines. Without optimizations, that CPU was a complete dog.

Looks like the next consoles from Sony and Microsoft will supposedly share similar CPUs/architecture that will make cross-platform development much easier. The 720 is essentially a Windows PC, and the PS4 is basically a Linux PC. This way, developers could port things easily from the 720, PC, PS4 and SteamBox. I am looking forward to decently ported console games on the PC now that Xbox Next / PS4 will have x86 CPUs and hardware parts that are very similar to PC components.
3 2 [Posted by: BestJinjo  | Date: 01/22/13 04:28:56 PM]
This is good news for PC gaming ports made easy
3 0 [Posted by: vid_ghost  | Date: 01/22/13 07:10:25 PM]
show the post
1 5 [Posted by: TA152H  | Date: 01/22/13 09:17:04 PM]
Of course IPC is very important, haven't you learned this from Core 2 Duo vs. Pentium 4/D days? Or are you still stuck comparing Ghz?

A Xenon core at 3.2ghz is slower than one Pentium 4 3.2ghz core. In the context of modern CPUs, developers have estimated the IPC of the Xenon at just 1/4 of a Core i7. A Jaguar core at 1.6Ghz is going to be faster than a Xenon at 3.2ghz, not slower like you are suggesting.

You are comparing an in-order Xenon to out of order Jaguar with 128-bit floating point and 256-bit AVX instructions. If they code to take advantage of AVX, the CPU will cream the Xenon. Even without it, it will trounce it.

The entire Xenon 360 CPU is only 70-85% as fast as just 1 core of 1st generation i7:

You are giving Xenon way too much credit. A 2.13ghz Core 2 Duo was 50-70% faster in games compared to G5-based IBM PowerPC CPUs that Apple replaced at the time. Xenon is very similar to the G5 PowerPC architecture. A Jaguar is a lot closer in IPC to Core 2 Duo, and you are getting 8 of those cores with updated modern instructions that exponentially increase throughput, plus out of order architecture. It will mop the floor with the tri-core Xenon.

Did you ever play Dark Souls on Xbox 360/PS3? It's a 15 fps chugfest in Blightown (heavily CPU limited area), despite the game rendering at just 1024x720.

Athlon X4 620 2.6ghz can maintain 30 fps minimum at 2560x1600:

Sorry, but the CPUs in PS360 are slow and always have been. The key reason games run well on those consoles are graphical short-cuts (rendering at 20-30 fps, no DX11 effects, low resolution textures, texture pop-in, rendering games at low resolution like Black Ops 2 at 880x720 or Uncharted 3 at 896x504) and the fact that developers spend years coding directly to the metal (no API/OS overhead) and optimizing for their outdated hardware.
4 1 [Posted by: BestJinjo  | Date: 01/23/13 09:55:39 AM]
show the post
1 4 [Posted by: TA152H  | Date: 01/23/13 11:16:03 AM]
E-450 1600mhz = 1030 3DMark06
Intel Core 2 Duo T5470 1600mhz = 1365 3DMark06

Jaguar has a >15% increase in IPC over Bobcat. (0.95 --> 1.10 or 16% increase in IPC)

1030 * 1.16 IPC increase => 1195.

That's 88% of the IPC of the Core 2 Duo or 1.88x the IPC of Pentium 4, and Pentium 4 IPC > Xenon! Your 3.2ghz Xenon core would not even be as fast as a single 1.6Ghz Jaguar core and the Xenon has just 3 of those vs. 8 Jaguar cores. Your math doesn't add up.

You are saying that Jaguar's IPC is "not even close" to C2D? 88% looks close enough. You didn't even take into account new instructions it's getting like 256-bit AVX, etc. Also, that's just 15% IPC increase in general apps, not games. Jaguar doubles the FP pipe from 64-bit to 128-bit for one pass SSE execution. Gaming performance will improve.

Also, these are just rumored specs, not confirmed. Either way, what you are saying makes little sense - Sony and MS would not replace the CPUs in Xbox 360/PS3 with slower CPUs. That would make it impossible to make games next generation on their consoles and defeat the purpose of adding any next generation GPUs too because the consoles would be completely CPU limited. So even if we ignore all the specs for a second, what you are saying is not even logical. With PS4 rumored to feature a GCN with 18 Compute Units (1152 Stream Processors), clocked at 800mhz, the performance of 1.84 Tflops puts it similar to HD7850 2GB. The engineers at Sony wouldn't be stupid to pair an HD7850 GPU with a CPU that's barely as fast (or as you are implying slower) than Xbox 360's Xenon/Cell. Sorry TA152H, your logic doesn't add up.

Like I said, the CPU in Xbox 360/PS3 can't even hold FPS at 30 in Blighttown at 1024x720 resolution, while a lowly dual-core X2 620 2.6ghz manages 30 fps at 2560x1600. That tells you right there how garbage the CPUs are in current consoles. Obviously no one is expecting Core i7 4770K in a PS4 but whatever CPU they go with, it will mop the floor with the Cell on all real world metrics that matter for games.
4 1 [Posted by: BestJinjo  | Date: 01/23/13 02:02:04 PM]
show the post
0 4 [Posted by: TA152H  | Date: 01/23/13 05:41:29 PM]
You forgot that E-450 Bobcat I put up is a dual-core CPU. Jaguar 8-core is 4x the performance of that part + 15% increase in IPC.

So we'd have an 8-core Jaguar at 1.6ghz with all cores utilized being:

E-450 1030 3DMark06 * 1.15 * 4 times as many cores = 4738.

Obviously there is no perfect core scaling but a Core i5 2500K scores 5853. I am telling you an 8-core Jaguar would destroy a 3-core Xenon.
3 0 [Posted by: BestJinjo  | Date: 01/24/13 09:26:55 AM]
Ya, as I said, let's take these rumors with some grain of salt. Here is a different rumor stating PS4 might have a Bulldozer CPU:
3 0 [Posted by: BestJinjo  | Date: 01/23/13 03:23:59 PM]
show the post
0 3 [Posted by: TA152H  | Date: 01/24/13 01:02:20 AM]
It could be Piledriver FX8300 series. If consoles are aiming at 1920x1080 with AA, then unless their GPU is at least a GTX670, the console will be 90% GPU limited with an FX8300 series because CPU limited games like MMOs and RTS games aren't on consoles but GPU limited games like BF3, Sleeping Dogs, Max Payne 3, Hitman Absolution are:

You gotta consider that once you are talking about next generation DX11 games, up the resolution from 880x720-1280x720 of most PS360 games, add anti-aliasing, the workload will almost entirely shift to the GPU. They could use a slightly downclocked Bulldozer or Vishera to keep power consumption in check.

Given that any Intel i5/i7 is out of the question due to Intel's cost/margins, Bulldozer/Vishera would actually be the second best possible processor to use in a console long-term. They could play around with Turbo (say when 2 of 4 modules are active, it could Turbo to 4.0ghz and when 4 modules are used, clocks are limited to 2.8-3.0ghz or something). Even a 2.8-3.0ghz Bulldozer/Vishera CPU choice would be a lot better than any IBM CPU or an 8 core Jaguar.
3 0 [Posted by: BestJinjo  | Date: 01/24/13 09:31:06 AM]

Somehow the high volume of relatively slow memory + mystery hardware accelerators reminds me of the Caustic ray-tracer hardware shown at CES ... but probably no connection. What other heterogeneous architecture accelerators could be useful alongside the usual GPU, video accelerator, and cryptography accelerator? I guess vision accelerators could have special hardware for edge and motion detection.
0 0 [Posted by: gc  | Date: 01/22/13 10:19:22 PM]

Seems legit. They will want to keep power intake down to keep away Red Ring of Death(s) and Yellow Light of Death(s), and AMD's Jaguar will do the job just right. All the power they need from a customized architecture, while draining less enough power.
2 0 [Posted by: K1107  | Date: 01/23/13 02:16:19 AM]
- collapse thread

RROD was simply a result of cheap thermal paste/glue on the thermal system. Over time the heatsink would separate from the chip and the chip would overheat/die, naturally. If MS spent $2 on high quality thermal/paste glue, it wouldn't have been a problem even with a 250W system power consumption.

YLOD was related to Nvidia's infamous bump-gate scandal when they decided to cut costs and use lead-free solder. As the GPU would heat and cool, over time, cracks/fissures in the soldering would appear and the GPU would lose signal with the motherboard at the ball soldering contact points. This is why you can "revive" the PS3 temporarily with a thermal heat gun across the GPU/motherboard. Hence to fix the PS4 YLOD you have to 'reball' the RSX GPU with new solder.

Again, neither of these issues were caused by high power consumption but due cost cutting/use of low quality components. This is no different than GeForce 8 cards eventually dying anyway due to bumpgate solder fissures (YLOD) or mounting a cooler incorrectly on a 50W GPU and it fries when you power the system on (RROD).

You know when you can cool 2 GTX680s in a GTX690 with its near 300W of power usage in such a small space, that it's doable if you are willing to pay the $ for a high-end cooling system parts. A 95W CPU/APU and a 100W HD7970M would have been perfectly adequate in terms of cooling them in a console. You can even fit a GTX660Ti/670 with a Core i7 2600K inside a chassis the size of original PS3 -- Alienware X51.

Seems like the primary reason they are going with low-power 8-core Jaguar CPU instead of Trinity / Richland APU is for cost cutting reasons.
2 0 [Posted by: BestJinjo  | Date: 01/23/13 10:52:13 AM]

No SSDs, long load times, no thank you.

At least the PC ported games won't suffer as much from dumb-downed AI 8GB vs 512MB. The x86 processor helps in this respect also. Even with an SSD, consoles will never support mods and you'll have to spend a fortune on games.
0 0 [Posted by: ericore  | Date: 02/08/13 02:13:47 PM]


Back to the Article

Add your Comment