News
 

Bookmark and Share

(32) 

 

The yet-not-officially-announce next-generation video-game systems from Microsoft Corp. and Sony Corp. are projected to utilize microprocessor technology that is not yet available for personal computers. The recent rumours about Xbox Next and PlayStation 4 “Orbis” imply that the two systems will be powered by Advanced Micro Devices’ code-named Jaguar x86 cores that will make it to the market later this year.

Jaguar Micro-Architecture to Power Next-Gen Consoles

According to Eurogamer.net’s Digital Foundry, both Microsoft Xbox “Durango” and Sony PlayStation 4 “Orbis” are going to be based on highly-integrated system-on-chips featuring AMD Jaguar x86 64-bit cores. The SoCs are projected to be conservatively clocked at around 1.6GHz, which should ensure maximum possible yields as well as low temperature of multi-core solutions. Keeping in mind that video game consoles are designed to last for many years, it is possible that SoCs inside future PlayStation and Xbox will feature certain tweaks, optimizations and innovations that will not be available on personal computers for a while.

The idea to use AMD’s low-power/low-cost cores instead of high-performance x86 cores has both pros and cons. On the one hand, AMD’s Jaguar looks very promising on paper and has a number of advantages that may be especially valuable for game consoles, including 128-bit floating point unit (FPU) with enhancements and double-pumping to support 256-bit AVX instructions as well as an innovative integer unit with new hardware divider, larger schedulers and more out-of-order resources. On the other hand, AMD’s Jaguar is substantially behind the company’s high-end x86 cores when it comes to general-purpose performance and therefore some of the operations may take a long time to complete, unless there are not special-purpose accelerators integrated or the consoles will heavily rely on GPGPU [general-purpose computing using GPUs] technologies.

Perhaps, the main advantage of using AMD’s Jaguar for video game consoles is relatively simplistic design of the core that lets Microsoft and Sony to make chips powered by Jaguar at different foundries without major problems with porting the design to a different process technology. Yet another benefit of Jaguar is its small size (just 3.1mm2 per die, without L2 cache), which allows to integrate eight of such cores into one chip without significantly improving costs.

Graphics Performance Remains A Mystery

Based on earlier rumours, the Digital Foundry also suggests that graphics processing unit inside PlayStation 4 “Orbis” is projected to feature higher raw performance than graphics core of Xbox “Durango”. However, the web-site claims that the SoC inside Microsoft’s future console also has a graphics core incorporated. Such GPU makes sense for GPU compute as well as for multimedia graphics applications that do not require high-performance graphics (Xbox Live apps). Obviously, the same core can assist the main GPU engine when needed, hence, the actual difference between graphics processing performance of the two consoles still remains unclear.

Many media sources also reported that while Sony PlayStation 4 “Orbis” will rely on 4GB of high-speed GDDR5 memory, Microsoft Xbox Next “Durango” will feature 8GB of mainstream DDR3 memory as well as high-speed eDRAM buffer for GPU with some fixed-function logic to compensate slower memory.

It is expected that both consoles will be available this holiday season for $350 - $400. Formal announcements by Sony and Microsoft are projected to be scheduled in the first half of 2013.

Microsoft and Sony did not comment on the news-story.

Tags: Sony, Playstation, Xbox, Xbox Next, Microsoft, Loop, Durango, Odin, Omni, Orbis

Discussion

Comments currently: 32
Discussion started: 01/22/13 02:40:27 AM
Latest comment: 02/08/13 02:13:47 PM
Expand all threads | Collapse all threads

[1-7]

1. 
ram is cheap at the moment so they had better put a lot more in it like 16GB then you would be sure that they have enough ram for the next 8 years or so (because consoles live up to 8 years).
4 6 [Posted by: massau  | Date: 01/22/13 02:40:27 AM]
Reply
- collapse thread

 
Cost is only one factor you need a lot of DDR3 chips to get to 16GB cheaply and in a small space they're probably limited to 8 chips maybe 16 if they put them on each side of the motherboard. Which means you need to use the higher density 8Gb chips which are not cheap look up how much a single 16GB ddr3 dimm is.
1 0 [Posted by: sollord  | Date: 01/22/13 05:36:18 AM]
Reply
 
You don't need in a console as much memory as you need in a PC. You don't run full Windows in the background while playing in a console.
4 0 [Posted by: john_gre  | Date: 01/22/13 11:02:03 AM]
Reply
 
90% of the games cannot use more than 2 GB (3.5GB for execution) of RAM anyways because they are codded still on x32 bits...
4 4 [Posted by: TAViX  | Date: 01/22/13 05:36:27 AM]
Reply
 
these consoles will be 64 bit.. "AMD Jaguar x86 64-bit cores"
4 2 [Posted by: flyboy294  | Date: 01/22/13 09:45:00 AM]
Reply
 
The hardware will be yes, the question is if the software will also.
Personally i would expect that shift to be made, but it´s not something i would place a bet on.
3 0 [Posted by: DIREWOLF75  | Date: 01/22/13 10:23:48 AM]
Reply
 
16gig isnt as important in a console as it is in a PC since their isn't the overhead or running a full OS.
5 1 [Posted by: KeyBoardG  | Date: 01/22/13 07:24:44 AM]
Reply

2. 
Surprise !!.... CPU based on AMD Jaguar Micro-Architecture not Bulldozer, well done AMD. Bring that CPU to desktop as well....
6 1 [Posted by: tks  | Date: 01/22/13 04:26:30 AM]
Reply
- collapse thread

 
show the post
1 4 [Posted by: sollord  | Date: 01/22/13 05:43:14 AM]
Reply
 
i think the 15% is at the same freq (IPC), they clock higher and use less power, so it ends up being around 40%-50% faster with an increase in freq whilst using the same power.
3 0 [Posted by: parkerm35  | Date: 01/22/13 08:52:33 AM]
Reply
 
I am honestly surprised. I was thinking PS4 might use a 4-core Trinity APU with a dedicated GPU. Interesting that they are going with a 1.6Ghz 8-core Jaguar setup instead of a 3.0ghz 4-core AMD APU. I would have personally went with the faster quad-core as it's a lot more difficult to code games across 8 cores efficiently.
3 0 [Posted by: BestJinjo  | Date: 01/22/13 04:33:28 PM]
Reply

3. 
8 cores should be good enough for a while. and i'm sure game developers will take advantage of all of those cores in time.
2 0 [Posted by: SteelCity1981  | Date: 01/22/13 11:08:55 AM]
Reply
- collapse thread

 
yes, programmers make games for what hardware they target machine got. if it is 8-cores, games will soon start to be optimized to utilize them all.
optimizing (finally!) for more parallel execution is clearly the future, and it was long due late - a shame that existing architectures advantages - eg. more cores, 64-bit cpus etc are not utilized more, yet they are available for years...
2 0 [Posted by: snakefist  | Date: 01/23/13 02:31:09 AM]
Reply
 
In theory that sounds great but it didn't work out that way at all for Xbox 360/PS3. Xbox 360 was a 3-core 6 threaded CPU, while PS3 was a single main core CPU with 6 supporting smaller cores. How many games are threaded beyond 4 threads on those consoles or on the PC? Those consoles have been out since 2005/2006. Game code is notoriously poorly multi-threaded.

Sounds like the reason they are going for a 8-core Jaguar are lower power consumption and cost savings, not performance. It's basically a tablet/low-end laptop CPU, not a performance CPU. An A10-5800K or Richland quad would have been faster and easier to optimize for 4 cores.
1 0 [Posted by: BestJinjo  | Date: 01/23/13 09:47:25 AM]
Reply
 
i don't know how many games are threaded for those cores on the consoles, but i'm sure not very many consider the gpu and system ram would be bottlenecking any real potential with those extra cores for graphical use. the only real use i can see on the xbox 360 and ps3 with those cores would be geared towards manageing multiple operations more so then anything else like for kinect and the ps3's wand.

Sony is waiting to see what xbox 720 brings out to see what hardware it has so it can try to out due it.

http://www.tomshardware.c...h-Release-Date,20599.html

Maybe we will see a Richland apu in the ps4 as from reported sony is going with a fusion based apu but as of yet no one knows anything further then that. if the ps4 goes with richland that would def make faster then xbox 720 jaguar apu.
0 0 [Posted by: SteelCity1981  | Date: 01/23/13 08:03:01 PM]
Reply

4. 
Since they are spending extra money for 8-cores, and that each of these cores are not high performance, MS and Sony expect game developers to code smartly in order to efficiently utilize all the available cores to get the highest performance. Plus they will have to design gaming engines to use GPU computation whenever possible.

Soon, normal laptops with decent APU will be able to run the latest Xbox Next/PS4 ported PC games.
1 0 [Posted by: gjcjan  | Date: 01/22/13 11:10:57 AM]
Reply
- collapse thread

 
While they are not high performing cores like desktop Bulldozer/Vishera/SB/IVB/Haswell CPUs, compared to the Cell and Xbox 360's Xenon it'll still be a huge improvement as those processors had an IPC slower than a Pentium 4. The Cell was actually a 1 core CPU with 6 supporting SPE engines. Without optimizations, that CPU was a complete dog.

Looks like the next consoles from Sony and Microsoft will supposedly share similar CPUs/architecture that will make cross-platform development much easier. The 720 is essentially a Windows PC, and the PS4 is basically a Linux PC. This way, developers could port things easily from the 720, PC, PS4 and SteamBox. I am looking forward to decently ported console games on the PC now that Xbox Next / PS4 will have x86 CPUs and hardware parts that are very similar to PC components.
3 2 [Posted by: BestJinjo  | Date: 01/22/13 04:28:56 PM]
Reply
 
This is good news for PC gaming ports made easy
3 0 [Posted by: vid_ghost  | Date: 01/22/13 07:10:25 PM]
Reply
 
show the post
1 5 [Posted by: TA152H  | Date: 01/22/13 09:17:04 PM]
Reply
 
Of course IPC is very important, haven't you learned this from Core 2 Duo vs. Pentium 4/D days? Or are you still stuck comparing Ghz?

A Xenon core at 3.2ghz is slower than one Pentium 4 3.2ghz core. In the context of modern CPUs, developers have estimated the IPC of the Xenon at just 1/4 of a Core i7. A Jaguar core at 1.6Ghz is going to be faster than a Xenon at 3.2ghz, not slower like you are suggesting.

You are comparing an in-order Xenon to out of order Jaguar with 128-bit floating point and 256-bit AVX instructions. If they code to take advantage of AVX, the CPU will cream the Xenon. Even without it, it will trounce it.

The entire Xenon 360 CPU is only 70-85% as fast as just 1 core of 1st generation i7:

http://www.eurogamer.net/...terview-metro-2033?page=4

You are giving Xenon way too much credit. A 2.13ghz Core 2 Duo was 50-70% faster in games compared to G5-based IBM PowerPC CPUs that Apple replaced at the time. Xenon is very similar to the G5 PowerPC architecture. A Jaguar is a lot closer in IPC to Core 2 Duo, and you are getting 8 of those cores with updated modern instructions that exponentially increase throughput, plus out of order architecture. It will mop the floor with the tri-core Xenon.

Did you ever play Dark Souls on Xbox 360/PS3? It's a 15 fps chugfest in Blightown (heavily CPU limited area), despite the game rendering at just 1024x720.

Athlon X4 620 2.6ghz can maintain 30 fps minimum at 2560x1600:
http://gamegpu.ru/images/...e%20Edition/ds%20proz.png

Sorry, but the CPUs in PS360 are slow and always have been. The key reason games run well on those consoles are graphical short-cuts (rendering at 20-30 fps, no DX11 effects, low resolution textures, texture pop-in, rendering games at low resolution like Black Ops 2 at 880x720 or Uncharted 3 at 896x504) and the fact that developers spend years coding directly to the metal (no API/OS overhead) and optimizing for their outdated hardware.
4 1 [Posted by: BestJinjo  | Date: 01/23/13 09:55:39 AM]
Reply
 
show the post
1 4 [Posted by: TA152H  | Date: 01/23/13 11:16:03 AM]
Reply
 
E-450 1600mhz = 1030 3DMark06
Intel Core 2 Duo T5470 1600mhz = 1365 3DMark06
http://www.notebookcheck....Benchmarklist.2436.0.html

Jaguar has a >15% increase in IPC over Bobcat. (0.95 --> 1.10 or 16% increase in IPC)
http://semiaccurate.com/2...jaguar-core/#.UQBcYidZUeo

1030 * 1.16 IPC increase => 1195.

That's 88% of the IPC of the Core 2 Duo or 1.88x the IPC of Pentium 4, and Pentium 4 IPC > Xenon! Your 3.2ghz Xenon core would not even be as fast as a single 1.6Ghz Jaguar core and the Xenon has just 3 of those vs. 8 Jaguar cores. Your math doesn't add up.

You are saying that Jaguar's IPC is "not even close" to C2D? 88% looks close enough. You didn't even take into account new instructions it's getting like 256-bit AVX, etc. Also, that's just 15% IPC increase in general apps, not games. Jaguar doubles the FP pipe from 64-bit to 128-bit for one pass SSE execution. Gaming performance will improve.

Also, these are just rumored specs, not confirmed. Either way, what you are saying makes little sense - Sony and MS would not replace the CPUs in Xbox 360/PS3 with slower CPUs. That would make it impossible to make games next generation on their consoles and defeat the purpose of adding any next generation GPUs too because the consoles would be completely CPU limited. So even if we ignore all the specs for a second, what you are saying is not even logical. With PS4 rumored to feature a GCN with 18 Compute Units (1152 Stream Processors), clocked at 800mhz, the performance of 1.84 Tflops puts it similar to HD7850 2GB. The engineers at Sony wouldn't be stupid to pair an HD7850 GPU with a CPU that's barely as fast (or as you are implying slower) than Xbox 360's Xenon/Cell. Sorry TA152H, your logic doesn't add up.

Like I said, the CPU in Xbox 360/PS3 can't even hold FPS at 30 in Blighttown at 1024x720 resolution, while a lowly dual-core X2 620 2.6ghz manages 30 fps at 2560x1600. That tells you right there how garbage the CPUs are in current consoles. Obviously no one is expecting Core i7 4770K in a PS4 but whatever CPU they go with, it will mop the floor with the Cell on all real world metrics that matter for games.
4 1 [Posted by: BestJinjo  | Date: 01/23/13 02:02:04 PM]
Reply
 
show the post
0 4 [Posted by: TA152H  | Date: 01/23/13 05:41:29 PM]
Reply
 
You forgot that E-450 Bobcat I put up is a dual-core CPU. Jaguar 8-core is 4x the performance of that part + 15% increase in IPC.

So we'd have an 8-core Jaguar at 1.6ghz with all cores utilized being:

E-450 1030 3DMark06 * 1.15 * 4 times as many cores = 4738.

Obviously there is no perfect core scaling but a Core i5 2500K scores 5853. I am telling you an 8-core Jaguar would destroy a 3-core Xenon.
3 0 [Posted by: BestJinjo  | Date: 01/24/13 09:26:55 AM]
Reply
 
Ya, as I said, let's take these rumors with some grain of salt. Here is a different rumor stating PS4 might have a Bulldozer CPU:

http://techreport.com/new...-bulldozer-cpu-radeon-gpu
3 0 [Posted by: BestJinjo  | Date: 01/23/13 03:23:59 PM]
Reply
 
show the post
0 3 [Posted by: TA152H  | Date: 01/24/13 01:02:20 AM]
Reply
 
It could be Piledriver FX8300 series. If consoles are aiming at 1920x1080 with AA, then unless their GPU is at least a GTX670, the console will be 90% GPU limited with an FX8300 series because CPU limited games like MMOs and RTS games aren't on consoles but GPU limited games like BF3, Sleeping Dogs, Max Payne 3, Hitman Absolution are:

http://pctuning.tyden.cz/...enomu-po-core-i7?start=16

You gotta consider that once you are talking about next generation DX11 games, up the resolution from 880x720-1280x720 of most PS360 games, add anti-aliasing, the workload will almost entirely shift to the GPU. They could use a slightly downclocked Bulldozer or Vishera to keep power consumption in check.

Given that any Intel i5/i7 is out of the question due to Intel's cost/margins, Bulldozer/Vishera would actually be the second best possible processor to use in a console long-term. They could play around with Turbo (say when 2 of 4 modules are active, it could Turbo to 4.0ghz and when 4 modules are used, clocks are limited to 2.8-3.0ghz or something). Even a 2.8-3.0ghz Bulldozer/Vishera CPU choice would be a lot better than any IBM CPU or an 8 core Jaguar.
3 0 [Posted by: BestJinjo  | Date: 01/24/13 09:31:06 AM]
Reply

5. 
Somehow the high volume of relatively slow memory + mystery hardware accelerators reminds me of the Caustic ray-tracer hardware shown at CES ... but probably no connection. What other heterogeneous architecture accelerators could be useful alongside the usual GPU, video accelerator, and cryptography accelerator? I guess vision accelerators could have special hardware for edge and motion detection.
0 0 [Posted by: gc  | Date: 01/22/13 10:19:22 PM]
Reply

6. 
Seems legit. They will want to keep power intake down to keep away Red Ring of Death(s) and Yellow Light of Death(s), and AMD's Jaguar will do the job just right. All the power they need from a customized architecture, while draining less enough power.
2 0 [Posted by: K1107  | Date: 01/23/13 02:16:19 AM]
Reply
- collapse thread

 
RROD was simply a result of cheap thermal paste/glue on the thermal system. Over time the heatsink would separate from the chip and the chip would overheat/die, naturally. If MS spent $2 on high quality thermal/paste glue, it wouldn't have been a problem even with a 250W system power consumption.

YLOD was related to Nvidia's infamous bump-gate scandal when they decided to cut costs and use lead-free solder. As the GPU would heat and cool, over time, cracks/fissures in the soldering would appear and the GPU would lose signal with the motherboard at the ball soldering contact points. This is why you can "revive" the PS3 temporarily with a thermal heat gun across the GPU/motherboard. Hence to fix the PS4 YLOD you have to 'reball' the RSX GPU with new solder.

http://www.youtube.com/watch?v=Oo1I2Bu2IU8

Again, neither of these issues were caused by high power consumption but due cost cutting/use of low quality components. This is no different than GeForce 8 cards eventually dying anyway due to bumpgate solder fissures (YLOD) or mounting a cooler incorrectly on a 50W GPU and it fries when you power the system on (RROD).

You know when you can cool 2 GTX680s in a GTX690 with its near 300W of power usage in such a small space, that it's doable if you are willing to pay the $ for a high-end cooling system parts. A 95W CPU/APU and a 100W HD7970M would have been perfectly adequate in terms of cooling them in a console. You can even fit a GTX660Ti/670 with a Core i7 2600K inside a chassis the size of original PS3 -- Alienware X51.
http://www.theverge.com/2...8359/alienware-x51-review

Seems like the primary reason they are going with low-power 8-core Jaguar CPU instead of Trinity / Richland APU is for cost cutting reasons.
2 0 [Posted by: BestJinjo  | Date: 01/23/13 10:52:13 AM]
Reply

7. 
No SSDs, long load times, no thank you.

At least the PC ported games won't suffer as much from dumb-downed AI 8GB vs 512MB. The x86 processor helps in this respect also. Even with an SSD, consoles will never support mods and you'll have to spend a fortune on games.
0 0 [Posted by: ericore  | Date: 02/08/13 02:13:47 PM]
Reply

[1-7]

Add your Comment




Related news

Latest News

Monday, July 21, 2014

12:56 pm | Microsoft to Fire 18,000 Employees to Boost Efficiency. Microsoft to Perform Massive Job Cut Ever Following Acquisition of Nokia

Tuesday, July 15, 2014

6:11 am | Apple Teams Up with IBM to Make iPhone and iPad Ultimate Tools for Businesses and Enterprises. IBM to Sell Business-Optimized iPhone and iPad Devices

Monday, July 14, 2014

6:01 am | IBM to Invest $3 Billion In Research of Next-Gen Chips, Process Technologies. IBM to Fund Development of 7nm and Below Process Technologies, Help to Create Post-Silicon Future

5:58 am | Intel Postpones Launch of High-End “Broadwell-K” Processors to July – September, 2015. High-End Core i “Broadwell” Processors Scheduled to Arrive in Q3 2015

5:50 am | Intel Delays Introduction of Core M “Broadwell” Processors Further. Low-Power Broadwell Chips Due in Late 2014