Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Sony Sets Up Huge Conference in Late February, Could Introduce PlayStation 4 “Orbis”.

Started by: SteelCity1981 | Date 01/05/13 10:31:05 AM
Comments: 10 | Last Comment:  01/23/14 01:44:48 PM

Expand all threads | Collapse all threads


An A-10 5800k with a crossfire config mated to an Radeon 6670 would be great and would give the ps4 a decent amount of room to come out with some decent looking games and it will help keep the cost down of the console to boot.
1 2 [Posted by: SteelCity1981  | Date: 01/05/13 10:31:05 AM]
- collapse thread

Crossfire on high end cards already is not ideal since it has higher prevalence of micro-stutter compared to SLI due to lack of dedicated hardware frame metering. Crossfire with low-end slow GPUs is going to be stutter-fest galore. They would be better off going with A10 + HD7000/8000 dedicated GPU. The GPU in the APU could be used to power certain indie titles/arcade games that aren't GPU intensive enough, cutting down on power consumption in those titles. It could also accelerate 4K movie playback, etc. allowing the dedicated GPU to go into ZeroCore power savings mode. The dedicated GPU would then be used for next gen 3D games where more power is needed.

Also, if AMD is building them a custom A10 chip, it's not a required assumption that an A10-5800K will be used with outdated VLIW Trinity GPU. AMD already launched 28nm GCN parts in the mobile space and there shouldn't be any reason why Sony won't want at least a GCN GPU inside their APU.

A 384SP GCN mobile part mops the floor with a 480SP VLIW part.

It would be short-sighted at this point to use an A10-5800K.

I think a mobile variant of the desktop HD7850-7870 2GB is the minimum that PS4 needs to drive next gen graphics (so roughly equivalent to an HD7950M for its dedicated GPU). The console has to last 7-8 years from launch, not just be good enough to play 2013 PC games.
1 1 [Posted by: BestJinjo  | Date: 01/07/13 09:47:52 AM]
Isn't micro-stutter a driver issue, all due to AMD's drivers? I thought direct hardware access and better optimization would do away with issues like micro-stutter. Maybe I'm right, maybe I'm wrong, whatevs, I'm sure if they still have the awesome guys who designed the Vita they'll impress me in the end. I don't think they could've done any better with the Vita's hardware design.
0 0 [Posted by: Facelord  | Date: 01/26/13 10:21:52 PM]

a system built today that seems it would match its video quality- a 77w i5 cpu and an radeon 8850- under 200 watts- like the first model of PS3, which had a 380 watt psu. Accounting for a more integrated and optimized design, it's likely to use around 200w, so that's a 77w cpu and a 130 watt GPU, or however it's made- the Standby mode on ZeroCore is likely to keep power draw down.

The release of Blu-Ray with the first system was a major inclusion. That added like, $200 to the price of the system. There doesn't seem to be any indication that a release in two years or so will yield many more 4k television monitors that will be sought out by the average PS4 consumer, unless LG releases one for like $2000, as they currently sell for $15,000 or so. Although, I'm reminded that the PS3 was released in 2006, and 2011 was the year that 1080p monitors/TVs actually became inexpensive <$150. So it'll support 4k, but they won't be affordable till 2019. I'm quite happy with my ATSC broadcasts; I can't imagine more OTA bandwidth..
1 1 [Posted by: qubit  | Date: 01/05/13 02:07:23 PM]
- collapse thread

I am pretty sure an i5 is out of the question due to cost reasons. I think it's a 99% sealed deal that PS4 will use some custom AMD APU. The other possibility I've read is an IBM PowerPC CPU. I haven't read any credible source reporting an Intel CPU in PS4. The key unknown is whether it will also have a separate dedicated GPU to go along with it.

I agree with your point about power consumption. PS3 originally used between 195-240W of power:


I think the greatest obstacle to PS4's performance will be cost. They could easily fit a 130-140W GPU with a 65W AMD APU (the APU could be rated 100W but the GPU would be disabled in 3D games where the primary dedicated one would take over and thus the CPU would use 65W).
2 0 [Posted by: BestJinjo  | Date: 01/07/13 10:02:41 AM]

I'll be happy when both new systems come out, Next Xbox and Playstation. The games ported(sigh....) to pc won't seem so shit then, At least until a dedicated pc game comes out and shows them once again what can be done. Then it will be another waiting game for PS5 and Xbox1440(whatever they call it lol).

Not so sure about 4k TV's, Going from large tube TV's to 1080p flat screens took time as people didn't know what to expect, Now most people have 1080p they know what 4k will bring and are already waiting for screens. Technology is not only getting cheaper but it is being pushed out faster than ever before(think smartphones), I think 4k TV's will drop in price much faster than 1080p TV's did. I would say 2-3 years this time around so 2015-16, LG has already started the momentum now, 55inch 4k OLED 3d tv at $10,000, That's 1/3 the price of what 4k TV's were just months ago. Only need one more drop in price and it's already in the consumer area, Hell I payed $7500 many many years ago just for a crappy rear projection tv with crap picture, Then went onto 1080p LCD for less than half that price.

Only alarming thing I am finding is in the pc monitor industry, Why are they so far behind and overpriced on anything more than 1080p screens. We should have high PPI 24-30inch screens with 4k resolutions and 120hz. Mobile phones have better displays than pc monitors now and TV's have 4k with twice the screen size or more so what's the hold up on pc monitors, Give me a 27inch 4k 120hz monitor for under $1200 and I am sold tomorrow!
0 1 [Posted by: ozegamer  | Date: 01/05/13 07:15:43 PM]
- collapse thread not 4k but getting there in the horizontal direction...
2 0 [Posted by: qubit  | Date: 01/05/13 10:05:35 PM]
Not bad at all, Better(so far) than the 3 screen setup, I find that bezels ruin the whole immersion on multi screen setups.
0 0 [Posted by: ozegamer  | Date: 01/07/13 06:12:05 PM]


Back to the Article

Add your Comment