News
 

Bookmark and Share

(43) 

It is well known that Sony Corp. is developing the next iteration of the popular PlayStation console, the PlayStation 4. What remains unknown is the hardware that powers it. According to a media report, the PlayStation 4 will utilize graphics processing technology designed by Advanced Micro Devices.

A former employees of AMD told Forbes web-site that the company is working on a graphics processing technology for the next-generation PlayStation 4 video game console. The ex-employees did not provide any actual details or evidence about the actual proceedings and also naturally remained anonymous.

At present the information should be considered as a rumour as a custom AMD Radeon graphics chip inside the PS4 means that Sony will either have to drop compatibility with PS3 titles on its new consoles, or pay additional royalties to Nvidia Corp., whose chips power the current PlayStation 3 system.

In case the rumours about AMD's custom Radeon graphics processors inside Xbox Next (Loop, Durango) as well as PlayStation 4 are correct, then the company has a reason to celebrate: it is a massive success to  power all three next-generation consoles from all three major platform holders, Microsoft, Nintendo, Sony). Such position on the market may be very favourable for AMD as it will allow it to scale its graphics processing architecture beyond consoles, which are the primary game platforms nowadays, to new types of hardware that will be the gaming platforms of tomorrow.

Among other advantages, Radeon HD architecture inside every next-generation consoles will provide AMD a major advantage on the market of personal computers as all major game designers will have to optimize their titles for AMD architecture and therefore the Radeon graphics chips for PCs will have an advantage over competing solutions.

However, developing of three separate graphics cores for the next-gen video game consoles means that AMD will have to offload resources from other projects, such as next-generation GPUs for PC or ultra-portables, and dedicate them to development of new solutions for console platform holders.

AMD and Sony did not comment on the news-story.

Tags: Sony, Playstation, AMD, Radeon, Nvidia

Discussion

Comments currently: 43
Discussion started: 02/23/12 04:54:34 PM
Latest comment: 04/06/12 11:28:47 PM
Expand all threads | Collapse all threads

[1-12]

1. 
Gee, great so PS4 will use a HD6670 Graphics card also to go along with the next Xbox and the HD4850 found in the soon to be released Wii U

So at the end of the day all 3 consoles are "RUMORED" to have 5 year old 40nm technology by the time they are released!

Nothing but complete disappointment will come from this and PC gaming will again be stuck in the stone age of graphical improvement due to low speced ports on ancient console hardware

I just wish they at least started off with a 28nm based GPU like the HD7750 ( the price is about 10% more ) to START with. and not based on a 40nm one that they will shrink to 28, 20, 14nm in years to come as they did for PS3 and Xbox360
5 5 [Posted by: vid_ghost  | Date: 02/23/12 04:54:34 PM]
Reply
- collapse thread

 
Exactly. It's like if Xbox360 and PS3 at the time both had R500 GPU or 7950GT, it wouldn't have mattered at all if they both used NV or AMD. The end result is we got 7 years of mostly crappy console ports. The few gems such as Crysis, BF3 and Metro 2033 are notable stand-outs. How fast the GPU is will dictate how good the games will be. They can put GCN, but if it's some low end model, it's a waste.
3 3 [Posted by: BestJinjo  | Date: 02/23/12 06:12:40 PM]
Reply
 
You're posting only garbage. Where did you read about HD6670 or HD4850?!? Provide a link!
3 3 [Posted by: TAViX  | Date: 02/24/12 03:53:54 AM]
Reply
 
He is posting garbage? He is right actually based on current rumors.

The rumored GPU in Wii U is HD4850. "The last reported dev kit GPU for the Wii U had a Radeon HD 4850 powering it but needed to be underclocked due to overheating. The developers have said that the target specs were going to be the HD 4850 but dev kits were still being modified; which probably means size, RAM and clock speeds."
http://www.nintengen.com/...-gpu-vs-xbox-720-gpu.html

The rumored GPU in Xbox 720 is HD6670.
http://www.ign.com/articl...s-powerful-as-current-gen

Maybe you should do your research next time before you accuse other posters here of providing false information. We'd gladly read sources that refute this information. Until then vid_ghost is posting currently available info.
3 2 [Posted by: BestJinjo  | Date: 02/24/12 01:42:59 PM]
Reply
 
show the post
0 3 [Posted by: CPfresh  | Date: 02/24/12 12:54:24 PM]
Reply

2. 
Why do i have a feeling that Sony is going for an APU style of CPU/GPU setup to reduce cost and target sales. If they do an Trinity APU maybe something on the lines of an A10-5800 Quad Core CPU/Radeon 7660 wouldn't be too bad. I mean it's not like Microsoft or Nintendo are coming out with anything more powerful as we have seen their next gen specs already.
6 2 [Posted by: SteelCity1981  | Date: 02/23/12 05:04:03 PM]
Reply
- collapse thread

 
they want to keep the cell going so as to avoid paying for new tech and recoding the software SONY PS3-OS to work with x86 CPU's..

Also the lag on the CPU having to share memory with the graphics would be much worse then a 6670 of 4850 in the next xbox and wiiU

Overall i still hope that all this info on 2013 released consols is incorrect and that they will wake up and put in a 28nm GPU, by that time the price would have come down and 28nm would have matured allot with better yeilds.

I think the real reason for the HD6670 GPU's is because MS has chosen it as the SDK packaged GPU for use in development kits so they can start making games for the system... when they release the system with a HD7750 28nm GPU they will send out new development kits with final hardware specs for those games to be adjusted to work with the new hardware...

By the 2013 release they may even go up to a HD7770 or HD7850 Custome made GPU, if they do then it will not only desimate the WiiU but also put it back inline as the cheap unable to port over games system that sits in the corner collecting dust.
2 4 [Posted by: vid_ghost  | Date: 02/23/12 05:18:45 PM]
Reply
 


Doubtful. I don't expect MS to put a 7770 or 7850 in their next gen consoles they want to keep the production cost down and the sales price competitive. Plus the hardware is already being sent out to game developers to test. So i doubt we will see such gpu's in the xbox 720 or whatever they call it. Whatever we see at E3 this year will end up being what we will see on store shelves when its released.
3 1 [Posted by: SteelCity1981  | Date: 02/23/12 06:00:19 PM]
Reply
 
Even if they shove HD7970 in there, us PC guys are already waiting until we can get a 30 inch 4K LCD screen. In 5 years, we might be gaming on 4K OLED monitors with graphics cards 5-6x more powerful than HD7970. Basically we all know, no matter what they put inside the consoles, it will be obsolete. The only difference is how long it will take before the new consoles are obsolete?

The GPU in PS3 was previous generation high-end (7950GT). The GPU in 360 was at the time near high-end (R500 on unified architecture ~ the performance of R520 X1900XT). Look what happened to those? 7 years later, completely obsolete.

In this context, even if they put HD6970 or HD7870 in there (equivalent of the GPUs in 360/PS3 at the time), in 5-7 years they will be holding us back anyway.

Consoles just need to go back to being updated every 4-5 years, not every 7-8 years as appears to be the case now. The much longer console shelf life cycles is what's really causing all this console port trend.

Think about, it's like buying an HD7970 for $600 and holding it for 6 years vs. buying 3 new GPUs for $200 every 2 years for the next 6 years. That's why no matter what GPU they put in, it will eventually always be a disappointment since you can't upgrade it!
3 4 [Posted by: BestJinjo  | Date: 02/23/12 06:21:12 PM]
Reply
 
OLED well since the Blue LED burns out in 4 to 6 years give or take 8 hours per day of use, you'll have to go run out and buy another one. OLEDS are great but they have a short lifespan.
1 1 [Posted by: SteelCity1981  | Date: 02/23/12 07:13:44 PM]
Reply
 
I am pretty sure the commercial OLED TVs will have operating life-cycle of 100,000 hours, similar to today's LEDs and Plasmas. Also, I don't use my TV for 8 or even 6 hours. I can't imagine anyone who has that much time? 9-5 jobs (minimum) + commute. 6 hours of TV usage per day means doing nothing else but coming home from work and playing games non-stop? That's not reality for most adults, I hope.

Either way, I am pretty sure commercial OLED panels will have sufficient operating life.

My main point is us enthusiast PC gamers might get a console to augment our gaming experience for console exclusives (some of which are great games). However, PC will still offer superior gaming experience in terms of higher resolution screens (4K is coming sooner or later), 3D gaming, controls via keyboard + mouse, mod community, as well as the retain the flexibility of exponentially increasing the graphics performance through GPU upgrades.

I hope they bring out powerful consoles of course so developers will be more likely to make DX11 games from the ground-up, with full-blown Tessellation, smarter AI, and so on. But I am not expecting miracles. Also, I believe the rumors point to the next Xbox launching in the Fall of 2013, and PS4 later. By then, we'll be probably have HD8000 series and Kepler refresh. Basically no matter what consoles have, the more important question is will developers finally start to make games much better looking and more realistic/interesting to play?

Take away the great lighting model of BF3 and the EA sports character animations, and the game is still barely better than Crysis from 4 years ago (Ugly textures, ie. sandbags...). PC graphics have stagnated severely. Sure you can throw massive amounts of AA, Super-Sampling, Transparency AA, etc., but that's like putting lipstick on a pig.

Syndicate just came out - yet another console port with OK graphics.
5 2 [Posted by: BestJinjo  | Date: 02/23/12 09:05:19 PM]
Reply
 
lots of people kids come home after school play games on it for 3 hours or whatever, then parents take over and watch another 3 or 4 hours of TV. Then you got retirees that watch a lot of TV. My grandfather and the weather channel comes to mind. lol


They don't check out the reason for it.

Longevity

OLED TVs currently have one major flaw, and this is it! Currently the Blue LED only has around a 10,000-14,000 hour lifespan, although recent developments have gone some way into correcting this. However the 100,000 that standard LED TV manufacturers are quoting are far superior to OLED for the time being.

http://www.tvoled.co.uk/oledvsled.html

0 0 [Posted by: SteelCity1981  | Date: 02/23/12 10:50:04 PM]
Reply
 
Blue OLEDs historically have had a lifetime of around 14,000 hours to half original brightness (five years at 8 hours a day) when used for flat-panel displays. However, you are not accounting for any improvements in OLED technology in the next 5 years? Trust me, as OLED takes over LED, it's only a matter of time before those mass manufacturing of OLED displays and subsequent revisions will increase their lifespan, perhaps even pushing their expected life past that of LCD displays. Already in 2007, experimental OLEDs were created which can sustain 400 cd/m2 of luminance for over 198,000 hours for green OLEDs and 62,000 hours for blue OLEDs.

The screen in PS Vita is OLED. I doubt Sony would release Vita with an OLED screen if it thought the Blue OLEDs in it would only last 5 years. They'd have lawsuits on their hands.
0 2 [Posted by: BestJinjo  | Date: 02/24/12 02:38:27 PM]
Reply
 
there is a big diff between a small 7 inch screen and a TV with OLED on it the bigger the screen size the more contrast and light admissions it puts out which puts a lot more ware time on the LED itself. I mean you can't compare a vita screen to a 55 inch OLED screen, because the 55 inch OLED screen is obv going to have to put out a lot more contrast and light admissions then some 5 inch display. So that 198,000 hours you speak of basicly applies to small pocket size screens not big TV screens.
0 0 [Posted by: SteelCity1981  | Date: 02/25/12 12:48:33 AM]
Reply
 
I used to worry about that then I realised hang on a sec I am constantly updating my pc every year anyway, If my screen lasts me 3-5 years then that is good enough. New screen technology will be out by then so I would want to replace my screen anyway. I have a perfectly good first gen 24inch LCD in the cudboard now as I have updated to a 27inch 120hz screen, My point being I am sure I could have got another few more years out of that 24incher but it was time for an upgrade.
1 0 [Posted by: ozegamer  | Date: 02/24/12 05:55:09 PM]
Reply
 
I update my PC every 5 years or so but I bought a nice 19 inch SPVA-panel LCD monitor in 2003 and I still use it. I plan to keep it around even if I get a bigger monitor.
0 0 [Posted by: mikato  | Date: 02/27/12 11:31:43 AM]
Reply
 
In 5-6 year you WONT have cards 5-6 times faster than today simply because each new year the performance improvement of each top gen card is at most 40-50% .You do the math
0 1 [Posted by: TAViX  | Date: 02/24/12 03:58:02 AM]
Reply
 
I did the math. At a 40% increase each year over the previous year in 5 years what is available then would be 5.38 times faster than the original. After 6 years it would be 7.53 times faster than the original. Maybe you should do the math before basing your opinions on made up results.
3 0 [Posted by: PhreighnQ  | Date: 02/24/12 10:11:20 AM]
Reply
 
We will for sure. New generations are far quicker than 40-50% between generations, because you aren't counting refreshes, or the fact that more advanced graphical features produce an exponential decrease in performance on older cards. For example, GTX480 is ~ 51% faster than GTX280 but GTX580 is 73% faster than GTX280. But GTX480 and GTX580 are still the same generation.
http://www.computerbase.d...schnitt_leistung_mit_aaaf

Most reviews test brand new cards on games at the time and never revisit. Imagine if you tested HD5870 in September of 2009 when almost no demanding games other than Crysis were around? Now HD5870 is 75-100% faster than HD4870. In most of today's games, HD5870 is about 50% slower than HD7970, but in DX11 games, that grows to 100%.

To drive my point home, take a look at this:
http://techreport.com/articles.x/18682/6

7900GTX = March 9, 2006
GTX480 = March 26, 2010 (4 years)

GTX480 is easily 5x faster than 7900GTX. 7900GTX is faster than the GPU in PS3. So it's not inconceivable that even if the engineers manage to somehow put an HD7970 in the next consoles, in 5 years the GPUs in our PCs will be 5x faster. Plus we will have moved on to DX13, and whatever other goodness next generations will bring.

Even now with some overclocking HD7970 is ~ 2x faster than HD6970. The pace of GPU increases hasn't slowed down as much as the games are not demanding enough to expose the added benefits of new hardware because they are still light on Tessellation and Compute.

Let's revisit in 2 years and you'll see we'll have a GPU that's already 2x faster than HD7970 in modern games at that time come Spring 2014.

Of course we don't even have to discuss the possibility of an HD7970 in a console since the GPU itself consumes almost 200W.
0 2 [Posted by: BestJinjo  | Date: 02/24/12 01:58:47 PM]
Reply
 
DX 13 o.O

Now if only it was real time ray tracing by then, I can dream can't I?
1 0 [Posted by: ozegamer  | Date: 02/24/12 05:59:19 PM]
Reply
 
Actually, the GPU in PS3 (RSX) was smack in between a 7800GTX and a 7900GT (closer to a 7900GT, though). The GPU in 360 (Xenos) was phenomenal in that it was faster than X1800XT that finally became available that November. Xenos actually had more advanced architecture than the X1900XTX that came out a couple months later (unified shaders ALA R600, rather than just expanded shaders ALA R580). Imagine, the X1800XT that was delayed past the 90nm Xenos, actually launched with a MSRP of over $500.

Can you imagine a console being launched with graphics more powerful than a $500 video card today (i.e., HD 7970)?!? That's why Xbox360 was absolutely phenomenal for its time circa 2005. Not even RSX graphics in the PS3 released a full year later could actually beat Xenos. It was quite a revolution, and the reason why we continue to see impressive games on these platforms today.

However, the shift will be directed towards 3DTVs. Stereo3D technology needs leaps and bounds greater graphic processing power, along with true 1080p rendering with full antialiasing. This, combined with far greater tessellation (polygon) and texture/pixel fillrates, requires several magnitudes of orders greater processing power.

Sega Dreamcast was able to make games look nearly as good as PS2, or even the Wii, although the PS2 was like 10 times as powerful. It was the revolutionary leap of unified shader architecture (DX9.0"d" that marked such a revolution. However, I have the feeling that with the upcoming consoles, it will just be more of an evolution rather than a revolution (like the PS2/Xbox/Gamecube just being an evolution over Dreamcast).

The Wii U definitely needs 28nm-fabbed HD 7750 graphics (based on the super-efficient GCN architecture), even if slightly underclocked. Xbox 720 definitely needs 28nm-fabbed HD 7850 graphics (roughly equivalent to HD 6950)- but much better if Microsoft truly hopes for it to have a chance against Sony's PS4 that will most likely come out later on. MS probably knows that Sony will be more aggressive regarding the graphics, especially since MS won't be implementing such cutting-edge graphics as they did with Xbox360's Xenos. Sony is definitely going to want to make sure that PS4 has a clear advantage over Xbox720 regarding the gfx. I wouldn't be surprised if Sony indeed jumps ahead to 20nm graphics that is even more powerful than today's HD 7970, just to try to establish a clear sense of dominance like they always wanted to (for the PS2, it was the overall-especially with a headstart before Xbox and Gamecube, for the PS3-which came out a year behind the competition-it was Bluray, and for the PS4 which would also be late in 2014, it would have to be the graphics).

BTW, Microsoft already has the Kinect (and would likely be integrating Kinect 2.0 with Xbox720 right at the start), so Sony would have to do the same with copying MS this time around rather than copying Nintendo's Wiimote the last time around, by designing the PS4 to have built-in Kinect-like capabilities.

2 0 [Posted by: Bo_Fox  | Date: 02/24/12 10:39:15 PM]
Reply
 
I agree with everything you said until, up to and including this point:

"However, I have the feeling that with the upcoming consoles, it will just be more of an evolution rather than a revolution"

But then in the next paragraph you describe the opposite, the possibility of Sony putting a GPU at least as fast as an HD7970 and Wii U having HD7750 graphics card. The current rumors for both Wii U and Xbox, based on development kits, already show far far slower GPUs.

Also, Sony is bleeding cash flow due to losses in TV business and many other businesses they have. It would be a miracle of PS4 has some insane hardware. If anything, based on poor sales of Vita, Sony would be better off releasing a cheap $249-299 console and get a head-start by underpricing MS.

While the GPU in the 360 was better than PS3, in the end it didn't really allow the 360 to have much superior graphics. PS3 and 360 are more or less comparable in graphics, and one might even argue that PS3's exclusives such as Uncharted and God of War have better graphics.

If Sony spends a lot of $ on trying to ram a very powerful GPU inside PS4, they'd need to add a more expensive PSU, more complex PCB/circuitry, more expensive cooling, etc.

Why go through all that hassle? Unless the GPU is 2-3x more powerful in PS4 vs. Wii U/Xbox 720, it's doubtful that developers would devote extra time on PS4 due to that requiring the game to be coded separately for PS4. That's not working that well for PC already despite 10-12x the GPU horsepower advantage. If PS4 is way more powerful, best case scenario its games will run at higher frames with higher resolution textures and farther draw distances/with less texture pop-in. That's about it.

If PS4 does have a GPU ~ HD7970, it would only happen imho if they delay the launch to late 2014/2015.

Sony has openly stated that they will not design PS4 like they did with PS3 where they lost $400+ on each console sold since PS3's Bill of Materials was initially >$900.

HD7970 is a $550 GPU right now with ~ 200W power consumption on 28nm. No way that's going to be in a console unless they shrink it to 14nm or something.
0 2 [Posted by: BestJinjo  | Date: 02/25/12 08:25:50 AM]
Reply
 
show the post
1 4 [Posted by: BestJinjo  | Date: 02/23/12 06:14:19 PM]
Reply

3. 
"Among other advantages, Radeon HD architecture inside every next-generation consoles will provide AMD a major advantage on the market of personal computers as all major game designers will have to optimize their titles for AMD architecture and therefore the Radeon graphics chips for PCs will have an advantage over competing solutions."

Even if true, what is more important is not the architecture but how fast the GPU is. If they put an HD7750 GPU, for all we care it makes 0 difference. What that would mean rather is 8 more years of crappy looking console ports. Sure, they might run faster on AMD cards, but in the end, a slow GPU will only hurt us PC gamers.

What matters more is not whether the GPU is made by AMD or NV, but how powerful it is. If they throw an HD7750 in there and optimize games for GCN, that's still not doing anyone any favors since HD7750 will quickly bottleneck graphics evolution on the PC. Besides, no one can predict what features next generation PC graphics cards will have. If consoles stick around for 5-10 years, GCN architecture may itself be obsolete vs. whatever next generation architectures Nv or AMD will introduce. And suddenly all those optimizations for what now is a "modern" GCN architecture will be thrown out the window if the new architecture is a complete redesign.

We also know that Wii U won't use GCN. So already there will be at least 2 different AMD architectures between the Wii U and Xbox 720.
1 3 [Posted by: BestJinjo  | Date: 02/23/12 06:08:26 PM]
Reply
- collapse thread

 
Right, and NV could easily optimize their GPU design to run really fast on Radeon-optimized games. Ever heard of a Geforce card running ATI's own Ruby demo faster than any Radeon card could run it? Yep, it happened before many times. Now the HD 7970 can finally run tessellation faster than GTX 580 in most cases. It's like AMD CPUs being able to run Intel's SSE5 code just as efficiently.
1 0 [Posted by: Bo_Fox  | Date: 02/24/12 10:54:09 PM]
Reply

4. 
I think ms is working on 2 projects. The hd6670 speced chip maybe for something portable not for the console.
1 2 [Posted by: csimmo  | Date: 02/23/12 09:05:23 PM]
Reply

5. 
However, developing of three separate graphics cores for the next-gen video game consoles means that AMD will have to offload resources from other projects, such as next-generation GPUs for PC or ultra-portables, and dedicate them to development of new solutions for console platform holders.

It doesn't mean that at all! It means that they might offer the same solution to everyone :D And they better offer a next generation of GNC architecture too that doesn't suck so much, because NVIDIA are about to eat them!
0 1 [Posted by: Zingam  | Date: 02/24/12 12:17:51 AM]
Reply

6. 
I just want to say that Sony has everything -- and I mean every breed of rabbit in their hat -- to finally build the most amazing PC, NO! Personal Entertainment Computer (PEC maybe) ever! The PS3 is almost everything already... they just have to make like Apple does and they would win both PC the gamer and console gamer forever! All they have to do is to make the PS4 a ridiculously customized Android -- it's a linux OS anyway -- but optimized to PlayStation 4 smooth solid gaming as a goal -- also we all know they already did this on the bloody phone market. The rest you can all imagine: finally FULL mouse/keyboard support and still MOVE operated games AND Android! It's like having a Linux computer that even your mom can operate and still be console in every aspect since both hardware and OS would be Sony's and could be keep unchanged for years to come! That would be overkill! Even PC and MAC non-gamer dumb people would want this since it would be the first computer they can master. This would be heaven also to developers like me. Finally there would be a platform with touch/move AND consistent specs to optimize for!!! So Sony can conquer the world basically! And give Microsoft AND Apple goodbye like a boss! Name one person in the world that would not want a video game that is also a freaking high end PC with an easy to use Android OS that anyone can use and is somewhat portable and easy to setup and has blueray and full HTPC capabilities and hack why not telephony also so you can call people like freaking Cpt. Picard for god's sake!!! This is so good and at the same time so possible... man...
3 0 [Posted by: fullgrip  | Date: 02/24/12 03:29:50 AM]
Reply
- collapse thread

 
Dude, you should be Sony's CEO! The greatest and last one!
2 0 [Posted by: Zingam  | Date: 02/24/12 04:00:35 AM]
Reply

7. 
nice
1 0 [Posted by: TAViX  | Date: 02/24/12 03:47:53 AM]
Reply

8. 
show the post
0 3 [Posted by: Zingam  | Date: 02/24/12 04:03:48 AM]
Reply
- collapse thread

 
There is a backup plan Zingam, invest our monies into Apple stock since Apple sheep never disappoints. I'll ride the wave to $800 a share and use the profits for my OLED PC upgrade. Economy be damned, Apple sheep prevails!! (At least so far it has). You know you increase your chances of getting a job if you browse Monster on a MacBook Pro at Starbucks? That's been my strategy to subsidize my PC hobby

0 2 [Posted by: BestJinjo  | Date: 02/24/12 02:10:17 PM]
Reply

9. 
PC-s greatest thing is platform upgradeable aspect.
On the other side gaming consoles have better price value due the life spin cycle off console gen of 5-6 years.
Play station 3 cost a loot because 9 cells processor that even now presents engineering value 7 years after launched. I honestly believe that gpu architecture shud be next gen amd-s radeon 8xxx mid range chip probably soc apu maid whit 32 or 28nm wafers.
Amd shud really make an effort to do this well because all mayor AAA titles then will be optimized for Amd-s architecture for the first time in history. Its even possible that the reference design will be manufactured by Intel on 22nm wafers with 3 gate transistors.
Knowing the Sony they will probably tray to menage it all.
1 0 [Posted by: Zola  | Date: 02/24/12 05:35:46 AM]
Reply

10. 
Sad for Nvidia but talking about Radeon quality.
0 0 [Posted by: Blackcode  | Date: 02/24/12 10:24:50 AM]
Reply
- collapse thread

 
Radeon quality is awesome, but MS's penny pinching ways on cooling and TIM better not result in RROD #2. "Guys, it's on 28nm. We don't need active a heatsink".
1 2 [Posted by: BestJinjo  | Date: 02/24/12 05:53:58 PM]
Reply

11. 
Good. AMD can stick to the mediocre console game markets as it has never been interested in the bespoke quality of PC gaming.
1 2 [Posted by: TeemuMilto  | Date: 02/27/12 07:18:27 AM]
Reply

12. 
in the name of god.
hello.
The sony corporation will use the best GPU(7990 with 4GB XDR2) in the PS4.
if GOD Want.(100%)
0 0 [Posted by: ya_alimadad2006  | Date: 02/29/12 03:09:49 AM]
Reply

[1-12]

Add your Comment




Related news

Latest News

Wednesday, October 8, 2014

8:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

12:22 pm | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

9:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

6:41 pm | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture

Monday, August 25, 2014

6:05 pm | Chinese Inspur to Sell Mission-Critical Servers with AMD Software, Power 8 Processors. IBM to Enter Chinese Big Data Market with the Help from Inspur