Bookmark and Share


A picture of the graphics card that is claimed to be a board based on the code-named graphics processor R600 from ATI, graphics product group of Advanced Micro Devices, has been published by an Asian web-site. The board seems to be huge and is said to have nearly extreme demands for power.

The picture that that has been leaked shows a special “long” version of ATI’s code-named R600-based graphics card aimed at original equipment manufacturers and system integrators. The version designed for DIY marker will be shorter, according to the report.

VR-Zone web-site claims that there would be two versions of ATI R600 XTX: one is for OEM/SI and the other for retail. Both feature 1GB of GDDR4 memory on board, but the OEM version is 12.4” long, whereas the retail is 9.5” long. The power consumption of the AMD R600 graphics card is 270W for 12” version and 240W for 9.5” version. The difference between the boards is unclear.

Specifications of ATI R600 published by a web-site earlier resemble specs revealed by some other sources back in mid-2006, but are not fully similar:

  • 64 4-Way SIMD Unified Shaders, 128 Shader Operations/Cycle;
  • 32 texture mapping units, 16 raster operation units;
  • 512-bit memory interface full 32 bit per chip connection;
  • 230W thermal power envelope;

The web-site also claimed that the next-generation graphics chip from AMD’s graphics division formerly known as ATI Technologies will support so-called GPU clustering, which allows to install 2ⁿ number of GPUs (4, 8, 16, 32, etc), though it is unclear whether this is something new, as ATI’s graphics chips supported multi-GPU capability for professional solutions from companies like Evans & Sutherland for many years now. In addition, Level505 reports that the R600 chip is compatible with “draft DX10.1 vendor-specific cap removal” application programming interface, something, which is unlikely to be utilized for a substantial amount of time.

Officials for Advanced Micro Devices did not comment on the news-story.


Comments currently: 29
Discussion started: 02/09/07 07:35:52 AM
Latest comment: 02/11/07 09:59:12 AM
Expand all threads | Collapse all threads


That is fugly.
0 0 [Posted by:  | Date: 02/09/07 07:35:52 AM]
- collapse thread

OEM version is 12.4" long ( 270W )
retail is 9.5" long (240W)

So this is a picture of some early OEM prototype version NOT the retail one.
0 0 [Posted by:  | Date: 02/09/07 03:48:28 PM]

I'm getting tired of the continually increasing power consumption of graphics cards. For me, it's just a pain to try to figure out if I have to change power supplies for the millionth time, if I should look into running a dedicated circuit to my wall jack. It's getting ridiculous. I can't imagine how "green" hardcore gamers feel.

CPUs somehow reversed the increasing power consumption trend, GPU companies should really look into it. Are we going to need a cryogenic freezer one day to use a card that can run Unreal 10?
0 0 [Posted by:  | Date: 02/09/07 08:35:32 AM]
- collapse thread

That's because major architectures are revealed approximately every 5 years on a CPU. GPUs have a 1-year architectural lifecycle. They don't have the luxury of a handful of architectural optimizations plus one or two fabrication die shrinks like the CPUs do.

I do hear you though and don't disagree but it's the pace of gaming which is its own bane.
0 0 [Posted by:  | Date: 02/09/07 09:10:50 AM]
I have to disagree with you because G80 took 4 years from being a concept/idea till a working hardware.
Let's face it, graphics cards makers are far away from shuffling their focus to a more power economical product rather than a faster product -no matter what being faster takes-.
0 0 [Posted by:  | Date: 02/09/07 02:02:14 PM]

That thing had better have some damn good performance, or it is going to FLOP.

The only reason *I* can imagine for the longer OEM card, is that the retail has to fit into existing cases, which that OEM won't on many modern systems. Willing to bet that the retail cooling solution is going to be inferior and LOUD. Or maybe I'm just still bitter about my jet engine loud x850xt cooler that still overheated before I switched to a zalman.

Seriously though, I agree with the sentiments above. GPU companies are killing the gaming industry. How? They are pandering to a few hardcore gamers who will spend ANYTHING for the top performance. Then the developers play right into it, releasing games that REQUIRE such serious hardware - since the hardcore will slam any game that doesn't with bad reviews. Then, 90% of those who would be gaming on the PC end up giving up in frustration since their brand new $200 video card "isn't good enough for medium quality".

I wish that the nVidia/ATI war would go the way of the AMD/Intel war and focus on beating each other on price as much as performance. The AMD/Intel war brought high performance CPUs down to a price the average user could get more power than he ever needed. nVidia/ATI seem to be interested in making sure nobody will ever be able to survive by buying anything under $200 by promoting games (through developer agreements) that will not even run on their low-end hardware.
0 0 [Posted by:  | Date: 02/09/07 10:52:49 AM]

Damn that is awesome!!

High end graphics cards are all about excess, being big, powerful and having high performance. Its never been about being efficient or cost-effective.

0 0 [Posted by:  | Date: 02/09/07 11:25:14 AM]
- collapse thread

MMM Excuse me, intel Conroe (core 2) is big, powerful and having high performance, yet more efficient than the disaster netburst (pentium 4) in almost all aspects !!
0 0 [Posted by:  | Date: 02/09/07 01:57:13 PM]

I don't know why everyone is always so angry about how big or how expensive or how powerful the next generation of video cards are. Both Ati and Nvidia have previously stated that the next gen of cards would further push the power envelope, saying that future planned architectures will be smaller and more power-efficient.

Compared to CPU's, video cards are advancing at a much faster rate, with new architectures annually, and subsequently near double speed increases every year. CPU's have finally recently began to get faster again with C2D, yet it still take 2-3 years to see a double in performance.
0 0 [Posted by:  | Date: 02/09/07 02:34:55 PM]

Yeah this will suits my next purchase for my rig.Damn happy landing time. This monster will beat g80 surely and hoping will beat g81. 1GB DDR4 seems delicious : P
0 0 [Posted by:  | Date: 02/10/07 02:25:41 AM]
- collapse thread

This card will make computer hardware become alive after look dull in early 2007
0 0 [Posted by:  | Date: 02/10/07 05:03:32 AM]

With a power consumption this high, yes, it can only be AMD
0 0 [Posted by:  | Date: 02/10/07 04:12:06 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture