News
 

Bookmark and Share

(29) 

A picture of the graphics card that is claimed to be a board based on the code-named graphics processor R600 from ATI, graphics product group of Advanced Micro Devices, has been published by an Asian web-site. The board seems to be huge and is said to have nearly extreme demands for power.

The picture that that has been leaked shows a special “long” version of ATI’s code-named R600-based graphics card aimed at original equipment manufacturers and system integrators. The version designed for DIY marker will be shorter, according to the report.

VR-Zone web-site claims that there would be two versions of ATI R600 XTX: one is for OEM/SI and the other for retail. Both feature 1GB of GDDR4 memory on board, but the OEM version is 12.4” long, whereas the retail is 9.5” long. The power consumption of the AMD R600 graphics card is 270W for 12” version and 240W for 9.5” version. The difference between the boards is unclear.

Specifications of ATI R600 published by a web-site earlier resemble specs revealed by some other sources back in mid-2006, but are not fully similar:

  • 64 4-Way SIMD Unified Shaders, 128 Shader Operations/Cycle;
  • 32 texture mapping units, 16 raster operation units;
  • 512-bit memory interface full 32 bit per chip connection;
  • 230W thermal power envelope;

The web-site also claimed that the next-generation graphics chip from AMD’s graphics division formerly known as ATI Technologies will support so-called GPU clustering, which allows to install 2ⁿ number of GPUs (4, 8, 16, 32, etc), though it is unclear whether this is something new, as ATI’s graphics chips supported multi-GPU capability for professional solutions from companies like Evans & Sutherland for many years now. In addition, Level505 reports that the R600 chip is compatible with “draft DX10.1 vendor-specific cap removal” application programming interface, something, which is unlikely to be utilized for a substantial amount of time.

Officials for Advanced Micro Devices did not comment on the news-story.

Discussion

Comments currently: 29
Discussion started: 02/09/07 07:35:52 AM
Latest comment: 02/11/07 09:59:12 AM
Expand all threads | Collapse all threads

[1-7]

1. 
That is fugly.
0 0 [Posted by:  | Date: 02/09/07 07:35:52 AM]
Reply
- collapse thread

 
OEM version is 12.4" long ( 270W )
retail is 9.5" long (240W)

So this is a picture of some early OEM prototype version NOT the retail one.
0 0 [Posted by:  | Date: 02/09/07 03:48:28 PM]
Reply

2. 
I'm getting tired of the continually increasing power consumption of graphics cards. For me, it's just a pain to try to figure out if I have to change power supplies for the millionth time, if I should look into running a dedicated circuit to my wall jack. It's getting ridiculous. I can't imagine how "green" hardcore gamers feel.

CPUs somehow reversed the increasing power consumption trend, GPU companies should really look into it. Are we going to need a cryogenic freezer one day to use a card that can run Unreal 10?
0 0 [Posted by:  | Date: 02/09/07 08:35:32 AM]
Reply
- collapse thread

 
That's because major architectures are revealed approximately every 5 years on a CPU. GPUs have a 1-year architectural lifecycle. They don't have the luxury of a handful of architectural optimizations plus one or two fabrication die shrinks like the CPUs do.

I do hear you though and don't disagree but it's the pace of gaming which is its own bane.
0 0 [Posted by:  | Date: 02/09/07 09:10:50 AM]
Reply
 
I have to disagree with you because G80 took 4 years from being a concept/idea till a working hardware.
Let's face it, graphics cards makers are far away from shuffling their focus to a more power economical product rather than a faster product -no matter what being faster takes-.
0 0 [Posted by:  | Date: 02/09/07 02:02:14 PM]
Reply

3. 
That thing had better have some damn good performance, or it is going to FLOP.

The only reason *I* can imagine for the longer OEM card, is that the retail has to fit into existing cases, which that OEM won't on many modern systems. Willing to bet that the retail cooling solution is going to be inferior and LOUD. Or maybe I'm just still bitter about my jet engine loud x850xt cooler that still overheated before I switched to a zalman.

Seriously though, I agree with the sentiments above. GPU companies are killing the gaming industry. How? They are pandering to a few hardcore gamers who will spend ANYTHING for the top performance. Then the developers play right into it, releasing games that REQUIRE such serious hardware - since the hardcore will slam any game that doesn't with bad reviews. Then, 90% of those who would be gaming on the PC end up giving up in frustration since their brand new $200 video card "isn't good enough for medium quality".

I wish that the nVidia/ATI war would go the way of the AMD/Intel war and focus on beating each other on price as much as performance. The AMD/Intel war brought high performance CPUs down to a price the average user could get more power than he ever needed. nVidia/ATI seem to be interested in making sure nobody will ever be able to survive by buying anything under $200 by promoting games (through developer agreements) that will not even run on their low-end hardware.
0 0 [Posted by:  | Date: 02/09/07 10:52:49 AM]
Reply

4. 
Damn that is awesome!!

High end graphics cards are all about excess, being big, powerful and having high performance. Its never been about being efficient or cost-effective.



0 0 [Posted by:  | Date: 02/09/07 11:25:14 AM]
Reply
- collapse thread

 
MMM Excuse me, intel Conroe (core 2) is big, powerful and having high performance, yet more efficient than the disaster netburst (pentium 4) in almost all aspects !!
0 0 [Posted by:  | Date: 02/09/07 01:57:13 PM]
Reply

5. 
I don't know why everyone is always so angry about how big or how expensive or how powerful the next generation of video cards are. Both Ati and Nvidia have previously stated that the next gen of cards would further push the power envelope, saying that future planned architectures will be smaller and more power-efficient.

Compared to CPU's, video cards are advancing at a much faster rate, with new architectures annually, and subsequently near double speed increases every year. CPU's have finally recently began to get faster again with C2D, yet it still take 2-3 years to see a double in performance.
0 0 [Posted by:  | Date: 02/09/07 02:34:55 PM]
Reply

6. 
Yeah this will suits my next purchase for my rig.Damn happy landing time. This monster will beat g80 surely and hoping will beat g81. 1GB DDR4 seems delicious : P
0 0 [Posted by:  | Date: 02/10/07 02:25:41 AM]
Reply
- collapse thread

 
This card will make computer hardware become alive after look dull in early 2007
0 0 [Posted by:  | Date: 02/10/07 05:03:32 AM]
Reply

7. 
With a power consumption this high, yes, it can only be AMD
0 0 [Posted by:  | Date: 02/10/07 04:12:06 AM]
Reply

[1-7]

Add your Comment




Related news

Latest News

Monday, April 14, 2014

8:23 am | Microsoft Vows to Release Xbox 360 Emulator for Xbox One. Microsoft Xbox One May Gain Compatibility with Xbox 360 Games

Tuesday, April 1, 2014

10:39 am | Microsoft Reveals Kinect for Windows v2 Hardware. Launch of New Kinect for Windows Approaches

Tuesday, March 25, 2014

1:57 pm | Facebook to Acquire Virtual Reality Pioneer, Oculus VR. Facebook Considers Virtual Reality as Next-Gen Social Platform

1:35 pm | Intel Acquires Maker of Wearable Computing Devices. Basis Science Becomes Fully-Owned Subsidiary of Intel

Monday, March 24, 2014

10:53 pm | Global UHD TV Shipments Total 1.6 Million Units in 2013 – Analysts. China Ahead of the Whole World with 4K TV Adoption

10:40 pm | Crytek to Adopt AMD Mantle Mantle API for CryEngine. Leading Game Developer Adopts AMD Mantle

9:08 pm | Microsoft Unleashes DirectX 12: One API for PCs, Mobile Gadgets and Xbox One. Microsoft Promises Increased Performance, New Features with DirectX 12

3:33 pm | PowerVR Wizard: Imagination Reveals World’s First Ray-Tracing GPU IP for Mobile Devices. Imagination Technologies Brings Ray-Tracing, Hybrid Rendering Modes to Smartphones and Tablets

2:00 pm | Nokia Now Expects to Close Deal with Microsoft in Q2. Sale of Nokia’s Division to Close Next Month