PS. A pair of Corsair Dominator memories too.
Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!
Discussion on Article:
Directly Unified: Nvidia GeForce 8800 Architecture Review
PS. A pair of Corsair Dominator memories too.
i'm looking forward to it!
I see a lot of big number but I honestly have not seen much new innovations. I am hoping to see ATI making some innovative moves on their new chip. I'll be disappointed if ATI simply boost transistors and specs number up to be fast like 8800.
Unified Architecture with Streaming processors, direct-x 10 , higher quality AF, CSAA, GPGPU- with a C compiler. I say the improvements are atleast satisfactory.
in anadtech, it was said that this chip took 4 years to develop and cost 475million $ , it usually takes less time.
I agree with James.
The GPGPU is the most impressive new feature in my eyes. I'd love to play with it or see some applications using it :)
What you call "High Quality AF" came with the GeForce3 chip, as it was the first 8-tap euclidean anisotropic filtering GPU. So did the GeForce4, who despite their incredible DX8 performance was losing in some tests to an ATI 8500 when mid-high resolution and heavy anisotropic filtering were being applied, due to the high demands of its aniso method.
And much later came the FX series.
Just a little clarification.
Anyway, check TechReport review to see for yourself that G80 AF quality is higher than those of ATI R580+ in real games.
direct-x 10 - nvidia first. Xbox360 has some directX10 functions.
higher quality AF- Since Ati X1800.
CSAA - Ati as Temporal AA since R300 would give 12X equivalent.
GPGPU- with a C compiler - There is already software running in Ati GPU like Folding Home, Ati already opened their GPU to applications programmers 1 year ago.
Forget about temporal AA. You need 60+ fps @ 6xAA to be able to use it. The new 16x
CSAA modes may look even better, perhaps run faster and you can use them with 30 or 40 fps (min fps)...
"GPGPU- with a C compiler - There is already software running in Ati GPU like Folding Home, Ati already opened their GPU to applications programmers 1 year ago."
F@H uses standard DX9 SM3 code which could run on G7x too if there weren't some hardware limitations (they use _very_ long shaders, _much_ longer than in games). ATI did not open their GPUs for GPGPU computing but they fixed their shader compiler in their drivers so the F@H client works.
nVidia's C compiler is something different and it remains to be seen how useful it is.
I think you are wrong about the GPGPU.
A article on the preview of the
architecture, whlie other sites
are already providing benchmark results?!
to toms: you suck
to XBIT: Respect aai
Seems like Nvidia's new 8x CSAA is pretty much equal to ATI's 6x FSAA, with roughly 6 samples for each anti-aliased jaggies when zooming in at least 400% using ATI's Compressonator program.
Of course, Nvidia's new 8x pure MSAA is true 8x, which is better than ATI's 6x MSAA.
Notice how Xbit Labs will probably tell you that Nvidia's 8xQ FSAA (8x true MSAA) brings a large performance hit, significantly larger than that of ATI's 6x FSAA. I bet that's why 8x full MSAA has not yet been done on video cards before the G80, because the performance hit was too large and impractical in any of the current games. (Ahhh.. those stubborn engineers, only allowing what is really needed by at least 5% of the market.) Now the G80 has power, at the very top of the DX9 market as it is currently (just as when the GeForce 2 Ultra was at the top of the DX7 market, and, on a side note, can still perform some of the current DX9 games with little or bearable visual quality loss such as Half Life 2 in DX7 mode).
I think that the major reason for 12 memory chips (768 MB) or 10 (640 MB) is because Nvidia could not find enough of affordable 1 GB GDDR3 or GDDR4 memory. Obviously, it was either too expensive or too scarce if Samsung only made 1 GB video memory in 2 or more GHz GDDR4, or if Samsung has not yet started manufacturing 128MB chips (for 1 GB video memory). But, Nvidia really needed more than 512 MB for the resource-consuming 8800's, without having problems in supplying enough of the cards to meet demand.
I bet that such high FSAA modes will quickly become popular and then standard in the near future.
I have a 8800gts and hoped to find some answers.
Add your Comment
Enter your username and e-mail address. Password will be sent to you.