Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
ATI Radeon HD 6800: Generation Next?

Started by: GavinT | Date 11/02/10 12:03:28 PM
Comments: 22 | Last Comment:  11/25/10 12:08:16 AM

Expand all threads | Collapse all threads


Isn't the 6000 series meant to be a mix of last generation Cypress and next generation Northern Islands architectures? That would surely make Southern Islands a stepping stone between Cypress and Northern Islands, a half generation so to speak.
0 0 [Posted by: GavinT  | Date: 11/02/10 12:03:28 PM]
- collapse thread

It is, but, anyway, Radeon HD 6800 should be considered as 2-nd generation of ATI DX11 architecture.
0 0 [Posted by: Vader@Xbit  | Date: 11/03/10 12:21:04 PM]

Sadly you aren't comparing apples to apples. The image quality of the new 68x0 and since Catalyst 10.10 of the 58x0 have severely reduced. Texture flickering is more than a small problem for AMD nowadays and there are quite a bunch of in depth articles about it on German speaking IT websites and none of them is translated right now. You should have a look at HT4U (, TweakPC ( and especially ComputerBase (http://www.computerbase.d...0/bericht-radeon-hd-6800/) showing clearly you need to compare the HQ modus of AMD with the default quality of nVidia to get a more or less similar IQ. And even than nVidia is slightly better. The quality of nVidias HQ is simply unachievable with AMD, a very frustrating problem for people who would like to buy one of their products.
0 0 [Posted by: sanity  | Date: 11/02/10 10:16:51 PM]
- collapse thread

We are considering possibility to use default driver settings in future articles.
0 0 [Posted by: Vader@Xbit  | Date: 11/03/10 12:18:51 PM]
"We are considering possibility to use default driver settings in future articles."
Strange reply. Specially when there is no remarks suggesting that you would provide IQ comparisons in current released games. If you are going to reduce IQ for nvidia control panel I think it's only appropriate to show IQ comparisons as to why.
In any case, I'm not sure who would consider any future reviews valid when you enable default settings thus enabling all optimizations from the nv control panel while reducing them on CCC. It's really that simple.

Oh and BTW, the review linked earlier had "Enabled Surface Optimizations" still checked. This made the videos invalid.
0 0 [Posted by: CPUGuy  | Date: 11/04/10 11:29:56 AM]
I haven't said, that we'll definitely use default settings. We just considering. Perhaps, we won't. But majority of gamers use these default settings anyway.
0 0 [Posted by: Vader@Xbit  | Date: 11/04/10 06:15:21 PM]
Fair enough. But remember, AMD and Nvidia implement:
-AF differently
-AA differently (CSAA vs MLAA)
-profiles in general differently
-optimizations differently

I as well as others would simply like to see control panel settings equitable between cards.
0 0 [Posted by: CPUGuy  | Date: 11/05/10 11:29:32 AM]

How can you even say this absurdity, "Even in static screenshots from Fallout: New Vegas it is next to impossible to tell MLAA from ordinary 4x MSAA, even if you use Compressonator. Gamers are unlikely to feel any difference, especially during actual gameplay". 4xMSAA ~= MLAA in quality????!?!?!?!?!??!? Seriously!!!! Try some other games, the difference is so big, I thought you were joking!!!!!! )))))
0 0 [Posted by: TAViX  | Date: 11/03/10 02:14:54 AM]
- collapse thread

We'll check later. I, personally, think, that all these "Super-Duper Ultra Progressive" AA modes are total overkill, if not pointless marketing blah-blah at all. In 99% real cases MSAA 4x is more, than enough, with transparency AA enabled, of course.
0 0 [Posted by: Vader@Xbit  | Date: 11/03/10 12:15:05 PM]
In deed. On bigger res, using more than 8xAA is pointless. The garbage like Edge Detect and 24xAA just make the image worst.
MLAA on other hand is IDEAL for games that don't have AA implementation, like Metro, Batman, GTAIV, etc. But I will NEVER prefer MLAA over 4xAA if it's possible.
0 0 [Posted by: TAViX  | Date: 11/04/10 08:43:00 AM]
Just wait a little, we're going to update MLAA section soon. A spoiler: with MLAA, micro-geometry is awful.
0 0 [Posted by: Vader@Xbit  | Date: 11/04/10 06:17:30 PM]

Try the new 10.10d hotfix driver and see if that yields better results.
0 0 [Posted by: GavinT  | Date: 11/03/10 05:36:19 AM]

How many gamers out there are using the 2560x1600 resolution?

Over the years, following review sites, I have observed an inflation of benchmarks settings. As videocards improve, settings and resolutions are being pushed higher and higher to be able to see practical performance differences between GPU's.

The fact remains that on a 22 inch monitor, a 1280x800 resolution with max quality settings will provide a very enjoyable gaming experience. And a monster card is not required to achieve this.

The findamental question is: Do we want to play game or to engage in an arms race to the ultimate absolute performance?

For example, despite its flashy effects and high hardware requirements, many people soon find that Crysis can become boring.
0 0 [Posted by: BernardP  | Date: 11/03/10 08:45:10 AM]
- collapse thread

1280x800 on 22" will be non-native, non pixel-to-pixel mode, so picture will look like crap. All modern 22" are 1920x1080 already, and even older models are 1680x1050. 19" and some 20" are 1600x900.
0 0 [Posted by: Vader@Xbit  | Date: 11/03/10 12:09:52 PM]

As the hardware survey from Steam shows, already 40% of their users are on 1680x1050 or higher resolutions. And Steam isn't known for very high end oriented members.
0 0 [Posted by: sanity  | Date: 11/03/10 10:42:54 AM]

Then there is no point of having graphics card reviews. And you shouldn't have wasted your time, because @ 1680x1050 you are going to be either CPU limited or the frame rates would be so high it wouldn't make a difference. And while I agree with you that there are a lot of people spending $350+ on video cards an playing 1920x1080 which is an overkill, I do believe that if you are spending more than 350-400 (by current pricing) you should own a 2560x1600 monitor.
Edit: This was supposed to be a response to BernardP
0 0 [Posted by: jonup  | Date: 11/03/10 10:52:08 AM]
- collapse thread

The point is, can you have 30" and play in native resolution? Because in non-native mode picture quality is far from optimal.
0 0 [Posted by: Vader@Xbit  | Date: 11/03/10 12:11:42 PM]

Too bad there are no native DX11 games out, only DX9/DX10 games with a few DX11 features added on.

Also isn't it funny that an almost 2 yr old NVIDIA card (GTX 295) is still the 2nd fastest card in the world?
0 0 [Posted by: LedHed  | Date: 11/03/10 02:11:10 PM]
- collapse thread

What are you smoking, because I want some of that?
0 0 [Posted by: jonup  | Date: 11/03/10 03:39:35 PM]

It seems that AMD is just buying the time and wants to see NVIDIA's react(if it does!).But overall these chips are not awesome.
0 0 [Posted by: Pouria  | Date: 11/03/10 04:58:00 PM]

sorry, but the second page is a real pain to read.

"at first sight... a step backwards"
"The reduced number of ALUs and texture-mapping units and the lower fillrate parameters may lead an inexperienced user to this conclusion."
"Superficially, the Radeon HD 6800 only seems to have but one advantage over the previous series. Its clock rate is 900 MHz"
"Well, of course this superficial approach to evaluating the new series is incorrect."

you are being condescending and maybe even demeaning to the reader. first, you point out the facts about the 68xx vs 58xx and then say that this is a simplified, superficial and inexperienced point on view. that is a load of crap.

i am not arguing with 64-bit being next to useless in games but you are going to detail out the whole plethora of differences in the following pages - it is the exact same architecture with the following changes:
1. dropped 64-bit calculations
2. added tessellator.
that is it.

performance-wise 68x0 cards fall on either side of the 5850 and generally won't be able to touch 5870. granted, they are not supposed to, but then naming them 68x0 was quite a bad move from amd.

by the way, morphological antialiasing is not so much a hardware feature as a software one. it is perfectly usable (with modified drivers to enable the feature) with 58xx cards.
0 0 [Posted by: londiste  | Date: 11/05/10 02:41:12 AM]

Disappointing to say the least.
0 0 [Posted by: beck2448  | Date: 11/25/10 12:08:16 AM]


Back to the Article

Add your Comment