I think that using Adaptive AA on quality for ATI cards vs transparent-Multi-sampling AA for nvidia cards isn't fair for ATI cards, regarding the effectiveness and the demand of those AA settings.This is a qualtiy comparison article from xbitlabs:
According to this article, "Nvidia's Transparent AA Multi-Sampling hardly does any job at all.ATI's PerformanceÂ Adaptive AA is clearly better than Nvidia's Transparent AA multi-sampling, but is obviously not as good as Quality Adaptive AA"
And the thing is that i think that using the Q AD_AA for ATI cards hits the performance badly especially in UBISOFT games, like Far Cry 2 and HAWX...Cuz with my single 4870 TOXIC 1G. at 780MHz on my X58 platfrom with a Core I7 at 3.34Ghz, i get in HAWX at 1920 res. on max setts with 4x aa over 40 average fps "vs 27 fps for the 4890 here"....and slighlty over 43fps in
far Cry 2 at the previous setts "vs 44 fps here"...
I think that leaving AA on the Default multi-sampling mode for both cards from ATI and Nvidia would make a better performance comparison...
A response from the reviewer would be greatly appreciated.