UPDATE: Adding transparent texture antialiasing modes quality comparison section.
There are no ideal things: all the items that we create, buy or receive as a gift have their advantages and disadvantages. When we attempt to make a choice, we always have to consider all the pros and cons, realize what advantage is the most valuable one and what disadvantage brings less inconvenience.
Like all the other things, graphics cards are not ideal: one has unique feature-set; another consumes less power; the third one offers extreme performance, but costs too lot; another one boasts with nice price/performance ratio, but produces a lot of noise and so on.
X-bit labs constantly pays attention to multiple factors, such as performance, power consumption, noise level and so on. However, one of the things that our reviews generally lacked was the image quality (IQ) comparison. This was due to the fact that at some point in the past we set up pretty similar settings for ATI and Nvidia ensuring that there were no image quality differences, and then just kept on using them without comparing the IQ head to head. But recently we have discovered that on our test settings Nvidia GeForce 7-series hardware produced considerably lower image quality when compared to ATI due to lower-quality trilinar filtering. In addition, ATI Radeon X1000 supports high-quality angle-independent anisotropic filtering. On the other hand, Nvidia has very fine 8xs antialiasing supported across the board, which would leave gamers guess, who is producing the highest quality 3D world.
Well, in this article we will try to answer this question!