Image Quality and Speed in 3DMark 2006
Like in my GeForce driver review, I want to check out the correlation between the graphics card’s performance and image quality using 3DMark 2006. I ran the last two tests (that support Shader Model 3.0), changing the image quality setting from High Performance to High Quality in the Catalyst driver’s Control Panel. Catalyst IA was left at the Standard position. Then, I enabled anisotropic filtering and three levels of full-screen antialiasing (2x, 4x, 8x) using 3DMark06’s settings. The tests were performed with Catalyst 8.12 at a resolution of 1920x1200.
Here is how the quality of the Canyon Flight scene changes (frame 1350):
And here is how the graphics card’s speed changes depending on the image quality mode:
And now the screenshots for the second part of the diagram above, captured in the Deep Freeze test (frame 1150):
The two sets of screenshots suggest that the card’s performance does not change in the two 3DMark06 tests when the quality mode is adjusted from High Performance to High Quality in the driver but the image quality varies greatly. The game looks even better and more detailed when you turn on 16x anisotropic filtering but the frame rate lowers then: the performance hit is less dramatic in Deep Freeze than in Canyon Flight. Full-screen antialiasing, on the contrary, lowers the frame rate in the Deep Freeze more than in the other test. The image quality grows up steadily with 2x through 8x FSAA.