Influence of Texture Quality Settings on the Graphics Card Performance
Since this is mostly a theoretical investigation, we decided to pay due attention to the was FIC R96P will behave in synthetic benchmarks and to its performance in those applications. At first, we carried out a few fillrate tests, in order to find out how the graphics chip frequency affects it. Here are the results obtained:
You can easily notice that the memory frequency influences the fillrate in case of single texturing more than the graphics chip frequency does. Just compare the testing participants working at 500/600MHz and 500/500MHz. The fillrate drops as evenly as the level of anti-aliasing grows.
With enabled multi-texturing the results turn out more interesting. The graphics core frequency becomes a very important factor: just look at the performance gain provided by the frequency growth from 400MHz to 500MHz! You can also see that the growth of the FSAA level, increases the dependence of the fillrate values on the memory frequency. This dependence is especially evident on the last two graph lines (for 400/500MHz and 400/600MHz).
For a better comparison have a look at the results obtained during a similar testing carried out for NVIDIA GeForce FX 5600 Ultra:
Well, we don’t notice anything interesting here. The fillrate is all in all higher than by FIC R96P except the FSAA 6xS mode. As I have already mentioned to you, flexible architecture of NV31 allows it to change the pipeline configuration. When we have only one texture, this graphics processor works according to the following formula: 4 x 1, i.e. 4 pipelines with 1 TMU on each. However, as soon as we enable multi-texturing the chip switches to 2 x 2, i.e. 2 pipelines with 2 TMUs on each.
As you see, GeForce FX 5600 Ultra gives in when multi-texturing is enabled. It is probably because the 2x2 scheme it uses is not so efficient in those modes that work with FSAA.