Before proceeding to the benchmark analysis, let us have a quick look at technologies Crytek has incorporated into its FarCry game since the time it was released in Spring:
- Shader Model 3.0-optimized rendering engine. The first implementation of the Pixel Shaders 3.0 brought some additional performance to NVIDIA GeForce 6 hardware. It did not, however, improve image quality.
- Shader Model 2.0b-optimized rendering engine deserves the same comment as the Shader Model 3.0 engine – additional speed, no IQ improvements. This, basically, confirms ATI’s claim that virtually everything that may be done with Shaders 3.0 today can be done with Shaders 2.0b.
- OpenEXR HDR rendering delivered some major image quality improvement to NVIDIA’s GeForce 6 GPUs. This implementation had been tested for months by Crytek and also it looks great, it still deserves better implementation. Furthermore, modern hardware does not allow to play FarCry with HDR with high speeds.
- 3Dc normal map compression leaves somewhat mixed feeling. It gives tiny speed bump and no image quality improvements. Basically, we are nearly safe to say that the FarCry does not benefit from 3Dc.
So, all-in-all, the new technologies, except HDR, do not bring radically new advantages to the FarCry game. Since the FarCry engine is available for purchase, we believe that future titles based on the same engine may show some benefits with new rendering techniques Crytek managed to implement. Crytek itself is now working on another game, which may have more thoroughly implemented technologies listed above.
Now, let us talk more about performance of modern graphics cards in FarCry game.
In case you want to play FarCry game with the highest settings you probably get one of the top graphics cards from ATI Technologies or NVIDIA Corp. that cost $399 - $499 these days. While products like the GeForce 6600 GT or the GeForce 6800 offer great performance in FarCry, once anisotropic filtering and full-scene antialiasing are enabled, their performance drops.
The main advantage NVIDIA’s GeForce 6-family of graphics cards have over competing RADEON X800 and RADEON X700 hardware is support for HDR. Provided that you spend some time tweaking your game settings, you are likely to find the right balance between image quality and speed for you and enjoy the game with more natural lighting effects.
ATI’s RADEON X800 hardware, however, better handles antialiasing and anisotropic filtering quality compared to the GeForce 6800-series, which is also important. Unfortunately, ATI’s RADEON X800 has some issues with minimal fps that first emerged when Crytek launched its 1.2 patch and that got worse when they enabled Shader Model 2.0b support. Based on our investigation, those speed drops are not that serious, but some may dislike them.
When you choose more affordable hardware, such as RADEON 9800-series or the GeForce 6600 GT, you are likely to get a very nice price/performance ratio, but you will probably have to compromise image quality for enough speed. NVIDIA’s GeForce FX-series and ATI’s RADEON 9600-/X600-series can probably handle the game, it will look perfect, but you will hardly experience FarCry in all of its glory on such graphics cards.
Talking about image quality drawbacks we should say that they may be found on both sides: NVIDIA GeForce 6 renders shadows not as smoothly as ATI’s RADEON X800 and both produce different quality lighting – it’s really hard to determine which one is better.