FarCry 1.3: Crytek’s Last Play Brings HDR and 3Dc for the First Time

FarCry title and its engine proved to be the most technologically advanced released in 2004 so far. The game that has been in development since 2001 eventually acquired support for technologies that still have to make it into the mainstream market, the Shaders 2.0b/3.0, HDR and 3Dc. They say the patch 1.3 is the last patch for the FarCry, but it brings loads of important innovations and we decided that we should pay them a great deal of attention.

by Anton Shilov , Alexey Stepin, Yaroslav Lyssenko
11/02/2004 | 09:45 AM

Introduction

Weeks before the most-anticipated game of all times – the Half-Life 2 – hits the market Crytek, a second-tier game developer, has updated the most visually attractive game released so far – FarCry – with some new capabilities and finally officially enabled Shader Models 2.0b and 3.0 in the engine. The innovations that Crytek provides today are High Dynamic Range (lighting) rendering and 3Dc normal compression: both are likely to be used in future titles and the majority of such titles are expected to run on future hardware.

 

For Crytek there was a number of reasons to release he FarCry patch 1.3:

Crytek has achieved all of its goals, today we are here to evaluate performance gains, image quality and also find out whether the astonishing High Dynamic Range lighting brings performance down and how much.

Currently neither HDR nor 3Dc are officially supported capabilities, Crytek notes that they are on the beta stage of testing and does not provide any warranties on those techs. Answering on the question whether FarCry has a future the company’s officials say that the company is already working on a new engine and a new game project. This means that the true and correct implementation of HDR, 3Dc and some other technologies will be available only in future titles from Crytek, which will be released by EA Games, not UbiSoft, or in upcoming games running FarCry Engine.

HDR: Each Good Thing Needs a Proper Implementation

Before we proceed to the description of Crytek’s implementation of HDR, let us revise what the HDR actually is from our original GeForce 6800 Ultra review.

What is HDR?

The major idea of the High Dynamic Range is very simple: the lighting parameters (color and intensity) of the pixels forming the images should be described with real physical terms.

The today’s universal image description model is an additive hardware dependent RGB (Red, Green, Blue) model, which was first developed for such display devices as CRT (Cathode Ray Tube), i.e. the regular computer monitor. According to this model, any color can be represented as a sum of three basic colors: Red, Green and Blue with properly selected intensities. The intensity of each basic color is split into 256 shades (intensity gradations).

The number 256 is quite a randomly selected one and appeared as a compromise between the computer graphics subsystem performance, photorealistic image requirements and binary nature of all computer calculations. In particular, they found out that 16.7 million shades (256x256x256) are more than enough for images with photographic quality. Moreover, 256 can be easily codes in the binary system as 2^8, i.e. 1 byte.

Of course, any color in RGB model will be described with an integer triad. Note that floating point numbers (such as 1.6 or 25.4, for instance) cannot be used within this model, and the numbers used are kind of “fake”, i.e. they have nothing to do with real physical lighting parameters.

Certainly, the dynamic monitor range (and the RGB model description) is not enough to represent all real world images or at least that part of it, which a human eye can perceive. The typical consequence of that is the “removal” of all intensities from the upper and lower part of the range. An example here could be a room with the open window on a sunny summer day. The monitor will correctly display either the room interior or the part of the outdoor scene, which you can see through the window.

As far as the computer monitor is concerned, there is hardly anything you can do about it: you cannot increase the screen brightness up to the level of Sun brightness. But if there is nothing we could do about the monitor then why don’t we give up the RGB model, especially since it can be done absolutely painlessly. Let’s describe the images with real physical values of light intensity and color, and the let the monitor display all it can, as it will hardly be worse anyway. :) This is exactly the idea behind NVIDIA’s HDRI (High Dynamic Range Images): for pixels of the image we set the intensity and color in real physical values or values linearly proportional to them. Of course, all real (and fake) lighting parameters are now described with real numbers and not integers, so that we will not be able to cope with 8 bits per channel. This approach immediately eliminates all limitations imposed by the RGB model: the dynamic image range is not limited at all theoretically. This way the question about discreetness and the number of brightness gradations is no longer acute, and the problem of insufficient color coverage is also solved.

HDR and Hardware

Previous generation graphics processors from NVIDIA didn’t support information output from the pixel shader to a few buffers simultaneously (Multiple Render Targets) and data rendering into a buffer in floating-point representation (FP Render Target). ATI RADEON “R3xx” graphics chips family supported these features from the very beginning, which made an advantageous difference from NVIDIA’s solutions.

NVIDIA GeForce 6800, HDR vs RGB Rendering 

HDR:

RGB:

NVIDIA’s GeForce 6 has finally acquired full support of the Multiple Render targets and FP Render Target, which allowed the company marketing people to introduce the term NVIDIA HPDR. It should be noted that NVIDIA’s HPDR and ATI’s HDR approaches are pretty different at this time, therefore, it is unclear whether game developers support both, or wait till ATI and NVIDIA come up with a single standard.

NVIDIA uses a compromise variant, the 16-bit OpenEXR format developed by Industrial Light and Magic to describe the physical values. The 16-bit OpenEXR description devotes one bit for the sign of the exponent, five bits to store the value of the exponent and ten bits to store the mantissas of the chromatic color coordinates (u, v), five bits per coordinate. The dynamic representation range thus stretches to 9 orders of magnitude: from 6.14*10-5 to 6.41*104.

The process of constructing and outputting a HDR image with the NV40 graphics processor is divided into three steps:

  1. Light Transport: rendering a scene with a high lighting dynamic range and saving the information about the light characteristics of each pixel in a buffer that uses the OpenEXR floating-point data format. NVIDIA stresses the fact that the NV40 supports floating-point data representation on each step of creation of a HDR scene, ensuring the minimum quality loss:
  1. Tone Mapping – translation of the image with a high dynamic range into a LDRI format (RGBA or sRGB).
  2. Color and Gamma Correction – translation of the image into the color space of the display device (CRT or an LCD monitor or anything else).

FarCry Gets HDR

Crytek decided to implement NVIDIA’s (or, if you prefer, OpenEXR) HDR approach into its FarCry title. The approach requires FP16 blending to be supported, hence, HDR will not work on ATI’s hardware. Since the capability is unsupported, it is logical that its work has some restrictions (e.g., you must have a certain type of hardware).

Generally speaking HDR does improve image quality and realism in a number of cases. However, there are quite some situations where HDR brings unrealistic lighting. Furthermore, it degrades performance dramatically. Despite of the drawbacks, HDR in FarCry looks just great. Maybe that’s not the perfect approach, but it definitely adds a new look for the game that the majority of gamers have already completed ;)

NVIDIA GeForce 6800, HDR vs RGB Rendering 

HDR:

RGB:

Just in case you are a happy owner of the GeForce 6800- or 6600-series graphics cards here is what you should do, provided that you have FarCry version 1.3, DirectX 9.0c and NVIDIA ForceWare 66.81 drivers:

Keep in mind – HDR drastically affects performance, so, be ready to compromise your screen resolution. Nevertheless, it basically costs some additional speed – the jungle definitely look more realistic during daytime. At night and in-doors, however, HDR is not implemented that good – twilight should not look like a day and a couple of LEDs should not produce the light of a small sun.

3Dc: Get the Speed, Admire Under Microscope

Unlike HDR, do not expect normal map compression that is sometimes referred as 3Dc to make any huge difference in image quality in FarCry. 3Dc, however, improves performance in addition to some potential minor image quality gains.

So, what is 3Dc and Normal Maps Compression?

Normal maps are a new step in the bump mapping techniques, and today they get more and more popular. The idea behind the use of normal maps implies that the information about the object surface is stored as a texture, where each texture element saves three components of a vector perpendicular to the object surface in a given point, i.e. of the normal vector.

ATI RADEON, NVIDIA GeForce Normal Map Compression

 

Normal Maps Compressed

Normal Maps Not Compressed

ATI RADEON:
NVIDIA GeForce:

The use of normal maps allows obtaining a much more detailed and realistic image, without increasing the number of polygons used to build it. As a result, normal maps can be created from the difference between the high-polygon model and simple model of the object. Later on you can use only the simple model, but the normal map you apply to it will make it look almost as good as the reference image.

Usually, normal maps describing the object surface are applied together with the base textures storing the color info about the object. The higher is the level of details for the textures and normal maps, the more realistic the image will look. However, the use of textures and high resolution normal maps increases the memory bus workload that is why some alternative solutions are necessary to retain high level of performance. DirectX 9 offers a set of DXTC algorithms providing efficient texture compression, and despite the info losses during compression manage to retain the acceptable texture quality. However, the compression of normal maps cannot be done with DXTC algorithms, because the sharp changes of the normal vector get lost as a result of compression and we get block artifacts instead.

The 3Dc algorithms supported by ATI RADEON X800 on the hardware level is intended for normal maps compression and causes not such a great worsening of the image quality. When we use normal maps compressed with 3Dc algorithm, RADEON X800 stores them in the graphics memory and transfers to the VPU compressed, decompressing them “on the fly”. The restoring of the third normal vector component in the pixel shader doesn’t take too much of the resources: just a few additional instructions should be added to the shader in this case.

The main thing you should now about 3Dc is when we compress dual-component normal maps, 3Dc ensures data compression with 2:1 ratio. The overall compression including the shift from the regular vector description to the dual-component description is done with the ratio of 4:1.

FarCry and Compression of Normals

Since currently NVIDIA does not have any algorithms specifically intended to compress the normal maps, Crytek only implemented normal maps compression technology for ATI’s RADEON X800 product line. While you can enable normal map compression by typing “r_TexNormalMapCompressed 1” in the console or adding the same key to the shortcut of the game on the GeForce FX, GeForce 6 and RADEON 9000-series hardware, this will not be true normal compression, but just the usage of a 2-component texture which picks up the idea of re-calculating the Z-Value using a pixel shader.

ATI RADEON, NVIDIA GeForce Normal Map Compression

 

Normal Maps Compressed

Normal Maps Not Compressed

ATI RADEON:
NVIDIA GeForce:

Unfortunately, Crytek did not add any new normals to the game, therefore, there hardly any eye-candy improvements.

Testbeds and Methods

Well, the time has come to investigate performance impact the new FarCry features bring. For that we had two systems: for AGP and PCI Express graphics cards.

Testbed 1:

Testbed 2:

Software:

As usually, we set graphics quality to equal level: the RADEON and the GeForce hardware produced equal image quality. As usually, we used five demos recorded on “Pier”, “Regulator”, “Research”, “Training” and “Volcano” levels.

Image Quality Comparison: But Does it Matter?

We also, as usually, briefly examine image quality difference between ATI and NVIDIA top-of-the-range graphics cards. While this ensures that there are no image quality (IQ) degradation on any of the boards, thorough analysis is hardly needed. Since the GeForce 6 and the RADEON X800 today use different render-path their IQ is not pixel-by-pixel the same. Hence, basically, the only measurement here is whether you like it or not.

ATI RADEON X800 vs GeForce 6800 Image Quality Comparison, FarCry 1.3

ATI RADEON X800

NVIDIA GeForce 6800

ATI RADEON X800

NVIDIA GeForce 6800

As you see, NVIDIA has corrected its bug with lighting in Shader Model 3.0 mode by adding some intensity. The company, however did not correct its too rough shadows.

Performance

HDR: A Costly Thing, a Work for SLI

Since those who enable high dynamic range lighting are looking for extreme image quality, we enabled maximum level of anisotropic filtering as well.





Wow! The performance impact is just unbelievable and is more than 50%. High dynamic range requires tremendous memory bandwidth and quite some additional computing power from graphics processor and its memory subsystems, even high-end graphics cards of today cannot handle that load. Therefore, we should expect games to acquire that technique only when the next-generations of graphics processors will get 24 – 32 pixel pipelines and advanced memory interfaces. No need to tell that RADEON 9700 would show dramatically low framerate with HDR enabled.

NVIDIA GeForce 6800, HDR vs RGB Rendering 

HDR:

RGB:

Even though HDR works slow, there is a cure for that slow work even now. Multi-GPU technology like NVIDIA SLI would probably help to boost the speed. Of course, you should have a dual NVIDIA GeForce 6800 Ultra or GT installation and you should consider carefully whether $1000 for a graphics sub-system worth an improvement in FarCry.

Still, putting aside some drawbacks with implementation, with HDR enabled you’ll get absolutely the best image quality and realism available today.

Normal Map Compression: 3Dc at Work

Let us now take a look at normal map compression capability of FarCry.

3Dc Impact, Pure Speed





Well, the result is pretty clear: ATI’s 3Dc-supporting hardware actually gained some speed after the normal map compression was enabled. In contrast, performance of NVIDIA’s GeForce products lost a number of points after we activated the normal map compression. Still, performance numbers are very close with and without 3Dc.

The highest speed increase from 3Dc can be obtained on the “Research” level. However, it is pretty strange that on the “Research” as well as in some other cases ATI’s RADEON X800 hardware had lower minimal fps than comparable NVIDIA’s GeForce 6.

3Dc Impact, Eye Candy Mode

We also explored whether there are performance gains in eye candy mode with 3Dc/normal map compression enablement.





The situation with eye candy mode is the same as without antialiasing and anisotropic filtering: 3Dc gets ATI some speed boost on the “Research” level and pretty much flat benefit in other cases. The situation when RADEON X800’s minimal fps is lower than that of comparable GeForce 6800, but the average fps is higher repeats here as well. The reason for that is unclear.

FarCry version 1.3: Out-of-box performance

Now that you know how to enable unsupported HDR and 3Dc capabilities in FarCry, let us take the final look on performance of various graphics cards in the game you get out-of-the-box with the latest official drivers from ATI and NVIDIA.

Training Level, Pure Mode

As usually, we begin our investigation with “Training” level that includes loads of vegetations, grass and water, along with caverns and huts, just like many other levels of FarCry.



“Training” level demonstrates pretty obvious performance advantage the new generation of graphics cards have over the old breed of GPUs.

With the highest possible settings, but without maximum full-scene antialiasing and anisotropic filtering, the GeForce FX and the RADEON 9600 XT/X600 XT do not deliver playable fps already in 1280x1024 resolution.

ATI’s RADEON X800 and NVIDIA’s GeForce 6800, however, still manage to offer nice framerate even in 1600x1200. Pay attention that the GeForce 6600 GT can deliver more than 40fps in 1600x1200.

Training Level, Eye Candy Mode



With full-scene antialiasing 4x and anisotropic filtering 16x enabled few graphics cards actually succeed in showing even 40 frames per second. Needless to say that only the latest graphics cards with high-bandwidth memory and powerful GPUs keep up the load well in 1280x1024.

As usually, ATI’s RADEON X800 XT-series is slightly ahead of competition with FSAA activated, while the RADEON X800 PRO finds itself somewhere between the GeForce 6800 and the GeForce 6800 GT.

Research Level, Pure Mode

Our demo recorded on the “Research” level this time includes indoor actions only. Action in a cavern is an example of other indoor scenes in the game: plethora of light sources, per-pixel lighting and multi-pass rendering amid rather complex geometry.



The “Research” is probably among the toughest levels for modern VPUs: pretty serious geometry load is combined with numerous light sources. As a result, graphics cards with relatively slow math1 performance – e.g. GeForce FX – drop speed to very low levels. By contrast, RADEON 9800-series plays pretty well in 1280x1024.

In general, performance of the GeForce 6800 and the RADEON X800 families of graphics processors is astonishing and in case of comparable products is pretty close. Still, in some cases RADEON X800’s minimal fps is lower compared to that of competing products. While there are few places in the demo when minimal fps drops significantly, there may be cases when NVIDIA’s GeForce 6800-series will provide more comfortable game-play even with a bit lower average fps score.

Research Level, Eye Candy Mode




With eye candy features enabled, harder times are coming for graphics cards. In 1024x768 ATI’s RADEON X800 XT-series leave NVIDIA’s GeForce 6800 GT/Ultra behind in terms of minimal and average speed. Nevertheless, already in 1280x1024 minimal performance RADEON X800 family has to offer slashes to pretty low levels, which means that the GeForce 6800 Ultra/GT produced smoother gameplay because of lower variability in framerate.

Pier Level, Pure Mode

“Pier” level contains loads of vegetations along with water and pretty thorough physics calculations, which is why fps in our demo here is not as high as on “Training” level.



All graphics cards with except of the GeForce FX, GeForce 6600 [non-GT version] and RADEON X600 XT delivered excellent performance even in high resolutions on the “Pier” level.

The situation with minimal framerate demonstrated by the RADEON X800-series than comparable GeForce 6800-series remains here as well.

We do not know the reason for such behavior, but here is an observation:

While we have changed our “Pier” demo since our previous benchmarking sessions, the new one continues to show that the RADEON X800 XT PE’s minimal fps in 1600x1200 is around 43, while the average is approximately 85.

Pier Level, Eye Candy Mode

Once FSAA and anisotropic filtering are enabled, ATI’s RADEON X800 XT and X800 PRO spread their wings and manage to outperform both GeForce 6800 Ultra and GeForce 6800 GT in FarCry based on average fps results. Still, the results demonstrated by NVIDIA’s latest hardware are fully playable.



Minimal fps, however, demonstrate NVIDIA’s win.

Regulator Level, Pure Mode

On the “Regulator” level there is, again, complex per pixel lighting implemented indoors. As in the previous case, the RADEON X800-series requires a number of passes for the lighting, while the GeForce 6800-series requires only one.



All graphics cards, save for the GeForce FX-series, 6600 and the RADEON X600 XT, demonstrate decent performance numbers. However, NVIDIA’s GeForce 6800 Ultra and FT clearly outperform ATI’s RADEON X800 and RADEON 9800-series in high resolutions.

Regulator Level, Eye Candy Mode



NVIDIA’s GeForce 6800 GT and 6800 Ultra hardware manages to keep up with ATI’s RADEON X800 XT-series in eye candy mode; however, the RADEON X800 PRO and RADEON 9800 XT lag behind the competing products from NVIDIA Corp..

Volcano Level, Pure Mode

“Volcano” is a yet another perfect example of what “long” pixel shaders can bring: multitude light sources clearly impact performance negatively on the hardware that cannot optimize rendering process, while the RADEON X800 and the GeForce 6800 graphics chips families clearly deliver significant  performance advantage thanks to more efficient programming using Shaders 3.0 and Shaders 2.0b.



Even though all graphics cards demonstrate pretty good results, in high resolutions NVIDIA’s GeForce 6800 Ultra and GT take the lead over competing RADEON X800 products.

Volcano Level, Eye Candy Mode

While the difference between the top boards from ATI and NVIDIA is negligible with AF and FSAA activated, the small advantage is on ATI’s side.



Conclusion

Technologies

Before proceeding to the benchmark analysis, let us have a quick look at technologies Crytek has incorporated into its FarCry game since the time it was released in Spring:

So, all-in-all, the new technologies, except HDR, do not bring radically new advantages to the FarCry game. Since the FarCry engine is available for purchase, we believe that future titles based on the same engine may show some benefits with new rendering techniques Crytek managed to implement. Crytek itself is now working on another game, which may have more thoroughly implemented technologies listed above.

Performance

Now, let us talk more about performance of modern graphics cards in FarCry game.

In case you want to play FarCry game with the highest settings you probably get one of the top graphics cards from ATI Technologies or NVIDIA Corp. that cost $399 - $499 these days. While products like the GeForce 6600 GT or the GeForce 6800 offer great performance in FarCry, once anisotropic filtering and full-scene antialiasing are enabled, their performance drops.

The main advantage NVIDIA’s GeForce 6-family of graphics cards have over competing RADEON X800 and RADEON X700 hardware is support for HDR. Provided that you spend some time tweaking your game settings, you are likely to find the right balance between image quality and speed for you and enjoy the game with more natural lighting effects.

ATI’s RADEON X800 hardware, however, better handles antialiasing and anisotropic filtering quality compared to the GeForce 6800-series, which is also important. Unfortunately, ATI’s RADEON X800 has some issues with minimal fps that first emerged when Crytek launched its 1.2 patch and that got worse when they enabled Shader Model 2.0b support. Based on our investigation, those speed drops are not that serious, but some may dislike them.

When you choose more affordable hardware, such as RADEON 9800-series or the GeForce 6600 GT, you are likely to get a very nice price/performance ratio, but you will probably have to compromise image quality for enough speed. NVIDIA’s GeForce FX-series and ATI’s RADEON 9600-/X600-series can probably handle the game, it will look perfect, but you will hardly experience FarCry in all of its glory on such graphics cards.

Talking about image quality drawbacks we should say that they may be found on both sides: NVIDIA GeForce 6 renders shadows not as smoothly as ATI’s RADEON X800 and both produce different quality lighting – it’s really hard to determine which one is better.