Contemporary Graphics Cards in Call of Duty: World at War

Call of Duty series is one of the most popular gaming sequels about World War II. In our today’s article we are going to discuss the performance of contemporary graphics accelerators in the latest part of this game called World at War.

by Alexey Stepin , Yaroslav Lyssenko
02/08/2009 | 11:05 AM

World War 2 is undoubtedly one of the major events of the world history at large and of the 20th century in particular. It is depicted in lots of fiction books and movies and, of course, is a popular theme in the video game industry. Indeed, there are a lot of WW2 games of different genres, from global turn-based strategies in which you can test your leadership skills to first-person shooters for those who want to live the life of a rank-and-file soldier and feel all the excitement of hot combat. One of the most popular games about that war is Call of Duty that has evolved into a complex and far-reaching series with an army of fans.


Unlike some dinosaurs of gaming franchises such as Doom or Fallout, Call of Duty does not trace its origin back into prehistoric times. The original game of the series came out in 2003 when 3D graphics accelerators had already become widespread. Developed by Infinity Wards and published by Activision, the game ran on the id Tech 3 engine that had been used for the world-famous multiplayer shooter Quake 3: Arena. As opposed to the latter, Call of Duty was not limited to multiplayer. Instead, it offered three full-featured plotlines for Private Joey Martin of the U.S. 101st Airborne Division, Sergeant Jack Evans of the British 6th Airborne Division and Special Air Service, and Private Alexei Voronin in the 13th Guards Rifle Division of the Soviet Army. This approach helped the gamer see the war from different sides and take part in different episodes of it. The game enjoyed a warm welcome with the audience, receiving a number of awards, and was soon followed by an add-on, Call of Duty: United Offensive.

A full-featured sequel was released on October 25, 2005, on a wide range of platforms: from the PC and Xbox 360 to Mac OS X, PDAs, smart-phones and even ordinary cell phones. Call of Duty 2 carried on the tradition of telling about World War 2 from different points of view. Its plotlines focused on the Eastern Front, the North Africa events, and the D Day when the allies landed in Normandy. In the game’s finale the gamer was to fight two famous Tiger I tanks at once. The developer kept up his high quality standards and the second game of the series was received just as warmly as the first one. Call of Duty 2 ran on an original engine that improved it visuals noticeably.

The third installment of the series was developed by Treyarch and Pi Studios. It was a console project, never released on the PC. We can only tell you that campaigns in Poland and Canada were added to the traditional warring parties in it.

The fourth game marked a sudden departure from the WW2 theme. CoD 4: Modern Warfare is set in our time, its plot being almost as surrealistic as in the Red Alert games. Bent on restoring the Soviet Union, Russian ultranationalist Imran Zakhaev provokes a war in the Middle East to draw the public attention away from his actions in Russia. The game begins with a SAS group infiltrating an Estonian registered cargo ship suspected to have a nuclear weapon on board. You mostly play for Sergeant “Soap” MacTavish, but for a few levels he is substituted with Paul Jackson of USMC 1st Force Recon. The rest of the roles the player has to enact throughout the game are episodic. Despite the surrealism, the plot is well-devised and interesting to follow. No wonder that the game got positive reports from all leading reviewers.

Call of Duty: World at War Plot

The fifth game of the series was again developed by Treyarch which decided to return to the WW2 theme. As we said above, this is a widespread subject in video games, but fortunately, it was such an immense event that you can always find some unexplored aspect of it. This time the developer chose the Pacific campaign and the battle for Berlin.


The game begins with US marines raiding Makin Island. In the real history this raid had many objectives including the destruction of the defensive infrastructure, capturing prisoners, obtaining intelligence data, and drawing the Japanese Forces away from the Allied landings on Guadalcanal and Tulagi. The player sees the world with the eyes of Private Miller who’s watching a Japanese officer killing a prisoner and preparing to meet his own fate. Fortunately, he is rescued by a crew headed by Sergeant Sullivan. Miller joins with them and the game begins.


Throughout the first mission you will clean an island from the defending Japanese, blow up an ammo warehouse and evacuate a wounded fellow. The next mission takes place two years later, depicting the landing on Peleliu which was almost as action-heavy as the famous D-Day. You surely won’t get bored.




The first mission of the Soviet campaign begins on September 17, 1942, describing the battle of Stalingrad.

The beginning is somewhat macabre even: the wounded Private Dimitri Petrenko comes to, finding himself among the bodies of his fellow soldiers and watching the Nazis executing the rest of his troop. Fortunately, he is unnoticed by the enemies.



Soon you find out that Petrenko is not the only survivor. You meet Sergeant Reznov who is bent on killing Nazi General Amsel, the man responsible for the killing in Stalingrad. You will run away from flame-throwers, take part in an attack on a German signal office, and finally hunt down the General.



Like the American campaign, the next Soviet mission takes part a couple of years after the first one. The game is not unlike a TV series because the Soviet and American missions intertwine. Well, you’ll find all that by yourself as you will be playing the game as a flame-thrower operator and a driver of a T-34 and in many other roles.


And finally you will end up planting the red flag above the Reichstag.

Call of Duty: World at War Gameplay

It is no use to describe the gameplay mechanisms of a first-person shooter because the canons of this genre were all established long ago. Something new and original can be seen here but very seldom. Call of Duty: World at War offers classical controls but there are a few things that add to the realism of the game world. First, there is a limit of weapons you can carry with yourself: no more than two varieties simultaneously, save for grenades that count individually. Some FPS players may be annoyed at that, but it is indeed hard to imagine a real-life soldier carrying half a dozen weapons with him plus a generous stock of ammo. Of course, you can pick up weapons from killed enemies, replacing any weapon in your arsenal. There are also unique samples of weapons equipped with optical sights or with special rifle grenades instead of ordinary bullets.


Then, the gamer can squat or lay down, which has an appropriate effect on his aiming accuracy as well as on the chance of being shot. Since the game doesn’t offer traditional medical kits, hiding behind a boulder or wall is the only way of restoring your health if you have been shot. Of course, wounds do not heal so quickly in the real world, but this is the necessary compromise for achieving an interesting gameplay. Anyway, health restoration kits wouldn’t be appropriate in such a game Call of Duty: World at War. By the way, it is not so easy to find a shelter because the fifth part of the series has inherited one of the most interesting features of the fourth: it calculates realistically how different materials can be shot through. Like in a real world, a thin wooden wall won’t save you from an enemy bullet.


In addition to the traditional multiplayer mode, Call of Duty: World at War offers the option of cooperative play which is quite rare in today’s games. The console versions allow two persons to play at the same time in split-screen mode whereas the PC version can be played by four people simultaneously using network connection. In this mode you can upgrade your game character, which may be useful in multiplayer battles against other gamers later on. The so-called Death Cards are an interesting innovation, too. Each of them can either endow the player with additional capabilities or make the mission objective harder to achieve (e.g. the Jack of Spades makes the enemies invulnerable except when you shoot them right in the head). There is also an integrated mini game Nazi Zombies you can access upon passing the single-player campaign.

One of the nastiest drawbacks of the gameplay is the console-like system of saves which is based on specific control points and does not allow you to save whenever you feel like to. It’s not much fun if you have to replay some part of the game because you had exited it before you reached the next control point, but on the other hand, you learn to play discreetly, without hoping to save and reload in the hardest moments.

Call of Duty: World at War runs on the same engine as the previous game of the series, Modern Warfare, and does not look as impressive as Crysis, Far Cry 2 or S.T.A.L.K.E.R.:Clear Sky. However, it supports relief maps, a dynamic lighting model with HDR effects, depth-of-field effects, and dynamic vegetation and water. Besides, the mentioned games have terrific graphics subsystem requirements because of their highly detailed game world whereas Call of Duty: World at War is far more modest in this respect. We’ll check out right now just how modest it is.

Testbed and Methods

To investigate the performance of contemporary graphics accelerators in Call of Duty: World at War we put together the following testbed:

According to our testing methodology, the drivers were set up to provide the highest possible quality of texture filtering and to minimize the effect of software optimizations used by default. We enabled transparent texture antialiasing, and we used multisampling mode for both graphics architectures, because ATI solutions do not support supersampling for this function. As a result, our ATI and Nvidia driver settings looked as follows:

ATI Catalyst:

Nvidia GeForce:

14 graphics cards and multi-GPU systems participated in our today’s performance test session. They can be split in three categories according to their price:

Premium/High-End category:

Performance-Mainstream category:

Mainstream category:

We ran the tests in all resolutions including 2560x1600 only for the Premium category. Performance-Mainstream was limited by 1920x1200. Mainstream solutions were tested in 1680x1050 max. The game doesn’t have strictly defined image quality profiles, but allows manually adjusting a number of parameters as well as managing the texture resolutions, normal maps and reflection maps. We arranged the image quality settings into 5 profiles in order to study the instantaneous performance. Here they are:

We used the last one providing maximum image quality to study the graphics cards gaming performance in Call of Duty: World at War. The test sequence included a run in the beginning of the game (from the military prisoner torture facility) to the first check point and back. We used Fraps utility version 2.9.6 to record the average and minimal fps rate. To minimize the measuring error, we took the average result of three combined runs for further analysis.


Premium/High-End Category

There are no clear leaders at 1280x1024 but, judging by the bottom speeds, the GeForce GTX 280 SLI tandem is better while the single GeForce GTX 280 is worse than the others. Anyway, the frame rate never sinks below 47fps even with the slowest card while the average speed is higher than 60fps.

The single GeForce GTX 280 loses its ground at the higher resolution. It is obviously the slowest of the top-performance solutions, having only one GPU with 240 shader processors. Interestingly, the ATI Radeon HD 4870 3-way CrossFireX subsystem has a somewhat lower bottom speed than the single Radeon HD 4870 X2. It must be due to the overhead for synchronization of the three GPUs. At the same time, the third GPU does not provide any advantage in terms of average speed at 1680x1050 on our testbed (with today’s fastest Intel Core i7-965 Extreme Edition processor).

The Radeon HD 4870 X2 has to slow down at the resolution of 1920x1200, even though its average frame rate doesn’t drop dramatically. Every graphics subsystem can still provide a comfortable speed. The single GeForce GTX 280 is out of play at 2560x1600, however. Its bottom speed is below critical level. The ATI Radeon HD 4870 X2 is quite fast at the highest resolution, but it is the top-end multi-GPU solutions that are contending to be the fastest. ATI’s solution wins by a small margin: you won’t even be able to perceive the difference between the Radeon HD 4870 3-way CrossFireX and GeForce GTX 280 SLI with a naked eye. Nvidia’s solution is better in terms of noise, temperature and power consumption, requiring up to 360 watts of power whereas ATI’s 3-way solution has a peak power draw of almost 400 watts. The new GeForce GTX 295 looks even better from this aspect since it is close to the GeForce GTX 280 SLI tandem in speed but lacks the latter’s drawbacks.

It is easy to recommend in this category. If you care about your acoustic comfort and do not play at resolutions higher than 1920x1200, you are going to be satisfied with the GeForce GTX 280. But if you’ve got a 30-inch monitor, you may want to consider buying a Radeon HD 4870 X2 or GeForce GTX 295. The multi-GPU solutions built out of multiple graphics cards are not worth the trouble of assembling them. They are expensive, unwieldy, uneconomical and very hot, and don’t even make up for these deficiencies with much higher performance in comparison with the Radeon HD 4870 X2 or GeForce GTX 295. They may only be of some help if you want to use a higher level of antialiasing than the classic 4x MSAA.

Performance-Mainstream Category

It is not that clear in this product category. Nvidia’s solutions are superior at low resolutions, the outdated GeForce 9800 GTX+ being almost as fast as the more advanced GeForce GTX 260 Core 216 and even faster in terms of bottom speed. It seems that the frame rate of Call of Duty: World at War depends on the clock rate of the GPU’s execution and texture-mapping subunits rather than on their amount. The low results of the Radeon HD 4850 agree with our point as this GPU has the lowest clock rate, 625MHz only.

The Radeon HD 4870 1GB catches up with the GeForce 9800 GTX+ at 1680x1050, but Nvidia’s solutions win at 1920x1200 again. The Radeon HD 4850 looks like a clear outsider then as it cannot deliver a playable speed. Although its bottom speed is only 2-3fps lower than the desired minimum, we cannot be sure that it won’t drop even lower in action-heavy scenes.

It is not easy to give a specific recommendation in this product category. Although the GeForce 9800 GTX is ahead of the Radeon HD 4870 1GB, it uses an outdated GPU and may be far inferior to ATI’s card in other games. Moreover, its advantage is not as big as to be felt in practice: just a little higher average frame rate and a comparable bottom speed. In fact, the most difficult choice is between the Radeon HD 4870 1GB and GeForce GTX 260 Core 216. The latter has a somewhat lower bottom speed, but never sinks below playable level even at 1920x1200. And it is also faster than its opponent in many other games. Thus, price must be the decisive factor here. Coming at an official price of $239, the ATI Radeon HD 4870 1GB looks appealing, but the GeForce GTX 260 Core 216 will be preferable if it costs the same or even somewhat more money.

Mainstream Category

Nvidia’s GeForce 9800 GT is the winner in the mainstream category. It is far faster than its opponent Radeon HD 4830. The latter provides an acceptable frame rate at resolutions up to 1680x1050 inclusive, but has no reserve of speed at all.

The same goes for the Radeon HD 4670 and GeForce 9600 GT, except that these bottom-mainstream products cannot provide comfortable conditions for play at resolutions higher than 1280x1024 unless you turn full-screen antialiasing off.

Summing it up, Nvidia’s solutions are the best choice in the mainstream sector. And you should only buy a GeForce 9600 GT if you want to save some money and do not have a monitor capable of working at resolutions higher than 1280x1024 pixels.

Instantaneous Speed and Image Quality

Our test results indicate quite clearly that Call of Duty: World at War prefers Nvidia’s GPUs, probably due to the high clock rates of their execution and texture-mapping subunits. Perhaps the amount of raster back-ends also affects the speed of this game although this parameter is usually unimportant for modern graphics cards in modern games.

To reveal any aberrations in the behavior of ATI’s and Nvidia’s architectures we performed a small additional test with two popular mainstream cards: ATI Radeon HD 4850 and Nvidia GeForce 9800 GTX+. We also benchmarked the cards with a less advanced CPU. It was a Core i7 920, the junior model of Intel’s new series, clocked at 2.66GHz and with a somewhat slower QPI (4.8GT/s against 6.8GT/s of the senior model of the series). Officially priced at below $300, this CPU is a good choice for a modern gaming platform with modest cost, so you may want to learn if the use of Intel’s flagship CPU has a practical effect in CoD: World at War. And if it does, will it be worth the $700 difference in price between the Core i7 920 and Core i7 965 Extreme Edition?

We used the resolution of 1680x1050 for this test (the most widespread resolution among PC gamers today), recording instantaneous speed in the test scene for one minute. There were five different graphics quality profiles:

Besides writing down the performance data we also used Fraps 2.9.6 to capture a few screenshots. They will help us evaluate the difference in graphics quality between the mentioned profiles. So, here are the data we obtained:


Neither card is limited by the CPU at the highest graphics quality settings including 4x MSAA: there is no difference between the Core i7 965 Extreme Edition and the Core i7 920. The slump at 50 seconds on the Core i7 920 based system must be accidental. Moreover, the frame rate is never lower than 37fps, which is far above the minimum comfortable level.

We can also see the Nvidia solution enjoy a large advantage over its opponent. You could see this above when we tested performance-mainstream graphics cards. The visual quality is the highest in this mode thanks to full-screen antialiasing.


When we turn 4x MSAA off, the GeForce 9800 GTX+ is still far ahead but there is a limit of speed at about 91-93fps. Interestingly, this ceiling is not due to the system’s CPU because it is the same irrespective of the CPU model. There must be some fundamental limitations imposed by the game engine. As we have already found out, no graphics subsystem, including GeForce GTX 280 SLI and Radeon HD 4870 3-way CrossFireX, can deliver a frame rate higher than this limit.

The image quality is the same as in the previous mode save for micro-geometry such as wires, thin branches, etc. Such details of the scene look worse without full-screen antialiasing but the difference isn’t huge. We’d recommend this mode for playing the game on a Radeon HD 4850. Turning 4x MSAA off on this card helps raise the bottom speed from 30-31 to 45-50fps to ensure a reserve of speed for the action-heavy scenes the game abounds in.


When the game uses less detailed textures, the GeForce 9800 GTX+ hits against the mentioned performance ceiling throughout the entire test, so there is no point in switching to such graphics quality settings on this or more advanced card from Nvidia. We can also see the CPU affecting the frame rate for the first time here: the Core i7 920 system has wider fluctuations of speed, even though it never sinks below 54fps. The Radeon HD 4850 touches the ceiling occasionally and is never slower than 50fps.

The lower-quality textures in our High profile do not make the game look ugly. It is small details that suffer the most, particularly the relief created by means of normal maps which can be perfectly seen in the screenshots with the rifle. There is no point in using such settings on a Radeon HD 4850 or better graphics card, though.


The Normal profile aims to achieve an acceptable quality of textures while turning off all the additional special effects. The result is not very eye-pleasing, though. It doesn’t make sense in terms of performance, either. The tested cards both reach the performance ceiling and are not slower than 90fps overall. There is no significant difference between the junior and senior Core i7 processors just like in the previous cases.


As expected, the Low profile looks awful even in comparison with Normal. The game is no fun with such visuals. You may only want to use such settings if you want to run the game on an integrated graphics core.

So, our test data suggest that the speed of Call of Duty: World at War is not limited by the CPU, at least if you use a good enough CPU. The flagship Core i7 model doesn’t seem to be worth the money asked for it because the junior model is just as good. As for image quality settings, you may want to use our High settings for entry-level cards such as Radeon HD 4670 or GeForce 9600 GT, but there is no point in choosing settings lower than Ultra for more advanced graphics solutions.


The WW2 theme has been explored in numerous video games, and we could hardly expect Call of Duty: World at War to be truly original. However, it is a well-made sequel to a popular series, which makes up for the lack of originality. The single-player mode is exciting and diverse thanks to its two main plotlines describing the Pacific campaign and the Eastern Front events, but the cooperative mode available in the PC version of the game is even more fun to play as you can fight shoulder to shoulder with four friends of yours. We guess the main drawback of the single-player mode is the console-like system of saves with control points. You cannot save anywhere when you want to. On the other hand, some gamers appreciate this system for keeping the player under constant stress because you cannot easily replay a difficult stretch. The death match mode is almost the same as in Call of Duty: Modern Warfare and includes the same character upgrade system which allows to develop your skills, acquire new weapons, and open new game modes.

CoD: World at War is not a breakthrough when it comes to the visuals, either. It uses the same rendering engine as CoD: Modern Warfare, but looks good enough even for the year of 2009 thanks to support for dynamic lighting, HDR, depth-of-field effects, relief maps, etc. Additionally, the engine features the ability to calculate the piercing strength of bullets for specific materials, increasing the realism and making the gamer search for really solid shelters. You may regret your trying to hide from enemy fire behind a thin wall, just like in the real world. The game looks nice at the highest graphics quality settings, especially together with full-screen antialiasing, but the modest settings of our High profiles do not spoil the picture much. It is when you turn off the special effects that the visuals degenerate greatly, making the game downright ugly. We don’t recommend you doing that if you’ve got a Radeon HD 4670 or better card.

As for the game’s preferences regarding graphics hardware, it runs the fastest on the GeForce GTX 295 which ensures superb results even at 2560x1600 with 4x MSAA. The Radeon HD 4870 X2 and Radeon HD 4850 X2 are not that impressive. However, the latter is appealing because costs less but delivers almost the same speed as the senior model. The GeForce GTX 280 is very good at resolutions up to 1920x1200 inclusive. It is no worse than the Radeon HD 4870 X2/4850 X2 in the total of its consumer properties considering its lower level of noise and power consumption.  

Nvidia’s solutions are superior in the mainstream class, the GeForce GTX 260 being especially good. Although this graphics card is not much better than the GeForce 9800 GTX+ in sheer speed, its 55nm GPU boasts excellent electrical and thermal characteristics, surpassing the hotter and less economical Radeon HD 4870 1GB. The latter ensures a comfortable frame rate at every resolution typically used by owners of mainstream graphics cards, though. The Radeon HD 4850 was slow in our test. We guess its low results are due to the low core frequency. Besides, the game engine may be not optimized well for the superscalar design of the RV770’s execution subunits. The memory subsystem is not at fault here as is indicated by the results of the GeForce 9800 GTX+.

The GeForce 9800 GT is the best inexpensive graphics card that can be used for gaming. Unlike the Radeon HD 4830, it not only provided a good bottom speed at 1680x1050 with full-screen antialiasing but also ensured a generous reserve of average frame rate. The bottom-mainstream products were limited to 1280x1024 at the same graphics quality settings, and Nvidia’s solutions are superior again. However, Call of Duty: World at War is not the only game available for play, so we’d recommend you to consider the Radeon HD 4830. As our tests suggest this card is competitive even to the GeForce 9800 GT, let alone GeForce 9600 GT.

Our additional CPU-focused test did not reveal any difference between the two models of Intel Core i7 processors: the senior i7 965 Extreme Edition with a clock rate of 3.2GHz and the junior i7 920 with a clock rate of 2.66GHz and a cut-down QPI. Thus, there is no point in paying $900 more for the CPU when it comes to playing Call of Duty: World at War. Moreover, we recommend the Core i7 920 processor for a modern and high-performance gaming platform as a CPU with all the features of Intel’s new CPU series and at a modest price of below $300.

We hope Call of Duty: World at War will run fast on your computer. Being a well-made WW2-focused shooter, it is, perhaps, not free from certain drawbacks such as the control point based system of saves, but it will surely be appreciated by CoD fans and everyone who is interested in WW2-inspired video games.