Bookmark and Share


The two leading developers of standalone graphics processing units – ATI, graphics product group of Advanced Micro Devices, and Nvidia Corp. – have both issued statements claiming that two recently released DirectX 10 benchmarks either favour competing hardware or degrade performance of own graphics processors.

The Lost Planet and ATI

The first company to complain about inadequacy of a benchmark results was AMD, who said that due to the fact that Lost Planet: Extreme Condition was a part of Nvidia’s “The way it’s meant to be played” (TWIMTBP) program and because specialists from ATI did not have an opportunity to tweak their drivers for the title, performance measurements results obtained on the demo for PC were not correct.

“Lost Planet is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion.  Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game,” a statement by Advanced Micro Devices said.

For some reason, Capcom, the creator of Lost Planet, which is also available on Microsoft’s Xbox 360 (the console, which has ATI-developed graphics processor called Xenos inside) and is also a part of Games for Windows program, decided not to respond to the statement of AMD.

Call of Juarez and Nvidia

Shortly after a demo version of Call of Juarez with DirectX 10 effects was released, Nvidia Corp. issued an announcement claiming that the recently unveiled Call of Juarez DirectX 10 benchmark degrades performance of the GeForce 8-series graphics processing units (GPUs) without improving speed or quality on graphics processors from ATI or Nvidia.

“The benchmark includes new code that substantially reduces performance of Nvidia hardware. Further, there are no perceivable improvements in visual quality, and ATI/AMD performance was not improved. […] We encourage you to take a long, hard look at this new Call of Juarez DX10 benchmark before deciding to use it, as it does not properly reflect typical DX10 performance on Nvidia GeForce 8 hardware,” a statement by Nvidia reads.

Ironically, the Call of Juarez title is also a part of Nvidia’s TWIMTBP program, however, this game had been known for having issues with Nvidia GeForce hardware for quite some time and, for an unknown reason, Nvidia remained tight-lipped about those issues and did not release any updated drivers that would correct them for months.

Nvidia and Techland Share Opinions

Techland, the developer of Call of Juarez game, decided to publicly respond to accusation by Nvidia. In fact, the software maker also accused Nvidia in misleading claims.

DirectX 10 and FSAA

Firstly, Nvidia accused Techland of using software FSAA resolve and not using hardware approach, which automatically degraded performance on the GeForce 8 hardware. The company did not provide performance drop numbers.

“Nvidia’s DirectX 10 hardware-based multi-sampled AA resolve is disabled in the benchmark, with all Nvidia GPUs now being forced to use software-based AA resolve similar to the ATI DirectX 10 GPU, which lacks hardware AA resolve. This artificially deflates performance on Nvidia graphics cards and users are not getting the benefit of a key Nvidia hardware feature,” the claim by Nvidia reads.

On this, the software developer said that using of software-based FSAA that relies on pixel shaders allows game developer to more precisely control image quality in case of HDR rendering. According to Techland, hardware MSAA may no longer be adequate for modern games.

“Before the arrival of DirectX 10, previous graphics APIs only allowed automatic multi-sample anti-aliasing (MSAA) resolves to take place in interactive gaming applications. This automatic process always consisted of a straight averaging operation of the samples for each pixel in order to produce the final, anti-aliased image. While this method was adequate for a majority of graphic engines, the use of advanced high dynamic range rendering (HDR) and other techniques, such as deferred rendering (DR) or anti-aliased shadow buffers require programmable control over this operation due to the nature of the math1ematical operations involved. This means that the previous approach using a simple average can be shown to be math1ematically and visually incorrect (and in fact it produces glaring artifacts on occasions),” explained a spokesperson for Techland.

“All DirectX 10 graphics hardware which supports MSAA is required to expose a feature called ‘shader-assisted MSAA resolves’ whereby a pixel shader can be used to access all of the individual samples for every pixel. This allows the graphics engine to introduce a higher quality custom MSAA resolve operation. The DirectX 10 version of ‘Call of Juarez’ leverages this feature to apply HDR-correct MSAA to its final render, resulting in consistently better anti-aliasing for the whole scene regardless of the wide variations in intensity present in HDR scenes. Microsoft added the feature to DirectX 10 at the request of both hardware vendors and games developers specifically so that we could raise final image quality in this kind of way, and we are proud of the uncompromising approach that we have taken to image quality in the latest version of our game,” the explanation by game developer states.

A Hidden Parameter Degrades Nvidia GeForce Performance

Nvidia also said that a “hidden” parameter in the Call of Juarez DirectX 10 version actually degrades performance of Nvidia hardware without improving image quality.

“A hidden ‘ExtraQuality’ parameter only accessible via a configuration file is automatically enabled when the benchmark is run, no matter what the setting in the configuration file. This setting has no apparent visual quality enhancement, but reduces Nvidia GPU performance,” the statement by the graphics chip developer reads.

According to Techland, “ExtraQuality” is a visual quality setting enabled by default in the DX10 version of Call of Juarez.  In benchmark mode, “ExtraQuality” mode does two things:

  • First, it increases shadow generation distance in order to apply shadowing onto a wider range of pixels on the screen, resulting in better quality throughout the benchmark run.
  • Second, it increases the number of particles rendered with the geometry shader in order to produce more realistic-looking results, like for example waterfall, smoke and falling leaves.

“ExtraQuality is designed as a default setting to reflect the visual improvements made possible by DX10 cards and is not meant to be disabled in any way,” claims game developer.

Changes to Shaders – The Roots of All Evil

Nvidia also accused Techland of making changes to shaders that are present in the Call of Juarez, which resulted in up to 14% of performance drop for Nvidia’s GeForce.

“Changes to shaders that deflate Nvidia performance by approximately 10% – 14%, without improving visual quality on either Nvidia or ATI GPUs,” reads the statement by Nvidia’s marketing department.

But the game developer states that all updates to shaders made in the final version of the Call of Juarez benchmark for DirectX 10 were made to improve performance or visual quality or both, for example to allow anisotropic texture filtering on more surfaces than before.

“This includes the use of more complex materials for a wider range of materials. At the same time we implemented shader code to improve performance on the more costly computations associated with more distant pixels. Some materials were also tweaked in minor ways to improve overall image quality. One of the key strengths of Nvidia’s hardware is its ability to perform anisotropic filtering at high performance so we are puzzled that Nvidia’s complains about this change when in effect it plays to their strengths,” the spokesman for Techland said.

Nvidia Wants Better Shadows

The graphics chip company said that Techland intentionally decided to depreciate the strength of Nvidia hardware.

“Default settings for the new benchmark have shadow quality set to ‘Low’. Note that Nvidia hardware is much stronger in shadow-mapping than the competition,” reads the claim by Nvidia.

Techland said that even though it set its image quality settings to “ExtraQuality”, setting of shadow quality to “Low” was dictated by necessity to provide “good user experience”.

“Default settings were chosen to provide an overall good user experience. Users are encouraged to modify the settings in the ‘CoJDX10Launcher’ as required. Using larger shadow maps is one option that we would encourage users to experiment with, and in our experience changing this setting does not affect Nvidia’s comparative benchmark scores greatly,” the statement says.

Disappointment Everywhere

“All together, the code changes present in the new Call of Juarez benchmark only slow down Nvidia hardware significantly while contributing no discernable improvement in visual quality on Nvidia or ATI hardware,” the discouraged statement by Nvidia reads.

“We are disappointed that Nvidia have seen fit to attack our benchmark in any way. We are proud of the game that we have created, and we feel that Nvidia can also be proud of the hardware that they have created. Nonetheless these artistic decisions about the settings of a game rightly belong in the hands of the games developer, not the hardware manufacturer,” explained the upset software maker.

Benchmarks Vs. Games

The company, which is currently the No. 1 supplier of discrete GPUs, behaved in a similar way back at the dawn of DirectX 9 era and Futuremark’s 3DMark03 benchmark, claiming that the software developer decided to intentionally show Nvidia GeForce FX hardware in a bad light without any grounds, though, the developer did not agree. In fact, ATI also merely improved its scores at that time by 1.9%.

But both ATI and Nvidia are much more eager to comment on benchmark results and improve results obtained in industry-wide used benchmarks rather than to carefully work with software developers to ensure that their applications would run properly and exposed all the capabilities of the modern graphics processing units.

ATI Radeon HD 2900 XT has poor performance in a number of popular games, including S.T.A.L.K.E.R.: Shadow of Chernobyl, Supreme Commander and some other. Moreover, at the launch of the Radeon HD 2900 XT neither of the company’s publicly released drivers actually supported highly-advertised acceleration of HD DVD content encoded with VC-1 codec. But AMD does not issue statements that its hardware works too slow in those titles or explanations why the previous-generation Radeon X1000 does not accelerate HD content properly. Nonetheless, the firm decided to make a claim about inadequacy of the Lost Planet demo.

Nvidia GeForce 8 hardware had issues with Call of Juarez, Splinter Cell: Double Agent and some other titles, including F.E.A.R. and Prey, where GeForce 8800 GTS 640MB performs inline or slower than Radeon X1950 XTX. But the firm only decided to issue a public comment, when it came to a benchmark, which should show the GeForce 8 and the Radeon HD 2000 in all their glory. The company also did not update its ForceWare driver for Microsoft Windows XP for half of a year, causing concerns about the lack of support for the highly-advertised GeForce 7950 GX2 board and SLI technology overall. Though, it decided not to “disturb” anyone with improvements for the previous-generation hardware. It is also unclear whether the firm plans to make any updates for support of highly-discussed quad SLI technology for those users of high-end machines who plan to migrate to Microsoft Windows Vista operating system.

For some strange reasons, even now Nvidia advices to use non-WHQL drivers to benchmark Lost Planet: Extreme Condition, while ATI provides non-WHQL release candidate 2 drivers for testing the Radeon HD 2400- and 2600-series graphics cards. To ATI’s honour it should be kept in mind that a WHQL driver for the latter boards will be available on schedule already next month, in July, whereas its arch-rival has not managed to get an appropriate WHQL driver (version 158.43) out of the door in more than a month time after it released the Lost Planet Extreme Condition demo on its nZone web-site.

Obviously, benchmark results and review wins drive businesses of ATI and Nvidia primarily in the high-margin retail market, but maybe it is time to start carefully ensuring stability, image quality, high performance and user experience in real-world situations?


Comments currently: 5
Discussion started: 06/20/07 09:14:01 PM
Latest comment: 06/25/07 06:55:56 PM

Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture