Articles: Graphics
 

Bookmark and Share

(32) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 ]

DirectX 10.1

It’s not the first time in ATI Technologies’ history that it supports a standard that hasn’t yet become really widespread. For example, the ATI Radeon 8500 supported the Pixel Shader 1.4 specification, part of DirectX 8.1. That specification was more flexible than Pixel Shader 1.0/1.1/1.3 supported by Nvidia’s chips and helped achieve better visual effects, yet it didn’t really took off during the lifecycle of the Radeon 8000 series due to the limited support on the GPU part. The same story happened with Shader Model 2.0b supported by the ATI Radeon X700/X800/X850 series as well as with Shader Model 2.0a Nvidia promoted with its GeForce FX. DirectX 9 Shader Model 3.0 only became more or less popular after games began to be ported from the gaming consoles Microsoft Xbox 360 and Sony PlayStation 3, i.e. after the life term of GeForce 6 and early GeForce 7.

It’s all normal since the game developer always works with the most widespread standards. He isn’t interested in using capabilities provided by GPUs from one GPU maker, be it ATI or Nvidia. A new standard only becomes popular and widely used when it is supported by a large number of contemporary GPUs and when these GPUs deliver high enough performance for using the new features.

When Nvidia released the first DirectX 9.0c supporting GPU, NV40, the features provided by the new standard (Shader Model 3.0) were long unused, not only due to the lack of support from ATI Technologies but also due to the relatively low performance of the GeForce 6800 series. It was only with the third generation of SM3.0-compatible GPUs from Nvidia (GeForce 7600/7900) and with the first generation of such products from ATI (Radeon X1000) that the standard came to us for real: DirectX 9 Shader Model 3 was supported by the two leading GPUs developers whose products delivered enough performance for using it and also by the two leading gaming consoles.

Currently, there is a transition from DirectX 9.0c to DirectX 10 as Windows Vista is being promoted as a gaming platform. It’s going to take some time for DirectX 10 to become the dominating standard in the PC game industry, but AMD/ATI is already offering DirectX 10.1 in its new graphics core. Let’s see what perspectives this may have.

According to unofficial sources, DirectX 10.1 is going to be the first and only subset of DirectX 10. It will be officially added into Windows Vista as part of Service Pack 1 to be released in the first half of 2008. The main innovations in DirectX 10.1 include:

  • Cubic map arrays allow achieving a good speed when using global scene lighting by addressing several cubic maps during one rendering pass. This global lighting method involves the calculation of indirect scattered light, refractions, soft shadows, and more accurate reflections.
  • Improved deferred rendering and FSAA: support of independent blending modes for each MRT; obligatory support of 4x MSAA; writing pixel coverage mask from the shader; pattern sample selection; multi-sampling buffer fetch; support for filters to determine pixels requiring antialiasing.
  • Increased number of vertex shader registers: 32 as opposed to 16 in the DirectX 10 specification.
  • Gather4 support: similar to the Fetch4 feature in ATI Radeon X1000, it allows fetching a 4-pixel block (2x2) from a single-component texture. It accelerates the processing of shadow maps and improves the quality of shadowing.
  • Improved blending and filtering techniques: support for the LOD instruction that returns the level of detail for a filtered texture sample; INT16 for blending and FP32 for filtering (as opposed to INT8 and FP16 in DirectX 10).

That’s quite a long list of innovations. The techniques for accelerating the global lighting of the scene are especially exciting as they can help improve the quality of lighting in games dramatically. But are we going to see these new capabilities of the RV670 in real games during the lifecycle of the ATI Radeon HD 3000 series? Obviously, not. First, they can only be utilized after the release of Vista Service Pack 1. Second, the current generation of Nvidia’s GPUs, even the latest G92, do not support cubic map arrays and is unlikely to be able to calculate the global scene lighting fast enough. The rest of DirectX 10.1 features do not look like a breakthrough, being either improvements on the appropriate features of DirectX 10 or standardizations like the obligatory support of 4x MSAA or standard subpixel masks for antialiasing.

So we are quite sure of an indifferent attitude of game developers towards the innovations at least in the first half of this year. Games with support for DirectX 10.1 can be expected to arrive no sooner than the second half of 2008 or even when the market is filled with DX10.1-compatible solutions from both GPU developers. It is also possible that the Radeon HD 3000 series just won’t allow using DX10.1 features due to low performance, and game developers will have to wait for the next, faster, generation of GPUs from AMD and Nvidia. There is a piece of good news, though. ATI says DirectX 10.1 is the first and only superset for DirectX 10 planned by Microsoft, so it will be used sooner or later until DirectX 11.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 ]

Discussion

Comments currently: 32
Discussion started: 01/25/08 05:49:45 PM
Latest comment: 02/26/08 02:40:18 AM

View comments

Add your Comment