This means that the Nvidia 16bit is not valid?
ATI Technologies COO and President David Orton said last week internal Pixel Shader precision in ATI VPUs will be 32-bit extremely soon. Currently ATI Technologies DirectX 9.0-supporting graphics processors provide only 24-bit internal precision for Pixel Shaders – enough for current shaders and fully corresponding to Microsoft’s requirements, but still lower than generally-recognized 32-bit processing precision.
Answering questions of PC Watch web-site in
David Orton said that his company does not have plans to keep the internal precision of its Pixel Shader processing unit at 24-bit. He said that soon ATI’s hardware will get 32-bit internal precision, though, he denied to comment in which VPU this new feature will be implemented. Provided that Microsoft requires internal 32-bit precision for its Pixel Shaders 3.0 to be available next year, it is logical to anticipate 32-bit internal precision in ATI’s next-generation ATI R420 graphics processor.
Currently ATI’s R300, R350, R360, RV350 and RV360 graphics processors used in the RADEON 9500-, RADEON 9600-, RADEON 9700- and RADEON 9800-series of graphics cards support 24-bit internal precision for Pixel Shaders – the minimum precision required by Microsoft Corporation for DirectX 9.0. NVIDIA’s GPUs support 16-bit and 32-bit internal precision for Pixel Shaders.
ATI Technologies also does not say “no” to 64-bit internal precision, but notes some substantial difficulties with its implementation.
Talking about longer-term future, Orton indicated Pixel and Vertex Shaders 4.x – a part of DirectX 10 and Longhorn – to be extremely flexible and resemble each other pretty well in terms of formats. This means that graphics processing units will be yet another step closer to central processing units. Moreover, executing units or unit for Pixel Shaders and Vertex Shaders are anticipated to be the same. This may probably concern ATI’s future generations of VPUs – R500 or even R600.
Orton warned about too rapid process of innovation – according to the President of ATI, this may harm industry more than it may benefit. A very serious problem with technology improvement is ability of software makers to keep the same pace and not to be a couple of steps behind the hardware. Another issue is the necessity to shrink the size of graphics chips to keep their cost at an affordable level. One more concern is the cost of memory – 256-bit bus will stay here for long and graphics companies will have to develop and utilize high-speed, but cost effective types of DRAM, including GDDR-II and GDDR3.
Basically, all the issues mentioned above mean that graphics technology cycles are very likely to last about 18 to 24 months, not 6 to 12 month, as before, just as Mr. Orton indicated earlier this year. Furthermore, it means that revolutions in hardware are not very likely to happen frequently than once every 18 to 24 months.