by Anton Shilov
04/28/2010 | 09:53 PM
Several web-sites have revealed more details about the forthcoming third graphics card in the GeForce GTX 400-series family. As expected, the graphics board will be released on the 1st of June, 2010, and will carry sub-$300 price-tage, however, its performance will be somewhat lower than it could be expected.
The GF100 (NV60, G300) graphics processor in case of the GeForce GTX 460 model will sport only three graphics processing clusters (GPCs) in total so to maximize yield and minimize the price, according to media reports (1, 2). The more expensive models – 470 and 480 – have four GPCs active, but with one or two disabled streaming multiprocessors (each SM has 32 stream processors and 4 texture units). The less expensive models will either have one GPC disabled completely or will have four SMs deactivated in different GPCs. Additionally, Nvidia will disable two out of six memory controllers, two out of six raster operation (ROP) partions and part of the L2 cache.
As a result, the GeForce GTX 460 graphics chip will have 384 stream processing units, 48 texture units, 32 ROPs, 256-bit memory interface and 512KB L2 cache. The graphics board itself will carry 1GB of GDDR5 memory and many manufacturers plan to use their own print-circuit boards for the product, hence, it can be expected that memory speeds on different graphics cards will vary.
Previously it was expected that Nvidia will disable only three SMs, following the downgrade patter for the GeForce GTX 480 and 470, however, it looks like in order to ensure high output, affordable price and low power consumption, the company made decision to either deactivate four SMs or a whole GPC. Potentially, in case there are not a lot of defects on the chip in general, deactivating a whole GPC may allow to clock the product at higher frequencies.
The price of the novelty, according to Fudzilla web-site, will be in the range between $279 and $299, which will put it directly against ATI Radeon HD 5850. Time will tell, which graphics solution is better.
Nvidia did not comment on the news-story.