by Alexey Stepin , Yaroslav Lyssenko
07/10/2007 | 03:30 PM
It is no secret that Nvidia is currently the leading supplier of discrete graphics solutions with DirectX 10 support. The company’s products cover virtually the entire price range from below $100 to over $600. This firm standing comes largely as the result of a very aggressive marketing strategy. For example, the G80, Nvidia’s first GPU with DirectX 10 support, was announced as far back as November 8, 2006. Two G80-based graphics cards, GeForce 8800 GTX and GeForce 8800 GTS, were announced then, too (for details see our article called Directly Unified: Nvidia GeForce 8800 Architecture Review).
The arrival of the new family might have been called rather too hasty as it took place even before the official announcement of Microsoft’s new OS, Windows Vista. The first production batch of GeForce 8800 had the “wrong resistor” problem and was incompatible with some mainboards (for details see our article called 25 Signs of Perfection: Nvidia GeForce 8800 GTX in 25 Benchmarks). The drivers called for improvements, too. But most of those drawbacks were eventually corrected and the GeForce 8800 GTX and GTS enjoyed the reputation of the best gaming graphics cards deservedly.
Top-end graphics cards accounting for but a small share of the total sales volume, Nvidia made inroads into the segments of more affordable solutions later on. February 12, 2007, it launched the rather successful GeForce 8800 GTS 320MB. Then a second attack followed on the 17th of April with the announcement of mainstream GPUs, G84 and G86, and graphics cards based on them (GeForce 8600 and 8500).
Like with the GeForce 8800 GTX, there were some problems with the newer models. Particularly, the GeForce 8800 GTS 320MB would suffer from inefficient memory management in the driver and it would sometimes have a performance hit even in those applications where 256MB of memory was quite enough. The GeForce 8600 GTS proved to be rather slow-performing for its class. Anyway, the new solutions took their places whereas AMD was only making ready to introduce its own DirectX 10-compatible GPUs.
On the 2nd of May 2007, Nvidia added another model into the GeForce 8 line-up. That time it was a top-of-the-line solution. The luxurious GeForce 8800 Ultra was not a breakthrough since its performance was not much higher than that of the GeForce 8800 GTX, yet it proved Nvidia’s ability to produce the world’s fastest DirectX 10-compatible graphics cards.
AMD/ATI finally responded on the 14th of May, but the Radeon HD 2900 XT laid no claim to the title of the king of 3D. Our tests showed that it was generally slower than the GeForce 8800 GTX and, in some applications, slower than the GeForce 8800 GTS even (for details see our article called Almost a Champion: ATI Radeon HD 2900 XT Gaming Performance Review). The new card from AMD proved to be a good, if not the best, solution in the $399 price sector, but its high power consumption, driver-related problems, and belated arrival didn’t allow it to shake Nvidia’s positions. The less expensive Radeon HD 2600 XT and Radeon HD 2400 XT were announced along with the Radeon HD 2900 XT, but the company began mass shipments of these GPUs only by the middle of June, which played into Nvidia’s hands again.
The release of inexpensive RV610- and RV630-based products can surely affect the situation on the market of discrete DirectX 10 compatibles, but it’s only nearer to the fall of this year that the consequences of this move by AMD will have become apparent. Meanwhile, Nvidia goes on strengthening its positions on the market of inexpensive DirectX 10 graphics cards. On the 20th of June, the GeForce 8 series was complemented by yet another model, this one belonging with the entry-level price segment. The GeForce 8400 GS is expected to cost less than $89 and replace the outdated GeForce 7600 GS and 7300 GT.
In this review we’ll be talking about Nvidia’s products from the GeForce 8800 series. Their specs are listed in the following table:
Click to enlarge
As you can see, the GeForce 8 series indeed covers every price segment of the market but there seems to be a gap between GeForce 8600 GTS and GeForce 8800 GTS 320MB. It should be filled in with a solution priced at $229-299, with better performance than GeForce 8600 GTS, and capable of rivaling the Gemini Radeon HD 2600 XT and/or an even more advanced GPU from ATI. Perhaps we’ll see such a solution by the end of this year.
Here is a list of the graphics cards we’ll test today:
We’ll pay special attention to the GeForce 8600 GTS and GeForce 8500 GT because the former is represented by a graphics card with a passive cooling system (Gigabyte GV-NX86S256H Silent Pipe 3) while the latter (represented by MSI NX8500GT-TD256E) is visiting our labs for the first time.
This product comes in Gigabyte’s traditional upright box.
Gigabyte’s graphics card packages are designed to match the included game: the GV-NX76T256D-RH model we reviewed earlier had a box with a Civilization IV design while this box has a picture from Supreme Commander . A sticker on the box reports that the graphics card has only modern solid-state capacitors on board. There is a main cardboard box inside the external colorful wrapping. Besides the graphics card, securely fixed in a polyurethane-foam tray, we found the following accessories:
Not too many of them, yet there is everything you may want to use the graphics card normally. Moreover, Supreme Commander was deservedly praised as Best PC Strategy Game of E3 2006 and is a candidate for being put on the official games list of World Cyber Games 2007. So, the user gets a premium-class gaming title besides a silent graphics card.
We didn’t find any flaws in the user manual, so this graphics card kit deserves our praise. However, it is not the accessories that make this graphics card special.
Nvidia and AMD/ATI do not require that mainstream graphics cards based on their GPUs be exactly compliant with the reference designs. This card from Gigabyte is a good illustration of this point because its PCB has got a unique design.
Using a blue solder mask, this PCB is smaller than the reference one and hardly has any point of similarity with it at all. Even the memory chips are rotated by 90 degrees here. The power circuit is very simple since the card gets all the power it needs from the PCI Express slot alone and does not have an external power connector. The load capacity of the slot is 75W while the power consumption of the GeForce 8600 GTS is much lower, not reaching 50W even. We only wonder why Nvidia put an additional power connector on the reference card while they could have done without it, making the whole design cheaper. We doubt that even an overclocked GeForce 8600 GTS can surpass the load capacity of the PCI Express slot.
The card carries a revision 400-A2 G84 chip with support of HDCP and dated the 10th week of 2007. The GPU configuration is standard with 32 unified shader processors, 2 texture-mapping units with 8 address and 8 filter units in each, and 8 raster operators. The GPU is clocked at the reference frequencies of 675MHz for the main domain and 1450MHz for the shader domain. To be exact, the latter frequency is actually 1458MHz (which is exactly divisible by 9, 18, 27, and 54) due to the specifics of the GeForce 8 clock generator.
Like the reference card, the Gigabyte is equipped with Samsung K4J52324QE-BJ1А memory working at 1.9V voltage. Four such 512Mb chips provide a total of 256MB of graphics memory with a 128-bit memory bus. The memory frequency is 1000 (2000) MHz, which is the standard for GeForce 8600 GTS and the rated frequency of the chips.
The PCB design developed by Gigabyte allows to install a HDMI connector instead of the top DVI-I port. It is a logical solution since HDMI is currently gaining more popularity while the support of hardware decoding of H.264 video format by the G84 and G86 processors makes them suitable for use in a home multimedia center. The Gigabyte card also suits this application perfectly with its noiseless cooling system we will describe shortly.
Every connector, including the MIO interface, is covered with a protective plastic cap.
As opposed to the second generation of Gigabyte’s passive coolers we described in our Gigabyte GV-NX76T256D-RH review, every component of the Silent Pipe 3 system is located on one side of the PCB.
The system is based around a solid aluminum heatsink with thick diagonal ribs that also serves as a cooler foundation fastened to the PCB with four spring-loaded screws and an X-shaped plastic plate. This foundation carries a copper plate that has contact with the GPU die and directs the heat flow to the two heat pipes.
One, S-shaped, pipe is longer and is meant to distribute heat in the main heatsink more uniformly, besides transferring it to the second, auxiliary, heatsink. The other pipe is shorter and transfers heat to the second heatsink. The first pipe has a soft pad that prevents the cooler from misalignment. A yellow thermoplastic material is used as a thermal interface. Its efficiency is lower than that of the traditional thermal grease, but it’s not a problem considering the modest thermal characteristics of the G84 chip. The Silent Pipe 3 cooler does not cool the memory chips. The QE chips generate little heat even at frequencies of about 1000 (2000) MHz, probably due to reduced voltage.
The second heatsink consists of thin aluminum plates and a copper base. Like in the GV-NX76T256D-RH cooler, the plates go out of the cooler casing through the slits in the mounting plate to improve overall cooling efficiency. With this design the cooler occupies two slots, but this is an acceptable tradeoff for the total noiselessness. Note that the main heatsink goes out of the PCB dimensions a little. You should take this into account when you are choosing the system case if you want to make the GV-NX86S256H the heart of your multimedia system.
The box of this card looks like the box of the NX8600GTS, but the silvery color has been replaced with blue. Even the helmet of the fantastical warrior is now painted blue.
The package itself is simple – an ordinary cardboard box. As opposed to more expensive cards from MSI, there is no protective foamed-plastic tray here. Everything is laid out into cardboard compartments and the card is additionally wrapped into an antistatic pack. Besides it, the box contains:
Entry-level graphics cards, like this one from MSI, come with a minimum of accessories. This is just what we have here: the accessories are scanty, yet include everything you need to use the card. There is only one DVI-I → D-Sub adapter here because the card is equipped with only one DVI-I port. The brief user manual is designed like a large colorful poster. It is not a very convenient format, but you’ll get the necessary information from it.
Unfortunately, this graphics card does not support DHCP, and you’ll have to use a VGA cable to watch Blu-ray or HD DVD discs with DHCP on your LCD panel, which is going to have a negative effect on image quality at resolutions like 1920x1080. You can use the SlySoft AnyDVD HD software to disable the protection instead, but this means spending $79 which negates the price difference between the GeForce 8500 GT and more advanced graphics card models.
The packaging and accessories of the MSI NX8500GT are good for an entry-level graphics card. Let’s now take a closer look at the product considering that it is the first GeForce 8500 GT to ever come to our test labs.
The NX8500GT uses a reference PCB as is indicated by the Nvidia mark, but the card is obviously assembled on MSI’s own facilities because it employs a red solder mask.
The G86 consuming even less power than the G84, the power circuit is very simple here. An additional power connector is missing as the load capacity of the PCI Express slot is going to be more than enough for this graphics card. The wiring of the PCB is rather sophisticated, though, due to the use of DDR2 memory.
Hynix HY5PS56162A chips are organized as 16Mx16 and have a capacity of 256Mb each. Eight such chips provide a total of 256MB of graphics memory accessed across a 128-bit memory bus. The chips work at 1.8V voltage and have a rated frequency of 400 (800) MHz. This is the frequency the chips are indeed clocked at by the card, in compliance with Nvidia’s official specifications.
The die area of the G86 chip is considerably smaller in comparison with the G84 as you can expect considering the smaller amount of transistors, 210 and 289 million, respectively. There is no protective frame around the die, just like on G84-based cards. Our graphics card carries a revision 300-A2 G86 chip dated the 4th week of 2007. The GPU contains 16 unified shader processors, 1 TMU with 8 address and 8 filter units, and 8 raster operators. The official GPU clock rates of the GeForce 8500 GT model are 450MHz and 900MHz for the main and shader domain, respectively, but the real frequencies are 459MHz and 918MHz due to the specifics of the clock generator.
Like the reference GeForce 8500 GT, the MSI NX8500GT lacks a MIO connector, but supports SLI technology, even though we doubt many people will use this opportunity. The two cards communicate via the PCI Express bus in SLI mode.
The card has one DVI-I port and one D-Sub port. It also offers a 7-pin mini-DIN connector for display devices with analog video inputs (S-Video, RCA or YPbPr).
The GPU is cooled by a simple round cooler with circular ribbing.
Its “copper” color shouldn’t deceive you: the cooler was milled out of a chunk of aluminum and then painted. The heatsink is blown at by a small translucent fan with a 2-pin connection. It is almost silent because the G86 doesn’t need much cooling. The cooler is secured on the PCB with two plastic spring-loaded clips.
To test the performance of Nvidia GeForce 8 graphics card family we assembled the following standard test platform:
Since we believe that the use of tri-linear and anisotropic filtering optimizations is not justified in this case, the AMD and Nvidia graphics card drivers were set up to provide the highest possible quality of tri-linear and anisotropic texture filtering. We have also enabled transparent texture filtering to achieve best image quality by selecting Transparency antialiasing in multisampling mode of Nvidia ForceWare drivers. As a result, our settings looked as follows:
We selected the highest possible graphics quality level in each game using standard tools provided by the game itself. The games configuration files weren’t modified in any way. Performance was measured with the games’ own tools or, if not available, manually with Fraps utility version 2.8.2. We also measured the minimum speed of the cards where possible.
We performed tests in 1280x1024/960, 1600x1200 and 1920x1200 resolutions. GeForce 8500 GT was tested in the most widely spread 1280x1024 resolution only.
The games that didn’t support 16:10 ratio were run in 1920x1440 resolution. We used “eye candy” mode everywhere, where it was possible without disabling the HDR or Shader Model 3.0. Namely, we ran the tests with enabled anisotropic filtering as well as MSAA 4x. We enabled them from the game’s menu. If this was not possible, we forced them using the appropriate ForceWare driver settings.
For our tests we used the following games and benchmarks:
The GeForce 8800 Ultra and GeForce 8800 GTX easily solve the task of delivering comfortable gaming conditions in every resolution, including 1920x1200. Note that there is no difference between the two cards in lower resolutions. The GeForce 8800 GTS and the GTS 320MB can be used for playing at 1600x1200 or, perhaps, even at 1920x1200 as their frame rates are never lower than 40fps. The 640MB model is 12% faster than the cheaper 320MB version in the highest of the tested resolutions.
The GeForce 8600 GTS cannot be used in resolutions above 1280x1024 unless you disable 4x MSAA. The hardware capabilities of the G84 are far inferior to those of the full-fledged G80. The GeForce 8500 GT is too slow to be used in modern games at highest graphics quality settings.
This is a very resource-consuming game and the resolution of 1600x1200 is available for comfortable play only on GeForce 8800 GTX and Ultra cards, between which the difference is small. The owner of any version of GeForce 8800 GTS will have to content himself with playing at 1280x1024. It is only in this resolution that the minimum of speed is near the critical 25fps mark. The GeForce 8600 GTS and 8500 GT are too slow in this test. As for the resolution of 1920x1200, even the GeForce 8800 Ultra looks humble in it: its average speed of 37fps with slowdowns to 22fps is not quite comfortable for playing a dynamic first-person shooter.
The game having a frame rate limiter, you should consider the min speeds of the cards in the first place. This parameter determines your playing comfort in Command & Conquer 3.
The seniors of the family are going to deliver highest performance in every resolution. The GeForce 8800 GTS and GeForce 8800 GTS 320MB can’t match the leaders and their speed bottoms out to below 25fps in high resolutions. The GeForce 8600 GTS allows playing this game with comfort in 1280x1024 only, if you use full-screen antialiasing. The GeForce 8500 GT has low performance and you can’t play normally on it unless you lower the resolution, disable FSAA, and reduce the level of detail.
We tested the game in the pure speed mode with enabled anisotropic filtering only, because it has problems when you turn on FSAA.
The senior GeForce 8 models deliver superb performance, making all resolutions, including 1920x1200, playable with comfort. The Ultra model ensures an additional reserve of speed in the highest resolution. The less advanced models are limited to 1600x1200 as their minimum speed is too low for comfortable play in the higher display modes.
The GeForce 8600 GTS isn’t much worse than the GeForce 8800 GTS at 1280x1024 but slows down suddenly in the next resolution as it has few shader processors and low-performance TMUs. Anyway, it does well for a mainstream graphics card. The characteristics of the GeForce 8500 GT are inferior even to those of the G84-based card and it cannot satisfy a gamer who wants to have the best image quality possible.
The current version of Gothic 3 does not support FSAA, so we benchmarked the cards using anisotropic filtering only.
The requirements of this game genre to the minimum and average speed are no as strict as those of first-person shooters, and every GeForce 8800 can be used to play in any resolution, including 1920x1200. The senior models ensure a higher reserve of speed in the highest display mode, but the juniors provide comfortable playing conditions as well.
The GeForce 8600 cannot give you an acceptable speed even in 1280x1024 due to a lack of computing power and low performance of its TMUs. The 16 TMUs of the G84 chip seem to be equivalent to 8 classic TMUs in performance. The GeForce 8500 GT fails this test: it cannot reach even 10fps in the lowest of the tested resolutions.
The game’s support for HDR is still deficient, therefore we tested the cards at the eye candy settings with 4x FSAA.
It’s like in the previous test: the GeForce 8800 GTX and Ultra easily cope with every tested resolution and are likely to deliver high performance in the yet-exotic 2560x1600 resolution as well. Every version of GeForce 8600 GTS allows using the native resolution on a 23” and larger LCD monitor although the speed may occasionally drop to below 15fps, which is the critical mark for Neverwinter Nights 2.
The GeForce 8600 GTS has problems with texturing speed again and its average speed is below the desired 15fps even at 1280x1024. The GeForce 8500 GT can’t run this game normally at all. We doubt that its speed can be improved much by disabling FSAA and switching to lower resolutions whereas the reduction of the level of detail makes the game less appealing visually and, accordingly, less fun to play.
We try to get the best quality of graphics from each game, so we choose HDR + 16x anisotropic filtering in favor of FSAA because this game does not support FSAA and FP HDR simultaneously. The game cannot run in widescreen modes unless you modify its .INI files and we test it in display modes with an aspect ratio of 4:3.
Although the GeForce 8800 GTX and Ultra are faster than the GeForce 8800 GTS/GTS 320MB, they are all limited to 1600x1200 resolution. The senior models offer a bigger reserve of speed, but the junior ones are never slower than 30fps. The GeForce 8600 GTS has an average speed of over 30fps in 1280x1024, yet its minimum of speed is below 20fps, which is uncomfortable. The GeForce 8500 GT performs worse, not reaching 20fps even. You just can’t play normally at such a low speed.
The game doesn’t support FSAA when you enable the dynamic lighting model, but loses much of its visual appeal with the static model. So, we benchmarked the cards in S.T.A.L.K.E.R. using anisotropic filtering only.
The GeForce 8800 Ultra enjoys a substantial advantage over the GeForce 8800 GTX here. It may not mean much in resolutions up to 1600x1200, but the 16% bonus in 1920x1200 won’t be superfluous considering how important it is to aim precisely in S.T.A.L.K.E.R. Both versions of GeForce 8800 GTS slow down in 1600x1200, so you should better use 1280x1024 if you’ve got one of them.
The less advanced graphics cards from the Nvidia GeForce 8 series don’t suit this game at all. You may try to get an acceptable speed on a GeForce 8600 GTS by disabling the dynamic lighting model, yet the game will look much poorer as the consequence.
The GeForce 8800 series cards can all ensure a comfortable level of performance here. The min speed of the junior models is never lower than 37-38fps even at 1920x1200. This is more than enough for any real-time strategy (you can recall Command & Conquer with its speed limiter set at 30fps). Note also the very small delta of min and average speed of the GeForce 8800 GTS and GTS 320MB in high resolutions.
The results of the weakest GeForce 8 varieties aren’t so brilliant. The GeForce 8600 GTS allows playing in 1280x1024 more or less comfortably, but the GeForce 8600 GT is unlikely to maintain the same speed. The GeForce 8500 GT fails yet another test. This graphics card is obviously not meant for gamers.
The GeForce 8800 GTS and GTS 320MB are limited to 1600x1200 whereas the GeForce 8800 Ultra and GTX deliver enough performance for you to play in the higher resolution as well.
You can try to use your GeForce 8600 GTS to play at a resolution of 1280x1024 but occasional slowdowns to below comfortable level are unavoidable. You’ll probably choose to disable FSAA or lower the resolution. The GeForce 8500 GT is only half as fast as the GeForce 8600 GTS and is far from providing an acceptable speed.
Looking through the overall scores of the cards you can see the GeForce 8600 GTS being slower than the GeForce 8800 family while the GeForce 8500 GT has a very poor result. Although there was only one instance, namely in 1920x1200 resolution of Battlefield 2142, when the GeForce 8800 GTS 320MB was considerably slower than the 640MB version of the same card, these two differ noticeably in 3DMark06 even in 1280x1024 with disabled FSAA. Note also that the GeForce 8800 Ultra scores over 11.000 points.
The mentioned advantage of the GeForce 8800 GTS over the GeForce 8800 GTS 320MB can be seen in the SM2.0 tests only due to the first test which is rich in high-resolution textures. The GeForce 8600 GTS falls farther behind the GeForce 8800 GTS in the SM3.0/HDR tests which are more computations-heavy than the SM2.0 group. The GeForce 8500 GT cannot score even 1000 points in either case.
Our supposition about the first test is true: the GeForce 8800 GTS outperforms the GeForce 8800 GTS 320MB more than in the second test. The rest of the results look normal and agree with the overall results for this group of tests.
The two SM3.0/HDR tests produce similar results except that the GeForce 8800 Ultra enjoys a bigger lead over the GeForce 8800 GTX in the second test while the advantage of the GeForce 8800 GTS over the GeForce 8800 GTS 320MB is, on the contrary, smaller. The results of the individual tests agree with the overall scores despite the use of full-screen antialiasing.
This test was launched with 4x FSAA, 64-bit FP HDR and Parallax Occlusion Mapping to ensure maximum image quality.
There is a curious discrepancy in the test results when expressed in points and in frames-per-second. The 500-point difference between the scores of the GeForce 8800 GTS 320MB and the GeForce 8800 GTS is about 30 percent while the difference between their frame rates is only about 14 percent. Moreover, the average frame rates of the GeForce 8600 GTS and GeForce 8500 GT coincide while their scores do not. There must be some flaw in the benchmark itself especially as its version available for download is only beta 2 as yet.
The tests bring no surprises. Modern games having too harsh system requirements, you can only use high display resolutions in them if you’ve got a senior GeForce 8800, i.e. a GTX or an Ultra. These are capable of delivering comfortable performance in resolutions up to 1920x1200 inclusive even if you have enabled full-screen antialiasing to improve the image quality. The exceptions, such as Call of Juarez and S.T.A.L.K.E.R., are rare, yet the average frame rate provided by GeForce 8800 Ultra/GTX is quite high. By the way, there is not much reason in hunting for a rare and expensive GeForce 8800 Ultra because the GeForce 8800 GTX copes with 3D games just as well, but costs much less. Moreover, the latter card is available in pre-overclocked versions that are even less slow than the GeForce 8800 Ultra.
If you want to use a resolution of 2560x1600 pixels in your games, buying a GeForce 8800 Ultra makes sense, but a SLI tandem of two GeForce 8800 GTX will anyway deliver higher performance. Thus, the GeForce 8800 Ultra is the only choice when you need the maximum speed possible today, but the GeForce 8800 GTX seems more optimal otherwise.
The GeForce 8800 GTS, with 640MB or 320MB of graphics memory, runs modern games at an acceptable speed and provides comfortable gaming conditions in resolutions up to 1600x1200 and, occasionally, even to 1920x1200. The results of the two versions of the card are in fact identical and purchasing the more expensive 640MB version isn’t reasonable. You may want to add some more money and buy a GeForce 8800 GTX instead. These two cards also have a dangerous rival, ATI Radeon HD 2900 XT, which beats them in quite a lot of tests. You should choose basing on what games you like most, but the GeForce 8800 GTS 320MB seems to be the most optimal solution to us. It isn’t any slower than the 640MB model in usable resolutions but has an official price of only $299.
The GeForce 8600 GTS doesn’t look well in today’s games. It is only in few of them that it can deliver acceptable performance in 1280x1024 at maximum level of detail and with enabled transparency antialiasing. This graphics card cannot satisfy a fastidious gamer who wants to have not only a high speed in his favorite games but a high image quality as well. It is rather a product for undemanding gamers with a limited PC budget as well as for people whose PC is used mainly for multimedia tasks. The 8600 GTS may even be preferable to any GeForce 8800 when it comes to the latter application due to its advanced hardware video-processor. The low power draw and the simple cooler also make the GeForce 8600 suitable for media centers.
As for the GeForce 8500 GT, this card doesn’t offer enough performance for modern games. It is targeted at undemanding users who want to have DirectX 10 support and the multimedia capabilities of the GeForce 8 series.
Now, we want to add a few words about the particular products tested.
The Gigabyte GV-NX86S256H Silent Pipe 3 is an excellent choice for a home multimedia system. This graphics card has all the pros and cons of a GeForce 8600 GTS, but its unique design does not require additional power, making it easier to assemble the system, and its silent cooler won’t disturb you with its noise when you are watching a movie or listening to music. Moreover, you will be able to play modern games, even though you will probably have to give up FSAA and high display resolutions.
The MSI NX8500GT-TD256E is an good ordinary GeForce 8500 GT. With its quiet cooler it might have done well in a multimedia system, but the lack of DHCP support negates the advantages of the PureVideo HD processor. And as we said above, the GeForce 8500 GT cannot run modern 3D games fast. The MSI NX8500GT-TD256E may be an improvement for those who have been using an integrated graphics core, though. You may also buy it if you want a card with modern functionality but for little money, probably as a temporary solution until you’ve got enough cash for a more advanced graphics card.