Articles: Graphics
 

Bookmark and Share

(4) 

Table of Contents

Pages: [ 1 ]

As you may have heard, on February 6 NVIDIA announced a new graphics chip line--the GeForce4 Ti and GeForce4 MX series. Today GeForce4s are the fastest--though most expensive--chips. We have already reviewed them here. Now it's the turn of the latest chips aspiring to become as popular as any NVIDIA chip release since TNT2 M64.

With TNT2 M64 and TNT2 Vanta chips NVIDIA started a tradition of cheaper, cut-rate, high-performance chips. A later entry was the NVIDIA GeForce2 MX, which became a worldwide hit shortly after it came into being and still sits in many PCs.

Time passed, and the GeForce2 MX (MX 200/400) family, which is just a version of GeForce2 with limited 3D, was succeeded by GeForce4 MX--a wild architectural cocktail of several chip generations from GeForce2 through GeForce4.

In this review we'll try to clear up what has finally come out of this effort.

Closer Look: Chips

Within the GeForce4 line NVIDIA offers three products. They have different core and memory frequencies, graphics memory types, and, of course, prices:

  • GeForce4 MX460: core/memory frequencies at 300/500 (275 DDR) MHz, 128bit 64MB DDR SDRAM in a BGA package. The recommended price for NVIDIA GeForce4 MX460-based graphics cards is $179.
  • GeForce4 MX440: core/memory frequencies at 270/400 (200 DDR) MHz, 128bit 64MB DDR SDRAM. The recommended price of NVIDIA GeForce4 MX440 based graphics cards is $149.
  • GeForce4 MX420: core/memory frequencies at 250/166 MHz, 128bit 64MB SDRAM. The recommended price of NVIDIA GeForce4 MX420 based graphics cards is $99.

As you can see, graphics cards within this series differ quite a bit. The oldest NVIDIA GeForce4 MX460-based cards work at higher frequencies than even GeForce3 Ti500 cards, and have faster graphics memory. The newest model, based on GeForce4 MX420, runs at 250 MHz and is equipped with a feeble 128bit 166 MHz of SDRAM.

For comparison we summarize the basic characteristics of NVIDIA GeForce4 MX, GeForce4 Ti, GeForce2 Ti and GeForce3 cards in the following table:

  NVIDIA GeForce2 Ti NVIDIA GeForce3 Ti200/Ti500 NVIDIA GeForce4 Ti4600/Ti4400 NVIDIA GeForce4 MX 460/440/420
Manufacturing technology 0.18micron 0.15micron 0.15micron 0.15micron
Chip frequency, MHz 250 175/240 300/300 300/270/250
Graphics memory type 128bit DDR SDRAM 128bit DDR SDRAM 128bit DDR SDRAM 128bit DDR SDRAM/DDR SDRAM/SDRAM
Graphics memory frequency, MHz 400 (200 DDR) 400 (200 DDR)/
500 (250 DDR)
650 (325 DDR)/
550 (275 DDR)
550 (275 DDR)/
400 (200 DDR)/
166
Number of pixel pipelines 4 4 4 2
Number of TMUs per pipeline 2 2 2 2
Max. number of textures laid within 1 pass 2 4 4 2
Texture filtering methods Bi-liner
Tri-linear
Anisotropic (up to level 2)
Tri-linear
Bi-liner
Tri-linear
Anisotropic (up to level 8)
Tri-linear
Bi-liner
Tri-linear
Anisotropic (up to level 8)
Tri-linear
Bi-liner
Tri-linear
Anisotropic (up to level 2)
Tri-linear
Bump mapping techniques Emboss
Dot3
Emboss
EMBM
Dot3
Emboss
EMBM
Dot3
Emboss
Dot3
FSAA support FSAA 2x, 4x (*1) FSAA 2x, 4x
MSAA 2x, 4x
Quincunx (*1)
MSAA 2x, 4x
Quincunx, 4xS (*1)
MSAA 2x, 4x
Quincunx, 4xS (*1)
Hardware T&L unit yes yes yes yes
Pixel Shaders support no yes
version 1.0, 1.1
yes
version 1.0-1.3
no
Vertex Shaders support no yes
version 1.1
yes
version 1.1
yes
version 1.1
HSR technology and improved memory performance Data caching 4x32bit crossbar memory controller,
Z-test,
lossless Z-buffer compression,
data caching
4x32bit crossbar memory controller,
Z-test,
lossless Z-buffer compression,
quick Z-buffer clear,
preparing memory banks,
data caching
2x64bit crossbar memory controller,
Z-test,
lossless Z-buffer compression,
quick Z-buffer clear,
preparing memory banks,
data caching
Multi-monitor configurations support no no yes
2 integrated CRT controlelrs and RAMDAC
yes
2 integrated CRT controlelrs and RAMDAC
Digital monitors support External TMDS-tranceiver from third manufacturers External TMDS-tranceiver from third manufacturers 2 integrated TMDS-tranceivers 2 integrated TMDS-tranceivers
TV-Out support External TV-encoder from third manufacturers External TV-encoder from third manufacturers Integrated TV-encoder Integrated TV-encoder
Hardware DVD decompression no no yes (*2) yes (*2)

Thus, from NVIDIA GeForce4 the new chips have inherited such architecture elements as:

  • HSR and graphics memory optimization (Lightspeed Memory Architecture II) technologies. The only innovation in the NVIDIA GeForce4 MX chips is a split memory controller. Unlike the GeForce4 and GeForce3, the GeForce4 MX controller has two 64bit channels rather than four 32bit channels.
  • The T&L unit seems to support vertex shaders; Direct3D driver says it supports vertex shaders version 1.1. Perhaps NVIDIA GeForce4 MX is equipped with one of the vertex shaders used in GeForce4 Ti, but only one of the two.
  • Accuview provides full-screen anti-aliasing via multisampling (2x, 4x, Quincunx and a combination of multi- and supersampling--4xS).
  • nView ensures support for multi-display configurations. There are two embedded CRT controllers, two embedded RAMDAC, embedded TMDS transceivers and TV-Out support.

We have described these architecture elements in detail in our NVIDIA GeForce4 Review.

From NVIDIA GeForce2 the new chips take pixel pipelines with two texturing units each. Correspondingly, NVIDIA GeForce4 MX chips do not feel so free-working with textures as the GeForce4 Ti: they can overlay no more than two textures per tick, have no EMBM support, pixel shaders or anisotropic filtering higher than level 2 (8 samples at the most).

We asterisked several issues in the table. Here are some comments about them:

  • *1: NVIDIA GeForce3, GeForce4 Ti and GeForce4 MX support many modes of multi-sampling, supersampling and combinations thereof. For instance, these chips are capable of 2x, 4x or 4x 9-tap supersampling, though the current driver versions don't permit them to be enabled. So we must choose the modes proven to have the best price-to-performance ratios.
    By the way, with NVIDIA GeForce2 you can also use supersampling methods other than 2x and 4x. E.g., 1.5x1.5 and 3x3 supersampling modes were available with older drivers for NVIDIA GeForce2, and today you can enable these modes with such utilities as Rive Tuner.
  • *2. In spite of NVIDIA saying GeForce4 Ti and GeForce4 MX support hardware DVD decompression, to date software DVD players do not detect it.

Architecturally NVIDIA GeForce4 MX chips are closer to GeForce4 Ti than to GeForce2, so the new chips' name that groups them with the GeForce4 class is correct. Another issue is that we've been used to hobbled NVIDIA chips differing from their full-featured counterparts only in clocking, not in functions. This was the case with TNT2M64, as well as GeForce2 MX. In the case of GeForce4 MX, NVIDIA wanted to make the chips as cheap as possible, so instead of GeForce4/GeForce4 pixel pipelines, it used GeForce2 pipelines. As a result, GeForce4 MX chips have no pixel shaders or other advanced features.

Closer Look: Graphics Cards

The graphics memory chips and heatsink on GeForce4 graphics core make the GeForce4 based reference card look very much like NVIDIA's GeForce4 Ti4600/4400. The core and memory voltage regulator on the GeForce4 MX460 card doesn't occupy as much space, so the card has more normal proportions:

   

This graphics card is equipped with a NVIDIA NV17 chip with a postfix "PRO", GeForce4 MX460:

Like all the other chips of NVIDIA's new series, this chip has a metal cover for better heat dissipation. The nominal frequency of NVIDIA GeForce4 MX460 chip is 300 MHz.

The chip comes with a NVIDIA brand cooler like the one we met on the GeForce4 Ti4600 reference card:

   

The graphics card is equipped with 64 MB graphics DDR SDRAM from SAMSUNG in a BGA package.

These chips have 3ns access time, with working frequencies ranging up to 554 (277DDR) MHz. The nominal memory frequency of the NVIDIA GeForce4 MX460 is close to its maximum of 550 (275DDR) MHz.

Although the core of GeForce4 MX has embedded TMDS transceivers and TV encoder, instead of these on the NVIDIA reference card we found two chips from outside manufacturers (Sil16CT64 from Silicon Image and SAA7102AE from Philips).

Testbed and Methods

That's what our test bed looked like:

  • AMD Athlon XP 2000+ CPU;
  • MSI K7T266 Pro2 v2.0 (VIA KT266A) mainboard;
  • 2x128 MB DDR SDRAM PC 2100 Nanya CL2;
  • Fujitsu MPF3153AH HDD.

Software:

  • Version 6.13.10.6032 driver for Windows XP for ATI chip based graphics cards;
  • Detonator 27.42 driver for Windows XP for NVIDIA chip based graphics cards;
  • Windows XP;
  • Max Payne;
  • Serious Sam: The Second Encounter;
  • 3DMark 2001;
  • Quake3 Arena v1.30.

We ran these applications in the following modes:

Max Payne

Quality mode implied the highest graphics quality: disabled full-screen anti-aliasing and anisotropic filtering (in order to provide equal conditions or the tested graphics cards), 32bit texture and buffer color depth.

Speed mode incorporated the lowest graphics quality (16bit texture and frame buffer color depth).

In the Max Payne test we used benchmark mod and PCGH's Final Scene No1, which are described in detail on the 3DCenter web-site.

Serious Sam: The Second Encounter

Speed mode: "Speed" graphics quality settings, 16bit color modes;

Quality mode: "Quality" graphics quality settings, 32bit color modes.

For this test we launched a standard "The Grand Cathedral" demo.

The game engine used TRUFORM technology, which was enabled for ATI RADEON 8500 by default. We did not disable this.

3DMark 2001

We ran all gaming tests with 32bit frame buffer, 32bit textures and 32 (24) bit Z-buffer, in pure hardware T&L and hardware T&L modes. Changes made in synthetic tests are indicated on the charts.

Quake3 Arena v1.30

All tests were run with the highest graphics quality settings, with trilinear filtering and texture compression enabled.

During the testing we used two modes:

  • 16bit texture and frame buffer color depth
  • 32bit texture and frame buffer color depth

We tested with a standard Quake3 Arena 1.30 demo patch.

Performance


Out tests show that the T&L unit of NVIDIA GeForce4 MX is similar to GeForce3 in performance. When the core frequency of NVIDIA GeForce3 is set to 270 MHz, like the core frequency of NVIDIA GeForce4 MX440, the latter is just a trifle slower than the GeForce3 Ti500.

Things are more complicated with the vertex shaders: the hardware fill rate of GeForce4 MX460/MX440 is lower than the fill rate of NVIDIA GeForce3 Ti200 based card with 175 MHz core frequency. However, it is higher than with software vertex shaders handled by the CPU.

This gives us a reason to assume that NVIDIA GeForce4 MX has hardware support for vertex shaders, though its efficiency leaves much to be desired compared to the NVIDIA GeForce3 Ti/GeForce4 Ti, not to speak about the ATI RADEON 8500 LE.

NVIDIA GeForce4 MX does well in multi-texturing thanks to the graphics memory boosting technologies inherited from NVIDIA GeForce4 Ti. Allowing for test error, NVIDIA GeForce4 MX shows the highest possible result. But the situation changes dramatically when there is one texture overlaid per polygon. In 16bit color the results were almost ideal, whereas with 32bit texture, frame buffer and Z-buffer color depth there is a tangible lack of graphics memory.

Over all, the polygon fill rate of NVIDIA GeForce4 MX460/MX440 is the lowest of all graphics cards tested. Though GeForce4 MX architecture is the most favorable for graphics memory, unlike its competitors NVIDIA GeForce4 MX has only two pipelines with two texturing units each.

As for games, let's start with 3Dmark 2001. We have to use this test because there are no proper up-to-date Direct3D games allowing for adequate assessment of the graphics card's performance.


On average the performance of NVIDIA GeForce4 MX460 in this test is equal to NVIDIA GeForce3 Ti200. The results of NVIDIA GeForce4 MX440 are 10-15% lower than those of MX460.

The NVIDIA GeForce3 Ti200 has double the pixel pipelines of NVIDIA GeForce4 MX460 and higher vertex shader processing speed, and the latter has shown really smart results.

Anyway, here's why these results are so high: first, the clock frequencies of GeForce4 MX460 are much higher, and second, the GeForce4 MX chip is equipped with a more efficient cache and graphics memory controller than GeForce3 Ti200.


In this test the lineup is nearly the same: NVIDIA GeForce4 MX460 and GeForce3 Ti200 run shoulder-to-shoulder, Geforce4 MX440 lags 10-15% behind GeForce4 MX460. At the same time, GeForce4 MX440 is 10-15% faster than the mainstream card of the previous generation--the NVIDIA GeForce2 Ti.

Results in Max Payne were quite homogenous. Thus we conclude that performance in this test depends too heavily on the CPU and the system in general.

Nonetheless, the results for NVIDIA GeForce4 MX460 / MX440 are notably closer to Geforce3 Ti200 than to GeForce2 Ti.

Performance in Quake3 Arena is mainly determined by two factors--the polygon fill rate and graphics memory bandwidth. At higher resolutions the memory bus bandwidth has greater influence, while at lower resolutions it's polygon fill rate that matters.

Thanks to its high core frequency and efficient use of graphics memory, GeForce4 MX440 bests the GeForce2 Ti in 32bit modes, and GeForce4 MX460 with its higher clock frequencies is faster than GeForce3 Ti200 in both 16bit and 32bit color modes.

In quality mode (800x600) NVIDIA GeForce4 MX460 / MX440 breaks into the leading position. As the resolution grows, the GeForce4 MX440 proves slower than its MX460 brother (and just a little faster than GeForce2 Ti). In higher resolutions GeForce4 MX460 boasts marvelous results, outrunning GeForce3 Ti200 by 10-15%. GeForce4 MX460 owes its success to anisotropic filtering, enabled in quality mode. This filtering is known to cause significant performance losses in NVIDIA GeForce3 / Ti500 / Ti200 cards.

Conclusion

After this test session, we must admit that NVIDIA can be proud of its GeForce4 MX chips. They are smart enough to be worthy successors of the GeForce3 Ti200, GeForce2 Ti and GeForce2 MX series.

NVIDIA GeForce4 MX460 is the eldest of the family. Its performance is close to or even higher than the GeForce3 Ti200. Unlike the latter, GeForce4 MX460 supports dual-monitor configurations, but has no hardware support for pixel shaders or high-level anisotropic filtering.

NVIDIA GeForce4 MX440 is at every point better than GeForce2 Ti based cards. Apart from better performance, it supports all sorts of full-screen anti-aliasing via multi-sampling (Accuview), supports vertex shaders and multi-monitor configurations.

NVIDIA GeForce4 MX420 was not covered in this review (its time will come soon). Of course, it won't be able to stand up to the NVIDIA GeForce3 Ti200 and GeForce2Ti, but with its lower price (due to cheap SDRAM), GeForce4 MX420 will cover NVIDIA in the value graphics card sector. These cards will replace GeForce2 MX400, a used-to-be hit.

As far as the prices go, judging from the recently available pioneers we may assume that very soon these NVIDIA GeForce4 MX-based cards will become dainty morsels.

Now it's up to you to make the choice! 

Pages: [ 1 ]

Discussion

Comments currently: 4
Discussion started: 01/24/06 06:02:20 PM
Latest comment: 03/29/09 03:03:35 PM

View comments

Add your Comment




Latest materials in Graphics section

Article Rating

Article Rating: 8.1667 out of 10
 
Rate this article:
Excellent
Average
Poor