NVIDIA GeForce FX 5900XT (SE/LE) Modding Experience

NVIDIA GeForce FX 5900XT based graphics cards have truly become "the new Ti4200". They are really beloved by most overclockers for the ability to reach the performance of top NV35/38 products at a considerably lower cost. But this is far not the maximum of their potential. In certain lucky conditions a humble FX 5900XT may put up a fight with a well-overclocked RADEON 9800XT – our tests prove that! We managed to achieve 27882 points score in 3DMark 2001 SE!

by Kirill
06/15/2004 | 03:50 PM

Deep Tuning

Waiting for the GeForce 6800 Ultra to appear in shops, I purchased a Sparkle SP-8835 graphics card on the NVIDIA GeForce FX 5900XT chip (revision 1.0 of the product, that's why the standard frequencies were 390MHz GPU and 680MHz memory). The board carried 2.8ns memory chips from Hynix and this fact gave some hope for high overclocking results.

 

That turned to be true and my sample reached the tremendous 480/955MHz frequencies with its standard cooling system. Still, I thought that was not enough and took up the card seriously. My tuning was targeted specifically at the maximum score in 3DMark 2001 SE, but the performance grew appropriately in other 3D applications.

This article is written in the chronological order as I was making modifications to my graphics card. This makes it simpler to track the effect of each particular change. Each section contains a diagram which shows the frequency gain. It also mentions the state of the card after each particular mod since modifications were made at different times and it was not always possible to compare data we got under different conditions. I tried to provide the most comprehensive information, though.

The reader is supposed to have acquainted himself with the “NVIDIA GeForce FX 5900 XT FAQ: All You Need to Know About It” and to know the particulars of the card he’s going to mod. That’s why I don’t repeat the info given in the FAQ.

This guide applies to all graphics cards with the GeForce FX 5900XT GPU that follow NVIDIA’s reference design. I know of two exceptions only:

Besides the GeForce FX 5900XT proper, other “cut-down” versions of the FX 5900 based on the same PCB design are modded identically. For example, Prolink PixelView FX 5900 LE, ASUS V9950SE and V9950GE and others.

Disclaimer

First of all, I refuse to give any warranties whatsoever. This guide is a description of modifications successfully made to the construction of the graphics card by the team of ModLabs.net. Each mod was tested and helped to increase performance. However, we cannot promise you that you will be as successful (well, our card went to repair shop after all, too). We don’t accept any claims concerning damage to your card after the modification – such problems imply your own mistakes. The author and ModLabs.net are not responsible for any damage inflicted by repeating of what this guide describes.

Then, our disclaimer about volt-modding at large:

Attention! Volt-modding, if recognized, makes all warranty obligations void. So you should do everything neatly to be able to unsolder all back again in case of the card’s death.

You should take up volt-modding if all the following items are true:

Besides volt-modding, there are other modifications in the guide, so some additional words seem necessary. Any physical modification of a graphics card like the removal of the frame or the gluing of heatsinks on the memory chips will also void the warranty of the manufacturer and seller if it is impossible to return the original look to the product. So you do everything at your own risk. Good luck!

Ingredients

This is a full list of all you need to walk in our footsteps:

Vgpu Modification

My first modification in the “How to make your 5900XT into a racing car” series is a classic Vgpu volt-mod to raise the voltage of the graphics core.

In all 5900XT graphics cards with PCBs of the reference design the Intersil ISL6522CB chip controls this voltage (there are only two above-mentioned exceptions). By the way, the 5900XT differs from the 5900 (without “XT”), which uses a more advanced ISL6569 controller.

Now let me remind you a few basic things concerning volt-modding.

The gist of any volt-modding is modification of the voltage feedback circuit. To do this, a resistor of the necessary resistance is installed between the feedback leg (or some other point of the circuit, if necessary – this usually becomes really necessary when the chip’s legs are not very convenient to solder upon) and the “ground”. Reducing the resistance we increase the voltage of the element.

By the way, and this is very important! If you also use variable resistors, make sure the resistor is set to its maximum, not minimum on the first turning-on! If it is set to its minimum, the GPU will receive the maximum possible voltage and will die immediately in 90% of all cases! So remember this once and for all: reduce the resistance to increase the voltage!

Now, after the preliminaries are over, let’s continue…

Leg 5 of the Intersil ISL6522CB is responsible for feedback. To do the mod, we need to hang a 10,000 Ohms resistor between that leg and the closest ground. There are two more or less good candidates for the role of the ground: the seventh (corner) leg of the same Intersil chip or one of the central pins of the power connector.

Voltage is monitored in a not very-easy-to-access point – on one of the “barrels” on the card’s face side. I recommend you solder a wire to the measurement point and hang the multimeter on this wire – this is much handier than grope for the necessary spot each time.

It is a distinguishing trait of many graphics cards of the NVIDIA GeForce FX series that they can dynamically control the core frequency. The graphics memory always works at the same clock rate, while the graphics chip has two basic operational modes: its frequency is 300MHz in 2D applications and increases in 3D applications (in our case, the frequency of the 5900XT GPU goes up to 390MHz). Besides that, the graphics core works at 250MHz only when the power connector is empty and at 376MHz when the auto-brake is applied (see the FAQ on NVIDIA GeForce FX 5900XT).

I don’t say it to impress you with my extensive knowledge of graphics cards, but only to lead you to another peculiarity of the 5900XT (and other GeForce FX chips as well).

The core voltage changes along with the frequency!

The values follow:

This is very important when tweaking Vgpu. If you adjust the voltage in the 2D mode, always add 0.2v to the value on the multimeter! That is, if 1.85v is set in the 2D mode, the core switches to 2.05v on the start of any 3D application, which is not going to end well. Ok, you have had another warning, let’s move on…

As you must have noticed, the standard Vgpu of 5900XT graphics cards is 1.4v in the 3D mode, while the same voltage is 1.6v for the 5950 Ultra. Considering that the two cores have much in common, I’d say that 1.6v is the optimal point for people who don’t want to run any risks. In fact, the 5900XT core is supposed to work at 1.6v, but works at a reduced voltage. The average frequency gain above the standard Vgpu is 50MHz and this is quite enough for everyday work.

But of course we don’t stop at that. We increased the voltage to 1.95v, raising the chip frequency to 645MHz (as you remember, 390MHz is the nominal clock rate). The maximum frequency was 615MHz with 1.9v and 617MHz with 1.8v voltage.

The state of the graphics card:

As you see, the card enjoys the growth of Vgpu till the limit, increasing the overclockability of the core. Why do I start talking about limits? You’ll learn closer to the end of the article. You should definitely read it through if you’re about to make the Vgpu volt-mod!

Water-Cooling the Graphics Core

The standard cooler of the Sparkle FX 5900XT graphics card (and its twin brother from Point of View) is a massive aluminum contraption which looks much like the reference cooler from the Ti4600. There’s one 40mm fan, covered with a protective grid from above. The cooler is installed on the GPU with thermal paste in between and touches the memory chips through non-adhesive thick (2mm) thermal pads.

This thing cools well, but some modification will help.

The simplest and most efficient thing you can do is install additional blowers. In our case, blown on by two 80mm fans the graphics card didn’t overheat even at 600MHz GPU frequency and 1.8 Vgpu.

The next step is installation of a water-cooling system on the chip. We used the most aggressive option – a dedicated Koolance EXOS unit with a Koolance water-block for graphics cards.

You shouldn’t wait for miracles from water cooling, though. If the card has normal temperatures before its installation, you won’t get any perceptible frequency gains. In our case, water cooling added 10MHz to the GPU clock rate.

The status of the card: the chip’s lid is on, Vgpu is 1.8v.

Note that after the removal of the standard cooler the memory chips remained naked, without any cooling, save for the blowers. This fact didn’t affect their overclockability: 955MHz.

Removing the Heat Spreader (IHS)

Next, we should improve the contact between the core and the cooling system. The graphics chip on 5900XT cards (and on many other GeForce FX cards) is covered with a special lid that serves as a heat-spreader and also protects the core against physical damage. It would be all right, but this solution creates two more layers between the core and the cooling system, worsening the heat-transfer. Instead of the simple “core – thermal paste – cooler” scheme, we have a sandwich like “core – thermal paste – lid – thermal paste - cooler”. That’s not the most efficient way to cool the GPU down.

The cure is simple: remove the IHS. To do this, you need a paper-knife or any other knife with a flat and sharp blade. The procedure is similar to the removal of the frame from RADEONs.

Choose a corner (I prefer the bottom left one) and carefully push the knife under the heat-spreader. It’s handier to position the knife diagonally at that.

Then move the blade deeper, so that it was between the heat-spreader and the chip.

Then take the knife out. Put it in again, but with the tip now, so that the knife was perpendicular to the slit. Don’t push it too hard and far, as this may damage the unpackaged elements on the chip.

Move the blade to a side, releasing the chip from the IHS.

After you reach the second corner, you may probably take the knife out. In our case, the heat-spreader has lifted up so you could catch it with your fingers.

Now slowly and carefully lift the heat-spreader. At some moment, it just tears off, revealing the chip.

Now you only have to wash the thermal paste away and add the cover to your collection (ours includes seven frames from RADEONs and a heat-spreader from a Pentium 4 3.2C).

Alas, this modification totally extinguishes your warranty since you cannot put the heat-spreader back without everyone’s noticing.

That’s what you get from removing the IHS:

14MHz is of course no big deal, but anyway this is a bonus and you may want to repeat this mod, now that you’ve started tweaking your card. It takes little time – about 5-10 minutes.

Vmem Modification

It is a peculiarity of 5900 series graphics cards that they are more sensitive to GPU overclocking than to memory speedup. Roughly, you have the same performance gain from an extra 1MHz on GPU as from extra 10MHz of the memory clock rate. Anyway, each jot of performance is important in the race for records, so we also performed a Vmem modification.

Due to mysterious reasons, a majority of those who attempted this mod couldn’t get a single extra megahertz of memory frequency by adjusting Vmem, irrespective of the memory manufacturer (Samsung or Hynix). I can’t explain why I succeeded where others failed – the modification recipe was the same. Anyway, you may try it – maybe you’ll be lucky?

The 5900XT graphics card carries two Intersil ISL6522CB chips. One of them controls the GPU voltage, as we already know, and the other does the same with respect to memory. So, the second mod should be analogous to the first one.

Yes, we do the same: a 10kOhm variable resistor between legs 5 and 7 of the ISL6522CB. To be precise, between leg 5 and “ground”, but leg 7 seems the best ground to me in this case.

Monitoring is performed at the front side of the board; it’s easier to solder a wire there than poke your multimeter there each time.

As concerns the nominal voltage, it differs from 2.5v to 2.85v in different cards, so don’t be surprised to find it “non-standard”. I don’t have any statistics concerning brands, but my Sparkle had Vmem = 2.7v.

So what did I get by increasing Vmem? Until 3.1v the changes were insignificant (some extra 10-15MHz), but after that there was some growth. A memory voltage of 3.4v produced 1GHz memory frequency. On the other hand, from 3.2v on, the memory chips suddenly increased their heat generation: at 3.4v Vmem, one minute of testing in 3D made the chips heat up to 70°C even with intensive air cooling and this temperature soon produced image artifacts due to the trivial overheat. So, we needed to cool the memory chips somehow…

Heatsinks on Memory Chips

As I noted in the FAQ, the memory chips are not very hot on the 5900XT and ordinary overclocking doesn’t call for an installation of additional heatsinks. That’s also why some graphics card manufacturers leave the chips bare.

Meanwhile, extremely high voltages (starting from 3.2v) heat the memory up and even reduce its overclockability.

There are numerous ways to fastening heatsinks to memory chips. Reversibility of the process is important – you don’t want to lose your warranty, yeah? For example, you can fasten the heatsinks with wire, clips or other such stuff. As for me, I didn’t bother about the warranty, since I lost it altogether by removing the GPU’s lid. Instead, I could achieve the maximum reliability of the fastening and the best heat transfer.

I used a special material, Arctic Silver Thermal Adhesive. This is a product of the legendary Arctic Silver, whose excellent thermal pastes are known to every overclocker in this world.

It is a two-material compound with adhesive properties. It comes in two syringes, 10gram in total. Independently, each component resembles Arctic Silver 3 thermal paste (on which they are based) and has no adhesive qualities.

You apply one component, then another and mix them together – this produces a material with a setting time of 5-10 minutes. As for the mixing part, you first drop Part A, than Part B (near Part A, or right on it) on one detail and mix them “on the spot” and apply the layer on the surface using a credit card for example.

Then you press the heatsink to the chip and wait for the glue to work. If you feel confident, you may go on gluing other heatsinks in the meanwhile…

As for the heatsinks, I used my favorite and time-tested design: a sawed copper 1U heatsink for Xeons from Titan with ribs unbended like in Zalman’s coolers.

After an hour you may use the card again. Arctic Silver Thermal Adhesive is so strong that it’s real hard to tear a heatsink off the chip if you should have a mind to do that.

As for the warranty look, it’s hard to comment. I tore the heatsinks off my card and removed the remains of the glue from them. On the other hand, I read complains at forums like “I can’t return my card because I glued heatsinks to it”. Well, I guess some skill is required here, too…

Now, the most interesting part is the effect from the heatsinks. Let’s compare several states of the card:

As you see, you can’t hope for any positive effect from the heatsinks without the Vmem modification. And why should there be any effect if the chips have the room temperature (blown on by the fans)? At high Vmem, the chips are more like GDDR2 as concerns temperature. Getting them back to the room temperature or thereabouts with the help of the heatsinks, we got as many as extra 60MHz of memory frequency! To be precise, a non-stable, but allowable for testing purposes frequency was 1075MHz.

A curious fact: the chips are officially marked as Hynix AF-28, i.e. they have a specified access time of 2.8ns. Such chips are rated for work at 715MHz. Reaching 1075MHz we surpassed the specification by over 50%! As far as I know, this is an unprecedented result for memory chips, whose overclockability rigidly depends on the access time and seldom exceeds 115-120% of the rated frequency.

Of course, I’m almost sure that Hynix remarked 2.2ns chips as AF-28, but I can’t prove it: all official data say that the card carries 2.8ns chips…

Hardware modifications ended at that (later I regretted I didn’t extend them further) and the card was ready for setting records. The maximum frequencies it was stable at were 645/1075MHz – it passed 3DMark at them. But there’s also the software part of the story.

Wondrous Drivers, or How Do You Bring Together Detonator 44.03 and 5900XT?

There’s one driver from NVIDIA which dramatically improves the performance of NVIDIA’s GPUs in 3DMark 2001. It is Detonator 44.03. It contains optimizations to push the performance up in Dragothic Lo Detail and Nature and slightly adds to the score in Dragothic Hi Detail.

This driver is WHQL-certified, is not prohibited by Futuremark, but there’s an unwritten agreement among benchmarkers: if others don’t use this driver, 44.03 is considered nearly like cheating and the result won’t get any approval. On the other hand, if version 44.03 is commonly accepted for a certain class of graphics cards, it’s unreasonable to pass by this opportunity of getting another speed boost. A couple of examples:

  1. We had the second place in the GeForce FX 5700 Ultra class and were less than 1000 points to the first place. We would easily win installing 44.03. However, this would be an infringement of the fair play principle since MickeyMouse used the ordinary driver (and would outperform us once again switching to 44.03).
  2. It seems quite possible from the ethical point of view to use Detonator 44.03 with the GeForce FX 5900/5900 Ultra/5950 Ultra as all the best results in these categories were set by respected members of the overclocker community (Hiwayman, Fugger, Kunaak) with this exactly driver. It’s useless to compete with them using another driver.

An attentive reader may ask a question, “What 5700 Ultra and what 5950? This driver doesn’t recognize the new cards and can’t work with them!” Yes, it’s impossible to install a freshly-downloaded Detonator 44.03 on new cards. But there’s one shamanic rite…

The key is simple: add two lines to the nv4_disp.inf file, in accordance to your graphics card. This file contains a section called [NVIDIA.Mfg] (use search in the Notepad). This section contains lines of the following kind:

%NVIDIA_NV30.DEV_0301.1% = nv4_SSPoll, PCI\VEN_10DE&DEV_0301

So add another line:

%NVIDIA_NV35.DEV_0332.1% = nv4_NV3x,  PCI\VEN_10DE&DEV_0332

Then, there’s Localizable Strings section at the end of the inf file. Add one line to this section too:

NVIDIA_NV35.DEV_0332.1 = «NVIDIA GeForce FX 5900XT»

After that, install the driver as you ordinarily do: it detects and recognizes your GeForce FX 5900XT.

If you’re too lazy to do it yourself, download this modified nv4_disp.inf file with all the info about the 5900XT. You should place it into the folder with the unzipped Detonator 44.03, writing over the original file.

The effect of another driver is only noticeable in 3DMark 2001, so it’s unreasonable to install Detonator 44.03 to get any gains in games. 3DMark is impressive, though, especially if you’ve been running the same card on 53.03 for long…

Of course, this is the purest of application-specific optimizations, but extra 1,500 points is a real gift for people who strive for the top, and a totally legal gift (at least for the 5900XT – from any point of view).

Still, it’s not all simple with version 44.03. I didn’t examine this driver with other graphics cards (although, considering 690/1080MHz on Hiwayman’s card, there’s no such problem with the 5900 Ultra at least), but the 5900XT loses the GPU frequency after its installation. I don’t know how to explain this fact, but the difference between Detonator 53.03 and 44.03 amounted to 45MHz GPU frequency loss and 35MHz memory frequency loss. The above-mentioned 645/1075MHz frequencies transformed into 600/1040MHz. Anyway, this didn’t prevent the card from scoring extra 1,500 after the installation of Detonator 44.03.

Gentlemen, Start Your Engines!

Now it’s time to describe the process of achieving the record result and the result itself.

All the tests were run on our testbed, aka “benchmarking dragster”. The system configuration was as follows:

The most efficient off-the-shelf phase-change cooling system for computers, the nVENTIV Mach II GT, was cooling the CPU down. As I said above, a dedicated Koolance EXOS system with an original Koolance water-block for the GPU was installed on the graphics card. We also added copper heatsinks on the memory chips and provided additional cooling with the help of a 120mm Sunon fan (75CFM). We performed Vdimm modification on the mainboard to give 3.35v voltage to the memory modules.

The most efficient operational mode of the CPU (the multiplier is unlocked, so we can choose) is 305x14, i.e. 4300MHz. The memory works in the 5:4 mode (245MHz) with the smallest timings (2-5-2-2).

Let’s now get back to the beginning of our experiments and analyze the performance growth.

We had a top-end system and a score of 22 thousand without any extreme methods – it was hard to believe that we would add about 6 thousand more to this number.

LaikrodiZ tried to get the utmost out of the card, not using phase-change cooling of the CPU. After the Vgpu mod, at 615/955MHz frequencies and with the CPU clocked at 3800MHz (ASUS P4P800 mainboard, Kingston HyperX KHX3200 and Sirtec 420W PSU) he scored 23,522. This is still the second position in the 5900XT rating.

Then I took the card and continued working on my own system. The point of 25 thousand was conquered rather easily and 26K yielded after the Vmem modification. As a result, I stopped at something like 26,200. Until that I had been using Detonator 53.03 since version 44.03 is seldom used with the 5900XT. But after 26 thousand I realized that I could go higher. The best result for the full-fledged GeForce FX 5900 is 27,166 (Kunaak, XtremeSystems.org Team), and the best result for the 5950 Ultra is 28,055 (Fugger, XtremeSystems.org Team), while the absolute record for all graphics cards from NVIDIA was set by Hiwayman on the GeForce FX 5900 Ultra – the astonishing 29,048 score. All these records were set with Detonator 44.03. So I decided to switch to another driver.

Spending much time on revealing the driver’s peculiarities (the card would hang up at a random moment in the test after the GPU frequency exceeded 600MHz), we managed to make all the system components work right and achieved an impressive score of 27,882. In the unofficial rating on all graphics cards from NVIDIA, we took the third position after Hiwayman with the 5900 Ultra and Fugger with the 5950 Ultra. We also made it to the second page of the common ORB rating with our sub-$200 graphics card (in the middle of May, 2004). That’s not bad for a value product, yeah?

And then I pressed a little too hard…

What You Should NOT Do to Your 5900XT

Vcore is originally 1.4v on 5900XT-based graphics cards. The chip itself could have worked at a twice-higher voltage given enough cooling, at least for some time. Anyway, it is not easy to kill the graphics core with a too-high voltage (without overheat) – you should be either hopelessly unlucky or a downright botcher, but the electric current goes through the power-supply circuitry before getting to the chip and the elements of this circuit don’t all have a big reserve…

So, never increase Vcore of 5900XT-based graphics cards above 2.0v!

There’s a silly thing on the PCB marked as IOR 334H. Raising the voltage above 2 volts you nearly always burn this element out. Even considering the low popularity of volt-modding of the 5900XT (mostly because of the lack of good guides – I hope this article helps to improve the situation), I know of three absolutely identical cases of this chip’s burning out.

Well, my case was the fourth… All was good at 1.95v and I decided to increase Vcore just a little, by 0.1v, to reach 2.06v. The card started passing a test, but the weak link in the power-supply scheme broke and victoriously committed suicide through self-immolation.

The card went to the repairmen and its further fate depends on the availability of the necessary chip, while I switch to more interesting objects for benchmarking: RADEON X800 Pro and GeForce 6800 Ultra.

Later on, I thought the situation over and came to a conclusion that another simple modification might have helped me avoid the troubles. The burned-out chip and other power-supply elements do heat up a lot at work, especially after volt-modding. Gluing heatsinks on them, I’d most probably have solved the problem…

Conclusion

As you see, skill and the right approach may make the GeForce FX 5900XT perform wonders, show outstanding results and bring joy to its owner. None of the above-described modifications is very sophisticated, so you shouldn’t stop yourself from tapping the potential of your graphics card. Good luck!