Testbed and Methods
I tested the Zalman Reserator 1 system and the ZM-GWB1 water block on a testbed configured as follows:
- Intel Pentium 4 2400MHz CPU overclocked to 3600MHz (800MHz FSB overclocked to 1200MHz), Vcore was left standard;
- ASUS P4C800 Deluxe mainboard;
- NVIDIA GeForce FX 5900 Ultra graphics card;
- 2x256MB Kingston HyperX PC3500 DDR SDRAM, CL 2.0.
The Pentium 4 overclocked to 3.6GHz generates quite an amount of heat even without a core voltage increase, while the GeForce FX 5900 Ultra graphics card consumes more power and dissipates more heat than the RADEON X800 Pro or GeForce 6800 GT, and slightly less than the RADEON X800 XT Platinum Edition. Thus, I put the Zalman Reserator 1 system accompanied with the ZM-GWB1 block in a very harsh operating environment.
I performed my tests in the following manner: first, I started the system up and left it idle for 2.5 hours (the Windows XP Desktop was on the screen). Then I launched programs that loaded the CPU and the graphics card and watched the temperatures change for 2.5 hours more.
I used the latest version of the Motherboard Monitor to read the CPU temperature and RivaTuner to read the GPU temperature. The temperature of the water in the Reserator and of the room air was measured with a Fluke 54-II thermometer.
I used the CPU water block alone in my first test and loaded the CPU with two copies of the BurnP6 utility. The X-axis of the following graph contains the time passed since the launch of BurnP6 (in minutes); the Y-axis shows the temperature.
Well, 28°C in the Idle mode and 48°C in the Burn mode are good CPU temperatures. You may note that the temperatures of the water and the CPU were rising simultaneously; the difference between them was almost constant, about 8°C.
Note also the level of the water temperature in the system: it went up rather quickly, but the rate diminished as the temperature had become higher. This is explained by the fact that the difference in the temperatures of the room air and the surface of the Reserator becomes bigger as the water temperature increases, and this facilitates the transfer of heat to air. The water in the system heats up till the system becomes balanced at a given temperature – the amount of heat generated by the CPU equals the amount of heat given out to the outside air.
“Heating” the system up for 2.5 hours, I nearly reached the equilibrium – the water temperature had only increased by 0.3°C in the last half an hour. I say “nearly” because the system with its 6.5kg of aluminum and 2.5l of water has a tremendous thermal inertia. Evidently, the CPU load is going to be smaller in a majority of real-life applications than with special-purpose CPU-grilling programs, and the temperature of the processor is going to be lower, too. The same goes for a short-term peak load – the system won’t heat up in so short a time, so the CPU temperature won’t reach the values I got in my tests.
For my second test, I connected the GPU water block into the system and ran a bot-match in the onslaught mode on the Torlan level of Unreal Tournament 2004. This test is a good heater of both the CPU and the graphics card. I selected the maximum image quality settings in the game, 1600x1200 resolution, 4x full-screen antialiasing and 8x anisotropic filtering.
The rest of the test conditions were the same: 2.5 hours of idleness and then 2.5 hours of Unreal Tournament 2004. The results follow below:
The CPU was less hot than in the previous tests, and it is no surprise since the calculations of physics and game logics in Unreal Tournament 2004 are a smaller load on the CPU than a special-purpose maximum-load utility produces. However, the GPU and the CPU, working together, generate more heat than in the previous test: the temperature of the water after 2.5 hours of heat-up was higher than in the first test.
The GPU temperature measurements confirm my above-said suspicions: the GPU water block doesn’t suit well for its job. Those 80-90°C of the GPU temperature are not satisfactory, to put it mildly.