Articles: Graphics
 

Bookmark and Share

(8) 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 ]

Power Consumption, Noise, Overclockability

Despite the increased clock rates of the GF114 chip, Nvidia claims its TDP has only increased from 160 to 170 watts compared to its predecessor. So, we can expect the GeForce GTX 560 Ti to have about the same level of power consumption, too. We couldn’t help checking out this assumption by performing a standard series of electrical tests on our traditional testbed with the following configuration: 

  • Intel Core 2 Quad Q6600 CPU (3GHz, 1333 MHz FSB x 9, LGA775)
  • DFI LANParty UT ICFX3200-T2R/G mainboard (ATI CrossFire Xpress 3200 chipset)
  • PC2-1066 SDRAM (2x2 GB, 1066MHz)
  • Enermax Liberty ELT620AWT PSU (620W)
  • Microsoft Windows 7 Ultimate 7 64-bit
  • CyberLink PowerDVD 9 Ultra/"Serenity" BD (1080p VC-1, 20 Mbit)
  • Crysis Warhead
  • OCCT Perestroika 3.1.0

The new testbed for measuring electric characteristics of graphics cards uses a card designed by one of our engineers, Oleg Artamonov, and described in his article called PC Power Consumption: How Many Watts Do We Need?. As usual, we used the following benchmarks to load the graphics accelerators:

  • CyberLink PowerDVD 9: FullScreen, hardware acceleration enabled
  • Crysis Warhead: 1600x1200, FSAA 4x, DirectX 10/Enthusiast, "frost" map
  • OCCT Perestroika GPU: 1600x1200, FullScreen, Shader Complexity 8

Except for the maximum load simulation with OCCT, we measured power consumption in each mode for 60 seconds. We limited the run time of OCCT: GPU to 10 seconds to avoid overloading the graphics card's power circuitry. Here are the obtained results:

Of course, the extra active subunits and the higher frequencies affect the power consumption of the new card, so it needs 160 watts in 3D applications whereas the GeForce GTX 460 1GB needed only 140 watts. We guess that’s an acceptable price for the considerable improvements in performance. Moreover, the new card is even more economical in the desktop mode. When processing video, the card doesn’t drop its frequencies immediately whereas our testbed reports the peak power consumption. The average is lower at 18-25 watts. If you connect two monitors with different resolutions simultaneously, the GeForce GTX 560 Ti will not switch into the 51/101MHz mode, using the 405/810MHz mode instead.

Interestingly, it is the bottom connector (12V 6-pin) that has the highest load in the power-saving modes. But in the standard mode each connector has a load of 5.5 to 5.9 amperes, or no higher than 70 watts. Thus, there is indeed no need for 8-pin PCIe 2.0 power connectors.

Like the senior models of the series, the GeForce GTX 560 Ti can monitor the electric current in its 12V power lines. If the load is too high, which is typical of such stress tests as OCCT: GPU or FurMark, the GPU is switched into a low-performance mode as a protective measure. This feature is optional and may be disabled in custom-designed versions of the GeForce GTX 560 Ti. The OCCT:GPU diagram above illustrates the protective mechanism: the power consumption graph is jagged, similarly to such graphs of the GeForce GTX 580 and GTX 570.

We guess such protection is appropriate because unrealistic loads like FurMark may damage the graphics card. If you want to run stress tests anyway, you can use special options, e.g. in GPU-Z. The GPU-Z developers will surely implement this opportunity for the GeForce GTX 560 Ti.

Overall, the new card from Nvidia is quite competitive in terms of power consumption. Although not as economical as the Radeon HD 6870, it is going to be faster in games. It looks good compared to the Radeon HD 6950, too. The GeForce GTX 560 Ti carries on the good tradition of power efficiency started by the GeForce GTX 580.

To check out the temperature of the card we used a second sample of the GeForce GTX 560 Ti with its original thermal interface. At a room temperature of 25-27°C the card's GPU was 78°C hot. This is a very good result that testifies to the efficiency of the cooler.

As for the noise factor, the card produces about the same amount of noise in 2D and 3D modes because the fan works at 40% in the former mode and at 45% in the latter, despite our running Crysis Warhead to load the card. The noise level meter could barely spot any difference in noise at a distance of 5 centimeters from the testbed. At a distance of 1 meter, the level of noise is only 1 dBA higher than the base noise level of 38 dBA. In other words, the graphics card was not audible amidst the noise from the other system components. We must confess that our testbed is far from quiet, yet the GeForce GTX 560 Ti is not a loud card anyway.

Summing up this section of the review, we can say that the GeForce GTX 560 Ti has a well-balanced combination of electrical, thermal and acoustic characteristics. Now let’s see how it performs in games.

 
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 ]

Discussion

Comments currently: 8
Discussion started: 01/25/11 09:13:17 AM
Latest comment: 01/29/11 12:08:35 PM

View comments

Add your Comment