Measuring Power Consumption
It’s easy to grasp the idea of measuring a graphics card’s power consumption – just recall the basic physics course, particularly the legacy of the wise George Ohm. By measuring the voltage in a power circuit and multiplying it by the current strength in this circuit we get the amount of power consumed by the graphics card from this power circuit. Since there are several power circuits, we should then sum up the results for each of them to get the total power consumption of the card.
We can measure the current in a power circuit with the help of a shunt, attaching it into a rupture in this circuit. According to Ohm’s law, the current strength is the same at any point of an electric circuit. It means that the current that flows through the shunt is nothing else but the amount of power consumed by the graphics card.
I used 5W resistors with a resistance of 0.12 Ohms as a shunt, connecting them in parallel in fours and assembling a simple adapter for an easy connection between the power cable and the graphics card:
You can see it in the snapshot that this adapter contains two perfectly identical shunts, attached to 5v and 12v rails. The two medium wires – “zeroes” – go from one connector to another without ruptures:
The resulting resistance of the shunts was 0.03 Ohms – good enough for me as the voltage in the shunts never exceeded 0.15v in the hardest modes. Overall, the resistance should be low enough for the voltage not to drop too much on the shunt, but high enough to be measured by a normal voltmeter. I used a professional digital multimeter from UNI-T, the UT70D model.
Let’s move on. The voltage drop on the shunt divided by the shunt resistance equals the value of the current that flows through the shunt. By multiplying this value by the voltage that comes to the card after the shunt I get the amount of power consumed by the graphics card.