We see exactly the same thing during final rendering. By overclocking the processors without adjusting its core voltage you can save some power. More aggressive overclocking when we do increase the processor core voltage, on the contrary, causes higher power consumption despite higher performance.
Note that quad-core processors are almost always better than dual-core ones in terms of power consumption. Yes, these CPUs have higher performance-per-watt, but only in tasks optimized for multi-core systems. At the same time, the energy-efficiency leadership goes to Intel processors: the previous-generation ones based on Core microarchitecture, as well as some newer CPUs based on Nehalem microarchitecture, including Clarkdale and Lynnfield solutions, but not the older Bloomfield ones. It looks like Socket AM3 platforms are far not the most efficient choices.
3D games create a completely different type of load. Here, processor overclocking doesn’t speed up the calculations, because the time it takes for the game to run depends solely on the gamer’s skills, but absolutely not on the fps rate. Therefore, there is a different dependence between overclocking and power consumption in this case.
CPU overclocking, on the contrary, may increase the fps rate, namely, the number of frames that the CPU transfers to the graphics card for further rendering. As a result, the processor power consumption is not the only one increasing: so does the power consumption of the graphics card, because it receives additional load. Therefore, any type of overclocking applied to gaming systems always causes power consumption increase. That is why the only way you can save from overclocking is when the time it takes the processor to complete a certain task is directly connected with its performance. games are not one of those cases, so overclocking may only be interesting for the sake of better visual quality and lower response time to gamer’s actions.