Despite the fact that the power consumption of computers does not decrease drastically very often, their performance drastically improves every single year (for the common PC).
And the processing power improvements of the past couple of decades are actually very important to the energy situation the world is in.
Computer technology evolves faster than most types of technology, and has outpaced other electronics technologies so much that it is actually replacing them.
My point: If the power consumption of computers was to remain at ~40 watts (it varies), but the amount of information that they process is doubled, that is technically doubling the efficiency of older computers. So, even though the average power consumption of the computer has not been decreasing drastically, its performance has been, and this translates into greater energy savings.
If performance doubles, one computer can do the job of two older computers. Two older computers would use over 80 watts of power. One usually consumes 40 watts. This is most relevant to server and supercomputer networks because server and supercomputer setups often utilize multiple computers (distributed computing), due to the large workload that has to be processed, and one computer alone cannot handle that.
If a web server setup consisting of 100 computers is replaced by newer computers that can handle twice the workload while still using 40 watts, only 50 of the new computers would be necessary to replace the 100 old ones, and the 100 old ones would use a combined would use 4,000 watts of power, and the new ones would end up using half as much: 2,000 watts.
This means that people using web server setups with old computers can essentially slash their power consumption in half easily by upgrading to faster computers (the energy efficiency improvement could even be an order of magnitude greater if the old computers were many years old).
Another fact you should note about computer power consumption is that the wattage printed on the power supply (PSU) label is not the actual power consumption of the computer. It is the peak power handling capacity of the power supply, not the nominal energy usage. This wattage actually increases and makes people think that computers are using more electricity when they are just becoming more powerful.
Just to clarify: Upgrading your own personal computer may not necessarily save you energy. These points above are general, and you need to evaluate the specifics of your computer and the one you’d be upgrading to in order to identify your savings potential.
Unfortunately, it should be noted that computer power demand has been increasing for years, and has offset computer efficiency improvements.
Photo Credit: dottavi
I have a keen interest in physics-intensive topics such as electricity generation, refrigeration and air conditioning technology, energy storage, geography, and much more. My website is: Kompulsa.