CleanTechnica is the #1 cleantech-focused
website
 in the world. Subscribe today!


Consumer Technology Original IBM PC.

Published on July 12th, 2012 | by Nicholas Brown

5

Improved Computer Performance Translates into Greener Computing Overall

Share on Google+Share on RedditShare on StumbleUponTweet about this on TwitterShare on LinkedInShare on FacebookPin on PinterestDigg thisShare on TumblrBuffer this pageEmail this to someone

July 12th, 2012 by  

 
Despite the fact that the power consumption of computers does not decrease drastically very often, their performance drastically improves every single year (for the common PC).

And the processing power improvements of the past couple of decades are actually very important to the energy situation the world is in.

Original IBM PC.

Computer technology evolves faster than most types of technology, and has outpaced other electronics technologies so much that it is actually replacing them.

My point: If the power consumption of computers was to remain at ~40 watts (it varies), but the amount of information that they process is doubled, that is technically doubling the efficiency of older computers. So, even though the average power consumption of the computer has not been decreasing drastically, its performance has been, and this translates into greater energy savings.

If performance doubles, one computer can do the job of two older computers. Two older computers would use over 80 watts of power. One usually consumes 40 watts. This is most relevant to server and supercomputer networks because server and supercomputer setups often utilize multiple computers (distributed computing), due to the large workload that has to be processed, and one computer alone cannot handle that.

If a web server setup consisting of 100 computers is replaced by newer computers that can handle twice the workload while still using 40 watts, only 50 of the new computers would be necessary to replace the 100 old ones, and the 100 old ones would use a combined would use 4,000 watts of power, and the new ones would end up using half as much: 2,000 watts.

This means that people using web server setups with old computers can essentially slash their power consumption in half easily by upgrading to faster computers (the energy efficiency improvement could even be an order of magnitude greater if the old computers were many years old).

Another fact you should note about computer power consumption is that the wattage printed on the power supply (PSU) label is not the actual power consumption of the computer. It is the peak power handling capacity of the power supply, not the nominal energy usage. This wattage actually increases and makes people think that computers are using more electricity when they are just becoming more powerful.

Just to clarify: Upgrading your own personal computer may not necessarily save you energy. These points above are general, and you need to evaluate the specifics of your computer and the one you’d be upgrading to in order to identify your savings potential.

Unfortunately, it should be noted that computer power demand has been increasing for years, and has offset computer efficiency improvements.

Photo Credit: dottavi

Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.



Share on Google+Share on RedditShare on StumbleUponTweet about this on TwitterShare on LinkedInShare on FacebookPin on PinterestDigg thisShare on TumblrBuffer this pageEmail this to someone

Tags: , , ,


About the Author

writes on CleanTechnica, Gas2, Kleef&Co, and Green Building Elements. He has a keen interest in physics-intensive topics such as electricity generation, refrigeration and air conditioning technology, energy storage, and geography. His website is: Kompulsa.com.



  • Goaway

    This posting is very ill informed.

    Yes – computing efficiency is always improving, but  this is more than offset by hugely increased consumption.

    Yes – there is progress and awareness on power issues – but these are  slow incremental changes..

    • http://www.kompulsa.com/ Nicholas

      I know that computing power demand has been increasing, but, we are referring to distributed computing systems. There are many distributed server systems that are not exploding in size like many others are. As a matter of fact, the organizations doing this would have to grow in popularity to increase server power demand, and this is not the case. Many of them are relying on old computers and they could handle the same server load with less computers 

      My main point is that power demand grows with or without these fast computer efficiency improvements we have been seeing. Be thankful for these improvements or overall power consumption would explode even more if that wasn’t the case. You’re being pessimistic. People and organizations around the world (especially in developed countries) are still adopting computers for the first time. If computer performance didn’t increase so much, they would be using far more computers than would be needed now for the same tasks because traditional computers simply cannot do as much.

  • Jonesey

    Where is the data behind these assertions? I worked in IT for 15 years, and I *never* saw two computers replaced by one computer.  The author of this post should look up Parkinson’s Law, which applies to computers more than to just about anything else.

    Increased computing power (processor speed, hard drive space, network bandwidth) has so far led inexorably to increased use of that power to do more stuff (process more data that was going unprocessed before, storing more data, transferring more stuff over the network), not to do the same amount of stuff with fewer computers or other related hardware.

    There are opportunities for saving energy in the IT field, but this post does not explore them.

    • http://www.kompulsa.com/ Nikodean1

      Did you work with distributed computing systems?

       I made it clear in this post that it is specifically about distributed computing for server and supercomputer systems. Yes, I know that computing power demand has been increasing, but, please remember that if we stuck to the old computers, they would not be able to handle that increased work load, and, in a case where surplus computers were used in the past to compensate for increasing power demand, they would end up wasting more energy than modern ones do. Also, imagine how much more energy would be used if old computers were used to do the same job.

      Far more would be needed. My point is that improved computer performance has facilitated all of this, even though power demand increases, computer improvements compensate for this. It is just technological advancement. The demand of computer power from server networks does not automatically blow up to match that of modern computers every time new ones are increased. 

      You also failed to realize that this post specifically pertains to improved computer performance, not “how” to improve computing efficiency. 

    • http://www.kompulsa.com/ Nicholas

      I admit that computer power demand has been increasing, but, what the article says about new computers being able to handle greater work loads at a much higher efficiency is technically correct. I read a little about Parkinson’s law. Thank you for pointing out what you did. By the way, I did not say that computers have been replaced by fewer ones, but, in the scenario that I mentioned, it can be done, assuming that the users are not just another example of Parkinson’s law. 

Back to Top ↑