AMD commissions server power consumption study

— 2:06 PM on February 15, 2007

Intel and AMD have been content to take turns touting the benefits of power efficiency for a while, but AMD has now kicked things up a notch with a study by Dr. Jonathan G. Koomey, a Staff Scientist at the Lawrence Berkeley National Laboratory. The study (PDF), which AMD commissioned, says that total data center power consumption in the United States jumped from 23 billion kWh in 2000 to 45 billion kWh in 2005. That number includes not only servers themselves, but also cooling equipment, which represents roughly half of that 45 billion. To put things into perspective, 45 billion kWh adds up to energy costs of around $2.7 billion per year, and it represents 1.2% of total U.S. electricity consumption. According to the study, that's also comparable to the amount of power used by color TVs in the country.

Total electricity use for servers in the U.S. and the world in 2000 and 2005, including cooling and auxiliary equipment. Source: AMD.
Interestingly, the study says power consumption for the rest of the world climbed from 35 billion kWh to 78 billion kWh over the same time period, suggesting that server power demands for non-U.S. countries are increasing at an even faster pace.

The study concludes that, if power consumption per server remains constant, the amount of electricity used by servers worldwide will go up 40% by 2010. If power consumption per server increases at the same rate as it did between 2000 and 2005, however, the study says worldwide server electricity demands will be 76% higher in 2010 than they were in 2005.

Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.