August 1, 2011 – Data centers continue to take up electricity at a great rate, though at a slower clip in recent years than previously anticipated. That change is due to reduced installation and not efficiency improvements, according to a new study from IT energy analyst Jonathan G. Koomey.
The study, “Worldwide Electricity Used in Data Centers,” was an update to Koomey’s study of the same name from 2008. Servers remain the largest and most vital energy drains in data centers. The new study showed that electricity used by data centers grew by more than 50 percent from 2005 to 2010, much less than the rate it doubled from 2000 to 2005. In the past five years in the U.S., data center energy consumption increased by 36 percent, two-thirds less than the rate it consumed in the five years prior, according to Koomey.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access