How Much Energy Does the Internet Use?

by Mike Rinaldi on 6/18/12 4:11 PM

“No trees were killed in the sending of this message, but a large number of electrons were terribly inconvenienced.” It’s a funny email signature, but how many inconvenienced electrons does it take to power the internet?

In 2011, the digital universe, or the amount of information created and replicated, reached 1.8 trillion gigabytes, and this digital universe is doubling in size every two years. Much of that digital information is housed in data centers around the world, and running these data centers requires a huge amount of electrical energy.

A 10-megawatt (MW) data center can use the energy of a small town at a cost of around $300,000 a month. Couple that with the fact that there are over 500,000 data centers in the world, according to Emerson Network Power, and we’re talking about 2% of all electrical energy used globally. So, running the internet uses upwards of 406 terawatts per year, assuming 20.3 petawatt-hours as the world’s annual electrical energy consumption.

The odd thing is that in traditional data centers, only half of the energy consumed is useful for running the digital universe: powering the servers that hold our emails, social networking profiles, and the like. The other half of the energy goes into cooling those servers, or it’s lost as heat when electricity is changed between alternating current (AC) and direct current (DC).

 

What are the 4 trends driving the future of Data Center infrastructure design and management?

download-4-trends-whitepaper

Internet Energy Usage

Click here to read the full energy usage article

Topics: Emerson Network Power, data center infrastructure, reduce cost, Data Center, data center design, kW per rack, data center infrastructure management, DVL, electrical distribution, reduce downtime, data center outages

Subscribe to Our Blog

Recent Posts

Posts by Tag

see all