Bracing for the Cloud

Digital Economy Requires Massive Amount of Electricity

{photo_credit}

To understand the voracious growth and scale of investment in the global information-communication-technology ecosystem, consider this fact: the cloud is now approaching 10 percent of the world’s electricity consumption. At the individual level, when you count all components of usage – not just charging – the average iPhone consumes more energy annually than a medium-sized refrigerator. A new report from Digital Power Group CEO Mark Mills titled “The Cloud Begins with Coal” calculates just how much electricity (mainly sourced by coal) and physical infrastructure will be needed in order to power our digital economy.

August 13, 2013 | Breakthrough Staff,

They weigh less than five ounces, but according to recent data, when you count everything that matters, the average iPhone consumed more energy last year than a medium-sized refrigerator. By the numbers, a refrigerator from the Environmental Protection Agency’s Energy Star ratings list uses about 322 kWh per year. In contrast, the average iPhone used 361 kWh of electricity when you add up its wireless connections, data usage, and battery charging. Considering that a smart phone represents just one device in the ocean of the world’s Information-Communications-Technologies (ICT) ecosystem, it seems superfluous to say that the digital economy is poised to consume massive amounts of energy. 

The argument bears repeating, however. Recent media coverage of the cloud suggests that improvements in energy efficiency will curb energy consumption. In his new report titled The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power, August 2013, Mark Mills, CEO of Digital Power Group, argues that, despite suggestions otherwise, these applications should have very little impact on overall IT power consumption. The study, which was sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity, concurs (perhaps unexpectedly) with earlier studies undertaken by Greenpeace, and further illustrates that the rapid growth of the global digital era is transforming our energy ecosystem in unprecedented ways.   

According to Mills, the global ICT system is now approaching 10 percent of the world’s electricity generation. By current calculations, the cloud uses about 1,500 TWh of electricity annually, which is equal to the combined electrical generation of Japan and Germany. In the near future, hourly Internet traffic will exceed the Internet’s annual traffic in the year 2000. The global ICT ecosystem now also consumes as much electricity as global lighting did circa 1985 (seen below). 


Graph taken from The Cloud Begins With Coal

To ascertain how much energy the global ICT system will require is difficult, largely because the information age constitutes a ‘blue-whale’ economy in which energy use is largely invisible to the public. In fact, according to Mills, current estimates probably understate global ICT energy use by as much as 1,000 TWh since up-to-date data remains undisclosed, and as he asserts, many recent trends (such as wireless broadband) have yet to be added to the energy accounting. What we do know, however, is that much of the future electric demand growth – the EIA forecasts a 15 percent aggregate rise in US electric demand in over the next two decades – will come from new ICT.

According to Mills, the rapid growth of ICT will mean that the type of electricity demanded over the next few decades will be significantly different than in the past. Unlike many of the types of energy services that have driven past growth – lighting, heating and cooling, transportation – the ICT ecosystem consists of always-on electricity-consuming devices and infrastructure. In his Forbes column earlier this year, Mills argued: “Demand for resilient and reliable delivery is rising far faster than the absolute demand for power. The future will not be dominated by finding ways to add more renewables to a grid, but by ways to add more resiliency and reliability.” The fundamentally different nature of ICT is demonstrated by comparing its energy density to conventional services. As Mills writes:

The average square foot of a [cloud] data center uses 100 to 200 times more electricity than does a square foot of a modern office building. Put another way, a tiny few thousand square foot data room uses more electricity than lighting up a 100,000-square-foot shopping mall.

Previous studies have looked into different aspects of the digital universe. A 2012 report by Greenpeace International called How Clean Is Your Cloud argued that data centers are a primary driver of electricity demand growth. Researchers estimated that one data center could require the amount of electricity used to power nearly 180,000 homes. Many more data centers (the largest one the size of seven football fields) are popping up across the globe in remote, suburban towns, and, combined, are expected to need upwards of 1,000 TWh – more than the total used for all purposes by Japan and Germany.


Graph taken from The Cloud Begins With Coal; Source: Microsoft Global Foundation Services

But data centers alone are not responsible for the surge in ICT electricity use. A 2013 study by the Centre for Energy-Efficient Telecommunications (CEET) argued that much of the growth comes from wireless networks, such as Wi-Fi and 3G, used to access cloud services. According to the authors’ calculations, by 2015 the “wireless cloud” will consume up to 43 TWh, compared to only 9.2 TWh in 2012, representing a 460 percent increase. Wireless cloud users worldwide will grow from 42.8 million in 2008 to just over 998 million in 2014, representing a 69 percent annual growth rate.  And Mills’ study extends the ICT energy accounting to include the much broader universe of wireless network connectivity beyond just the cloud.


Graph taken from The Cloud Begins With Coal; Data Source: Ericsson Mobility Report, June 2013

These growth trends can be seen at the individual level as well. Take the iPhone example given at the outset. Based on the most recent data from NPD Connected Intelligence, the average Verizon Wireless iPhone user consumed about 1.58 GB of data per month in 2012, which equals about 19 GB per year. Multiply 19 GB by 19.1 kW, which is the amount of energy ATKearney reports is needed to power one GB, and you find that the average iPhone uses 361 kWh of electricity per year. Add to this the amount of electricity used to charge your phone annually (3.5 kWh) and the amount of electricity needed for each connection (23.4 kWh) and you have a grand total of 388 kWh per year. In Mills’ calculations, to watch an hour of video weekly on your smart phone or tablet consumes annually more electricity in the remote networks than two new refrigerators use in a year.

As Mills’ analysis notes, the information sector is now the fastest growing sector in the US economy: over $1 trillion of our economy is associated with information and data, more than twice the share of the GDP related to transportation (including vehicle manufacturing). The rapid rise in digital traffic is the driving force for the enormous growth in global investment in the ICT infrastructure (up $8 trillion within a decade).  And, as Mills points out – both in the analysis and inherent in the report’s title – coal has been, and is forecast to continue to be the dominant source of global electricity supply.  

“In every credible forecast – including from the EIA, IEA, BP, Exxon – coal continues to be the largest single source of electricity for the world,” says Mills. “Coal’s dominance arises from the importance of keeping costs down while providing ever-greater quantities of energy to the growing economies, and as the IEA recently noted, the absence of cost-effective alternatives at the scales the world needs.”

Google said as much in 2011 in its white paper “Google’s Green PPAs: What, How, and Why”:

Neither the wind nor the sun are constantly available resources. They come and go with the weather, while Google’s data centers operate 24x7. No matter what, we’d need to be connected to the grid to access “conventional” power to accommodate our constant load. The plain truth is that the electric grid, with its mix of renewable and fossil generation, is an extremely useful and important tool for a data center operator, and with current technologies, renewable energy alone is not sufficiently reliable to power a data center.

The company scorecard produced by Greenpeace in 2012 further demonstrates coal’s dominance. Of the 14 studied, Apple, Amazon, and Microsoft were given the poorest ratings in terms of their lack of clean energy sources and for not being transparent about their cloud infrastructure. 


Scorecard taken from Greenpeace report How Clean Is Your Cloud?

If Mills is right that ICT will fundamentally change the way we use electricity -- by putting a premium on reliable, round-the-clock power generation -- we need to be thinking seriously about how we can power the information sector with cheaper, cleaner alternatives to coal. This will require making technologies that can provide reliable, baseload power cheaper and more readily available. 

 

Photo Credit: Gadgetadda.com


Comments

  • A very large portion of the energy used at server farms is used for cooling.

    These farms would be far more energy efficient if they were located in Wyoming where much of the year cooling can be achieved by opening a window.  Also, it is far more efficient to transport data to a server farm in Wyoming then it is to transport coal to a generator almost anywhere else and that efficiency also leads to big energy savings.

    By jackbenimble on 2013 08 15

    Reply to this comment / Quote and reply


  • If you are correct, then I should be able to run my fridge from a little lithium-ion battery. Perhaps an electrical engineer should check your figures.

    By Pat Garret on 2013 08 16

    Reply to this comment / Quote and reply


    • I don’t think he’s saying that the smart phone itself expends more energy than a fridge; he’s saying that the overall energy consumption for an operational smart phone, which would also include the energy used to send info over a wireless network like wi-fi/bluetooth/3G/4G, is higher—since most fridges don’t have to request info to be sent over on a wireless network. The main difference is that you won’t necessarily see a bigger number on your own electric bill in the short run, but I guess we’re focusing on the big picture here since somebody somewhere will have to pay for it somehow, which may perhaps ultimately make the price for electricity rise in the maybe-not-so distant future.

      By wait a minute on 2013 08 21

      Reply to this comment / Quote and reply


  • Refrigerators and SmartPhones are not comparable.  They tell nothing about the net energy impact of smartphones and other computing devices. For example, a smartphone lets you dispense with an always-on landline phone. It lets you manage e-mail without turning on your desktop or laptop computer. It lets you manage your bank account without driving to a bank branch. It lets you watch videos without ordering DVDs to be delivered by Netflix.

    It also lets you read e-books and newspapers without anyone having to pulp millions of trees and truck paper all over the country. In fact, it saves an estimated 98 percent of the energy required to produce and deliver a paper newspaper.  A smartphone also helps you navigate traffic and locate empty parking spaces, saving time and gas. It helps you use car-sharing services and find local transit options so you don’t need to buy a vehicle at all. It even allows you to remotely control a thermostat in your home so you can save energy.  Your refrigerator can’t do any of that, just as your phone can’t chill beer. Why would anyone try to compare the two?

    Many people believe that the power used by computers is a lot more than it actually is, and that it’s growing by leaps and bounds.  Neither of these beliefs are true.  They reflect a false assumption that the economic singificance of IT must lead to high energy usage - another incorrect belief.  The truth is that IT has beneficial environmental effects that vastly outweigh the direct environmental impact of the electricity that it consumes.

    The important story is that while computers use electricity, they are but a marginal contributor to total electricity consumption, and while energy efficiency is a good idea, the more important focus should be on capabilities enabled by IT to serve the broader society.  Computers use a few percent of all electricity, but they can help us to use the other 95+ percent of electricity (not to mention natural gas and oil) a whole lot more efficiently.  When you loook at things in the aggregate, the electricity used by computers pay for themselves many times over in efficiency gains realized in aggregate.  It is about leveraging technology to produce efficiency gains for society in aggregate.

    The findings of the study are myopic, in that it misses the big pciture and focuses on one aspect, while ignoring its context and net impact.  It was clear that the author of the study had an interest in promoting coal, just based on the conclusions drawn.  Then to see that the study was sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity, it makes sense that the author would seek to manipulate the methods of the study to reach pre-defined conclusions to advance the objectives of its sponsor.  The author had an interest in the outcome (i.e. to appease the sponsor of the research).  Ergo, the “research” was not independent and unbiased.  In sum, the “study” is not research; rather just garbage.

    By Zeromein on 2013 09 04

    Reply to this comment / Quote and reply


  • I believe this article contains an error (as does one of its sources). The article states that it takes 19.1 kW to deliver 1GB of data. Should that be kWh rather than kW? The source (AT Kearny) has the same error.

    The AT Kearny source also has an additional error in that it states that energy use per connection is 23.4 kW. That amount of power would set users on fire. They probably mean 23.4 kWh per connection per year.

    There is a big difference between a kW and a kWh.

    By Barney Greinke on 2013 11 07

    Reply to this comment / Quote and reply

Submit a comment