For 14 years, I (and other scientists) have been debunking a myth created by the coal industry that the Internet is an energy hog. My first debunking (coauthored with California Energy Commissioner Art Rosenfeld) dates back to December 1999, “The Internet Economy and Global Warming,” which the UK Guardian called a “seminal work.” I even testified to Congress on this not once but twice (see Senate testimony here).
In fact, not only is the Internet not an energy hog, it often saves energy through dematerialization and substitution.
Consider those movie videos you stream. You used to have to drive your car to Blockbuster to get them. Then you got DVDs in the mail. Now you get bits over the Internet. Sounds like progress. Heck, you may have noticed that transportation oil consumption in this country peaked years ago in part because Millennials weaned on the Internet and mobile devices aren’t buying cars and driving them as much as previous generations.
Still, everyone from Time magazine to Grist has run myth-filled articles about “The Surprisingly Large Energy Footprint of the Digital Economy” or “Your iPhone uses more electricity than your fridge.” The primary source for these myths are “reports” that Mark Mills wrote for the coal industry, most recently “The Cloud Begins With Coal.”
Now you’d think that a study designed to prove the Internet economy must be fed ever-growing amounts of coal to keep running — a study that just happens to be “sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity “ (!) — would raise some red flags for journalists. Alas, no.
But global climate negotiators need not fear they are threatening to take down the Web. Just 60 seconds on Google directs one to countless debunkings of Mills, mostly by Dr. Jon Koomey, who is a research fellow at Stanford and former staff scientist at Lawrence Berkeley National Laboratory where he became the world’s expert foremost authority on the electricity consumption of the Internet.
Dr. Koomey is working on a post for Climate Progress that redoes some of the erroneous calculations Mills and others are pushing for the umpteenth time. Dr. Koomey has a quick post up, “Wild claims about electricity used by computers that just won’t die (but should),” in which he links to more than a dozen debunkings of Mills over the past 14 years.
As Dr. Koomey puts it:
Bottom line: Mr. Mills has made so many incorrect claims that he simply shouldn’t be treated as a serious participant in discussions about electricity used by information technology (IT) equipment. He cherry picks numbers to suit his narrative, and creates the appearance of doing real research by including many footnotes, but almost invariably he overestimates the amount of electricity used by IT equipment. Last time many important people were misled by his antics-–I hope they are smarter this time.
Hope springs eternal, but, sadly, so does misinformation.
At least MSN got the story right, explaining why the new claims are false corporate propaganda, “Three other researchers doubt the study, a separate paper conflicts with it, and the support behind the research comes with an obvious agenda.” MSN points out, “Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments that Mills’ work was flawed.”
Below is an excerpt from a 2010 Climate Progress post, “Debunking the myth of the internet as energy hog, again: How information technology is good for climate.” (See also my January 2009 CP post, “Ignore the media hype and keep Googling — The energy impact of web searches is very LOW.”)
For some reason, the power used by computers is a source of endless fascination to the public. Most folks think that the power used by computers is a lot more than it actually is, and that it’s growing at incredible rates. Neither one of these beliefs is true, but they reflect a stubborn sense that the economic importance of IT somehow must translate into a large amount of electricity use. That incorrect belief masks an important truth: Information technology has beneficial environmental effects that vastly outweigh the direct environmental impact of the electricity that it consumes.
That’s the start of guest commentary from Dr. Jonathan Koomey, a project scientist at Lawrence Berkeley National Laboratory…
Here is more of what he wrote:
Back in 1999, a cleverly written article was published in Forbes magazine, claiming that the Internet used 8% of all US electricity, that all computers (including the Internet) used 13% of US electricity, and that this total would grow to half of all electricity use in ten to twenty years. Most major U.S. newspapers and business magazines, many respected institutions, and politicians of both political parties cited these assertions (the first one even came up in Doonesbury at about the same time). Alas, most people took leave of their critical faculties when evaluating them.
Joe Romm, Amory Lovins, and I spent a few person years of effort between us demonstrating in the scientific literature that these assertions were all false (for a compilation of that work, go here). The Internet, as defined by the Forbes authors, used less than 1% of US electricity in 2000, all computers used about 3%, and there is no way, short of repealing the laws of arithmetic, for the total to grow to half of all electricity use in one to two decades (see the Epilogue in Koomey 2008 for a summary). Joe also showed that the high-level statistics on growth in energy, electricity, and carbon emissions in the Internet era all showed exactly the opposite of what the above claims would imply: the growth rates were significantly lower in the Internet era (1996 to 2000) than in the preceding four year period, even though GDP growth was higher in the latter period.
Unfortunately, variations of these myths persist to this day. In early 2009, the normally reliable Sunday Times of London reported that generating the electricity needed for a Google search emitted half as much carbon as did boiling a cup of tea, but this claim proved to be spurious (see Mills and Koomey 2009). As recently as April 12, 2010, Energy Tribune published an op-ed by Robert Bryce repeating the falsehoods in the Forbes article and confusing several important issues on this topic. And the ongoing concern over total electricity used by data centers continues to generate news coverage (for example, see this article in the Guardian), even though these facilities use only about 1% of world electricity use and their efficiency is improving rapidly over time.
In my view, the really important story is that while computers use electricity, they are not a huge contributor to total electricity consumption, and while it’s a good idea to make computers energy efficient, it’s even more important to focus on the capabilities information technology (IT) enables for the broader society. Computers use a few percent of all electricity, but they can help us to use the other 95+% of electricity (not to mention natural gas and oil) a whole lot more efficiently.
As an example of this latter point, consider downloading music versus buying it on a CD. A study that is now “in press” at the peer-reviewed Journal of Industrial Ecology showed that the worst case for downloads and the best case for physical CDs resulted in 40% lower emissions of greenhouse gases for downloads when you factor in all parts of the product lifecycle (Weber et al. 2009). When comparing the best case for downloads to the best case for physical CDs, the emissions reductions are 80%. Other studies have found similar results (see Turk et al. 2003, Sivaraman et al. 2007, Gard and Keoleian 2002, and Zurkirch and Reichart 2000). In general, moving bits is environmentally preferable to moving atoms, and whether it’s dematerialization (replacing materials with information) or reduced transportation (from not having to move materials or people, because of electronic data transfers or telepresence) IT is a game changer.
Another area where IT can help us is in getting smarter and more capable, so we can use our resources more efficiently. This could take the form of better sensors and controls in buildings and industry, like the wireless sensor networks that can be quickly and cheaply distributed in existing structures without wiring. Or it could involve more widespread use of software to make better energy-related decisions, such as Lawrence Berkeley National Laboratory’s Home Energy Saver or the private sector tool called Wattbot, both of which I’ve worked on over the years. Or it could involve computer controls in automobile engines, which reduce criteria pollutant emissions and improve fuel economy at the same time. Or it might mean smart meters that track electricity use minute by minute. Or it might involve the various companies that scan utility bills for big corporations and “roll up” those bills into analysis software that gives companies visibility into their actual energy costs (see, for example, AdvantageIQ). All of these examples and more are enabled by cheap, abundant, and powerful information technology
And there is good reason to believe that trends in information technology are going to make these positive developments even more pervasive and important. We’re all familiar with Moore’s law, which describes the rate of change in transistors per chip over time (doubling every year from the mid-1960s to the mid-1970s, and doubling every two years since the mid-1970s), with correspondingly rapid reductions in costs per transistor. However, few people are aware that there’s a similarly regular trend on the electrical efficiency of computers that has persisted for two decades longer than Moore’s law, and applies to all electronic information technology, not just microprocessors. The electrical efficiency of computation, defined as the number of computations we can do per kilowatt-hour consumed, has doubled roughly every year and a half since the mid 1940s (see Koomey et al. 2009, below)….
IT is one technology that should give us hope about meeting an aggressive warming limit of 2 degrees C (or less) from preindustrial times. Never before has society had to confront a challenge like this, but never before have we had such a powerful technology moving so rapidly in the right direction. And if we combine ubiquitous mobile computing with rapid advances in solar photovoltaic technologies (like in the Big Belly trash compactor, for example), the possibilities for truly game changing societal innovation are breathtaking
Of course, this story is as much about personal and institutional change as it is about technology, and without a focus on the human and organizational evolution (as well as a stiff price on carbon) we’ll continue on our currently unsustainable path. But one important piece of facing the climate challenge is falling rapidly into place: Information technology allows us to dematerialize, reduce transportation emissions, and get smarter faster. There’s no time to waste in putting it to work.
So Google, Youtube, blog, and flickr as much as you want. If you are worried about your carbon footprint, buy 100% green power and do an efficient retrofit on your house to cover your emissions — and let the Internet keep saving people energy and resources.
Indeed, replacing material consumption and transportation with electricity is almost certainly a good thing from a climate perspective since it is considerably easier to generate carbon-free electricity than it is to have carbon free-transportation or carbon-free versions of books and newspapers and inventories and offices.