To preserve a livable climate, we need technology deployment. That’s what drives innovation, as Gates himself used to argue.
So I listened to Bill Gates’ TED Speech a few hours after he gave it in Long Beach, CA. Let’s just call that an IT miracle.
It wasn’t 80% crap like his recent piece on energy.
Quite the reverse, it was more like a miraculous ice cream cone made up of 80% homemade chocolate-chocolate chip ice cream and only 20% bat guano. Curiously, the guano kind of stands out when you lick it, and that’s what people talk about.
Since TED is all hush-hush, most people get only the snippets the media shares, such as HuffPost’s headline: “Bill Gates’ TED Speech 2010: ‘We Need Energy Miracles’.” Mongabay.com reported:
Gates said the world needs to reduce carbon emissions to zero by 2050 and suggested researchers spent the next 20 years developing new technologies and the follow 20 years implementing them.
But I’ve got the scoop for you — and I’ll post the transcript when I get it.
Yes, Bill Gates keeps diminishing the value of aggressive action now, which is just plain suicidal. We need both massive technology deployment now and much more innovation. But the former is the sine qua non for having any chance to preserve a livable climate. Ironically, the former is also the key to the latter, something Gates himself used to argue. Strangely, Gates strongly praises Gore’s book even though its main thrust is directly at odds with Gates’.
This post will:
- Look at what’s good in the speech.
- Explain why “Energy Miracles” are widely overrated as a strategy for preserving a livable climate.
- Explain why tech deployment is the key to the kind of innovation Gates wishes for.
- Raise the issue some technologists have raised with me: Is Gates is a hypocrite?
WHAT’S GOOD IN GATES’ TED SPEECH
Let’s start with the homemade chocolate-chocolate chip ice cream.
First, Gates has finally gone on the record as to how serious a threat is posed by global warming and unrestricted emissions of greenhouse gases. He warns it could lead to starvation around the planet. He notes:
Now the exact amount of how you map from a certain increase in CO2 to what temperature will be and where the positive feedbacks are, there’s some uncertainty there — but not very much. And there’s certainly uncertainty about how bad the effects will be, but they will be extremely bad.“
This is something many, many of us have been waiting for him to do, particular because of his expansive philanthropic work with Warren Buffet (see Gates Foundation strategy raises key question: Can the problems of the developing world be solved by ignoring global warming?).
Gates is unequivocal on the science: “CO2 is warming the planet.” He understands that we have to get near zero emissions by mid-century, especially the rich countries. He talk to the “top scientists” and learned “until we get near to zero, the temperature will continue to rise.”
He recognizes “the IPCC is not necessarily the worst case” in term of impacts — though by now, that conclusion still deserves a “Duh” (see Intro to global warming impacts: Hell and High Water). For the plausible worst-case, see UK Met Office: Catastrophic climate change, 13-18°F over most of U.S. and 27°F in the Arctic, could happen in 50 years, but “we do have time to stop it if we cut greenhouse gas emissions soon.”
He doesn’t attack efficiency and renewables and immediate action with a string of dubious or illogical claims as he recently did (see “Bill Gates disses energy efficiency, renewables, and near-term climate action while embracing the magical thinking of Bjorn Lomborg (and George Bush).” Woo hoo!
Indeed, he notes that “we do need a market incentive” — a price for carbon either in the form of “cap-and-trade” or an “energy tax.”
He further asserts we can achieve a factor of 3 to 6 in efficiency gain across the board. Here is where he dives into the guano.
He fails to spell out just how aggressive we must be in technology deployment to achieve that efficiency gain. After all, we have the ability to dramatically increase the efficiency of almost every major human enterprise is now — cost-effectively. We don’t need energy miracles, we need to address market and regulatory barriers.
And while he correctly asserts that even if we do all of that efficiency, we can’t possibly solve the climate problem without multiple, massively scaled carbon-free energy sources. He identifies the five most likely candidates for massive scaling as carbon capture and storage, nuclear power, wind and solar (both PV and solar thermal). But he spends most of his time talking about nuclear, raising questions about renewables (transmission and storage) while pushing the notion that “We Need Energy Miracles.”
By miracles, he says, he doesn’t mean things that are “impossible.” The “microprocessor” and the “personal computer” are the “miracles” he means. As we’ll see, the PC in particular doesn’t match his (new) theory of how you get mass deployment of low cost innovative technology.
He doesn’t diss action now, but says that action now is “equally or maybe less important” than accelerating the pace of innovation breakthroughs.
And yes, when asked about the timescale issue, he does say “we need 20 year to invent and 20 years to deploy” his energy miracles.
Bizarrely, he says “a lot of great books have been written about” this subject and “I’ll be sending you” the new book by Gore, Our Choice. But had he read the book — or even picked it up — then he would have noticed that it is almost directly at odds with his argument. Right there on the back jacket next to Gore’s picture is an excerpt from the introduction by Gore beginning:
It is now abundantly clear that we have at our fingertips all of the tools we need to solve the climate crisis period. The only missing ingredient is collective will….
Our Choice gathers in one place all of the most effective solutions that are available now and that, together, will solve this crisis.
As CNN reports, Gates ended his remarks:
If he could wish for anything in the world, Gates said he would not pick the next 50 years’ worth of presidents or wish for a miracle vaccine.
He would choose energy that is half as expensive as coal and doesn’t warm the planet.
While Gates understands we need a price on carbon to make coal power more expensive, what he simply doesn’t understand — or, rather, what he no longer understands — is that the best way to drop the price of price of carbon-free power is through deployment.
WHY DEPLOYMENT, FAR MORE THAN R&D, IS THE KEY TO BOTH INNOVATION AND STABILIZING AT OR BELOW 2°C.
I was acting assistant secretary (and principal deputy assistant secretary) of energy for energy efficiency and renewable energy from 1995 to 1998, helping to run the billion-dollar federal office in charge of research, development, demonstration, and deployment of most low-carbon technologies, including three of Gates’ would be miracles. For much of that time I was in charge of technology and market analysis for the office. Since then, I have written a number of books on low carbon technology development and deployment.
So I have thought a lot about whether Gates is right that we need multiple “energy miracles” developed through a $10 billion-a-year government R&D effort to stabilize at 350 to 450 ppm.
Put more quantitatively, the question is “” What are the chances that multiple (4 to 8+) carbon-free technologies that do not exist today can each deliver the equivalent of 350 Gigawatts baseload power (~2.8 billion Megawatt-hours a year) and/or 160 billion gallons of gasoline cost-effectively by 2050? [Note -- that is about half of a stabilization wedge.] For the record, the U.S. consumed about 3.7 billion MW-hrs in 2005 and about 140 billion gallons of motor gasoline.
Put that way, the answer to the question is painfully obvious: “two chances “” slim and none.” Indeed, I have repeatedly challenged readers and listeners over the years to name even a single technology breakthrough with such an impact in the past three decades, after the huge surge in energy funding that followed the energy shocks of the 1970s. Nobody has ever named a single one that has even come close.
Yet somehow the government is not just going to invent one TILT (Terrific Imaginary Low-carbon Technology) in the next few years, we are going to invent several TILTs comparable to the microprocessor. Seriously. Hot fusion? No. Cold fusion? As if. Space solar power? Come on, how could that ever compete with solar baseload (aka CSP)? Hydrogen? It ain’t even an energy source, and after billions of dollars of public and private research in the past 15 years “” including several years running of being the single biggest focus of the DOE office on climate solutions I once ran “” it still has actually no chance whatsoever of delivering a major cost-effective climate solution by midcentury if ever (see “California Hydrogen Highway R.I.P.).
I don’t know why the energy miracle crowd can’t see the obvious “” so I will elaborate here. I will also discuss a major study that explains why deployment programs are so much more important than R&D at this point. Let’s keep this simple:
- To stabilize below 450 ppm, we need to deploy by 2050 some 12 to 14 stabilization wedges (each delivering 1 billion tons of avoided carbon) covering both efficient energy use and carbon-free supply (see here). The technologies we have today, plus a few that are in the verge of being commercialized, can provide the needed low-carbon energy [see "How the world can stabilize at 350 to 450 ppm: The full global warming solution (updated)"].
- Myriad energy-efficient solutions are already cost-effective today. Breaking down the barriers to their deployment now is much, much more important than developing new “breakthrough” efficient TILTs, since those would simply fail in the marketplace because of the same barriers. Cogeneration is perhaps the clearest example of this.
- On the supply side, deployment programs (coupled with a price for carbon) will always be much, much more important than R&D programs because new technologies take an incredibly long time to achieve mass-market commercial success. New supply TILTs would not simply emerge at a low cost. They need volume, volume, volume “” steady and large increases in demand over time to bring the cost down, as I discuss at length below.
- No existing or breakthrough technology is going to beat the price of power from a coal plant that has already been built “” the only way to deal with those plants is a high price for carbon or a mandate to shut them down. Indeed, that’s why we must act immediately not to build those plants in the first place.
- If a new supply technology can’t deliver half a wedge, it won’t be a big player in achieving 350-450 ppm.
For better or worse, we are stuck through 2050 with the technologies that are commercial today (like solar thermal electric) or that are very nearly commercial (like plug-in hybrids).
I have discussed most of this at length in previous posts (listed below), so I won’t repeat all the arguments here. Let me just focus on a few key points. A critical historical fact was explained by Royal Dutch/Shell, in their 2001 scenarios for how energy use is likely to evolve over the next five decades (even with a carbon constraint):
Note that this tiny toe-hold comes 25 years after commercial introduction. The first transition from scientific breakthrough to commercial introduction may itself take decades. We still haven’t seen commercial introduction of a hydrogen fuel cell car and have barely seen any commercial fuel cells “” over 160 years after they were first invented.
This tells you two important things. First, new breakthrough energy technologies simply don’t enter the market fast enough to have a big impact in the time frame we care about. We are trying to get 5% to 10% shares “” or more “” of the global market for energy, which means massive deployment by 2050 (if not sooner).
Second, if you are in the kind of hurry we are all in, then you are going to have to take unusual measures to deploy technologies far more aggressively than has ever occurred historically. That is, speeding up the deployment side is much more important than generating new technologies. Why? Virtually every supply technology in history has a steadily declining cost curve, whereby greater volume leads to lower cost in a predictable fashion because of economies of scale and the manufacturing learning curve.
WHY DEPLOYMENT NOW COMPLETELY TRUMPS RESEARCH
How do we achieve rapid innovation in existing technologies, as Gates suggests he wants?
A major 2000 report by the International Energy Agency, Experience Curves for Energy Technology Policy has a whole bunch of experience curves for various energy technologies. Let me quote some key passages:
Wind power is an example of a technology which relies on technical components that have reached maturity in other technological fields”¦. Experience curves for the total process of producing electricity from wind are considerably steeper than for wind turbines. Such experience curves reflect the learning in choosing sites for wind power, tailoring the turbines to the site, maintenance, power management, etc, which all are new activities.
Or consider PV:
Existing data show that experience curves provide a rational and systematic methodology to describe the historical development and performance of technologies”¦.
The experience curve shows the investment necessary to make a technology, such as PV, competitive, but it does not forecast when the technology will break-even. The time of break-even depends on deployment rates, which the decision-maker can influence through policy. With historical annual growth rates of 15%, photovoltaic modules will reach break-even point around the year 2025. Doubling the rate of growth will move the break-even point 10 years ahead to 2015.
Investments will be needed for the ride down the experience curve, that is for the learning efforts which will bring prices to the break-even point. An indicator for the resources required for learning is the difference between actual price and break-even price, i.e., the additional costs for the technology compared with the cost of the same service from technologies which the market presently considers cost-efficient. We will refer to these additional costs as learning investments, which means that they are investments in learning to make the technology cost-efficient, after which they will be recovered as the technology continues to improve.
Here is a key conclusion:
“¦ for major technologies such as photovoltaics, wind power, biomass, or heat pumps, resources provided through the market dominate the learning investments. Government deployment programmes may still be needed to stimulate these investments. The government expenditures for these programmes will be included in the learning investments.
Obviously government R&D, and especially first-of-a-kind demonstration programs, are critical before the technology can be introduced to the marketplace on a large scale “” and I’m glad Obama had doubled spending in this area. But, we “expect learning investments to become the dominant resource for later stages in technology development, where the objectives are to overcome cost barriers and make the technology commercial.”
We are really in a race to get technologies into the learning curve phase: “The experience effect leads to a competition between technologies to take advantage of opportunities for learning provided by the market. To exploit the opportunity, the emerging and still too expensive technology also has to compete for learning investments.”
In short, you need to get from first demonstration to commercial introduction as quickly as possible to be able to then take advantage of the learning curve before your competition does. Again, that’s why if you want mass deployment of the technology by 2050, we are mostly stuck with what we have today or very soon will have. Some breakthrough TILT in the year 2025 will find it exceedingly difficult to compete with technologies like CSP or wind that have had decades of such learning.
And that is why the analogy of a massive government Apollo program or Manhattan project is so flawed. Those programs were to create unique non-commercial products for a specialized customer with an unlimited budget. Throwing money at the problem was an obvious approach. To save a livable climate we need to create mass-market commercial products for lots of different customers who have limited budgets. That requires a completely different strategy.
The vast majority “” if not all “” of the wedge-sized solutions for 2050 will come from technologies that are now commercial or very soon will be. And federal policy must be designed with that understanding in mind. The IEA report concluded:
A general message to policy makers comes from the basic philosophy of the experience curve. Learning requires continuous action, and future opportunities are therefore strongly coupled to present activities. If we want cost-efficient, CO2-mitigation technologies available during the first decades of the new century, these technologies must be given the opportunity to learn in the current marketplace. Deferring decisions on deployment will risk lock-out of these technologies, i.e., lack of opportunities to learn will foreclose these options making them unavailable to the energy system.“¦
“¦ the low-cost path to CO2-stabilisation requires large investments in technology learning over the next decades. The learning investments are provided through market deployment of technologies not yet commercial, in order to reduce the cost of these technologies and make them competitive with conventional fossil-fuel technologies. Governments can use several policy instruments to ensure that market actors make the large-scale learning investments in environment-friendly technologies. Measures to encourage niche markets for new technologies are one of the most efficient ways for governments to provide learning opportunities. The learning investments are recovered as the new technologies mature, illustrating the long-range financing component of cost-efficient policies to reduce CO2 emissions. The time horizon for learning stretches over several decades, which require long-term, stable policies for energy technology.
Deployment, deployment, deployment, R&D, deployment, deployment, deployment.
IS GATES A HYPOCRITE?
After Gates put out his first piece dissing energy efficiency and action, I wrote a very critical analysis. Afterwards, a couple of technologists wrote to point out how hypocritical Gates was to push innovation-through-big-government-R&D, given that he has long been touting innovation-through-deployment for his own industry.
As recently as two (!) years ago in a Carnegie Mellon speech, Gates argued:
But Paul Allen and I thought, okay, we’ll do software. We’ll build a platform, and encourage other people to write software. Now, there was an assumption there that we could get millions of machines out, because, after all, if you want to make it economic to spend tens of millions developing software, and sell it for $100 or so, you’ve really got to get that base out there.
But because we made that bet, and we got that going, it became a virtuous cycle. That is, as more machines would sell, it created the market for a broader range of software, and that further drove the market for the machines, and in fact that volume allowed the price of the machine to come down. And that’s why from 1975 onward, that personal computer market actually not only became significant, it actually become the center of the entire computer industry.
The large machines we use today, and the big server farms, or corporate data servers, these are all based on the Windows PC architecture which, because of its volume, has come down in price, and improved in performance very, very dramatically. And so we have a large software industry.
One technologist (who wants to remain anonymous) wrote:
The man built his career on shipping “what we have now” and then improving it, using programmers paid out of the revenues gained from shipping not-quite-yet-ready product. Not once cent of Big Government R&D Breakthrough Command Economy directly flowed to Microsoft. To be fair, big government R&D did lead to things like the integrated circuit and the Internet, both of which had something to do with enabling Bill’s fortune. His business strategy for his entire life was antithetical to the Lomborg nonsense “don’t do anything until the Big Research Lab In The Sky Makes It Perfect.”
We simply don’t have the time to wait for Energy Miracles, and Gates simply hasn’t proposed the best strategy to achieve his wish — dramatic improvement in performance and a sharp drop in price.
The time to act — to deploy — is now.
- Do we need a massive government program to generate breakthroughs to make solar energy cost-competitive?
- The decarbonization story and why a carbon price beats technology breakthroughs
- The Extreme (plug in) Hybrid “” no breakthrough needed!
- Bush climate speech follows Luntz playbook: “Technology, technology, blah, blah, blah.”