Back in May, a major study, California’s Energy Future — the View to 2050, was released by an independent state science and technology advisory panel. It had two central findings:
- California can achieve emissions roughly 60% below 1990 levels with technology we largely know about today if such technology is rapidly deployed at rates that are aggressive but feasible.
- We could further reduce 2050 greenhouse gas emissions to 80% below 1990 levels with significant innovation and advancements in multiple technologies that eliminate emissions from fuels. All of these solutions would require intensive and sustained investment in new technologies plus innovation to bridge from the laboratory to reliable operating systems in relatively short timeframes.
This report is an incredibly strong endorsement of the “deploy, deploy, deploy, research & develop, deploy, deploy, deploy,” strategy that I and others have been advocating. In fact, the report explicitly states that failing to adopt “Aggressive efficiency measures for buildings, industry and transportation” and “Aggressive electrification to avoid fossil fuel use” would “significantly increase the 2050 emissions.”
Amazingly, Revkin asserts the exact opposite in “A Reality Check on Ambitious Climate Targets.” Certainly misreporting on energy and climate in the NY Times is legion, as we’ve seen. But Andy Revkin’s latest head-exploding post easily wins the “Charlie Sheen” award.
A leading journalist and climate expert, Robert Collier, debunked Revkin’s “real spinning of the report” — see “Sticking the long knives into energy efficiency” (reposted below). It’s worth spending some time on this because the report’s actual conclusions and implications are very important to understand.
I have long asserted that it is not possible to make a positive contribution to the climate debate if you don’t spell out what your emissions or temperature target (or range) is. Revkin’s post proves that conclusively, as I will show.
Revkin claims in his post:
Given that California is a best-case scenario* compared to other states (and, of course, countries) far more dependent on coal, Long’s piece and the underlying report pose a strong challenge to those calling for a “deploy, deploy, deploy” approach to cutting climate risks.
This is a link to — and swipe at — me, needless to say.
Blunder number one is for Revkin to assert the report challenges the aggressive deployment strategy for meeting ambitious climate targets. Quite the reverse. The report makes clear that without aggressive deployment, the target can’t possibly be reached.
Revkin added the asterisk (*) because, buried way, way at the bottom of his post is this Postcript,
In a Twitter reaction, Alan Nogee, the former clean-energy program director for the Union of Concerned Scientists, noted that California’s lack of coal dependence makes it more a worst case than a best case, because it doesn’t have a lot of coal emissions that might be relatively easily displaced.
Duh. Rather than an asterisk, Revkin should simply remove his misleading error.
The fact is that California has been pushing efficiency and low-carbon electricity aggressively since the 1970s. It is considerably more efficient in its use of energy than almost every other state. For a long time now the CO2 intensity of its electricity (CO2/Mwh) has been nearly half that of the rest of the nation. So obviously the rest of the country — which is far more coal-intensive and inefficient — has considerably more low-hanging fruit for emissions reductions.
That’s blunder two.
Blunder three is really the most amazing and amusing.
Revkin appears to be unaware that a 60% reduction vs. 1990 levels is the target that the IPCC believes the rich countries (Annex I) should adopt if the goal is to stabilize at 550 ppm CO2-eq. I discussed the science underlying this at length two years ago. Here’s the key chart from the full Working Group III report (Box 13.7, page 776):
Now 550 ppm CO2-equivalent is about 450 ppm CO2 (because of the warming from the other greenhouse gases), and it means ultimately stabilizing at 3°C (5.4°F) above preindustrial levels using the “best estimate” of climate sensitivity — see the IPCC’s Synthesis Report “Summary for Policymakers” (Table SPM.6).
Of course, Revkin continues to this day to only endorse his vague R&D-focused “energy quest” and criticize those of us (including the National Academy of Sciences) who push for strong emissions reductions starting now. Since Revkin refuses to tell us what level of concentrations he thinks the world should aim for — even a broad range, say 450 ppm to 550 ppm — he retains the luxury of attacking those who are willing to state what their target is while maintaining a faux high ground that they are being politically unrealistic while he can pretend his essentially do-nothing do-little* strategy is scientifically or morally viable, which it ain’t.
That said, based on his new post, Revkin apparently thinks the target should be stronger than 550 ppm CO2-eq. After all, it’s quite clear from the California report, which he does not dispute, that we should be able to meet the 60% below 1990 levels target by aggressively deploying existing technology. And yet Revkin says the report is a strong challenge to those of us who believe our climate strategy should be based on aggressive deployment. So apparently that target is too weak for Revkin since you only need the major technology advances for the stronger target.
On the other hand, it’s hard to believe that he supports the 450 ppm CO2-eq target, which is roughly stabilization at 2°C given the IPCC’s best estimate for climate sensitivity. He has spent so much time criticizing me and others who do lay out strategies to meet that target (and yes, those strategies include more R&D — everybody but the hard-core libertarians and fossil fuel types support more clean energy R&D).
Moreover, if Revkin does believe in the stronger target, his post makes even less sense. He would be implying that because we can only go most of the way with existing technology therefore we MUST NOT START aggressive deployment until we have every piece of technology available. Otherwise, why not start aggressive deployment now?
Obviously, the report he cites doesn’t take that absurd view since it would mean a staggeringly greater amount of emissions in the near term — which means we would need even more breakthroughs and an incomprehensibly fast rate of deployment. There just is no logic underlying Revkin’s post or his critique of aggresive deployment.
The bottom line is that by failing to spell out what target or range he supports, Revkin’s critique of aggressive deployment implodes. Indeed, it backfires. It proves he cannot make a positive contribution to the debate until he spells out his climate target.
For the record, I do not know a single environmentalist who would not gladly agree to a bill requiring a nation-wide 2050 GHG target of a 60% reduction below 1990 levels — with aggressive deployment plus R&D and a reevaluation of the target every 10 years based on advances in science and technology.
Revkin seems painfully unaware of the fact that one of the best way to get major technology advances — if not the best way — is by deployment, not R&D (as I’ve explained many times, see “The breakthrough technology illusion”) and in any case the two aren’t mutually exclusive.
Finally, it bears repeating that, as we learned in 2009, “The world will have to spend an extra $500 billion to cut carbon emissions for each year it delays implementing a major assault on global warming, the International Energy Agency said on Tuesday.”
Aggressive deployment (along with more R&D) is the only cost-effective strategy if you want to avoid catastrophic global warming.
Here is Collier’s must-read piece:
Sticking the long knives into energy efficiency
A new, authoritative study has concluded that California can reduce its total greenhouse gas emissions by 60 percent from 1990 levels by 2050 using technologies that already exist or are in demonstration. By nearly any measure, that’s good news. It shows that serious action on global warming is feasible right now and does not require futuristic technological breakthroughs that might never come to fruition.
But this message appears to have been bungled with a miserably bad communications strategy by the report’s managers. As a result, the report is being spun as bad news that undercuts California’s strategy of using energy efficiency regulations to reduce emissions.
The study, California’s Energy Future — the View to 2050, was released in May by the California Council on Science and Technology, an independent state advisory panel. The document received almost zero coverage by the mainstream and new media, and it sank like a stone from the public view.
This past week, however, the study’s co-author, Jane Long, a top official at Lawrence Livermore National Laboratory, wrote an article published in the scientific journal Naturethat subtly changed her own report’s emphasis.
Long focused on the glass half empty — the fact that California’s global warming law mandates that 2050 emissions reflect a reduction of 80 percent, not 60 percent. She explained at length the perfectly valid point that reaching 80 percent would require major technological breakthroughs. Yet she twisted a knife into the report’s main strategy for getting to 60 percent:
Some say that we can radically reduce emissions with only a major emphasis on efficiency, or just by changing our behaviour. But what if it doesn’t add up?
One might reply that in the current and foreseeable political climate, simply reaching 60 percent would be a colossal achievement. Getting to 80 percent is a pipe dream. Starting now with the politically accessible, low-hanging fruit — regulation-driven efficiency improvements — is what’s needed.
A cynic also might say that Long’s spin dovetails with Livermore Lab’s deep vested interest in obtaining research funding for so-called “breakthrough technologies.” Unlike the nearby Lawrence Berkeley National Laboratory, with which it is often confused, Livermore Lab focuses most of its energy/climate work on next-generation technologies, such as carbon capture and storage and nuclear fusion. Berkeley Lab, in contrast, is the Vatican of energy efficiency work, and it has played a central role in the development of California’s global warming policies.
But the real spinning of the report was done by Andrew Revkin at the New York Times. Revkin is a prolific and often interesting blogger who also is a highly partisan advocate of the “breakthrough technologies” camp in climate politics.
Revkin devoted a long column Friday to Long’s article. He gave only a brief, dismissive mention to the 60 percent angle and made a not-so-sly dig at his longtime nemesis in the intramural climate wars, blogger and energy efficiency guru Joe Romm of the Center for American Progress:
Given that California is a best-case scenario compared to other states (and, of course, countries) far more dependent on coal, Long’s piece and the underlying report pose a strong challenge to those calling for a “deploy, deploy, deploy” approach to cutting climate risks.
But let’s look at the report itself. It says that getting to the 60 percent mark can be accomplished through several key strategies:
- Aggressive efficiency measures for buildings, industry and transportation.
- Every existing building will either be retrofitted to higher efficiency standards or will be replaced.
- Electrification of transportation and heat wherever technically feasible.
- Developing emission-free electricity production with some combination of renewable energy, nuclear power and fossil fuel accompanied by underground storage of the carbon dioxide emissions, while at the same time nearly doubling electricity production.
- Finding supplies of low-carbon fuel to supply transportation and heat use which cannot be electrified, such as for airplanes and heavy duty trucks, and high quality heat in industry.
- 60 percent of light-duty vehicles will use electricity, so that the average fuel economy will be roughly 70 miles per gallon.
- The electricity generating capacity of the state will be almost entirely replaced and then doubled, and all with near zero-emission technology.
- Infrastructure to produce biofuels — costing tens of billions of dollars — will have to be built.
Not easy, but not impossible. Certainly a good strategy for the short and medium terms. Why has this good news gone so unheralded?
The good news has gone unheralded because most reporters ignore reports like this and the few that write about it, like Revkin, devote so much time to misleading or erroneous spin.
* The asterisk of mine above is because science journalist John Rennie — in his critique of any earlier Revkin post — balked slightly at my describing Revkin’s “energy quest” strategy as “essentially do-nothing”: “I’ll differ from Joe in that I don’t consider Andy’s favored approach to be a do-nothing strategy: a quest for cleaner, more affordable energy would be scientifically and morally desirable for plenty of reasons, and it would almost certainly help to reduce future warming eventually. The problem is, there’s a very good chance it would do too little, too late.” Point taken. It is a “do-little” strategy.
Recall the words of Jigar Shah, a solar-industry rock star who founded the pioneering solar company SunEdison. In the first Climate Progress podcast, he candidly shared his views on why doubters of today’s renewable energy technologies are so wrong:
It depends on the person … but often they’re just too ignorant to know better. For some people, technology is not their sweet spot. They have other skills. And so when someone tells them, “technology is not ready,” they just eat up those words … hook, line and sinker and then decide that’s what their talking points are going to be. And with those people it’s just sad that they don’t read more.
Then there are actually people who are diabolical… This is by far the most interesting way to foil the progress of new technologies. That is, by saying that they’re not ready. You know, you see this with the big oil companies. They’ll say: “we need all of the above.” Or they say: “we are huge supporters of solar and wind if only their costs would come down by 20%. Then, you know, if there were big breakthroughs in the technology, we’d be huge supporters.”
No, that actually just means that they don’t love solar and wind. It actually means that they hate those technologies and that, in fact, they are trying to figure out, using white lies, how to undermine those technologies. So we just have to call their bluff, as opposed to saying: “oh my god, they’re our friends because they said something that seems to resonate with me.” They’re not your friend. They’re actually trying to figure out how to play a nice PR trick to marginalize you.
Jigar actually thinks we could reduce CO2 emissions about 50% cost-effectively with existing technologies, but that by the time we finished doing so in a couple of decades, we’d have another array of cost-effective strategies to take us down another 50%.
The time to act is now. Anyone who says otherwise doesn’t know what they are talking about — or is intentionally deceiving you.
Deploy, Deploy, Deploy, Research & Develop, Deploy, Deploy, Deploy.NOTE: I realize that both Revkin and Dr. Jane Long, the study’s co-leader whom he cites, don’t understand Rob Socolow’s wedges and their relationship to these IPCC targets, so I am going to do a separate post spelling that all this out.
- World’s Engineers: “The Technology Needed to Cut the World’s Greenhouse Gas Emissions by 85% by 2050 Already Exists”
- Breaking: Socolow reaffirms 2004 ‘wedges’ paper, urges aggressive low-carbon deployment ASAP