Tumblr Icon RSS Icon

Confirming The Human Fingerprint In Global Ocean Warming

Posted on

"Confirming The Human Fingerprint In Global Ocean Warming"

Share:

google plus icon

Recent warming of the top 2300 feet of the ocean alone corresponds to an energy content of more than one Hiroshima atomic bomb detonation every second over the past 40 years. A new analysis of all the recent data makes clear that this remarkable warming can only be explained with man-made greenhouse gas emissions — JR.

by Dana Nuccitelli, via Skeptical Science

Although over 90% of overall global warming goes into heating the oceans, it is often overlooked, particularly by those who try to deny that global warming is still happening.  Nature Climate Change has a new paper by some big names in the field of oceanography, including Domingues, Church, Ishii, and also Santer (Gleckler et al. 2012).  The paper compares ocean heat content (OHC) simulations in climate models to some of the newest and best OHC observational data sets from Domingues (2008), Ishii (2009), and Levitus (2009) which contain important corrections for systematic instrumental biases in expendable bathythermograph (XBT) data.  The paper makes several important points.

  • The 0-700 meter layer of the oceans warmed on average 0.022°C to 0.028°C per decade since 1960.
  • Climate model simulations which include the most complete set of external forcings – natural (solar and volcanic) and anthropogenic (greenhouse gases and sulphate aerosols) – are consistent with the rate of warming observed over the past 40 years (see the 20th Century multimodel response including volcanic forcings [MMR VOL 20CEN] and multimodel response including volcanic forcings projected forward using the IPCC SRES Scenario A1B [MMR VOL SRESA1B] in Figure 1).
  • The ocean warming observed over the past 40 years cannot be explained without anthropogenic greenhouse gas emissions; it is a ‘fingerprint’ of human-caused global warming (Figure 3).

Figure 1: Recent observed ∆T estimates from Domingues et al. 2008 (blue), Ishii et al. 2009 (red) and Levitus et al. 2009 (green) compared with the 20th Century (20CEN) multimodel response (MMR) of phase 3 of the Coupled Model Intercomparison Project (CMIP3) for the subsets of models including volcanic (VOL, black) and no volcanic (NoV, gray) forcings. MMR results are also shown for the CMIP3 SRES A1B scenarios (dashed black and gray), constructed from the same VOL and NoV subsets defined by the 20CEN models. Figure 1c from Gleckler et al. 2012.

Gleckler et al. note that several previous studies have identified a human ‘fingerprint’ in ocean warming; they cite Barnett et al. 2001, Barnett et al. 2005, Pierce et al. 2006, and Palmer et al. 2009.  However, the Gleckler et al. results are more robust because it is the first (and by far the most) comprehensive study to provide an in-depth examination of data and modelling uncertainties, and to use three improved data sets with corrections for instrumental biases (e.g., XBTs) along with a large number of model runs made available by 13 leading modelling centres from around the world (CMIP3 model ensemble).

“Several studies have used well-established detection and attribution methods to demonstrate that the observed basin-scale temperature changes are consistent with model responses to anthropogenic forcing and inconsistent with model-based estimates of natural variability. These studies relied on a single observational data set and employed results from only one or two models.”

Accounting for Natural Variability

Before looking for a human-induced fingerprint, Gleckler et al. checked that the models have realistic natural variability (“noise”).

“Before conducting our [detection and attribution] analysis, it is important to verify that the models used here do not systematically underestimate natural variability, particularly on 10-year timescales relevant to the detection of a slowly evolving ocean warming signal”

The authors do find some indications that the models underestimate the observed spatio-temporal variability of 5 and 10 year trends; however, this variability underestimate would have to be smaller than observed by a factor of more than two to negate the positive identification of an anthropogenic fingerprint in the observed ocean warming over the past 40 years.  Their analysis provides no evidence of a noise error of this magnitude.

The Human Fingerprint

In order to determine the cause of the observed ocean warming during the past 40 years, Gleckler et al. use the leading empirical orthogonal function (EOF), a well-known statistical tool in geophysics, to identify the simulated spatial pattern of response (the fingerprint) to external forcings over the period 1960-1999.

“The leading EOF primarily captures the pronounced change in mean state and exhibits warming in all ocean basins, with consistently larger warming in the Atlantic than in the Pacific.”

In the model runs using anthropogenic forcings (VOL and NoV), the leading EOF is positive in all ocean basins, as is the case for the observational data from all three data sets.  Effectively this means the models expect and the data observe warming in all ocean basins.  However, in the control runs, without any time-varying external forcings (e.g., no human-induced factors), the sign of the leading EOF flips in the different ocean basins, meaning that the models expect cooling (EOF loading < 0) in some basins and warming (EOF loading > 0) in others (Figure 2b).

 

Figure 2: EOF analysis of basin-average upper-ocean temperature changes. Basin-scale structure of externally forced fingerprints from 20CEN runs (a) and leading noise mode from pooled control runs (b). The leading EOF for each of the three different observational ∆T estimates is also shown in a. Observational results are for infilled data sets and all model results are for spatially complete data, with removal of model drift based on a cubic fit to control-run data, and the global mean included in all data. Figure 4 from Gleckler et al. 2012.

The authors then project the time series from the observations and model simulations (VOL, NoV, and control runs) to the same externally forced VOL model fingerprint pattern (Figure 2a). By examining successive linear trends calculated from these projected time series (technically known as pseudo-principal component time series), it is possible to detect when the externally forced signal rises above noise, and continues above this statistically significant level (Figure 3).

“Our aim is to search for time-increasing correspondence between the model-predicted ocean warming fingerprint and the observational data sets, and then to determine whether such correspondence could be due to natural variability alone.”

“By fitting overlapping trends of various values of L [trend length] to these pseudo-principal component time series, we can examine the behaviour of signal-to-noise (S/N) ratios as a function of timescale and determine the detection time – the time at which S/N rises above (and remains above) a stipulated (1% or 5%) significance level”

Gleckler et al. find that the S/N ratio of the volcanic forcing model is consistent with that in the three observational data sets which have been corrected for XBT biases (Figure 3).

“For all three corrected estimates, the S/N is consistently above a 1% significance level, with ratios greater than four by 2003.”

 

Figure 3: S/N ratio as a function of increasing trend length L. A common VOL model noise estimate was used to calculate S/N. The 1% and 5% significance thresholds are shown (as horizontal black and grey lines respectively) and assume a Gaussian distribution of noise trends in the VOL control-run pseudo-principal components. All observational estimates are infilled, all model data are spatially complete and the global mean is included in all data. Figure 5c from Gleckler et al. 2012.

Summary

Frankly it’s not at all surprising that the warming of the oceans can be primarily attributed to human greenhouse gas emissions.  A 0-700 meter ocean warming 0.025°C per decade may not sound like a lot, but it corresponds to an energy content of about 2.4×1022 Joules per decade, or more than one Little Boy atomic bomb detonation per second, every second over the past 40 years, just accumulating in the uppermost 700 meters of the world’s oceans.

Although it’s a challenge for climate models to adequately simulate natural variability, that challenge becomes less of a roadblock when considering long timeframes (like 50 years), a large energy accumulation (like 1023 Joules), and a large area (like the global oceans), as in the Gleckler study.  That immense amount of energy has to come from somewhere, and as we know, human greenhouse gas emissions are the biggest culprit behind the global energy imbalance.  As oceanography expert and one of the coordinating lead authors for the detection and attribution chapter IPCC AR5 Nathan Bindoff said in reaction to Gleckler et al. and the human attribution of ocean warming,

“We did it. No matter how you look at it, we did it. That’s it.”

In addition to identifying the human ‘fingerprint’ in ocean warming, it’s also interesting that Gleckler et al. demonstrate that, as long as they incorporate volcanic influences, climate models OHC simulations are in overall good agreement with the most up-to-date observational data.  This model-data OHC comparison is one which we recently discussed, but just for one NASA GISS model.  Gleckler et al. on the other hand examine several different CMIP3 models, and do not identify a notable model-data discrepancy, although it will be interesting to see a comparison using CMIP5 models with updated radiative forcing estimates.

The main takeaway points from Gleckler et al. are:

  • The results significantly improve the confidence levels in human-induced global ocean warming, from “likely” in the IPCC AR4 to “virtually certain” in IPCC AR5.  One of the main reasons for lower confidence in AR4 was the concern they had about climate models not being able to simulate observed decadal (10-year) variability. We now know that the models never simulated the “observed and significantly large decadal variability” because it was mostly caused by the systematic errors in the XBTs, not known at that time.
  • Because temperature increase expands the volume of the ocean and sea level rises, this implies that thermal expansion, which is a major contribution to the observed global mean sea-level rise during the past 40 years, is also largely human-induced.
  • Figure 3 shows that when we continue to consider longer and longer trends, the human-induced signal becomes more and more evident (stronger relative to the noise). In previous OHC detection and attribution studies, they have used a different method and were not able to show this time evolution of the signal trend.

This piece was originally published at Skeptical Science and was reprinted with permission.

Related Climate Progress Post

« »

11 Responses to Confirming The Human Fingerprint In Global Ocean Warming

  1. Jim Baird says:

    Since this warming leads to thermal expansion and icecap melting for a millennium, is it not the rational approach to convert as much of this accumulating energy to renewable energy as possible?

    Detractors claim the inefficiency of energy production when the delta T between warm and cold reservoirs is low.

    The same delta T produces one of the most powerful forces in Nature, the hurricane.

    • Mark E says:

      Other readers may be interested in reading the debate Jim recently fomented in this thread at Real Climate.> I certainly don’t mind businesspeople advocating for their patents, Jim. Instead of coyly introducing your marketing ideas here, why not just be up front about it? What would your thing do again?

      • Jim Baird says:

        What would it do?

        Basic physics teaches that 1 Calorie is the equivalent of the work done against a gravitational field of 1 by a mass of 427 kilogram falling a distance of 1 meter.

        Conversely 1 kilogram falling a distance of 427 meters produces the same result, which is the raising of 1 kilogram of water 1 degree centigrade.

        There are significant regions of the tropical oceans where the temperature differences between the surface and deep waters are as high as 21 degrees, which means in these regions there are effectively thermal dams as high as 8967 metres or 350 metres greater than Mount Everest.

        The theoretically efficiency of a heat engine when the difference in temperature between the hot and cold reservoirs is 21 degrees is 7 percent. Practically it is closer to 2.5 percent so the effective height of the untapped ocean dam is reduced to 217 metres or just slightly less than Mica Dam, which stands at 240 meters.

        Sticking with this analogy, the 0.022°C to 0.028°C temperature increase every decade is raising the head of the theoretical dam as well. To the extent it will be shortly overflowing into our coastal cities.

        The First Law of Thermodynamics notes the heat can be converted to work to reduce the temperature of the ocean in the same way water flowing over a dam generates power.

        If you have an alternative to offer Mark, I am all ears.

        • Mark E says:

          Sorry, I was asking about the specific patent and financial interests you have in OTEC, so we can assign appropriate weight to your remarks.

          You ask for my answer, but the first question is “what is the problem”?

          OTEC advocates seem to define the problem as the extra BTUs in the upper ocean, just like the seems to think lack of additional bottles is the problem.

          Certainly we need technology as we try to come to grips with the real problem, but the rapid and massive “disposal” of surface BTUs into the ocean depths – given our limited knowledge about the ocean – strikes me as a foolish ecological gamble that does not address the real problem.

        • Mark E says:

          PS the screwed up paragraph in my last comment should read….

          OTEC advocates seem to define the problem as the extra BTUs in the upper ocean, just like the microbe with the binoculars (video)> seems to think lack of additional bottles is the problem.

          • Jim Baird says:

            Mark,Joe Romm recently quoted from Martin Hoffert‘s, DotEarth piece where he framed the problem in terms of sea levels a hundred meters greater and both poles de-glaciated.

            I side with him, that this would be a disaster of epic proportion.

            In a personal email, Dr. Hoffert, stated he was a fan of OTEC but he didn’t believe it could provide the amount of energy he was looking for – about 30 terawatts. The reason was the potential to eutrophy the water column by upwelling massive amounts of nutrient rich cold water and the potential to overturn the Thermohaline, as you point out, with a massive “disposal” of surface BTUs into the ocean depths.


            Gerard Nihous
            of the University of Hawaii commented at a recent conference
            on upwelling, the maximum possible number of 100 MW OTEC power plants capable of
            being supported by our oceans is half a million…which would mean up to 25
            terawatts of power.


            Clifford Goudey,
            former Director, MIT’s Offshore Aquaculture Engineering Center seems to concur that a deep water condenser would overcome Hoffert’s and some of your concerns regarding environmental impacts. Including the prospect of massive CO2 releases from cold water upwelling.

            OTEC is in the public domain and the deep water condenser was proposed in 1972. I believe however OTEC should be implemented using a heat pipe that would recapture the latent heat of condensation in a returning fluid to eliminate the massive disposal of BTUs to the depths. With a heat pipe the amount of heat extracted from the evaporator is equal to the amount of heat dumped at the condensing end so 30TWe could be produced by moving 60TWh from the ocean’s surface and dumping 30TWh to the depths. In that this a closed system I believe there would be little to no environmental impact other than a gradual cooling of the ocean, which would counteract sea level rise.

            I too however believe there is an element of the too few bottles aspect to sea level rise which I proposed could be addressed by converting liquid ocean volume to hydrogen by electrolysis – using the offshore power created by OTEC plants – and capturing melting runoff, such as British Columbia is currently experiencing – and moving this water currency or runoff to the world’s deserts, which are the only bottles big enough to have any impact on sea level rise.

            I have a number of patent applications for these solutions as a consequence of 25 years of self financed research, which I would hope to recoup at some point along with a reasonable return on my investment.

            I make no apology this and firmly believe this approach is the only way sea level rise can be turned back and the consequences of Hoffert’s, holocaust of some as yet unknown horror can be avoided.

          • Mark E says:

            Thank you for partial disclosure of your personal stake. We definitely need good ideas.

            In my view, it is typical human short-sightedness to claim the method is a closed system because what we are talking about here is the _climate system_. Using your numbers,

            A = B + C where

            A= 60 TW surface BTUs

            B= 30TW of garbage BTUs we bury in the depths

            C= 30TW of useful BTUs we borrow from the ocean

            The energy we thus borrow does not disappear. Most of it would just do work for us before returning to the climate system, of which the ocean is a part.

            Despite our foolish thinking “out of sight/ out of mind”, the garbage energy we bury in the depths will still warm the ocean, just at a much greater depth. We are still charting the currents! Its ecologically cavalier to assume there will be no massive backlash from largescale longterm deep-water warming, at a much faster rate than natural mixing would provide.

            We agree that meters of sea level rise is unimaginably bad… shoot even inches will have a massive impact to food producing deltas, e.g. the Nile and Mekong, due to salt water affecting the crops. I’m not brushing aside the crisis we face an iota.

            What I _am_ saying is two-fold.

            First, that there is no “AWAY”. The internal combustion engine was predicated on as assumption that we could “throw away” combustion gases to the vast atmosphere… surely we could never mess up something so big, right? OTEC is premised on precisely the same attitude. The ocean heat sink is so big, surely the short-circuiting of natural BTU mixing from the surface to the depths could never mess anything up, could it? Like many ecological issues, the only way to find out is to conduct an uncontrolled experiment. There is a long, long list of promised “silver bullets” that turned out to have undervalued known unknowns, and even dangerous unknown unknowns. When you promise OTEC is the sweet answer to the problems of energy demand and sea level rise, you are adding yet another promised magic bullet to that list, while seriously downplaying the potential consequences. You can’t tell us what the disposal of 30TW to the depths would do because you don’t know. You are just assuming it will “go away”, or at least that result is less bad than dealing with the problem of surface warming some other way. Out of sight, out of mind. Will we e-v-e-r learn?

            Second, it would be more correct to claim OTEC offers a stop gap energy source which could itself become a big problem down the road. It is not a technology for a truly sustainable economy, in the long term. For the sake of argument, pretend we could snap our fingers and instantly have pre-industrial levels of GHGs in the air, and a zero-emissions society powered by OTEC. Should we keep relying on OTEC? NO! In that case we would be rapidly moving “natural” levels of surface energy to the ocean depths, with untold consequences.

            Certainly we have a very big problem. OTEC strikes me as a magic silver bullet because, quite simply, there is no “away” to where we can throw garbage BTUs. To say nothing of all the other assumptions like industrial accidents venting ammonia in the depths, or the cost of unexpected biofilming of the deep water condensers. The pain we face as we seek solutions – ones that respect the Limits to Growth – is quite simply the natural consequences of our society’s past choices. No teenager likes to confront the wall of personal responsibility, but that’s where we find ourselves.

            In my view, Joe’s past columns about wedges points the way forward, and OTEC is one of the items on that list that in my view has too high a potential downside to deploy in a big way.

  2. Carolyn says:

    I always wondered what the E numbers were for the changes in oceanic temperature. It is truly staggering….

  3. Jim Baird says:

    Mark, the only problem with limiting growth is it goes against human nature.

    Like Richard Smalley and Martin Hoffert, I think 30 to 60 TW worth of primary energy is in the cards, whether we like it or not.

    At least on the production side I think OTEC is a smart choice because that heat is already with us.

    • Mark E says:

      I’m all for borrowing BTUs from the ocean surface with other technologies… the ones that do not directly heat the deep ocean by short-circuiting natural ocean mixing.

      If you figure out how to dump the BTU’s outside our atmosphere, I might want to talk about buying stock options.

      • Jim Baird says:

        Mark, Nature has beaten everyone to the punch on that score. It borrows BTU from the ocean’s surface and radiates some into space from the top of the tropopause. In the process it can produce however a great deal of havoc.


        Ken Caldeira
        and Bill Gates have proposed short-circuiting natural ocean
        mixing to cool the seas and weaken hurricanes with ocean cooling pumps.

        I humbly suggest it is more rational to derive the same benefit by producing all of the renewable energy 10 billion people will need by mid century.