Buy

Books
Click images for more details

Support

 

Twitter
Recent posts
Recent comments
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« And another! | Main | Another paper finds that climate sensitivity is low »
Tuesday
Apr162013

An objective Bayesian estimate of climate sensitivity

This is a guest post by Nic Lewis.

Many readers will know that I have analysed the Forest et al., 2006, (F06) study in some depth. I'm pleased to report that my paper reanalysing F06 using an improved, objective Bayesian method was accepted by Journal of Climate last month, just before the IPCC deadline for papers to be cited in AR5 WG1, and has now been posted as an Early Online Release, here. The paper is long (8,400 words) and technical, with quite a lot of statistical mathematics, so in this article I'll just give a flavour of it and summarize its results.

The journey from initially looking into F06 to getting my paper accepted was fairly long and bumpy. I originally submitted the paper last July, fourteen months after first coming across some data that should have matched what was used in F06. The reason it took me that long was partly that I was feeling my way, learning exactly how F06 worked, how to undertake objective statistical inference correctly in its case and how to deal with other issues that I was unfamiliar with. It was also partly because after some months I obtained, from the lead author of a related study, another set of data that should have matched the data used in F06, but which was mostly different from the first set. And it was partly because I was unsuccessful in my attempts to obtain any data or code from Dr Forest. Fortunately, he released a full set of (semi-processed) data and code after I submitted the paper. Therefore, in a revised version of the paper submitted in December, following a first round of peer review, I was able properly to resolve the data issues and also to take advantage of the final six years of model simulation data, which had not been used in F06. I still faced difficulties with two reviewers – my response to one second review exceeded 9,500 words –  but fortunately the editor involved was very fair and helpful, and decided my re-revised paper did not require a further round of peer review.

Forest 2006

First, some details about F06, for those interested. F06 was a 'Bayesian' study that estimated climate sensitivity (ECS or Seq) jointly with effective ocean diffusivity (Kv)1 and aerosol forcing (Faer). F06 used three 'diagnostics' (groups of variables whose observed values are compared to model-simulations): surface temperature anomalies, global deep-ocean temperature trend, and upper-air temperature changes. The MIT 2D climate model, which has adjustable parameters calibrated in terms of Seq , Kv and Faer, was run several hundred times at different settings of those parameters, producing sets of model-simulated temperature changes. Comparison of these simulated temperature changes to observations provided estimates of how likely the observations were to have occurred at each set of parameter values (taking account of natural internal variability). Bayes' theorem could then be applied, uniform prior distributions for the three parameters being multiplied together, and the resulting uniform joint prior being multiplied by the likelihood function for each diagnostic in turn. The result was a joint posterior probability density function (PDF) for the parameters. The PDFs for each of the individual parameters were then readily derived by integration. These techniques are described in Appendix 9.B of AR4 WG1, here.

Lewis 2013

As noted above, Forest 06 used uniform priors in the parameters. However, the relationship between the parameters and the observations is highly nonlinear and the use of a uniform parameter prior therefore strongly influences the final PDF. Therefore in my paper Bayes' theorem is applied to the data rather than the parameters: a joint posterior PDF for the observations is obtained from a joint uniform prior in the observations and the likelihood functions. Because the observations have first been 'whitened',2 this uniform prior is noninformative, meaning that the joint posterior PDF is objective and free of bias. Then, using a standard statistical formula, this posterior PDF in the whitened observations can be converted to an objective joint PDF for the climate parameters.

The F06 ECS PDF had a mode (most likely value) of 2.9 K (°C) and a 5–95% uncertainty range of 2.1 to 8.9 K. Using the same data, I estimate a climate sensitivity PDF with a mode of 2.4 K and a 5–95% uncertainty range of 2.0–3.6 K, the reduction being primarily due to use of an objective Bayesian approach. Upon incorporating six additional years of model-simulation data, previously unused, and improving diagnostic power by changing how the surface temperature data is used, the central estimate of climate sensitivity using the objective Bayesian method falls to 1.6 K (mode and median), with 5–95% bounds of 1.2–2.2 K. When uncertainties in non-aerosol forcings and in surface temperatures, ignored in F06, are allowed for, the 5–95% range widens to 1.0–3.0 K.

The 1.6 K mode for climate sensitivity I obtain is identical to the modes from Aldrin et al. (2012) and (using the same, HadCRUT4, observational dataset) Ring et al. (2012).  It is also the same as the best estimate I obtained in my December non-peer reviewed heat balance (energy budget) study using more recent data, here. In principle, the lack of warming over the last ten to fifteen years shouldn't really affect estimates of climate sensitivity, as a lower global surface temperature should be compensated for by more heat going into the ocean.

 Footnotes

  1. Parameterised as its square root
  2. Making them uncorrelated, with a radially symmetric joint probability density .

PrintView Printer Friendly Version

Reader Comments (42)

Nic, this is simply a warm thank-you for your impressive persistence and determination! Thanks for plowing though such important issues!

Apr 16, 2013 at 6:17 PM | Registered CommenterSkiphil

"I still faced difficulties with two reviewers – my response to one second review exceeded 9,500 words..."

"You pronounce this review of my work with greater fear than I receive it." (With apols to Giordano Bruno.)

Apr 16, 2013 at 7:06 PM | Unregistered CommenterJit

Even though most of the statistical logic remains over my head despite Nic's user friendly summary, what is so refreshing is that here we seem to have an authoritative audit, producing a fairly tight ranged straightforward numerical result. No doubt it will attract the inevitable flak because it is not alarming enough. The post significantly raises the profile of the Bishop's blog even more and a welcome feather in his cap.

I still remain sceptical about UHI and adjustment bias in the surface temperature record, and also about the state of knowledge of processes influencing the deep ocean, which may partly be geothermal (volcanic/hydrothermal) in origin, especially along tectonically active spreading rift zones. This is however personal speculation and not evidence based

Apr 16, 2013 at 8:28 PM | Registered CommenterPharos

"In principle, the lack of warming over the last ten to fifteen years shouldn't really affect estimates of climate sensitivity, as a lower global surface temperature should be compensated for by more heat going into the ocean."

Why?

As a daft engineer I would have thought lower global surface temperatures would result in less heat going into the ocean.

Apr 16, 2013 at 8:57 PM | Unregistered CommenterNial

What does "objective Bayesian" mean? I was under the impression that Bayesian methods involved an assigned numerical value to expert judgment. Aren't such judgments necessarily subjective?

Apr 16, 2013 at 8:59 PM | Unregistered CommenterPat Frank

Dear Nic,

An impressive result. In your own fighting a battle against "settled science"and then coming up with a piece like this is truly beyond imagination. We, researchers, stand on the shoulders of giants. With your work hitherto, you are, IMHO, one of those figures on whose shoulders future (real) climate scientist will stand. Pun intended;)

Keep the good work on, you have done an amazing thing.

Harry

Apr 16, 2013 at 9:35 PM | Unregistered CommenterHarry

Pat Frank
"I was under the impression that Bayesian methods involved an assigned numerical value to expert judgment."

That is the usual subjectivist Bayesian approach, and clearly involves subjective judgement.

Objective Bayesian methods seek to avoid subjective judgement about parameter values, although inevitably subjective judgement remains regarding, for example, experimental design and choice of data. The idea is to "let the data speak for themselves", so that, however weak (low precision) the data is, the data-derived likelihood function will dominate the prior. This is only fully achievable in certain cases, but with care it is possible to get close to it in many if not most other cases.

There is a large statistical literature irelating to the objective Bayesian approach, much of it relatively recent. At present, objective Bayesian thinking has not by any means fully penetrated the Bayesian statistical community, which remains divided into different camps.

Typically, results from applying objective Bayesian methods will result in posterior PDFs whose characteristics, at least approximately, match confidence intervals from classical frequentist statistic methods (accurate probability coverage), where frequentist methods are able to derive confidence intervals.

Apr 16, 2013 at 9:56 PM | Unregistered CommenterNic Lewis

Nial
"As a daft engineer I would have thought lower global surface temperatures would result in less heat going into the ocean."

Not daft at all. But the premise is that the causation is the other way around. The argument goes that with greenhouse gas concentrations and hence net forcing (warming influence) increasing over time, the only way that surface temperatures can remain stable is for more heat to be absorbed by the ocean. Since the ocean water below the mixed layer (typical depth 30 to 100 m or so) is much colder than that in the mixed layer, an increase in downwards transport of mixed-layer ocean water could account for that. Such an increase in downwards transport would tend to cool the mixed-layer water, offsetting the increased greenhouse gas etc warming effect. So the ocean surface temperature could stay constant at the same time as more heat went into the ocean. An increase in downwards transport of ocean water could be driven by internal climate system variability, for instance through changes in wind patterns.

That is not to deny that there may be other possible explanations for at least part of the standstill in global temperature, such as that cloud behaviour could have been changing in a way that tends to counteract increasing greenhouse gas forcing.

Apr 16, 2013 at 10:17 PM | Unregistered CommenterNic Lewis

Nic

Did you address Annan's concerns about the precious draft?

Apr 16, 2013 at 10:26 PM | Unregistered Commenterdiogenes

Hi Nic,

"That is not to deny that there may be other possible explanations for at least part of the standstill in global temperature, such as that cloud behaviour could have been changing in a way that tends to counteract increasing greenhouse gas forcing."

Isn't there some observational evidence of cloudiness having reduced through the latter part of the 20th century, while I'm not aware of any observational evidence of increased downward transport through the upper ocean. Have you tried using any of the observed cloud estimates in any of your work?

Apr 16, 2013 at 10:36 PM | Unregistered CommenterRob Burton

Nic, thanks for your reply. So you estimate a prior. Is it right that your data set is the spread of model realizations? Do you then take the likelihood function from the distribution of climate model realizations? The likelihood dominates the prior, which I take means the former has a much greater variance.

But that likelihood itself involves a judgment of correct dynamics within climate models. Doesn't this put a subjective trace into the likelihood function as well as into the prior?

Apr 16, 2013 at 10:41 PM | Unregistered CommenterPat Frank

diogenes
" Did you address Annan's concerns about the precious draft?"

I presume you meant "previous" draft?

I'm not sure what concerns you refer to – AFAIK the concerns that James Annan raised related to my December non-peer reviewed global heat balance based estimate of climate sensitivity, not to the Journal of Climate study. And those concerns were, I believe, largely misplaced.

Apr 16, 2013 at 10:43 PM | Unregistered CommenterNic Lewis

Nic, "such as that cloud behaviour could have been changing in a way that tends to counteract increasing greenhouse gas forcing."

Or that the increased energy goes elsewhere and is dissipated, such as through a more vigorous hydrology, and doesn't appear as sensible heat at all.

Apr 16, 2013 at 10:46 PM | Unregistered CommenterPat Frank

Last statement is nonsensical. Heat decides to stop going into the atmosphere and just feels like going to the ocean instead - for no apparent reason and totally unexpectedly. Simpler explanation is that the heat wasn't there in the first place. Occams razor.

Apr 16, 2013 at 10:48 PM | Unregistered CommenterJamesG

What is the standard mean of any of your results?

Apr 16, 2013 at 10:49 PM | Unregistered CommenterTom

If heat is going into the deep ocean that is good news. It means the next ice age will be delayed. if the heat is not going into the deep ocean that is good news. it means the alarmists are wrong.

Apr 16, 2013 at 10:49 PM | Unregistered CommenternTropywins

The oceans are in equilibrium because of two factors. The first is the melting of ice in the polar summer, the cause of the thermohaline circulation. The second is the salinity gradients - the partial molar enthalpy of mixing of salt and water is quite negative. The latter data are easily accessible from the UNESCO Equation of State for water.

To claim heat is going into the ocean because it can't be handled by other 'adjustments' is to compound the fraud. There is no way that the thermal diffusivity can suddenly increase!

Apr 16, 2013 at 11:35 PM | Unregistered CommenterAlecM

Congratulations, Nic.

Do you have supplementary information for this & if so could you link it?

Unfortunately I can't view your paper yet without a fee, even though my university has a subscription to Journal of Climate.

Apr 17, 2013 at 12:01 AM | Unregistered CommenterCarrick

I think Nic's ECS value of 1.6C will be criticized as most of the TCR values from AR4 are greater than this. Even with Isaac Held's revised TCR of 1.4C (as per The Economist), there's not much heat left in the pipeline. This is before considering that the TCR value comes from a scenario where co2 is growing by 1% per year, whereas the actual rate of growth is much less than that. So the actual response at a doubling will be somewhat greater than the TCR as there will be more time for the semi-fast response to be realized.

I think the modelers will have a hard time believing that the amount of heat in the pipeline is less than a couple of tenths. Forest's values were much more palatable.

I'm not saying that Nic is wrong, only that he will met with a large dose of skepticism. But he already knew that.

Apr 17, 2013 at 2:21 AM | Unregistered CommenterAJ

I agree with Nic that at Annan's there were not any persuasive objections. A post doc who is an aerosol specialist said that Nic's aerosol forcing should be higher. He only succeeded in persuading me that aerosol forcing numbers are very uncertain and patterns of regional temperature don't support the presumed regional forcing estimates. But surely, the range of Nic's estimates takes account of this uncertainty.

Apr 17, 2013 at 2:54 AM | Unregistered CommenterDavid Young

A little noticed fact is that Annan and Hargreaves recent paper on the LGM also estimates a linear sensitivity of 1.7C. This is a difference between 2 very different states. The mumbling about nonlinearity was not convincing to me. However, apparently, models differ on how much higher CO2 sensitivity is than this linear number. My take is that a finite difference over a big delta actually captures most of the nonlinearity.

Apr 17, 2013 at 3:00 AM | Unregistered CommenterDavid Young

How refreshing! A scientist/statistician writes a paper. It is published in a respected journal. He comes to a well-known blog, announces it and describes its content in a straightforward way. He then stays online and answers questions from all comers.

By contrast, compare that to the orchestrated media circus that accompanied the release of Marcott et al and Gergis et al, each of which was followed shortly thereafter by the disappearance and unavailability of the principal authors.

Apr 17, 2013 at 3:19 AM | Unregistered Commentertheduke

A sensitivity value near 1.6C per doubling keeps turning up. (http://rankexploits.com/musings/2011/a-simple-analysis-of-equilibrium-climate-sensitivity/)
Maybe it is more than coincidence.

Apr 17, 2013 at 4:17 AM | Unregistered CommenterSteve Fitzpatrick

Hmmm, of they haven't found the missing heat going in to the oceans then perhaps that's because it's not going in to the oceans simply because maybe it's not there in the first place?

Why do climate scientists have to great great big theories when the most simple explanation will suffice?

Regards

Mailman

Apr 17, 2013 at 8:13 AM | Unregistered CommenterMailman

I think JamesG and Mailman are correct.
The 'explanation' that heat is somehow being whisked down to the ocean depths from the atmosphere, leaving no trace of its presence at the surface, is just nonsense. IR is absorbed near the surface and such surface warming would make the ocean more stably stratified, reducing ocean mixing not increasing it.

Apr 17, 2013 at 9:49 AM | Registered CommenterPaul Matthews

I did not have time to look into this paper but since Nick Lewis is providing very interesting feedback let's try to have this directly explained by the author at a really simplified level that seems a better fit for this blog - does the paper use distinct parameters on how much of the feedbacks are realized after certain time intervals or does the paper just assume that everything is 100% realized? In case the first alternative is correct - what percentage and what intervals have been used as priors and what numbers resulted from the analysis?

Apr 17, 2013 at 11:00 AM | Unregistered Commentertheothernick

I admire the work, but I find all these studies to be a failure in Logic.
If you cannot establish the EXACT contribution of all the variables affecting what makes the temperature change you cannot possibly "Estimate" anything to do with Sensitivity, regardless of how much maths and statistics you use.
Garbage in Garbage out.

Apr 17, 2013 at 11:21 AM | Unregistered CommenterA C Osborn

If I understand this correctly (which is unlikely, I admit) then using data up to 2006, you find a mode CS of 2.4 K and a 5–95% bounds of 2.0–3.6 K.

By including data from 2006 up to 2012, you get a mode CS of 1.6 K, with 5–95% bounds of 1.2–2.2 K.

So the overlap between the two sets of results is only 2.0 - 2.2 K, within 90% bounds.

This seems to me to say that the 2006-2012 data was extremely unlikely data, given the known state in 2006, as it doesn't just lie in the low end of the previously calculated range, it lies almost entirely outside the expected range.

While this is obviously theoretically possible, as there is at least some overlap, is it really credible?

Apr 17, 2013 at 11:32 AM | Registered Commentersteve ta

Paul Matthews
"surface warming would make the ocean more stably stratified, reducing ocean mixing not increasing it."

What you say makes sense on a simple physical basis. However, Balmesda et al "Distinctive climate signals in reanalysis of global ocean heat content", 2013, GRL argues (based on an ocean reanalysis involving a climate model) that surface wind variability is largely responsible for the changing ocean heat vertical distribution. I don't know if that is correct.

As I understand it, although mixing of the heat into the deeper ocean can be represented quite well by a diffusive process (plus upwelling) using an effective vertical diffusivity value, most of the downwards heat mixing is probably by transport along constant density (isopycnal) surfaces inclined at shallow angles to the horizontal, with some eddy diffusion across those surfaces. Density is affected by salinity as well as temperature, of course. I'm unsure to what extent the claims of greater heat mixing into the deeper ocean depend on increased isopycnal transport (and if so, why that should be) rather than on, e.g., wind changing the pattern of ocean circulations.

Apr 17, 2013 at 11:54 AM | Unregistered CommenterNic Lewis

steveta
"While this is obviously theoretically possible, as there is at least some overlap, is it really credible?"

Please see my response to a similar query by Zeke Hausfather, at http://wattsupwiththat.com/2013/04/16/an-objective-bayesian-estimate-of-climate-sensitivity/#comment-1277413

You have the end years wrong, BTW.

Apr 17, 2013 at 12:58 PM | Unregistered CommenterNic Lewis

Nic -
Apologies if this has been asked and answered already, but is there a non-paywalled version available?

Apr 17, 2013 at 2:09 PM | Registered CommenterHaroldW

Very nice work Nic. Thank you for your efforts which I hope will be recognized by the climate community.

Apr 17, 2013 at 3:20 PM | Unregistered CommenterJeff Condon

Thanks, Jeff

Apr 17, 2013 at 3:27 PM | Unregistered CommenterNic Lewis

[Please go away]

Apr 17, 2013 at 3:29 PM | Unregistered CommenterBBD

Nic --
Congratulations on getting this published!
I am puzzled, though by your statement that you imposed a Bayesian prior on the data rather than on the parameters. The usual Bayesian approach takes the data as given, and then computes a posterior for the parameters from some prior for the parameters. While it is true that a uniform but bounded prior for the parameters will to some degree pre-determine the results, this can be avoided with an "uninformative" or diffuse unbounded uniform prior. This usually replicates classical confidence intervals in trivial linear and Gaussian cases, but allows an extension to complicated non-linear and/or non-Gaussian cases.
How does a prior on the data work?
(I should have studied the draft you sent me more closely, but maybe you could give the "Cliffnotes" version? Thanks!)

Apr 17, 2013 at 4:11 PM | Unregistered CommenterHu McCulloch

Please go away

that was one keyboard :-)

Apr 17, 2013 at 6:39 PM | Unregistered CommenterIbrahim

Hu

Thanks! I'm not sure what "Cliff notes" are, but I'll try to give a fairly brief explanation.

In the Bayesian paradigm one is free to impute a probability density to any variable (one- or multi-dimensional) that is considered to have a fixed but unknown value. Usually that variable is the parameter (vector), but there is nothing to stop one choosing instead the "true" data (vector), or some transformation of it – here the "true" whitened data. The true data is what would have been observed in the absence of error (climate noise, mainly). The actual observed value of the data is given, and its error distribution gives the likelihood function. Since by assumption the whitened data has a standard uncorrelated multinormal distribution, a uniform joint prior will be noninformative when applying Bayes' theorem to infer a joint posterior PDF for the true whitened data. That PDF will replicate classical confidence intervals - it will be objective.

It is easiest to think about the next stage if one accepts the premise in Forest 2006 that there are only three degrees of freedom in the true data, the same as the number of climate system parameters. In that case, a simple reparameterisation (change of variables) can be made from the true whitened data to the climate system parameters, using the standard Jacobian determinant formula to determine the factor, expressed as a function of the climate system parameters, to use to convert the joint PDF for the data into a PDF for the climate system parameters. The resulting joint PDF for those parameters will also be valid – reparameterisation doesn't affect the validity of a PDF. All that the conversion factor does is equate volume elements in parameter space to corresponding volume elements in data space.

It is easy to show that the PDF conversion factor effectively represents the joint parameter prior that has been used to infer a Bayesian posterior PDF for the parameter, and is the standard noninformative Jeffreys' prior. I could just have derived the Jeffreys' prior for the parameters and carried out the Bayesian inference for the parameters in the usual way. But my hope was that by undertaking the Bayesian step at the data stage, where it is easy to see that a joint uniform prior is noninformative, and then using standard statistical methods, more people would accept that the (highly non-uniform) Jeffreys' prior was in fact noninformative.

Apr 17, 2013 at 11:23 PM | Unregistered CommenterNic Lewis

"Since the ocean water below the mixed layer (typical depth 30 to 100 m or so) is much colder than that in the mixed layer, an increase in downwards transport of mixed-layer ocean water could account for that. Such an increase in downwards transport would tend to cool the mixed-layer water, offsetting the increased greenhouse gas etc warming effect. So the ocean surface temperature could stay constant at the same time as more heat went into the ocean. "

I'm still not buying it.

"an increase in downwards transport of mixed-layer ocean water could account for that"

What's the proposed mechanism that causes warm water to suddenly start traveling in the wrong direction?

BTW, thanks for answering the questions.

Apr 18, 2013 at 12:41 PM | Unregistered CommenterNial

Nial
"What's the proposed mechanism that causes warm water to suddenly start traveling in the wrong direction?"

My Apr 17, 2013 at 11:54 AM response to Paul Matthews addresses your query. I'm not saying that there has been a large increase in heat ransport into the deep ocean, just rehearsing the arguments about what could cause such an increase.

Apr 18, 2013 at 4:42 PM | Unregistered CommenterNic Lewis

Nic,
Let me add my congratulations - for both the quality of the work and your persistence.

The paper is a major step forwards in tying CS estimates to observational history, notwithstanding that I still have some reservations about the full contribution of non-linear flux response to SAT.

Apr 19, 2013 at 7:19 AM | Unregistered CommenterPaul_K

Paul_K
Thanks!

Apr 19, 2013 at 7:31 AM | Unregistered CommenterNic Lewis

Nic -

The most striking thing about your result is the lowering of the uncertainty in climate sensitivity using the same base data as in F06.

It it right to conclude that similar re-processing of other base data would also reduce the uncertainty of their estimates of climate sensitivity?

Thanks,

Apr 24, 2013 at 8:44 PM | Unregistered CommenterRERT

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>