Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« Moonshine | Main | Dixon's cunning plan »
Sunday
Aug242014

GCMs and public policy

In the thread beneath the posting about the Chen and Tung paper, Richard Betts left a comment that I thought was interesting and worthy of further thought.

Bish, as always I am slightly bemused over why you think GCMs are so central to climate policy.

Everyone* agrees that the greenhouse effect is real, and that CO2 is a greenhouse gas.
Everyone* agrees that CO2 rise is anthropogenic
Everyone** agrees that we can't predict the long-term response of the climate to ongoing CO2 rise with great accuracy. It could be large, it could be small. We don't know. The old-style energy balance models got us this far. We can't be certain of large changes in future, but can't rule them out either.

So climate mitigation policy is a political judgement based on what policymakers think carries the greater risk in the future - decarbonising or not decarbonising.

A primary aim of developing GCMs these days is to improve forecasts of regional climate on nearer-term timescales (seasons, year and a couple of decades) in order to inform contingency planning and adaptation (and also simply to increase understanding of the climate system by seeing how well forecasts based on current understanding stack up against observations, and then futher refining the models). Clearly, contingency planning and adaptation need to be done in the face of large uncertainty.

*OK so not quite everyone, but everyone who has thought about it to any reasonable extent
**Apart from a few who think that observations of a decade or three of small forcing can be extrapolated to indicate the response to long-term larger forcing with confidence.

So, let me try to explain why I think GCMs are so important to the policy debate.

Let us start by considering climate sensitivity. As readers here know, the official IPCC position on climate sensitivity is largely based on the GCMs. This time round we have had some minor concessions to observational estimates, but a significant proportion of the probability density of the observational studies remains outwith the IPCC's likely range of 1.5-4.5°C. Proponents of GCMs might counter that the upper end of the GCMs are ignored too, but I would suggest that one should conclude that an ECS of 5-6°C in the light of temperature history.

Estimates of climate sensitivity - and therefore in practice GCM estimates of climate sensitivity - directly inform estimates of the social cost of carbon. So when people like Chris Hope are arguing for a carbon tax of $100/tCO2, this is a function of GCMs. I recall, I hope correctly, that Chris suggested a figure of $18/tCO2 if one used an ECS of 1.6, in line with observational estimates. This matters of course, because the policy response, if any, to an $18 problem is significantly different to that for a $100 problem.

Wherever we look in the interactions between scientists and politicians on climate questions, we see an emphasis on catastrophe. We see no confessions of ignorance, but only occasional reference to uncertainties. Here's some notes of Tim Palmer addressing the All-Party Climate Change Group:

With the amount of carbon dioxide already in the atmosphere, future emissions will need to be reduced by half to that of historical emissions to limit global average temperature rise to 2°C. However, if emissions are not curbed (under the business as usual scenario), the amount of carbon dioxide in the atmosphere will be three times the historical emissions and the temperatures might rise up to 4°C.

And on the other hand they might not. This idea does not, however, seem to have been put forward for consideration.

Readers might also wonder what explanations were given to our political masters on the credibility of the GCMs. Here's what Palmer said:

Climate models are only flawed only if the basic principles of physics are, but they can be improved. Many components of the climate system could be better quantified and therefore allow for greater parameterisation in the models to make the models more accurate. Additionally increasing the resolution of models to allow them to model processes at a finer scale, again increasing the accuracy of the results. However, advances in computing technologies would be needed to perform all the necessary calculations. However, although the accuracy of predictions could be improved, the underlying processes of the models are accurate.

Apart from the transport of heat to the deep ocean, if Friday's paper from Chen and Tung is to be believed.

You can see that policymakers are getting a thoroughly biased picture of what GCMs can do and whether they are reliable or not. They are also getting a thoroughly biased picture of the cost of climate change based on the output of those GCMs. They are simply not being asked to consider the possibility that warming might be negligible or non-existent or that the models could be complete and utter junk. They are not told about the aerosol fudging or the GCMs' ongoing failures.

And this is just scratching the surface.

[BTW: Could commenters who like to amuse themselves by baiting Richard please refrain from so doing!]

PrintView Printer Friendly Version

Reader Comments (306)

I think the climate sensitivity is very unlikely to be constant, and the system complexity ensures this. But the bigger problem is actually diagnosing what causes local trends - forcing or fractal dynamics - is virtually impossible to unpick unless the forcing dominates over the fractal variability.

As an example, I generated a long sequence of fractal variability using Fourier methods. (Subsampled long sequences are better using Fourier methods as this is the easiest way of capturing some detail at the lowest frequencies). So I generated a 1 million point sequence, from which I randomly selected four 1000 point sequences, and plotted them.

I can absolutely assure you these are unforced fractal fluctuations - I know this because I generated them without forcing. You can note that each plot has a different range of trends, jumps and other characteristics that people mistakenly think must have some kind of cause. But these examples have just one cause: the fractal dynamics of the underlying system generating the sequences.

I dropped the plot here, sorry about the slightly spammy website (probably best not to follow the link on a phone!)
http://i62.tinypic.com/16ivbs0.png

The problem is that people see this and want to "explain" the trends, shifts etc. Furthermore, if there were "real" forced jumps, shifts or trends in these, how would we know which come from fractal dynamics and which are forced?

Aug 30, 2014 at 12:36 PM | Unregistered CommenterSpence_UK

I thought I'd drop the code used to generate the example data. Unfortunately I use MATLAB rather than R, which costs money, but the code here should work in Octave (a free program that interprets MATLAB code). It is possible that the blog software or HTML may mangle some of the characters, but hopefully this will copy and paste:

sampleRate = 1;
numPoints = 1048576;

Freqs = (0:numPoints/2)/(sampleRate*2);
Freqs = [Freqs fliplr(Freqs(2:(end-1)))];

% Hurst exponent
H = 1;
beta = 1-2*H;

% We cannot represent the 1/f between DC and the first frequency
% bin properly with Fourier methods, so we flatten spectrum below
% the first bin
Freqs(1) = Freqs(2);
FreqScale = (Freqs.^beta);

% Correction to yield unity standard deviation
Correction = sqrt(sum(FreqScale(1:end)));

Spectrum = (randn(1, numPoints) + i*randn(1, numPoints)).*sqrt(FreqScale);
FracNoise = real(ifft(Spectrum)*numPoints)/Correction;

To plot a random 1000 length sample in MATLAB (this may be a bit different in Octave, not sure):

figure;
rpoints = round(rand(1,1)*1000000);
ShowData = FracNoise(rpoints(loop):(rpoints(loop)+999));
plot(ShowData);

Aug 30, 2014 at 12:41 PM | Unregistered CommenterSpence_UK

Thanks Spence

the one thing that seems clear to me is that there is no way CO2 is the climate control knob

I would like to record once again my appreciation for your considerable input on this thread and to wish you all the best

Aug 30, 2014 at 1:09 PM | Unregistered CommenterH2O: the miracle molecule

Seconded. And thanks for the heads-up on Octave. I should have known about that.

Aug 30, 2014 at 1:39 PM | Registered CommenterRichard Drake

Always happy to contribute a few thoughts!

I did notice a mistake I made when copying and pasting the code above. My original code included a loop to produce the four subplots, but I wasn't sure if octave supports subplots in the same way MATLAB does, so removed the loop - but left a reference to the loop variable (tut!). So the plot code should read:

figure;
rpoints = round(rand(1,1)*1000000);
ShowData = FracNoise(rpoints:(rpoints+999));
plot(ShowData);

Aug 30, 2014 at 1:44 PM | Unregistered CommenterSpence_UK

Google Scholar search for The Exact Phrase, 'chaotic climate' yields the text snippets listed below. I think 'chaotic climate variability' or 'chaotic climate predictability' will return like hits. I haven't filled in the details of the hits. The Google search is easy and quick.

The range of possibilities for future climate evolution (1, 2, 3) needs to be taken into account when planning climate change mitigation and adaptation strategies. This requires ensembles of multi-decadal simulations to assess both chaotic climate variability and model response uncertainty (4, 5,6, 7, 8, 9). [Literature citations that were superscripts are in ().]

The predictability of weather and climate forecasts is determined by the projection of uncertainties in both initial conditions and model formulation onto flow-dependent instabilities of the chaotic climate attractor.

The title of a September 2013 article in Journal of Climate: Separating Forced from Chaotic Climate Variability over the Past Millennium

A major reason for this is the presence within this chaotic system of discernible periodic cycles varying from very low frequency of a hundred thousand years or so to inter annual variations such as El Niño and the Quasi biennial Oscillation.

Climate models are not perfect either. Errors evolve in climate simulations as a result of incomplete physical understanding and limited knowledge of past (or future) climate forcing. These errors must be considered along with the uncertainty related to climate chaos, which occurs because of nonlinear interactions in the global climate system. Climate chaos errors can be addressed through repeated runs of a climate model with the same forcing, but different starting conditions. These ensemble simulations can then be used to estimate the magnitude of the uncertainty introduced by a chaotic climate system (2).

[3] Because the climate system is chaotic, climate predictions must be predictions of distributions. Predictability concerns the degree to which a forecast distribution can differ from the climatological distribution and thus potentially provide information about the future. For systems at equilibrium predictability comes from initializing with a distribution of small perturbations about a specific initial state. As a prediction progresses, the distribution takes on the characteristics of the climatological distribution and predictability is eventually lost when the two distributions cannot be distinguished.

CHAOTIC BEHAVIOUR OF CLIMATE
Huard writes: “The natural variability of the climate system is largely chaotic” and thus “unpredictable”. Not only do we endorse this statement, and not only have we presented research results on this issue (Koutsoyiannis 2003, 2006, 2010, Koutsoyiannis et al. 2009, Christofides and Koutsoyiannis 2011), but we have also pointed to this problem in the second paragraph of the conclusions of our paper, the one that begins: “However, we think that the most important question is not whether GCMs can produce credible estimates of future climate, but whether climate is at all predictable in deterministic terms.” It is climate modelers who say or imply otherwise; for example Schmidt (2007, our emphasis):

"Weather is chaotic; imperceptible differences in the initial state of the atmosphere lead to radically different conditions in a week or so. Climate is instead a boundary value problem—a statistical description of the mean state and variability of a system, not an individual path through phase space. Current climate models yield stable and non chaotic climates, which implies that questions regarding the sensitivity of climate to, say, an increase in greenhouse gases are well posed and can be justifiably asked of the models.

Therefore, again we are not the right recipients of Huard's warning that climate is chaotic.

Apparently, the climate system in the physical domain is chaotic, but climate in some, but clearly not all, computer domains is not. How does that work?

A brand new physical principle has been introduced in these discussions as follows:

Eli Rabett says: August 30, 2014 at 10:32 pm
The major boundary condition is conservation of energy, which pretty well stomps on the butterflies and sub unit climate sensitivity

Aug 31, 2014 at 9:24 PM | Unregistered CommenterDan Hughes

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>