Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Links

A few sites I've stumbled across recently....

Powered by Squarespace

Discussion > Where is the evidence

Rhoda, I agree with the thrust of your comment and most of what you say. Just a clarification about lapse rate.

It doesn't cause anything, it's merely the observation that air temperatures become cooler with altitude. The cause is gravity acting upon the air molecules, making the atmosphere denser at the bottom and thinner at the top. The ideal gas law says the higher pressure increases the kinetic energy of the molecules, and hence their temperature near the surface, and the opposite as pressures diminish. Note this is a function of the mass of air, not the composition of gases, IR active or otherwise.

That dense air means that each CO2 molecule is closely packed with 2500 others (N2 and O2) mostly, so in that case, any energy absorbed transfers via collision to a N2 or O2 molecule. The parcel of air is more affected by convection which, as you say, makes the air rise and get cooler with lower pressures.

The emission altitude is not a fixed level, but is the place where the air is thin enough that CO2 and H20 can emit photons before collision transfers occur. It's complicated, because an IR active molecule above could reabsorb, and of course re-emit. As NIV says, it is effectively the level where convection stops and radiation takes over as the form of energy dissipation. Now the actual altitude varies all over the place, close to ground in Arctic winter, and 10-12 km above the equator.

The claim is that more CO2 pushes the emission level higher, so lower temperature and less energy is lost. I am not convinced: more molecules emitting at a slightly lower temp, so no difference; also, measurements of optical depth don't seem to vary with more CO2; also, this seems to occur at the tropopause, where temperatures stop cooling with altitude.

Mar 4, 2015 at 8:37 PM | Unregistered CommenterRon C.

Dung,

"Do I have permission to respond? :P"

You can question what he means by "almost", and ask to see the evidence. Has anyone done a survey?

Rhoda,

"I don't buy the average height of emission to space as a major factor. Obviously the theory looks good, in some sort of theoretical calm atmosphere."

The concept doesn't work in a theoretical calm atmosphere, only a convective one. It's the rolling convection cells (horizontal and vertical) that drive air around the compression-expansion cycle that changes its temperature vertically. Compressing air makes it hot, expanding air cools it.

"But what I can't understand, having read NiVs patient explanation many times, is how the temperature at some height in the 30,000ft range sets surface temp. The surface is where the energy becomes sensible heat."

It's tricky to explain, because most of the everyday analogies are not an exact match, which leads to confusion, but the more precise analogies are unfamiliar and people's intuition doesn't work so well.

Everybody is probably familiar with the way a conservatory or greenhouse gets uncomfortably hot in the summer. You can buy a gizmo with a thermostat on it that opens the roof vents when it gets too hot, and shuts them when things cool down. The temperature inside stays at a constant, comfortable level. But how can a vent high up near the ceiling control the temperature down near the ground where you're sitting? The sunlight turns into sensible heat at the ground, not up at the vent, right?

Not only that, but the thermostat is set to a different temperature to the one you feel comfortable at. The temperature at the vent is different, generally a lot cooler. So how can a thermostat set at one temperature stabilise the ground at a completely different (and usually much higher) temperature?

The answer is that the temperatures in different parts of the conservatory are held in a fixed relationship to one another by convection and diffusion, but can rise and fall in unison, as the conservatory as a whole gets hotter or cooler. If the vent opens heat escapes from the top and this cooling propagates downwards by increased convection. If the vent closes, the convective cooling stops all the way down to the surface, and heat builds up.

It's the combination of two completely different mechanisms. The overall 'thermostat' of the atmosphere is at the top: radiation to outer space. The tmperature everywhere else is rigidly linked to it by convection's ability to sharply turn on and off at a particular temperature gradient.

"Warm air will always rise in cool air."

Only so long as the rate at which it cools because of expansion is less than the rate at which the surrounding air cools with altitude.

Once the expansion-cooling rate is attained, convection stops, with cold air on top of warm. Why else do you think the snow is on the tops of mountains rather than their bottoms?

Ron,


"The cause is gravity acting upon the air molecules, making the atmosphere denser at the bottom and thinner at the top."

Yes. A simpler way to put it is to say that the weight of all the air above presses down on a particular bit of air; the pressure is what holds the column of air up. The higher you go, the less weight of air above you there is to lift.

"The ideal gas law says the higher pressure increases the kinetic energy of the molecules, and hence their temperature near the surface, and the opposite as pressures diminish."

Not quite. You can get also high pressure air at low temperatures. If you compress air without letting heat in or out, then the temperature will rise - called 'adiabatic' compression in thermodynamics. But you can also compress air while extracting the heat, getting a bigger volume change for your effort - what's called 'isothermal' compression.

The compression going on in the atmosphere's convection cycle is generally close to adiabatic, because all the air around a small packet of air is generally at roughly the same temperature, and it can't radiate efficiently. Heat leaks in and out far more slowly than the rapid timescales that convection works on.

" I am not convinced: more molecules emitting at a slightly lower temp, so no difference"

But in this simplified picture, it's only the topmost 'visible' layer that emits to space, and the number of molecules in that stays the same. The emission of the extra molecules underneath them is blocked by those above.

The direction of net energy flow is determined only by the difference in temperatures, not the amount of stuff. If you have a big body at a cold temperature next to a small body at a very hot temperature, the cold body might be emitting more heat overall because of its bigger surface area, but the net flow is still from the hot body to the cold. Most of the heat emitted from the big cold body doesn't hit the small body, because it's so small. Only the temperature matters.

The way this is arranged varies depending on the configuration, but it always happens. People have had a lot of fun over the years trying to construct exotic arangements of mirrors and radiators and insulators and heat engines to try to break the rule, but nobody has succeeded yet. The second law of thermodynamics is one on the most thoroughly challenged and tested of all the laws of physics. I do encourage people to try though. The prize on offer is a perpetual motion machine to the lucky winner who defeats it!

You won't win the big prize - the game is rigged - but since the consolation prize to those runners-up who fail is an improved understanding of physics, you can't lose, either.

Mar 5, 2015 at 11:02 AM | Unregistered CommenterNullius in Verba

I congratulate NiV for managing what I can never do. I am not a good teacher, I tried it once many years ago, teaching disadvantaged kids mathematics, and wasn't very good at it, I think you have to imagine what it's like not to understand something, it's not a thing I can do.

Nice read.

Mar 5, 2015 at 12:58 PM | Unregistered CommenterTheBigYinJames

The earth in IR with opaque atmosphere

Shows that the "ground" is not where most IR photons come from - to a viewer in space it looks like IR comes from the atmosphere - and is related to clouds and convection bringing the heat up to the top of the atmosphere where it is dissipated radiatively.

Mar 5, 2015 at 1:02 PM | Unregistered CommenterTheBigYinJames

Thanks NiV for your comment (s). As usual, you write clearly and persuasively.

I recently came across a paper on this subject, and wondered if you would comment on it.

The authors use the logic of maximum entropy production to estimate the temperature profile of the atmosphere, and then estimate the sensitivity to doubling of CO2. Interestingly, their sensitivity is much lower then CMIP5 models.

The paper is Herbert et al 2013, Vertical Temperature Profiles at Maximum Entropy Production with a Net Exchange Radiative Formulation.
The abstract and key finding:

“Like any fluid heated from below, the atmosphere is subject to vertical instability which triggers convection. Convection occurs on small time and space scales, which makes it a challenging feature to include in climate models. Usually sub-grid parameterizations are required. Here, we develop an alternative view based on a global thermodynamic variational principle. We compute convective flux profiles and temperature profiles at steady-state in an implicit way, by maximizing the associated entropy production rate. Two settings are examined, corresponding respectively to the idealized case of a gray atmosphere, and a realistic case based on a Net Exchange Formulation radiative scheme. In the second case, we are also able to discuss the effect of variations of the atmospheric composition, like a doubling of the carbon dioxide concentration.”

“The response of the surface temperature to the variation of the carbon dioxide concentration — usually called climate sensitivity — ranges from 0.24 K (for the sub-arctic winter profile) to 0.66 K (for the tropical profile), as shown in table 3. To compare these values with the literature, we need to be careful about the feedbacks included in the model we wish to compare to. Indeed, if the overall climate sensitivity is still a subject of debate, this is mainly due to poorly understood feedbacks, like the cloud feedback (Stephens 2005), which are not accounted for in the present study.”

So there you have it: Convection rules in the lower troposphere. Direct warming from CO2 is quite modest, way less than models project. Unless feedbacks are strongly positive (in which case we are not here to be talking about this), I don’t see cause for alarm..

http://arxiv.org/pdf/1301.1550.pdf

Mar 5, 2015 at 1:48 PM | Unregistered CommenterRon C.

"Not seeing a cause for alarm" is the thread that binds us all on this side. Even if you see some warming (as I do) then it needn't be something that we can't deal with. The idea that any change in the temperature spells disaster is a form of technological pessimism that comes straight out of the luddite enviro movement.

Mar 5, 2015 at 2:03 PM | Unregistered CommenterTheBigYinJames

"I recently came across a paper on this subject, and wondered if you would comment on it."

I've only skimmed through it, but I think the paper itself covers most of the obvious criticisms.

They propose using an underlying principle that they agree there's no known theoretical justification for. They get wrong answers in some well-known cases (the grey atmosphere model) and some not-quite-right-but-not-so-wildly-off answers using a more detailed but still heavily approximated band model. They appeared to be quite pleased that they got a result that was statically stable - that yielded sorta-realistic profiles without convective adjustment. But I'm not sure why this is good, since the real atmosphere *does* convect.

And their greenhouse model contains no feedbacks. Without feedbacks, even the mainstream models yield much lower sensitivities that are distinctly non-scary.

It's interesting in a "what-if-the-laws-of-physics-were-different?" sort of way, but I don't think it really provides any support for the maximum entropy principle, or tells us anything useful about the real atmosphere.

But that's a snap judgement based on about 15 minutes of reading. Make of it what you will.

Mar 5, 2015 at 3:01 PM | Unregistered CommenterNullius in Verba

I have had a lot of fun asking the libs I know to explain the greenhouse effect. Not the atmospheric one, the original, crop growing greenhouse. Not one in a hundred knew what I was talking about. Then I ask them what is the atmospheric effect, and almost none can explain it coherently. When I ask for someone besides Gore or Obama that can explain it with substantial information they flounder. Yet they call us deniers, because we are unscientific!

Mar 6, 2015 at 12:16 AM | Unregistered CommenterOld Grouch

It is totally unreasonable to expect libs to explain the Greenhouse effect when Davey himself does not understand it.

Mar 6, 2015 at 12:04 PM | Registered CommenterDung

NiV has the background to critique the Herbert et al paper, and I do not, so I take his detractions on board.

However, Matthew Marler has pointed me to the back story concerning this paper and others like it. It seems that climate modelers are dealing with a quandary: How can we improve on the unsatisfactory results from climate modeling?

Shall we:
A.Continue tweaking models using classical maths though they depend on climate being in quasi-equilibrium; or,
B.Start over from scratch applying non-equilibrium maths to the turbulent climate, though this branch of math is immature with limited expertise.

In other words, we are confident in classical maths but, does climate have features that disqualify it from their application? We are confident that non-equilibrium maths were developed for systems such as the climate, but are these maths robust enough to deal with such a complex reality?

It appears that some modelers are coming to grips with the turbulent quality of climate due to convection dominating heat transfer in the lower troposphere. Heretofore, models put in a parameter for energy loss through convection, and proceeded to model the system as a purely radiative dissipative system. Recently, it seems that some modelers are striking out in a new, possibly more fruitful direction. The Herbert et al paper is one example exploring the paradigm of non-equilibrium steady states (NESS). Such attempts are open to criticism from a classical position, as NiV has demonstrated.

That is my layman’s POV. Here is the issue stated by practitioners, more elegantly with bigger words:

“In particular, it is not obvious, as of today, whether it is more efficient to approach the problem of constructing a theory of climate dynamics starting from the framework of hamiltonian mechanics and quasi-equilibrium statistical mechanics or taking the point of view of dissipative chaotic dynamical systems, and of non-equilibrium statistical mechanics, and even the authors of this review disagree. The former approach can rely on much more powerful mathematical tools, while the latter is more realistic and epistemologically more correct, because, obviously, the climate is, indeed, a non-equilibrium system.” Lucarini et al 2014

Mar 7, 2015 at 4:24 PM | Unregistered CommenterRon C.

" It seems that climate modelers are dealing with a quandary: How can we improve on the unsatisfactory results from climate modeling?"

Excellent question!

Generally, when modelling a physical system, the first task is to identify all the relevant physics. This can be difficult - we don't know whether some features are emergent properties from physics we've already incorporated but cannot analyse, or whether it's something we haven't included.

The second job is to select validated models for each bit of the physics. By "validated model" I mean a model that has been demonstrated to be *sufficiently* accurate/reliable over the range of parameters to be studied. All models are wrong, but some are useful. The accuracy of models needs to be measured, and then compared against what's needed (which also needs to be determined). So, for example, there's no need to involve special relativity, all the velocities are small, a Newtonian model is sufficient for our purposes. But we do have to invoke quantum mechanics to model absorption - a grey atmosphere model isn't sufficient.

The compatibility of the different models needs to be checked. Sometimes models make different, conflicting assumptions and these can lead to odd results.

The practicality of the model needs to be assessed. How much computation does it require? Do we have all the data we need? Have all the relevant parameters been measured, or estimated? There's no point in doing a lot of work on a scheme that you don't have to capability to carry out.

If we find our model is too difficult to work with, we may be able to make simplifications, approximations, pre-calculated look-up tables, produce ansatz models that work numerically even though they don't correspond to the physics, and so on. (That's where ideas like maximum entropy production might come in.) Keep going round the loop until we've got something we can do.

The combined model needs to be re-verified and re-validated. When we plug all the different models together, how do the individual inaccuracies in each of the component models combine? What are the error bars? How do we account for system uncertainty, where we're unsure if we've captured all the relevant physics? We design tests, to check to see if it's working, and demonstrate its validity.

We build the model and test it. We go round the debugging loop until it can be demonstrated to work.

We then test it formally using tests developed and data collected independently of any we used during development. This avoids the dangers of overfitting the test data. We also document the tested limits of its validity.

You start using it for real on low-value jobs, or as support to other methods. You build up a track record over a number of years. (It's amazing how using it in the real world reveals things you didn't think of!)

A model with a strong track record can then be used on more important jobs with higher stakes, and its predictions taken more seriously.

--

We're not in the final stages of that - nowhere near. So what can we do to get them there?

Well first, we need to keep working on basic mechanisms. Clouds are the big one, followed by ocean circulation and heat transfer, followed by biology (especially ocean biology, for the carbon cycle), followed by aerosols. Second, we need to collect better data. We need that to identify the physics, to provide the measured parameters and inputs, and to perform tests. We need to document the performance/accuracy of the current models and data - both what they get right and most especially what they get wrong, so we know exactly what goes wrong and understand as clearly as possible the problems we still need to solve. We need to explore a broader range of options and ideas - at the moment, all the models look much the same, and are mostly related to one another. Making them modular and mutually compatible so we can swap the best bits around would seem like a good idea. Validation and testing needs to be taken out of the hands of developers and done independently and externally. Rival research groups would be one possibility, if they hadn't already demonstrated their inclinations towards pal review, so I'd suggest specialist scrutineers with a lot of sceptics on the staff. Their aim is to document the shortcomings, not to rubbish the models, so you need a mix from both sides of the debate. And you need to formalise the testing, auditing, and validation process so that governments cannot use them for any purpose they've not been demonstrated to be able to perform. If you don't have a certificate for your models testifying to their ability to predict sea ice or hurricane strength a century in advance, then you can't sell or use them for that purpose. That will then motivate and fund the development of demonstrably better models.

There's tons to do, and no shortage of ideas for what could usefully be done. I don't think that's what's getting in the way.

Mar 8, 2015 at 11:45 AM | Unregistered CommenterNullius in Verba

We then test it formally using tests developed and data collected independently of any we used during development. This avoids the dangers of overfitting the test data.
Mar 8, 2015 at 11:45 AM Nullius in Verba

Or, as sometimes expressed, testing on the training data. A fallacy recognised in the early days of attempts to develop methods for automatic pattern recognition. But not acknowledged by the Met Office when they state that the (alleged) ability of GCM's to reproduce past climate validates them for predicting future climate. A very clear and obvious case of testing on the training data.

Mar 8, 2015 at 12:55 PM | Registered CommenterMartin A

Thanks for that NiV, very astute and informative..

I took the liberty to cross post your last part with your attribution at Climate Etc, where this is also being discussed, along with a link back to here.

http://judithcurry.com/2015/03/05/2-new-papers-on-the-pause/#comment-681717
.

Mar 8, 2015 at 3:17 PM | Unregistered CommenterRon C.

For the first time I find myself not agreeing with NIV or at least not agreeing the priorities going forward.

The problem with models is not the methods or mechanics but the simple fact that we do not know (or understand) all the factors that would be needed in order to make an accurate useable model. I do not think that knowledge is going to be available for a long time (personal opinion) and so although people should keep attempting to improve models; it should be recognised that they can play no part in predicting anything right now.

Mar 9, 2015 at 2:19 PM | Registered CommenterDung

Dung, I think a significant part of the problem is that you get, say, radiative physicists who incorrectly think they know everything about how water and CO2 act in the earth’s atmosphere; then you get a bunch of glaciologists who incorrectly think they know everything about how ice sheets respond to temperature, precipitation; plus a bunch of carbon cycle modelers who incorrectly think they know everything about chemistry and biology, etc etc.

Then a bunch of computer programmers, who know even less about anything, stick all these things together into a grand coupled GCM which, by implication, appears to contain the sum of a large fraction of the scientific knowledge and wisdom of the human race in one program. Even with good project management it was always likely to be complete bollocks, of course. A proverbial camel-designed-by-a-committee. The coupling together is no less important than the individual modules of the model.

Yet the supporters act as if they are the new scientific masters of the universe and anybody from a non [self-proclaimed] "Climate Science" background must bow down in deference to their science-of-everything. Too many people still in awe of what a computer model can do and not enough understanding of the limitations of the underlying principles and assumptions.

Mar 9, 2015 at 3:32 PM | Unregistered Commentermichael hart

"The problem with models is not the methods or mechanics but the simple fact that we do not know (or understand) all the factors that would be needed in order to make an accurate useable model."

Yep. -- "Generally, when modelling a physical system, the first task is to identify all the relevant physics. This can be difficult - we don't know whether some features are emergent properties from physics we've already incorporated but cannot analyse, or whether it's something we haven't included."

At the moment, we've not identified all the relevant physics. But that's a hard one to do anything about - we don't know where to start looking or even what we're looking for. The mechanisms are probably still in the domain of 'unknown unknowns'. We probably have to just keep on nibbling at the edges and wait for a lucky break.

And I'm not going to argue if someone prefers a different order of priority, or list of topics. I've chucked in a few ideas about what I might be looking at if I was a climate scientist, but of course it's a good idea to have lots of people approaching it from many different directions. Opinions should differ.

Mar 9, 2015 at 7:33 PM | Unregistered CommenterNullius in Verba

michael hart

I agree with you completely :)

There are so many facets of this global discussion that are pure fabrication, arrogance, ignorance and disinformation and it is all coming from the same side of the argument.
There is no such thing as a climate scientist and never will be; it is impossible for one person to be in possession of and understand all the up to date information within all the scientific disciplines that touch upon climate change..

Mar 10, 2015 at 4:20 PM | Registered CommenterDung

Then a bunch of computer programmers, who know even less about anything, stick all these things together into a grand coupled GCM which, by implication, appears to contain the sum of a large fraction of the scientific knowledge and wisdom of the human race in one program. Even with good project management it was always likely to be complete bollocks, of course. (...)
Mar 9, 2015 at 3:32 PM michael hart

A composite of unvalidated models, with the bits too poorly understood to model at all 'parameterized', was always more than just 'likely' to be bollocks - it was a certainty.

There is a common characteristic of huge programmes that attract government sponsorship because the grandiose goal has caught the misinformed imagination of politicians, encouraged and egged on by career-building bureaucrats.

Even though the calibre of the physicists involved is probably far beyond that of the so-called climate scientists, the pursuit of power from nuclear fusion has a similar flavour - a huge programme, funded by taxpayer's money, built on a firm foundation of bullshit optimism and wishful thinking.

Achieving fusion power

Fusion is expected to become a major part of the energy mix during the second half of this century. With adequate funding, the first fusion power plant can be operating in the 2040s.

http://www.ccfe.ac.uk/Fusion_power.aspx

(My emphasis). Note the "can" - "could" might have been a more honest choice of word for something that has a long way to go and many hurdles to overcome. And where a frank assessment would be along the lines "fusion power has always been 50 years in the future and it should be assumed it will remain there until the problems have been overcome".

Anybody remember the Alvey IT programme of the 1980's? £350M (in 1983 money) pissed away. Years afterwards, I heard two of its former managers asked "What did the Alvey program actually achieve?". After a couple of moments of nervous laughter, one of them replied "Well, at least it got us these jobs".

Mar 11, 2015 at 11:18 AM | Registered CommenterMartin A

Indeed, Martin. A lot of people have read E. F. Schumacher's "Small is Beautiful", but a lot more people have misunderstood it.

Mar 12, 2015 at 2:10 PM | Unregistered Commentermichael hart