Reflections on questions concerning global warming

“Prediction is difficult, especially about the future”

———————

Reflections on questions concerning global warming

Richard Jones

Part 4: Where are we going?

Modelling the future climate

———————

If anyone is amused by that old joke about prediction, it may be thanks to an assumption that prediction, obviously, is about the future. However, people in the business of scientific prediction should and do test their crystal balls by predicting the past. Let’s return to this in a while after looking at how the predictions of global warming are generated.

For this article, I need terms to refer to the protagonists in promoting or opposing the climate emergency proposition; for lack of fair unbiased descriptions that aren’t cumbersome, I’ve settled on somewhat disparaging labels used by one side against the other: climatists vs denialists. The latter is offensive as it exploits all the previous work  badmouthing people who question official accounts of the Nazi atrocities to tar climate ‘denialists’ with the same brush. ‘Climatist’ is insulting to people who intend to present scientific results and believe that they are doing exactly that. Of course there are charlatans on both sides, and each has publicists who find it convenient to argue against the charlatans of the other side rather than the serious analysts. What’s new?

First, I’ll describe how scientific predictions about climate change are made. This, without any attempt to be comprehensive, is intended to convey firstly that genuine science is involved, and secondly the limitations of the modelling process. Then, after a brief account of the actual predictions, I’ll look at the political output: the way the results are presented to governments and the public by the climatists, and reactions from denialists.

In the attempt to learn about how scientists come up with the material used by climatists to convince the world about a climate emergency, I wanted sources not involved in the passionate argumentation, and to this end I went for text books. As I’ve mentioned, climate science as a discipline seems to be a product of the climate emergency, so it’s unlikely that a text book from a climate scientist will pronounce that it’s all a lot of tosh. But that is not the point. To learn what climate scientists are actually doing, reasonably free of propaganda, look at how they are taught the job.

Of course, as with any branch of science, they have to learn about the science’s findings, and I used this previously to provide the underwhelming estimated historical global warming (a bit out of date, as we shall see). Now, a different text gives a succinct description of how predictions of future climate are generated. I have deliberately selected a free text book so that it is easy for any reader to access. 

Andreas Schmittner in ‘Introduction to Climate Science’ explains the basics of climate modelling:

“Climate models solve budget equations numerically on a computer. The equations are based on the conservation of energy, momentum, and mass (air, water, carbon, and other relevant elements, substances, and tracers). Typically they are solved in separate boxes that represent specific regions of Earth’s climate system components (Fig. 1). Along their boundaries the boxes exchange energy, momentum and mass. Exchange with the flow of water or air from one box to another is called advection. Prognostic variables such as temperature, specific humidity in the atmosphere, or salinity in the ocean, and three velocity components (zonal, meridional, and vertical) are calculated in each box. The momentum equations, which are used to calculate the velocities, are based on Newton’s laws of motion and they include effects of the rotating Earth such as the Coriolis force. The temperature equations are based on the laws of thermodynamics.

Thus, climate models represent the fundamental laws of physics as applied to Earth’s climate system.” 

This is real science in action. Before elaborating on that, I’ll point out that Fig.1 is taken straight from the Wikipedia article on climate modelling. The wiki article is probably harder to digest than the passage I quoted from the book, but the image may make the whole idea more accessible.

https://en.wikipedia.org/wiki/General_circulation_model

In physics, the first step in trying to understand something unfamiliar is to come up with a model. There is no expectation that the model is the ‘truth’ of the matter, it’s just a learning step. Around 1900 the atomic theory was simultaneously triumphant and overthrown. When the idea, from ancient Greece, that all matter is made from a small number of varieties of indivisible things called atoms was almost confirmed, smaller things were found that seemed to be part of these atoms, and the composition of atoms became a question. The Bohr model of the atom, with electrons orbiting a positively charged nucleus like planets around the sun, was understood to be impossible, but it was a giant step forward and its limitations provoked the line of enquiry that led to quantum mechanics. If you want to understand what determines the physical properties of metals (such as thermal and electrical conductivity) you may well begin by looking at a model that just considers the metal as a set of electrons moving freely but influenced by the electrical forces with the matrix of positive ions. It turns out that this simplest of models works fairly well for silver but is useless for most metallic elements, let alone alloys. Since that first model, the theory has been progressively refined.

The earth’s climate is a much more complex system to model than, say, a pure metal. The method described above mentioned energy exchange only at the boundaries of the ‘boxes’, so it doesn’t directly mention the most contentious ingredient in the ‘climate emergency’: absorption of solar radiation by carbon dioxide and other ‘greenhouse gases’. Nor does it mention change of state, which we can think of as the elephant in the room. Granted that this is an introductory text and the description of the modelling process will be simplified, it is still a worry that the most dynamic system of energy transfer in the atmosphere doesn’t rate a mention here, and the promotion of the climate emergency theme focuses on the passive carbon dioxide component with scarcely any mention of the evaporation, condensation and sublimation components of the earth’s dynamic heat engine. Indeed the closest we get to that is the increasing assertions about ‘extreme climate events’. Nearly all of these events involve the massive energy exchange from the heat pump in which water is evaporated from the ocean surface and condenses to water or sublimates to ice high in the atmosphere. Then there is albedo (reflectiveness of surfaces to solar radiation). Intuitively (but not necessarily correctly) it seems that higher surface temperatures on water will lead to more evaporation, then more condensation as cloud, which reflects a lot of sunlight back into space. In climatist publicity, albedo is mentioned in connection with the loss of ice cover. This and other factors in the earth’s albedo can be readily measured from satellites, and doubtless is. It figures in climate models, but it doesn’t get much publicity compared to the ice caps and glaciers. Reflection of tropical sunlight is of course much more significant than polar. If climate change results in more cloud at high latitudes and less near the equator, then albedo is reduced, adding to the warming trend. Also, while clouds reflect solar radiation into space, they also trap outgoing radiation. This is something we experience directly through nights that are warmer with cloud cover compared to clear-sky nights.

These comments are not meant to disparage the modelling. Most of this and much more is accounted for in various models. Indeed, Schmittner describes how (as with other physical modelling, such as that of metal properties mentioned above) climate models have gone through a process of stage by stage refinement. The point is just to underline how difficult and complex a task it is. Anyone who has read a page or two about chaos theory will have come across the legendary butterfly flapping its wings and causing an extreme weather event the other end of the world. Absurd as that is, it emphasises that a small error–including one of omission–in the data input to a model could cause a large error in the output. On the other hand, my comments can be fairly dismissed as armchair science. People actually doing the modelling (or reading all the reports in the journals) will have a good idea which of the complications and weaknesses are important. Some may have a trivial effect on prediction.

It is not at all surprising to learn that despite the best efforts of the scientists with their models, there is wide variation in their predictions. Indeed, Schmittner points out unavoidable errors in the modelling process, beyond obvious ones like inaccurate or inadequately precise data.

“Due to the limited resolution of the models, processes at spatial scales below the grid box size cannot be directly simulated. For example, individual clouds or convective updrafts in the atmosphere are often only a few tens or hundred meters in size and therefore cannot be resolved in global atmospheric models.”

I won’t try to explain ‘parametrization’, but it is a mathematical approach to dealing with those obstacles, and “The parameter values are usually not precisely known but they influence the results of the climate model. Therefore, parametrization are a source of error and uncertainty in climate models.

Parameters in the cloud parametrization of a model, for example, will impact its cloud feedback and therefore its climate sensitivity.” Also I’m led to wonder about the potential for parametrization to be a source of spurious accuracy. Models are tested by ‘hindcasting’–applying them to past climate and comparing with historical data. There is plenty of scope for fine-tuning a model by adjusting the parameters. This can be a legitimate process depending on many considerations, but it has to be recognised that it is a kludge that is useful as a temporary expedient while developing a branch of science or a particular application. Its history is not altogether a happy one. Einstein found that his theory of gravitation, the “General Theory of Relativity”, predicted that the universe must be expanding or contracting, which as far as was known at the time was not the case. So he added to his fundamental equation for gravity a ‘cosmological constant’ that made a steady state universe possible. Oops!

If the predictions of global climate vary so much, how can it be said that there is a “scientific consensus”? The predictions of various models are averaged. The success claimed for climate modelling is based on comparing the resulting “model mean” from hindcasts with historical climate changes. The validity of the averaging really depends on the assumption that hindcasting is a reliable guide to the success of a model for forecasting. Random errors can be reduced by averaging of many estimates, but systematic errors cannot.

My summary has probably been excessively negative in focusing on matters that denialists may call weaknesses and climatists call challenges. There is excellent information on the web. I recommend a couple of articles on Carbon Brief.

As far as I have seen, the denialists do not make climate predictions, they criticise or ridicule the predictions of the climatists, or accept the less dire predictions and contest the conclusions drawn from them.

There are scientists who disagree with climatist agendas, but to the best of my poor knowledge, they have not run models and proffered dissenting predictions. So it is a bit of an anti-climax when we look at the current predictions for future climate change. They can be found assembled in one place with zero controversy. There are multiple predictions, but they differ in scenarios, mainly the rate at which humans add carbon dioxide to the atmosphere. Attention is directed to the scenarios as the basis for variation, and away from any differences arising from the modelling process itself such as estimates as substitutes for unknown quantities.

The climate scientists who run the models and publish predictions work collaboratively. You can think ‘conspiracy’ if you wish, but this is how real science is done. (Much of what is called science does not deserve the name because it is done in secret, not collaboratively, and cannot be subjected to public scrutiny.) They also work closely with the Intergovernmental Panel on Climate Change (IPCC), a kind of climatist consortium set up as a joint project of the World Meteorological Office and the UN Environment Programme. The IPCC periodically publishes reports that detail the predictions of the latest model means. The current predictions are found in the “Sixth Assessment Report (AR6)”. I accessed this through the “Synthesis Report (SYR)” which “synthesizes and integrates materials contained within the three Working Groups Assessment Reports and the Special Reports contributing to the AR6. It addresses a broad range of policy-relevant but policy-neutral questions approved by the Panel.”

If it is policy-neutral, that must be in the same sense that US and Israeli policy on genocide is neutral as to whether to achieve it by bullets, phosphorus fragmentation bombs, starvation, etc. The IPCC doesn’t tell governments what measures to take but it is emphatic in telling them that drastic action is needed. I can see that anyone immersed in ‘climate emergency’ politics would see that as neutral. But I risk jumping ahead to the politics. That is not for this article. First let’s see the predictions.

Actually it’s hard to completely avoid the politics. In one respect it is everywhere. Leaving aside the typos and possibly unintentional ambiguities, there is constant Mandarin Newspeak, the use of codewords and abbreviations that deter the uninitiated from attempting to understand, and lots of pretty graphs and diagrams to convince same to give up and just trust the experts.

I don’t suppose the reader is desperately anxious to see the predictions, so I will continue on that theme with a paragraph of Mandarin Newspeak on the subject of climate modelling:

“Modelled scenarios and pathways are used to explore future emissions, climate change, related impacts and risks, and possible mitigation and adaptation strategies and are based on a range of assumptions, including socio-economic variables and mitigation options. These are quantitative projections and are neither predictions nor forecasts. Global modelled emission pathways, including those based on cost effective approaches contain regionally differentiated assumptions and outcomes, and have to be assessed with the careful recognition of these assumptions. Most do not make explicit assumptions about global equity, environmental justice or intra-regional income distribution. IPCC is neutral with regard to the assumptions underlying the scenarios in the literature assessed in this report, which do not cover all possible futures.”

There is one nice thing in all that: someone has kindly chosen to educate the reader in the correct use of Mandarin Newspeak. Predictions or forecasts are to be called ‘quantitative projections’, which of course they are not. The simplest way to predict changes in temperature or sea level etc would be to get a graph of recent changes and extend it into the future (projecting the observed quantities). Sadly, the result of that would not be nearly as alarming as some of the ‘scenarios’.

So, at last, here are the ‘quantitative projections’:

“Global warming will continue to increase in the near term (2021–2040) mainly due to increased cumulative CO2 emissions in nearly all considered scenarios and modelled pathways.” … “The assessed climate response to GHG emissions scenarios results in a best estimate of warming for 2081–2100 that spans a range from 1.4°C for a very low GHG emissions scenario (SSP1-1.9) to 2.7°C for an intermediate GHG emissions scenario (SSP2-4.5) and 4.4°C for a very high GHG emissions scenario (SSP5-8.5), with narrower uncertainty ranges than for corresponding scenarios in AR5.”

(AR5 refers to the preceding IPCC report.)

Assume that the “warming for 2081–2100” is actually for the century and a half ending with that period rather than what it actually says. This kind of translation is required all through, any time the reader is confronted with the predictions or forecasts, oops, quantitative projections. These are spread over a significant part of the document, but the majority is about what is to be done. In a policy-neutral fashion of course.

Discussion of the not-predictions, of course, involves the various ‘scenarios’, mostly about CO2.  So the variation in output of actual models is masked behind the variation based on different CO2 futures.

What are we to think of these projections? The greenhouse effect is real, and essential for life on earth. Increasing CO2 must increase temperatures in the lack of any counter balance. However, the very complex system that is fancifully called ‘global climate’ has numerous feedbacks. Some are positive and some negative. (Negative feedback involves counterbalancing effects, encouraging stability. Positive feedback encourages runaway processes although in general there are limits, and reaching a limit can cause an opposite runaway process. In electronics, this is the basis for oscillators. In climate, this has a part in the alternation of ice age and interglacial.)

Two things trouble me with all this.

The report glibly admits that not all possible scenarios are covered. This could be the understatement of the century. For instance: if you burn wood–or children–you add CO2 to the atmosphere which has a warming effect, but you add smoke particles which have a cooling effect. How can we not wonder at the amount of grovelling that allows all this ‘quantitative projection’ to be based almost exclusively on how we generate power for civil use and ignore questions like the balance in the war against Russia between the effect of fuel burning and aerosols from explosive munitions—let alone the aerosols from nuclear munitions in the context of the empire’s attempts to legitimate nuclear war.

The other issue is the need to suspend disbelief in the modelling. If you only have inaccurate rulers, you can (unless their errors are identical) improve accuracy by measuring how tall is a tail by using many rulers and taking an average. On TV entertainment there are sometimes quiz games involving guessing a number, and if you are last to guess, you’ll improve your odds by averaging the previous guesses. All this is simply properties of number. Model averages, just by the nature of mathematics, are, probably, closer to reality than the model extremes.

Climate modelling is done in a similar manner to weather forecasting, which nowadays is often quite accurate, but has limitations. My experience recently in Perth, Western Australia, is that nightly minimum temperature is a challenge as it is so sensitive to cloud cover. That failure suggests an inability to accurately model the water-based heat engine that I mentioned.

Other failures seem to involve fairly accurate prediction that falls foul of a bit of uncertainty as to timing of rain or its position: if a region of heavy falls is slightly north or south of the prediction, there are locations that are much wetter or drier than forecast. This kind of error is probably inconsequential for climate modelling.

All in all, a given amount of computer resources can use much smaller spatial boxes for weather than for global climate, and smaller time intervals. On the other hand, a climate model can run for extended time that would render a weather forecast pointless. Ultimately however, no amount of computer power and time can overcome the basic problem of uncertainty in the input to a model. So the output is varied. Divergent results are averaged to get something that with hindcasting fits observations tolerably. (I haven’t checked this and I know of one instance where denialists point to deliberate fraud in the fitting of model to the records.)

Out of all this we have an underwhelming group of forecasts for a handful of scenarios that mask the disagreements in models by averaging them for each scenario. The term scenario is a grandiose smoke screen for handling a single unknown: net input of ‘carbon’ (actually, carbon dioxide) to the atmosphere, and evading the others. So if all the other unknowns, such as solar radiation, vulcanism, nuclear war, behave themselves (including negative and positive feedbacks contributing in a conveniently minor way) then we have some estimates of some increasing average temperatures over the next century.

And we have some truly extravagant claims about the likely consequences, which I will address next, before taking another look at the measures advocated for dealing with the ‘climate emergency’.

Leave a comment