Archive for September, 2016

The never vanishing ozone hole!

September 19, 2016

There has been some discussions in the media on the success of the Montreal protocol in eliminating the usage of ozone depleting substances, and its effect on the Antarctic ozone hole.

A much more sober evaluation can be found in the EEA report “Production and consumption of ozone depleting substances“. I made an overlay of the two important graphs of the area of the Antarctic ozone hole area  and the world  consumption of ozone depleting substances (ODS), like CFC’s:

consumption_and_area

This objective plot shows that the ozone hole practically stays constant since 1992, whereas the consumption of the OSD falls sharply to near zero levels.  This begs some serious explanation: are the man-made OSD the sole cause of ozone depletion, or are other phenomena acting here? Since 20 years, the OSD consumption is less than 20% of its value 30 years ago; should we really assume that a delay of 20 to 30 years is needed before seeing the effect of the OSD ban?

Just for us in the Northern Hemisphere: here the graph of the total ozone column as measured in Uccle since the 72’s: the grey region corresponds to the [-2sigma, +2sigma] interval, containing about 96% of all values.

uccle_toc

 

It is remarkable how fast local TOC changes, but the year-long average remains nicely sinusoidal; average TOC values do not show any decrease to be afraid of (there is a first  period of decrease followed by one of increase; we still are in the latter): no ozone hole here!

uccle_trend

No Certain Doom

September 4, 2016

Remark 5th Aug 2016: some minor revision and replacement of some figures by the originals after exchanges with Dr. Frank

——————————————————————

Dr. Pat Frank works at the Stanford National Linear Accelerator (SLAC). Since quite some years he studies the problem of uncertainty and reliability of climate models and measurements, and his conclusions are damning. In this comment I will try to condense the essentials given in his July 2016 presentation “No Certain Doom: On the Accuracy of Projected Global Average Surface Air Temperatures” (link) in July 2016 at the 34th Meeting of the Doctors for Disaster Preparedness. This is a very clear presentation and I urge you to spend 42 minutes to watch this video. For all those who do not have the time, I will try to condense the essentials of this presentation below.

1. The ubiquitous climate models

Dr. Frank starts with a beautiful worded reality: Climate models are the beating heart of the modern frenzy about CO2. He repeats that the IPCC said “it is extremely likely that human influence has been the dominant cause of the observed warming since the midth-twentieth century”, a “consensus” conclusion that has been adopted by many scientific bodies and universities. Nevertheless, when it comes to predicting the future warming, the picture looks not very nice:

AR5_TS15This figure from AR5  (AR5,TS15) shows that using the RCP 8.5 scenario, the uncertainty of 2300 warming (w.r. to pre-industrial times) is probably more than 12°C; this enormous spread of the results given by a very great number of models is a telltale sign of their poor reliability. Now this large uncertainty does not contain the errors of the individual models and their propagation, when future temperatures are calculated step by step. What is really amazing is that no published study shows this propagation of errors through the GCM projections, which as a consequence should be considered unknown.

2. A model of models

Dr. Frank recalls that all models essentially assume a linear dependency of warming and change of radiative forcing during one calculation step. He introduces a simple “model of models”, and compares the outcome with that of the super-computer models:

linear_model_of_models

Later a supplementary term of 4 will be added before the closing right bracket. Comparing the outcome of this “Frank-model” with the GCM’s, one must be surprised how good it compares to the big guys:

 

comparison_2GCM_Frank_model
The lines represent the outcome of the “Frank-model”, the dots those of different GCM predictions (bottom is Hadcrut3). The extraordinary good result allows Dr. Frank to say that “if you want to duplicate the projections running on super-computers, you can do it with a handheld calculator”.

3. Clouds, the elephant in the room

The influence of clouds on global temperature is enormous: globally their effect is assumed to make a 25 W/m2 cooling, with an annual uncertainty in the GCM’s of +/- 4 W/m2. This number must be compared to the consensus value of human produced GHG (greenhouse gases) of 2.8 W/m2 since 1900. So the cloud uncertainty simply swallows the alleged anthropogenic warming. Usually one assumes that the errors in the models are random, so that they could with some luck at least partially cancel each other. Dr. Frank shows that nothing could be farther from the truth: the errors of the models are heavily correlated ( R>0.95 for 12, and R>0.5 for 46 models).

How does the cloud uncertainty propagate through the calculation scheme of the GCM’s?

The annual 4 W/m2 uncertainty propagates from step to step, and the resulting uncertainty is the square-root of the (sum of the squares of the individual errors):

cloud_errors_in_GCM
The result is an increasing uncertainty which amounts to about 13°C when given as predicted warming for 2100:

propagation_of_cloud_error

Clearly this warming range is “outside of any physical possibility”; the temperature projections of even the most advanced models become meaningless, if the cloud uncertainty is properly applied.

3.  Conclusion

What can the climate models deliver as as prediction tools? The following slide is crystal clear:

what_do_climate_models_reveal_newfigure

Do our politicians which jump one over the other to sign the Paris treaty have only the slightest understanding of this uncertainty problem?