Remark 5th Aug 2016: some minor revision and replacement of some figures by the originals after exchanges with Dr. Frank
Dr. Pat Frank works at the Stanford National Linear Accelerator (SLAC). Since quite some years he studies the problem of uncertainty and reliability of climate models and measurements, and his conclusions are damning. In this comment I will try to condense the essentials given in his July 2016 presentation “No Certain Doom: On the Accuracy of Projected Global Average Surface Air Temperatures” (link) in July 2016 at the 34th Meeting of the Doctors for Disaster Preparedness. This is a very clear presentation and I urge you to spend 42 minutes to watch this video. For all those who do not have the time, I will try to condense the essentials of this presentation below.
1. The ubiquitous climate models
Dr. Frank starts with a beautiful worded reality: “Climate models are the beating heart of the modern frenzy about CO2“. He repeats that the IPCC said “it is extremely likely that human influence has been the dominant cause of the observed warming since the midth-twentieth century”, a “consensus” conclusion that has been adopted by many scientific bodies and universities. Nevertheless, when it comes to predicting the future warming, the picture looks not very nice:
This figure from AR5 (AR5,TS15) shows that using the RCP 8.5 scenario, the uncertainty of 2300 warming (w.r. to pre-industrial times) is probably more than 12°C; this enormous spread of the results given by a very great number of models is a telltale sign of their poor reliability. Now this large uncertainty does not contain the errors of the individual models and their propagation, when future temperatures are calculated step by step. What is really amazing is that no published study shows this propagation of errors through the GCM projections, which as a consequence should be considered unknown.
2. A model of models
Dr. Frank recalls that all models essentially assume a linear dependency of warming and change of radiative forcing during one calculation step. He introduces a simple “model of models”, and compares the outcome with that of the super-computer models:
Later a supplementary term of 4 will be added before the closing right bracket. Comparing the outcome of this “Frank-model” with the GCM’s, one must be surprised how good it compares to the big guys:
The lines represent the outcome of the “Frank-model”, the dots those of different GCM predictions (bottom is Hadcrut3). The extraordinary good result allows Dr. Frank to say that “if you want to duplicate the projections running on super-computers, you can do it with a handheld calculator”.
3. Clouds, the elephant in the room
The influence of clouds on global temperature is enormous: globally their effect is assumed to make a 25 W/m2 cooling, with an annual uncertainty in the GCM’s of +/- 4 W/m2. This number must be compared to the consensus value of human produced GHG (greenhouse gases) of 2.8 W/m2 since 1900. So the cloud uncertainty simply swallows the alleged anthropogenic warming. Usually one assumes that the errors in the models are random, so that they could with some luck at least partially cancel each other. Dr. Frank shows that nothing could be farther from the truth: the errors of the models are heavily correlated ( R>0.95 for 12, and R>0.5 for 46 models).
How does the cloud uncertainty propagate through the calculation scheme of the GCM’s?
The annual 4 W/m2 uncertainty propagates from step to step, and the resulting uncertainty is the square-root of the (sum of the squares of the individual errors):
Clearly this warming range is “outside of any physical possibility”; the temperature projections of even the most advanced models become meaningless, if the cloud uncertainty is properly applied.
What can the climate models deliver as as prediction tools? The following slide is crystal clear:
Do our politicians which jump one over the other to sign the Paris treaty have only the slightest understanding of this uncertainty problem?