No Certain Doom

Remark 5th Aug 2016: some minor revision and replacement of some figures by the originals after exchanges with Dr. Frank

——————————————————————

Dr. Pat Frank works at the Stanford National Linear Accelerator (SLAC). Since quite some years he studies the problem of uncertainty and reliability of climate models and measurements, and his conclusions are damning. In this comment I will try to condense the essentials given in his July 2016 presentation “No Certain Doom: On the Accuracy of Projected Global Average Surface Air Temperatures” (link) in July 2016 at the 34th Meeting of the Doctors for Disaster Preparedness. This is a very clear presentation and I urge you to spend 42 minutes to watch this video. For all those who do not have the time, I will try to condense the essentials of this presentation below.

1. The ubiquitous climate models

Dr. Frank starts with a beautiful worded reality: Climate models are the beating heart of the modern frenzy about CO2. He repeats that the IPCC said “it is extremely likely that human influence has been the dominant cause of the observed warming since the midth-twentieth century”, a “consensus” conclusion that has been adopted by many scientific bodies and universities. Nevertheless, when it comes to predicting the future warming, the picture looks not very nice:

AR5_TS15This figure from AR5  (AR5,TS15) shows that using the RCP 8.5 scenario, the uncertainty of 2300 warming (w.r. to pre-industrial times) is probably more than 12°C; this enormous spread of the results given by a very great number of models is a telltale sign of their poor reliability. Now this large uncertainty does not contain the errors of the individual models and their propagation, when future temperatures are calculated step by step. What is really amazing is that no published study shows this propagation of errors through the GCM projections, which as a consequence should be considered unknown.

2. A model of models

Dr. Frank recalls that all models essentially assume a linear dependency of warming and change of radiative forcing during one calculation step. He introduces a simple “model of models”, and compares the outcome with that of the super-computer models:

linear_model_of_models

Later a supplementary term of 4 will be added before the closing right bracket. Comparing the outcome of this “Frank-model” with the GCM’s, one must be surprised how good it compares to the big guys:

 

comparison_2GCM_Frank_model
The lines represent the outcome of the “Frank-model”, the dots those of different GCM predictions (bottom is Hadcrut3). The extraordinary good result allows Dr. Frank to say that “if you want to duplicate the projections running on super-computers, you can do it with a handheld calculator”.

3. Clouds, the elephant in the room

The influence of clouds on global temperature is enormous: globally their effect is assumed to make a 25 W/m2 cooling, with an annual uncertainty in the GCM’s of +/- 4 W/m2. This number must be compared to the consensus value of human produced GHG (greenhouse gases) of 2.8 W/m2 since 1900. So the cloud uncertainty simply swallows the alleged anthropogenic warming. Usually one assumes that the errors in the models are random, so that they could with some luck at least partially cancel each other. Dr. Frank shows that nothing could be farther from the truth: the errors of the models are heavily correlated ( R>0.95 for 12, and R>0.5 for 46 models).

How does the cloud uncertainty propagate through the calculation scheme of the GCM’s?

The annual 4 W/m2 uncertainty propagates from step to step, and the resulting uncertainty is the square-root of the (sum of the squares of the individual errors):

cloud_errors_in_GCM
The result is an increasing uncertainty which amounts to about 13°C when given as predicted warming for 2100:

propagation_of_cloud_error

Clearly this warming range is “outside of any physical possibility”; the temperature projections of even the most advanced models become meaningless, if the cloud uncertainty is properly applied.

3.  Conclusion

What can the climate models deliver as as prediction tools? The following slide is crystal clear:

what_do_climate_models_reveal_newfigure

Do our politicians which jump one over the other to sign the Paris treaty have only the slightest understanding of this uncertainty problem?

 

3 Responses to “No Certain Doom”

  1. Kein sicherer Untergang – Klimamodelle können aus systematischen Gründen die Zukunft nicht berechnen. – EIKE – Europäisches Institut für Klima & Energie Says:

    […] Link: https://meteolcd.wordpress.com/2016/09/04/no-certain-doom/ […]

  2. Pat Frank Says:

    Dave Burton, your post at WUWT reveals no major error on my part.

  3. daveburton Says:

    (Francis, please just delete my slightly garbled 1st & 2nd attempts at this comment. Sorry about that, and thank you! -Dave)

    My comment link (above) does not work. I think it was probably wrong even when I posted it, or perhaps it broke due to the changes at WUWT, I’m not sure which. But, either way, it is wrong. Here’s a corrected link, which points to the right comment:

    https://wattsupwiththat.wordpress.com/2016/11/22/the-needle-in-the-haystack-pat-franks-devastating-expose-of-climate-model-error/#comment-2349884

    Dr. Frank subsequently got his work published, here:
    https://www.frontiersin.org/articles/10.3389/feart.2019.00223/full

    Dr. Roy Spencer examined it, and concluded, as I had, that its primary conclusion is incorrect. Here’s Dr. Spencer explaining it; his explanation is convincing:

    Part 1: http://www.drroyspencer.com/2019/09/critique-of-propagation-of-error-and-the-reliability-of-global-air-temperature-predictions/ (or https://wattsupwiththat.com/2019/09/11/critique-of-propagation-of-error-and-the-reliability-of-global-air-temperature-predictions/ )

    Part 2: http://www.drroyspencer.com/2019/09/additional-comments-on-the-frank-2019-propagation-of-error-paper/ (or https://wattsupwiththat.com/2019/09/12/additional-comments-on-the-frank-2019-propagation-of-error-paper/ )

Leave a comment