Archive for August, 2020

Global temperatures from historic documents (1/2)

August 20, 2020

1. Introduction

When we speak of global warming, the following picture is practically omnipresent:

It presents the global temperature anomaly (i.e. the difference of the actual yearly temperature with the average from 1961-1990) as given by the 3 most known temperature reconstructions of GISS (= NASA), HADCRUT4 (England) and BERKELEY (Berkeley BEST project, USA). These series more or less agree for the last 50 years, but nevertheless show visible difference for the preceding 50 to 70 years. The data used are those from known weather stations, but also from proxies like treerings, ice cores etc. What is rarely mentioned, is that during the late 19th and the beginning 20th century there were many famous scientists who worked on the same problem: find global mean yearly temperatures according to the latitudes (the so-called zonal temperatures) and/or find the global yearly isotherms which were known not to coincide with the latitude circles. Many of these ancient researchers like von Hann and von Betzold were from Germany and published in German. This may explain the poor interest shown in these papers by “modern” researchers.

This situation has some similarities with the reconstructions of global CO2 levels. Here also mostly ice-cores or other proxies are used, and the papers from the 19th century scientists which made real CO2 measurements with chemical methods are often belittled. The late Ernst-Georg BECK (a chemistry and biology German teacher) made an outstanding effort to find and evaluate these old measurements, and found that these values were much more variable as told by the “consensus” climatology. I wrote with Beck a paper published in 2009 by Springer on how to try to validate these old measurements, of which there were not many and their focus typical local (link).

2. The KRAMM et al. paper

Gerard Kramm from Engineering Meteorological Consulting in Fairbanks and his co-authors (Martina Berger, Ralph Dlugi from the German Arbeitsgruppe Atmophärische Prozesse. Munich, and Nicole Mölders, University of Alaska Fairbanks) have published in Natural Science, 2020 (link) a very important paper on how researchers from the old times calculated zonal, hemispheric and global annual temperatures. The very long title is “Meridional Distributions of Historical Zonal Averages and Their Use to Quantify the Global and Spheroidal Mean Near-Surface Temperature of the Terrestrial Atmosphere“, and this 45 page paper is a blockbuster. It contains it’s fair share of mathematics, and I had to read it several times to understand the finer points. I first stumbled on that paper from a discussion at the NoTricksZone blog (link), and you might well first reading the comment of Kenneth Richard.

The 4 authors all seem German speaking people, what explains that many citations are given in its original language. They tell us that very famous scientists of the second half of the 19th and the start of the 20th century worked to find global average temperatures. One must remember that in 1887 for instance 459 land based meteorological stations (outside the USA and the polar regions) and about 600 vessels gathered meteorological data; the first Meteorological Congress held in 1873 in Vienna had standardized the equipment (for instance of dry and moist thermometers). The best known authors of big climate treaties written in the 1852-1913 time span are von Hann ( Julius-Ferdinand von Hann, 1839 – 1921 ) and von Betzold (Wilhelm von Betzold, 1837 – 1907 ), who referred to numerous other authors.

The Kramm paper tries to validate the results given by these authors, using papers from other authors and mathematical calculations.

Just to show how good the results of these authors were, look at the following extract of  a graph from von Hann (1887) showing the zonal isotherms over the whole globe. I have added the text boxes:

The yellow dot shows the approximate location of Diekirch, slightly south of the 10°C isotherm. The yellow box shows that the mean temperature measured by meteoLCD was 10.6°C over the 21 years period 1998 – 2019, very close to the von Hann isotherm of 1887.

The authors write that “obviously the results of well-known climate researchers ….are notably higher than those derived from Hadcrut4, Berkeley and Nasa GISS“. So the question is have these institutions (willingly or not) lowered the temperatures of the past and so amplified the global warming?

(to be continued)

Colle Gnifetti ice core… a new European temperature reconstruction

August 5, 2020

CG_drilling

(picture from the PhD thesis of Licciulli, 2018)

When we want to know the temperatures of say the last 1000 years, we must use proxies like changes in the O18 isotope, changes in leaf stomata or tree rings (for instance in the famous bristlecone trees) etc… The best known proxies (beside tree rings) are ice cores, most coming from drilling in Antarctica or Greenland glaciers. Ice cores from European glaciers are few, so the paper by Bohleber et al. on ice cores from the Monta Rosa region is remarkable. The title is “Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millenium” (link), and it was published

graphic_cp_cover_homepage

in the “Climate of the Past” series of the European Geosciences Union (EGU) in January 2018. I became aware of this paper by an excellent comment of Willis Eschenbach in WUWT (24-Jul-2020), I will come back to this later.

What makes the paper of Bohleber so special, is that the location of the 2 ice cores is on the Colle Gnifetti saddle (4450m asl) in the Monte Rosa region (border between Italy and Switzerland), so really in our neighborhood when compared to Antarctica and Greenland. This glacier is not very thick (about 140m only), as the prevailing winds remove a good part of the yearly snowfall. But the ca. 65m deep drillings allow going back by more than 1000 years. The researchers studied the dust layers found in the ice cores, especially the abundance of Ca2+ ions. These dust layers are very thin, so they used quite sophisticated laser technologies to investigate them. They found a good agreement between the observed temperature trends and those of the Ca2+ dust layers (mostly dust from the Sahara: warmer temperatures increase the advection of dust-rich air masses).

The IPCC’s view at the last 1000 years temperatures

In its first assessment report (FAR) of 1990, the IPCC gave a graph form Hubert Lamb showing (without any clear temperature scale) the existence of a warmer period (MWP) around year 1000 and the later distinctive cooling of the Little Ice Age (LIA):

Medieval_Warm_FAR

With the infamous Hockey-Stick paper by Mann in the 3rd report (TAR, 1999) the MWP disappeared, or was ignored (link to original paper):

hockeystick_1999

For political or activist reasons, this faulty graph from a junior PhD became a poster-child in the global warming debate, and remained so for long years, despite the fact that it was shown wrong for an incorrect application of statistical calculations (PCA, principal component analysis) and inadequate choice of tree rings.

Today there are many reconstructions of the NH temperatures, and the figure below (blue arrow and highlights added by me) shows how different they are, and that at least one (Christiansen and Ljungqvist, 2012) gives hugely changing temperatures, with a very pronounced MWP nearly as warm as today (link):

many_reconstructions_NH

Now, here follows the reconstruction by Bohleber et al, based as seen above on the study of dust layers, a factor that was not considered in the hockeystick paper.

CG_temp_reconstruction

I have added the text boxes and the arrows to the original graph. First one should note the temperatures are anomalies (=deviations) from the average temperature at GG during 1860 – 2000. The horizontal time axis is reversed, i.e. the most recent period is left, and the “calibration” period is the interval 1860 to 2000. The red curve shows an independent reconstruction by Luterbach of mean European summer temperature anomalies. The black curve gives (if I understand this correctly) these same anomalies as measured by meteorological instruments over Europe (West Europe?).

Willis Eschenbach made a linear regression with the BEST NH temperature reconstructions, and adjusted the Ca2+ curve using this function (y = 1.6*x – 0.2). The visual correlation for the last 250 years is excellent (except a divergence for the last ~25 years):

Eschenbach_BEST_NH

Applying the same regression on the whole CG data, and smoothing by a 15 year filter makes the important details still more visible:

Eschenbach_CG_linearadj

We clearly see two warm periods: one around 850 AD and the other corresponding the the MWP, today called MCA = Medieval Climate Anomaly, because it seems inconvenient to the “consensus climatology” that some CO2 low medieval times were nearly as warm as today. So Bohleber et al. write in their conclusion “the reconstruction reproduces the overall features of the LIA … and reveal an exceptional medieval period around AD 1100-1200”.

What also clearly can be seen in all these graphs is that the climate never was stable for very long times: the normal situation is a changing climate!