Welcome to the meteoLCD blog

September 28, 2008


This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to the questions of climate change, global warming, energy etc…

Sun variability and NH temperature

September 3, 2021

There is a new paper published in April 2021 in Research in Astronomy and Astrophysics titled “How much has the Sun influenced Northern Hemisphere temperature trends? An ongoing debate” (link). The authors are R. Connolly, W. Soon, M. Connolly together with 19 coauthors. All these people are from well-known universities or research facilities, and as such have impressive scientific backgrounds. The paper is quite long, more than 70 pages including a huge 536 items reference list. I recommend a careful reading of this paper that is the best overview of scientific knowledge regarding the sun-climate question I know of. What makes this paper unique is that it presents many facets of the problems of the TSI variability and of NH surface temperature series. It honestly states that not all coauthors share the same conclusions, so it clearly is not a cherry-picking paper pushing an activist agenda. As this is such a large and diverse paper, I will just touch on a few aspects, and try to give a short summary.

Its main conclusion is that the IPCC’s stand on the influence of the sun on global warming is at least open for discussion, and ignores a huge amount of scientific findings that conflict with its anthropocentric view on human caused climate change.

The problems with knowing TSI

Everybody knows that the sun is the engine that drives Earth’s climate, and that the energy output of this big thermonuclear reactor is not constant. Best known are the 11 years Schwabe cycle of total solar intensity, and the 22 years Hale cycle of its magnetic activity. The TSI (irradiance in W/m2 on a surface perpendicular the the solar rays, measured at TOA, the top of the atmosphere) really is directly and continuously measured only since the satellite times, starting in 1978 with the NIMBUS 7 satellite and its ERB (Earth Radiation Budget) mission. Previous data are more patchy, coming from soundings with balloons and rockets, or from indirect proxies like solar spots, changes of the solar magnetism measured at ground level or even planetary (astronomical) causes.

43 years of satellite measurements covering nearly 7 Schwabe cycles should be enough to yield a definitive answer for TSI variability, but this is alas not the case. The satellites instruments degrade with time, and successive satellites have different biases and measurement problems:

The different TSI measurements series from 1978 on (link)

The figure shows that the series differ by about 10 W/m2, so simply stitching together these series is impossible (just to set this number: the increased radiative forcing caused by the higher atmospheric CO2 concentration from 1750 to 2011 is about 1.82 W/m2, according to the IPCC AR5) . Two best-known efforts to get a continuous “homogenized” series are those from the ACRIM (USA) and PMOD (Davos, CH) teams. Both come to different conclusions: according to ACRIM there is a general increase in TSI, whereas PMOD thinks that TSI remains more or less constant. Needless to say that the IPCC adopts the PMOD view that conforms more to its policy of anthropogenic caused climate warming, and ignores ACRIM

If one includes the proxy series, as this paper does, there are 16 different TSI reconstructions that may be valid. So the least that can be said is that an honest scientific broker should exanimate, and not ignore, them all.

High and low variable TSI series

The 11 year cycle is not the only one influencing TSI; there are also many multidecadal/multicentennial/multimillennial cycles which can be found by spectral analysis or by astronomical causes, like the Milankovitch cycles. If these longer cycle variations are considered small w.r. to the Schwabe cycle, the reconstruction is consider “low variability”, in opposition to “high variability”. The authors try to compare both type of reconstructions with the changes in the NH surface temperature, and they find that the latter (high variability) series correspond better with the NH temperature changes since 1850.

What NH temperature series to use?

Clearly the vast majority of weather stations are located in the NH. A big problem is that the ongoing urbanization introduces an urban island warming bias, which is still visible in many of the homogenized series like those of NASA-GISS. So the authors propose to use only the stations that were and still are rural since about 1850. The difference can be startling, as shown in the next figure which takes a NH subset of 4 regions (Arctic, Ireland, USA, China):

The warming trend is 0.41 °C/century for the rural only stations, whereas it is more than double with 0.94 °C/century for the combined rual+urban stations. Notice also the much greater variability (i.e. lower r2) of the rural only series!

This makes it clear that the choice of including all stations (with the risk of including an urban warming bias) or only the rural ones (with the handicap of having much fewer stations) will command the outcome of every sun-temperature research.

An example of the solar influence

The next figure is a subset of figure 16 of the paper; it shows how the trend of a linear temperature fit (blue box = fit of temperature w.r. to time) can be compared to that of the solar influence (yellow box= fit of temp w.r. to TSI) and the anthropogenic greenhouse gas forcings (grey box = fit of temp. w.r. to GHG forcings); the latter are the residuals left over from the (Temperature, TSI) linear fit.

Using rural only stations, and a high variability TSI reconstruction shows that the solar changes could explain 98% of the secular temperature trend (of the NH surface temperatures); using both urban and rural stations, the solar influence is still 57%, i.e. more than the half of the warming can be explained by a solar cause.


In this short comment I could only glance some points of the paper. It has many more very interesting chapters, for instance on the temperature reconstructions from sea surface temperatures, glacier length, tree rings etc.

What remains is an overall picture of complexity, which is ignored by the IPCC, as well in the AR5 and the new AR6. The science on the influence of solar variability, be it in the visible or UV spectrum, is far from settled. The IPCC ignores datasets that conflict with its predefined political view. The recent warming is only unusual if calculated from the rural + urban data series, but mainly unexceptional if temperature data are restricted to the rural stations.

Radioactivity and precipitation

July 27, 2021

In the past I have written many times on the observational fact that due to radon washout, the ambient gamma radiation shows sometimes impressive peaks ( see here, here, here, here, here, ).

In this blog I will show that the graphs of cumulative rainfall and gamma radiation might give a wrong picture, and that using the original time-series yield a more correct insight.

Here what the graphs of cumulative rainfall and gamma radiation of atmospheric air shows for the week covering the 24 and 25 July 2021:

These graphs are not faulty, but give a wrong picture: the two rainfall peaks cause two radiation peaks, with the second higher that the first, even if its “cause” (= the precipitation in mm per half-hour) is much less. This could be a sign of radon washout during the first peak, and radioactivity levels which have not yet recovered to their usual background.The third precipitation peak during the 25th July does only cause a mild surge of the gamma radiation intensity.

Now let’s zoom on the half-hourly levels of precipitation and gamma radiation:

The picture becomes somewhat clearer: there are 2 precipitation peaks during the 24th July 2021, and the intensity of the second is close to the double of the first ( the X scale represent the multiples of half-hours, starting at 00:30). The second radiation peak is practically the same as the first: the gamma levels have not sufficiently recovered from the first washout during the approx. 7 hours to yield a proportional higher peak.

The third event during the 25th July is more “smeared out”: the total rain volume falls down during ca. 3.5 hours (7 half-hours), and is not concentrated on a single half-hour event. This does not cause a strong radiation increase, even after 20 hours have passed since the last rain-fall peak, a time-span probably long enough to compensate for the previous washout. I suggested in one of the previous blogs a recover period of approx. 1 day.

I always marvel why our “greens” have not yet discovered this natural phenomenon of radiation increase, and not jumped on this pattern which should give a good scare. The second peak here is about 97-85 = 12nSv/h, i.e. 14% higher than the usual background. What would Greenpeace say if radioactivity from the Cattenom nuclear facility had increased by this amount?

The declining value of wind and solar electricity

July 23, 2021



Many scientists have predicted that wind and PV electricity value will decline above a certain level of penetration, where penetration means the percentage of installed wind and solar capacity w.r. to the total installed electricity production systems (which include for instance fossil fuel and nuclear systems).

A new outstanding paper by Dev Millstein et al. published in JOULE (a CellPress Open Access publication) puts these predictions on solid data foundations. The title is “Solar and wind grid system value in the United States: The effect of transmission congestion, generation profiles and curtailment” (link). The authors analyzed data from 2100 US intermittent renewable electricity producers, and separated the influence of the production profile (e.g. the sun does not shine at night), transmission congestion (e.g. difficulties to transport excessive solar and wind electricity during favorable periods) and curtailment (i.e. cost of shutting down solar and wind producers to avoid net infrastructure problems).

This is a long 28 pages paper, well worth reading several times to become familiar with the different technical concepts.

In this blog, I try to condense the essentials in a few lines.

  1. By how much do the prices for wind and solar electricity fall ?

The short answer is that above 20% of wind/solar penetration, the produced electricity value falls by 30 to 40%.

This is an enormous amount, which may put a barrier to higher wind & solar penetration. This barrier is basically rooted in economic realities, not physics or engineering problems!

2. What is the parameter which has the most influence on value creep?

The authors find that the production profile i.e. the timing of production over a day or longer period is the principal cause of the fall in electricty value:

Look at the highlighted CAISO numbers, which correspond to the situation in California. The solar penetration is large (19%), as is the value fall of 37% ( the is percentage of value decline w.r. to electricity prices which would have been customary when there were no intermittent wind & solar renewable producers).

For most sites, the decline of electricity value follows a logistic curve ( = exponential decline at the beginning which stabilizes at an horizontal asymptote). This is not the case for CAISO, where the decline is practically linear (see the yellow double-line, highlights by me):

The decline from 0 to 20% penetration is nearly 50%, from about 1.3 to 0.6, which is close to breathtaking!

3. What do we have to expect?

Up to now, most of this decline was cancelled or obscured by the falling prices of wind and solar installations. But many factors suggest that the easy part of lowering prices to make PV’s and wind turbines is bygone. There surely will be some fall in prices, but not at the level previously seen. The scarcity of raw materials and rare earths, the low number of producing countries and regions, the increased world-wide demand all point to an end of the spectacular price falls seen during the last years.

So in absence of a breakthrough in storage technology which could change the production profile (remember: this is the main factor of the fall of electricity prices!), some countries will rapidly hit the wall. Sure, politics and overt or hidden subsidies for wind & solar may obscure this price creep, but these will inflate electricity prices above levels that even the most green-inclined citizens are willing to pay. Knowing that their sacrifices will have no measurable influence on supposed global evil climate destructive CO2 levels will certainly be a barrier for increasing sacrifices in life-style, which are asked for, and by this value decline in wind & solar electricity that should save the planet.

4. Some conclusions of the authors

  • Some models indicate …. that value decline might soon get worse; our empirical values provide little solace on that front
  • Forward-looking models, which have been roughly correct to date, suggest that we will soon enter a regime of accelerating value decline

All this should somehow dampen the naïve and “politically correct” enthusiasm for an exclusively wind and solar driven world!

Radiation Amplification Factor RAF in April 2021

April 29, 2021

We had a period of several cloudless, blue sky days at the end of April 2021. So time to redo a calculation of the Radiation Amplification Factor RAF. In short, we want to see how the variation of the Total Ozone Column (TOC) influences the effective UVB radiation at ground level. I wrote several time on this, and usually we found an RAF of approx 1.05 to 1.10.

First here a graph showing the variation of total solar irradiance (blue curve, unit W/m2) and the effective UVB (red curve, unit mMED/h):

First remark that the peak solar irradiance was practically constant; the 24th April was a bit hazy, so it will be left out in the computations. The numbers in the turquoise boxes are the maximum values of the TOC, measured in DU (Dobson Unit) with our Microtops II instrument (serial 5375). Let us first plot the UVBeff versus the TOC:

fig. 1 UVBeff versus maximum daily TOC (5 days: 23 and 25 to 28 April 2021)

Clearly the UVBeff values decrease with increasing TOC, as the thicker ozone column filters out more UVB radiation. The empirical relationship is practically linear, and suggests that a dip of 100 DU (a quite substantial thinning of the ozone layer) would cause an increase of effective UVB of about 0.6 MED/h or 1.7 UVI (as 1 MED/h = 25/9 UVI).

The numerical correct definition of the RAF is : UVB = C * TOC**RAF where ** means “at the power of” Taking the natural logarithm gives ln(UVB) = ln(C) +RAF*ln(TOC) or RAF = [ln(UVB – ln(C)]/ln(TOC).

If we have many measurement couples of UVB and TOC, it can be shown (see here) that

RAF = [-ln(UVBi/UVB0)]/[ln(TOCi/TOC0)]

where the index i corresponds to the ith measurement couple, and 0 to that taken as a reference (usually i=0). This is equivalent to say that RAF is the slope of the linear regression line through the scattterplot of -1*ln(UVBi/UVB0) versus ln(TOCi/TOC0).

Here is that plot:

RAF computed from TOC

The slope is 1.0461, so the (erythemal) RAF computed from the 5 blue sky days is RAF = 1.0461 ~1.05

This has to be compared to the value RAF = 1.08 in the referenced paper [ref. 1]. Note the excellent R2 = 0.96 of this linear fit.

There is some discussion if TOC should be replaced by TOCslant = TOC/cos(SZA), where SZA is the solar zenith angle. If we do this, the RAF ~ 1.10, close to the previous value; the R2 is somewhat lower with R2=0.91. The SZA is practically constant for the 5 days wuth SZA ~38° .

RAF computed from TOC slant = TOC/cos(SZA)

The RAF = 1.10 value is close to what Jay Herman published in GRL in figure 8 [ref. 2] (red lines added):

RAF from Erythemal UVB as a function of SZA


These 5 days of cloudless sky give practically the same results for RAF as that found during previous investigations. As a very raw rule of thumb one could keep in mind that a dip of 100 DU yields an increase of at most 2 UVI. The following table resumes the findings of this paper and the references 1 to 5:

Table of erthymal RAF’s



[1] MASSEN, Francis, 2013: Computing the Radiation Amplification Factor RAF using a sudden
dip in Total Ozone Column measured at Diekirch, Luxembourg (link)

[2] HERMAN, Jay, 2010: Use of an improved radiation amplification factor to estimate
the effect of total ozone changes on action spectrum weighted irradiances and an instrument response function.
Journal of Geophysical Research, vol.115, 2010 (link)

[3] MASSEN, Francis, 2014 : RAF revisited (link)

[4] MASSEN, Francis, 2016: First Radiation Amplification Factor for 2016 (link)

[5] MASSEN, Francis, 2018: UVI and Total Ozone (link)

Greens for Nuclear Energy

April 8, 2021

We are so used to the absolute rejection of everything related to nuclear energy by the Greens we are familiar with, that this new UK movement comes a bit as a surprise.

Sure, it is their estimation that climate change is an existential threat that underlies their new appreciation of what nuclear as a carbon free energy can do. I can live with that, even if in my opinion there is no climate emergency (read the Clintel declaration).

The Greens for Nuclear Energy home page has a short video that pushes the need for nuclear energy quite far: not only in developing new technologies, but also in keeping in activity running facilities; this is something that would give the German Greens a heart attack!

With Michael Shellenberger, Bill Gates and other well known Greens or former Greens (like Patrick Moore) saying clearly that nuclear energy is a must in a realistic energy mix, will the wind turn ? And how will our EU Greens adapt? Will they change their opinion or stick with their image of a movement that only knows to present a future “to save the planet” made of restrictions in every aspect of life, be it housing, moving, eating or traveling…

You might read this very sober article by Gail H. Marcus in physicsworld (April 2017) “How green is nuclear energy?“, who concludes that “nuclear energy is indeed green, and it offers several other advantages as well. It should, therefore, be considered in this light in decision-making on future energy-supply options”.


added 10-Apr-2021:

Read this comment on the upcoming ( and partially leaked) JRC report for the EU commission which also says that nuclear energy is sustainable.

Link to the full paper “An Assessment of the Sustainability of Nuclear Power for the EU Taxonomy Consultation 2019

Global temperatures from historic documents (1/2)

August 20, 2020

1. Introduction

When we speak of global warming, the following picture is practically omnipresent:

It presents the global temperature anomaly (i.e. the difference of the actual yearly temperature with the average from 1961-1990) as given by the 3 most known temperature reconstructions of GISS (= NASA), HADCRUT4 (England) and BERKELEY (Berkeley BEST project, USA). These series more or less agree for the last 50 years, but nevertheless show visible difference for the preceding 50 to 70 years. The data used are those from known weather stations, but also from proxies like treerings, ice cores etc. What is rarely mentioned, is that during the late 19th and the beginning 20th century there were many famous scientists who worked on the same problem: find global mean yearly temperatures according to the latitudes (the so-called zonal temperatures) and/or find the global yearly isotherms which were known not to coincide with the latitude circles. Many of these ancient researchers like von Hann and von Betzold were from Germany and published in German. This may explain the poor interest shown in these papers by “modern” researchers.

This situation has some similarities with the reconstructions of global CO2 levels. Here also mostly ice-cores or other proxies are used, and the papers from the 19th century scientists which made real CO2 measurements with chemical methods are often belittled. The late Ernst-Georg BECK (a chemistry and biology German teacher) made an outstanding effort to find and evaluate these old measurements, and found that these values were much more variable as told by the “consensus” climatology. I wrote with Beck a paper published in 2009 by Springer on how to try to validate these old measurements, of which there were not many and their focus typical local (link).

2. The KRAMM et al. paper

Gerard Kramm from Engineering Meteorological Consulting in Fairbanks and his co-authors (Martina Berger, Ralph Dlugi from the German Arbeitsgruppe Atmophärische Prozesse. Munich, and Nicole Mölders, University of Alaska Fairbanks) have published in Natural Science, 2020 (link) a very important paper on how researchers from the old times calculated zonal, hemispheric and global annual temperatures. The very long title is “Meridional Distributions of Historical Zonal Averages and Their Use to Quantify the Global and Spheroidal Mean Near-Surface Temperature of the Terrestrial Atmosphere“, and this 45 page paper is a blockbuster. It contains it’s fair share of mathematics, and I had to read it several times to understand the finer points. I first stumbled on that paper from a discussion at the NoTricksZone blog (link), and you might well first reading the comment of Kenneth Richard.

The 4 authors all seem German speaking people, what explains that many citations are given in its original language. They tell us that very famous scientists of the second half of the 19th and the start of the 20th century worked to find global average temperatures. One must remember that in 1887 for instance 459 land based meteorological stations (outside the USA and the polar regions) and about 600 vessels gathered meteorological data; the first Meteorological Congress held in 1873 in Vienna had standardized the equipment (for instance of dry and moist thermometers). The best known authors of big climate treaties written in the 1852-1913 time span are von Hann ( Julius-Ferdinand von Hann, 1839 – 1921 ) and von Betzold (Wilhelm von Betzold, 1837 – 1907 ), who referred to numerous other authors.

The Kramm paper tries to validate the results given by these authors, using papers from other authors and mathematical calculations.

Just to show how good the results of these authors were, look at the following extract of  a graph from von Hann (1887) showing the zonal isotherms over the whole globe. I have added the text boxes:

The yellow dot shows the approximate location of Diekirch, slightly south of the 10°C isotherm. The yellow box shows that the mean temperature measured by meteoLCD was 10.6°C over the 21 years period 1998 – 2019, very close to the von Hann isotherm of 1887.

The authors write that “obviously the results of well-known climate researchers ….are notably higher than those derived from Hadcrut4, Berkeley and Nasa GISS“. So the question is have these institutions (willingly or not) lowered the temperatures of the past and so amplified the global warming?

(to be continued)

Colle Gnifetti ice core… a new European temperature reconstruction

August 5, 2020


(picture from the PhD thesis of Licciulli, 2018)

When we want to know the temperatures of say the last 1000 years, we must use proxies like changes in the O18 isotope, changes in leaf stomata or tree rings (for instance in the famous bristlecone trees) etc… The best known proxies (beside tree rings) are ice cores, most coming from drilling in Antarctica or Greenland glaciers. Ice cores from European glaciers are few, so the paper by Bohleber et al. on ice cores from the Monta Rosa region is remarkable. The title is “Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millenium” (link), and it was published


in the “Climate of the Past” series of the European Geosciences Union (EGU) in January 2018. I became aware of this paper by an excellent comment of Willis Eschenbach in WUWT (24-Jul-2020), I will come back to this later.

What makes the paper of Bohleber so special, is that the location of the 2 ice cores is on the Colle Gnifetti saddle (4450m asl) in the Monte Rosa region (border between Italy and Switzerland), so really in our neighborhood when compared to Antarctica and Greenland. This glacier is not very thick (about 140m only), as the prevailing winds remove a good part of the yearly snowfall. But the ca. 65m deep drillings allow going back by more than 1000 years. The researchers studied the dust layers found in the ice cores, especially the abundance of Ca2+ ions. These dust layers are very thin, so they used quite sophisticated laser technologies to investigate them. They found a good agreement between the observed temperature trends and those of the Ca2+ dust layers (mostly dust from the Sahara: warmer temperatures increase the advection of dust-rich air masses).

The IPCC’s view at the last 1000 years temperatures

In its first assessment report (FAR) of 1990, the IPCC gave a graph form Hubert Lamb showing (without any clear temperature scale) the existence of a warmer period (MWP) around year 1000 and the later distinctive cooling of the Little Ice Age (LIA):


With the infamous Hockey-Stick paper by Mann in the 3rd report (TAR, 1999) the MWP disappeared, or was ignored (link to original paper):


For political or activist reasons, this faulty graph from a junior PhD became a poster-child in the global warming debate, and remained so for long years, despite the fact that it was shown wrong for an incorrect application of statistical calculations (PCA, principal component analysis) and inadequate choice of tree rings.

Today there are many reconstructions of the NH temperatures, and the figure below (blue arrow and highlights added by me) shows how different they are, and that at least one (Christiansen and Ljungqvist, 2012) gives hugely changing temperatures, with a very pronounced MWP nearly as warm as today (link):


Now, here follows the reconstruction by Bohleber et al, based as seen above on the study of dust layers, a factor that was not considered in the hockeystick paper.


I have added the text boxes and the arrows to the original graph. First one should note the temperatures are anomalies (=deviations) from the average temperature at GG during 1860 – 2000. The horizontal time axis is reversed, i.e. the most recent period is left, and the “calibration” period is the interval 1860 to 2000. The red curve shows an independent reconstruction by Luterbach of mean European summer temperature anomalies. The black curve gives (if I understand this correctly) these same anomalies as measured by meteorological instruments over Europe (West Europe?).

Willis Eschenbach made a linear regression with the BEST NH temperature reconstructions, and adjusted the Ca2+ curve using this function (y = 1.6*x – 0.2). The visual correlation for the last 250 years is excellent (except a divergence for the last ~25 years):


Applying the same regression on the whole CG data, and smoothing by a 15 year filter makes the important details still more visible:


We clearly see two warm periods: one around 850 AD and the other corresponding the the MWP, today called MCA = Medieval Climate Anomaly, because it seems inconvenient to the “consensus climatology” that some CO2 low medieval times were nearly as warm as today. So Bohleber et al. write in their conclusion “the reconstruction reproduces the overall features of the LIA … and reveal an exceptional medieval period around AD 1100-1200”.

What also clearly can be seen in all these graphs is that the climate never was stable for very long times: the normal situation is a changing climate!




Lindzen’s new paper: An oversimplified picture

June 23, 2020

MIT Prof. Richard Lindzen (retired) has published (19 May 2020) a very interesting new paper in The European Physical Journal Plus (Springer) titled “An oversimplified picture of the climate behavior based on a single process can lead to distorted conclusions“. The full article is paywalled (a shockingly high 45€ for 10 pages!), but it is easy to find an accessible version by googling.

The article is written in very easy terms, at least concerning the first 3 chapters and the conclusion in chapter 5. I read it carefully several times and will try to summarize as best I can.

  1. Introduction

In the introduction Lindzen recall’s that greenhouse warming is a recent element in climate literature, and even if known and mentioned, played a minor role in climate science before 1980. He also repeats a mostly ignored argument, i.e. that even if there is some global warming now (from whatever causes) the claim that this must be catastrophic should be seen with suspicion.

2. Chapter 2

Chapter 2 is titled “The climate system” and on these less than 1.5 pages Lindzen excels in clarity. He writes nothing that could be controversial, but many of these facts are constantly ignored in the media: the uneven solar heating between the equator and the poles drives the motions of heat in the air and the oceans; in the latter there are changes in timescales ranging from years (e.g. El-Nino, PDO and AMO) to millenia, and these changes are present even if the composition of the atmosphere would be unchanging.

The surface of the oceans is never in equilibrium with space, and the complicated air flow over geographic landscapes causes regional variations in climate (not well described by climate models). Not CO2, but water vapor and clouds are the two most important greenhouse substances; doubling the atmospheric CO2 content would increase the energy budget by less than 2%.

He writes that the political/scientific consensus is that changes in global radiative forcing are the unique cause of changes of global temperatures, and these changes are predominantly caused by increasing CO2 emissions. This simplified picture of one global cause (global radiative forcing) and one global effect (global temperature) to describe the climate is mistaken.

It is water vapor that essentially blocks outgoing IR radiation which causes the surface and adjacent air to warm and so triggers convection. Convection and radiative processes result in temperature decreasing with height, up to level where there is so little water vapor left that radiation escapes unhindered to space. It is at this altitude where the radiative equilibrium between incoming solar energy and outgoing IR energy happens, and the temperature  there is 255 K. As the temperature has decreased with height, level zero (i.e. the surface) must be warmer. Adding other greenhouse gases (like CO2) increases the equilibrium height, and as a consequence the temperature of the surface. The radiative budget is constantly changed by other factors, as varying cloud cover and height, snow, ocean circulations etc. These changes have an effect that is comparable to that of doubling the CO2 content of the atmosphere. And most important, even if the solar forcing (i.e. the engine driving the climate) would be constant, the climate would still vary, as the system has autonomous variability!

The problem of the “consensus” climatology (IPCC and politics) is that they ignore the many variables at work and simplify the perturbation of energy budget of a complex system to the perturbing effect of a single variable (CO2).

3. History

In this short chapter Lindzen enumerates the many scientists that disagreed up into the eighties with the consensus view. But between 1988 and 1994, climate funding in the USA for example increased by a factor of 15! And all the “new” climate scientists understood very well that the reason for this extraordinary increase in funding was the global warming alarm, which became a self-fulfilling cause.

Let me here repeat as an aside what the German physicist Dr. Gerd Weber wrote 1992 in his book “Der Treibhauseffekt”:


4. Chapter 4

This is the longest chapter in Lindzen’s paper, also one that demands a few lectures to understand it correctly. Lindzen wants to show that the thermal difference between equatorial and polar region has an influence on global temperature, and that this difference is independent from the CO2 content of the atmosphere. He recalls the Milankovitch cycles and the important messages that variations in arctic (summer) insolation cause the fluctuations in ice cover. The arctic inversion (i.e. temperature increasing with height) makes the surface difference between equator and polar temperatures greater than they are at the polar tropopause ( 6 km). So one does not have to introduce a mysterious “polar amplification” (as does the IPCC) for this temperature differential.

Lindzen establishes a very simple formula which gives the change in global temperature as the sum of the changes of the tropical temperature (mostly caused by greenhouse radiative forcing) and that of the changes of the equator-to-pole temperature difference (which is independent of the greenhouse effect). This means that even in the absence of greenhouse gas forcings (what is the aim of the actual climate policies) there will be changes in global temperature.


5. Conclusion

The conclusion is that the basic premise of the conventional (consensus or not) climate picture that all changes in global (mean) temperature are due to radiative forcing is mistaken.


My personal remarks:

Will this paper by one of the most important atmospheric scientists be read by the people deciding on extremely costly and radical climate policies? Will it be mentioned in the media?

I doubt it. The climate train like the “Snowpiercer” in the Netflix series is launched full steam ahead, and political decisions become more and more the result of quasi religious emotions than that of scientific reasoning. But reality and physics are stubborn… and so as the Snowpiercer is vulnerable to avalanches and rockfall, the actual simplistic climate view could well change during the next decades, long before the predicted climate catastrophe in 2100 will occur.

COVID-19 Luxembourg, final remarks

June 6, 2020


by Francis Massen (francis.massen@education.lu)
06 June 2020


1. Introduction

The COVID-19 pandemic (or epidemic)  started the 29th Feb 2020 in Luxembourg; this is day 0 or day 1, dependent on counting (I start at day 0).

I wanted to follow the evolution using a model (better: a formula) developed about 200 years ago by Benjamin GOMPERTZ, an English autodidact and later member of the Royal Society. The GOMPERTZ formula belongs to the category of sigmoid functions, which describe a phenomenon starting slowly, going through a quasi exponential development phase and than slowing down up to zero progression. The formula written in Excel notation is:

y = a*exp(-b*exp(-c*x)) with exp(x) = ex

It has only 3 parameters, and clearly a represents an horizontal asymptote when x tends to infinity:  y(inf) = a*exp(-b*0) = a

The Gompertz function is much in use in population dynamics, but also in the first phase of a developing epidemic.

All data and graphs are in the subdirectory Archive.


2. The situation at the end (31-May-20)

Fig.1 shows the Gompertz function modelled on the 92 days of the total infected in Luxembourg, starting 29-Feb-20 and ending 31-May-20.

Modeling means that the parameters a, b and c are mathematically choosen by a regression calculus (Levenberg-Marquardt algorithm) to give a best fit to the observations:

fig.1. Gompertz function fitted to the total infected after 92 days, extended to 100 days.

The next figure shows the previous fit and the observations:

fig.2. Deviations from Gompertz fit


Despite the visible differences, statistically all parameters are significant and the goodness of the fit R2 = 0.998 (the maximum possible is R2=1).


3. How does the Gompertz function fit at the start?

A first question is “when is the fit statistically significant?”.

Let’s take as an example the situation on 16-Mar. None of the 3 parameters were statistically significant; the uncertainty range for parameter a was a ridicule [-19621 … +22437 ]

Nevertheless the fit to the few data points was visually excellent:

fig.3 Gompertz fit to the 6 available data from 29-Feb to 16-Mar


If we would have taken this fit as a valid predictor for the future, we would have been in for a big surprise:

fig.4. The previous fit extended to 100 days (red line), compared to the observations (green dots).

The green curve shows what happened, the red is the predictor made the 16-May.

Conclusion #1: beware of predictions made too early in the development of the epidemic!


The 24-March (=day 24) was the first day where all parameters were statistically significant (alpha = 95%); if we use the Gompertz function calculated from the available data at that moment, the previous fig.4 will change (see fig.5):

fig.5. Gompertz fit made at day 24; the uncertainty range for a at that date shown by the green box was [1182 .. 3098]


 We see that the number of total infected is finally  1400 more that the prediction made the 24th March; the final total is even about 1000 cases more than the higher bound of the uncertainty range!


Conclusion #2: beware of predictions made too early in the development of the epidemic! (I repeat myself!)


4. Evolution of the number of death

Fig.6 shows the number of deaths on the first day where all parameters of the Gompertz fut are significant; the yellow rectangle corresponds to the borders of the uncertainty range, which is large, but not impossible large.

Fig.6. Death number of Gompertz curve prediction, 05-Apr-20 (=36th day)

The Gompertz curve does not fit very well, as the death number increases by jumps:

Fig.7. Death number and Gompertz fit on all observations until 5th Apr. 20


The uncertainty range does not narrow smoothly, but goes through some violent swings before narrowing continuously from about the 44th day ( 8 April) on, as shown by the next animation:

Fig. 8. Animation showing the evolution of death number and the uncertainty interval (upper and lower bounds by full and open squares)


Similar to what has been concluded above: if one had taken the prediction made the 9th April, the number of predicted deaths would have been about 250, to be compared to the final 110.


Conclusion #3: beware of predictions made too early in the development of the epidemic! (I repeat myself!)


5. Final remarks

1.  The 200 year-old Gompertz curve represents well the ongoing epidemic, independent from the political decisions to impose strict measures like shutting down schools and many economic actors, and imposing a relatively strict quarantine: the evolution of both total infected and total death number follows well this simple curve. The death numbers vary more by pause and jumps, so that deviation from the Gompertz curve is more apparent.

2. Even if all 3 parameters of the curve are found to be statistically significant, one can not rely on the prediction to have a correct estimate of the total number of infections and of death. These predictions are only valid when enough data are present, which represents a serious decision problem as this “enough” number is unknown a priori. In hind-cast, one can see that after about 50 days the total number of infected can be reasonable well predicted; this is the moment where the exponential increase is over and the progression begins to slow down. For the number of deaths the date where a valid prediction can be made is about 10 days later (i.e. 60 days after the start).

3. The sad conclusion is that this modeling, independent from its intellectual and scientific interest, is a poor instrument to help making political decisions at the earliest moment, a time when they are most urgently needed. Models are very good at the final development stages of the pandemic; but alas their usefulness is inversely proportional to the delay from the start of the epidemic. One should not be fooled by an extremely good R2: all models follow the past in an excellent way, but this is in general no guaranty on how good they are in predicting the future.



Francis Massen, meteoLCD

06 June 2020





Frost Saints Days cooling since 1998 at meteoLCD

May 24, 2020


We all know since our earliest child days that in May there is a period of 5 days that often brings back severe cooling, before temperatures climb again into the summer numbers. The “Frost Saints” (or “Ice Saints”) are called “Eisheilige” in German, and are known as Mamertus, Pankratius, Servatius, Bonifatius and “die kalte Sophie” (from 11 to 15 May). The climatological cause is cold polar air streaming over a still cold nightly soil, which than often causes frost on this soil. This frost may destroy or damage young seedlings, so it always was something the peasants were afraid of.

Josef Kowatsch has an article in “Die Kalte Sonne” (English translation and comments at NoTrickszone) titled “Warum werden die Eisheiligen seit 25 Jahren immer kälter?“. Kowatsch shows that at 5 chosen German weather-stations the trend of the Frost Saints period is negative, i.e. these days are cooling. The station closest to meteoLCD is Bad Kreuznach, located 125km East of Diekirch. Here what Kowatsch has found for this station, located at practically the same altitude (184m asl versus 218m asl):

The cooling over the 25 year periods is – 0.14°C/year of -1.42 °/decade.

Now this pushed me to make the same analysis for meteoLCD, starting in 1998 (the “official” beginning of our data archive). Here is the result:

Same observation at Diekirch: since 1998 (23 years) the Ice Saints period is cooling; here the trend is -1.34 °C/decade, a quite impressive number. If one would foolishly extend this up to 2100, we would expect Ice Saints day more than 10°C cooler than today!

The plot shows that there are important variations w.r. to the linear trend line, which explains the poor R2 of 0.09. A Fourier analysis suggest 2 main underlying periods: about 2 and 10 years; the latter may be compared to the NAO (North Atlantic Oscillation) last period, which seems to be about 20 years… but this might be a coincidence.

The following graph from the excellent climate4you website shows that the heat content of the North-Atlantic changed markedly, with a cooling period following a warming:

The period of the oscillation in the green rectangle could be close to 20 years, the number discussed above for the NAO.


What remains to take home is that despite the ear-deafening shouts of the climate alarmists, the NGO’s, the politicians etc., there is some cooling going on: we are not in a situation of constant warming everywhere!

So it still might be prudent to not put all our eggs into the warming basket!