Greens for Nuclear Energy

April 8, 2021

We are so used to the absolute rejection of everything related to nuclear energy by the Greens we are familiar with, that this new UK movement comes a bit as a surprise.

Sure, it is their estimation that climate change is an existential threat that underlies their new appreciation of what nuclear as a carbon free energy can do. I can live with that, even if in my opinion there is no climate emergency (read the Clintel declaration).

The Greens for Nuclear Energy home page has a short video that pushes the need for nuclear energy quite far: not only in developing new technologies, but also in keeping in activity running facilities; this is something that would give the German Greens a heart attack!

With Michael Shellenberger, Bill Gates and other well known Greens or former Greens (like Patrick Moore) saying clearly that nuclear energy is a must in a realistic energy mix, will the wind turn ? And how will our EU Greens adapt? Will they change their opinion or stick with their image of a movement that only knows to present a future “to save the planet” made of restrictions in every aspect of life, be it housing, moving, eating or traveling…

You might read this very sober article by Gail H. Marcus in physicsworld (April 2017) “How green is nuclear energy?“, who concludes that “nuclear energy is indeed green, and it offers several other advantages as well. It should, therefore, be considered in this light in decision-making on future energy-supply options”.

_____________________________

added 10-Apr-2021:

Read this comment on the upcoming ( and partially leaked) JRC report for the EU commission which also says that nuclear energy is sustainable.

Link to the full paper “An Assessment of the Sustainability of Nuclear Power for the EU Taxonomy Consultation 2019

Global temperatures from historic documents (1/2)

August 20, 2020

1. Introduction

When we speak of global warming, the following picture is practically omnipresent:

It presents the global temperature anomaly (i.e. the difference of the actual yearly temperature with the average from 1961-1990) as given by the 3 most known temperature reconstructions of GISS (= NASA), HADCRUT4 (England) and BERKELEY (Berkeley BEST project, USA). These series more or less agree for the last 50 years, but nevertheless show visible difference for the preceding 50 to 70 years. The data used are those from known weather stations, but also from proxies like treerings, ice cores etc. What is rarely mentioned, is that during the late 19th and the beginning 20th century there were many famous scientists who worked on the same problem: find global mean yearly temperatures according to the latitudes (the so-called zonal temperatures) and/or find the global yearly isotherms which were known not to coincide with the latitude circles. Many of these ancient researchers like von Hann and von Betzold were from Germany and published in German. This may explain the poor interest shown in these papers by “modern” researchers.

This situation has some similarities with the reconstructions of global CO2 levels. Here also mostly ice-cores or other proxies are used, and the papers from the 19th century scientists which made real CO2 measurements with chemical methods are often belittled. The late Ernst-Georg BECK (a chemistry and biology German teacher) made an outstanding effort to find and evaluate these old measurements, and found that these values were much more variable as told by the “consensus” climatology. I wrote with Beck a paper published in 2009 by Springer on how to try to validate these old measurements, of which there were not many and their focus typical local (link).

2. The KRAMM et al. paper

Gerard Kramm from Engineering Meteorological Consulting in Fairbanks and his co-authors (Martina Berger, Ralph Dlugi from the German Arbeitsgruppe Atmophärische Prozesse. Munich, and Nicole Mölders, University of Alaska Fairbanks) have published in Natural Science, 2020 (link) a very important paper on how researchers from the old times calculated zonal, hemispheric and global annual temperatures. The very long title is “Meridional Distributions of Historical Zonal Averages and Their Use to Quantify the Global and Spheroidal Mean Near-Surface Temperature of the Terrestrial Atmosphere“, and this 45 page paper is a blockbuster. It contains it’s fair share of mathematics, and I had to read it several times to understand the finer points. I first stumbled on that paper from a discussion at the NoTricksZone blog (link), and you might well first reading the comment of Kenneth Richard.

The 4 authors all seem German speaking people, what explains that many citations are given in its original language. They tell us that very famous scientists of the second half of the 19th and the start of the 20th century worked to find global average temperatures. One must remember that in 1887 for instance 459 land based meteorological stations (outside the USA and the polar regions) and about 600 vessels gathered meteorological data; the first Meteorological Congress held in 1873 in Vienna had standardized the equipment (for instance of dry and moist thermometers). The best known authors of big climate treaties written in the 1852-1913 time span are von Hann ( Julius-Ferdinand von Hann, 1839 – 1921 ) and von Betzold (Wilhelm von Betzold, 1837 – 1907 ), who referred to numerous other authors.

The Kramm paper tries to validate the results given by these authors, using papers from other authors and mathematical calculations.

Just to show how good the results of these authors were, look at the following extract of  a graph from von Hann (1887) showing the zonal isotherms over the whole globe. I have added the text boxes:

The yellow dot shows the approximate location of Diekirch, slightly south of the 10°C isotherm. The yellow box shows that the mean temperature measured by meteoLCD was 10.6°C over the 21 years period 1998 – 2019, very close to the von Hann isotherm of 1887.

The authors write that “obviously the results of well-known climate researchers ….are notably higher than those derived from Hadcrut4, Berkeley and Nasa GISS“. So the question is have these institutions (willingly or not) lowered the temperatures of the past and so amplified the global warming?

(to be continued)

Colle Gnifetti ice core… a new European temperature reconstruction

August 5, 2020

CG_drilling

(picture from the PhD thesis of Licciulli, 2018)

When we want to know the temperatures of say the last 1000 years, we must use proxies like changes in the O18 isotope, changes in leaf stomata or tree rings (for instance in the famous bristlecone trees) etc… The best known proxies (beside tree rings) are ice cores, most coming from drilling in Antarctica or Greenland glaciers. Ice cores from European glaciers are few, so the paper by Bohleber et al. on ice cores from the Monta Rosa region is remarkable. The title is “Temperature and mineral dust variability recorded in two low-accumulation Alpine ice cores over the last millenium” (link), and it was published

graphic_cp_cover_homepage

in the “Climate of the Past” series of the European Geosciences Union (EGU) in January 2018. I became aware of this paper by an excellent comment of Willis Eschenbach in WUWT (24-Jul-2020), I will come back to this later.

What makes the paper of Bohleber so special, is that the location of the 2 ice cores is on the Colle Gnifetti saddle (4450m asl) in the Monte Rosa region (border between Italy and Switzerland), so really in our neighborhood when compared to Antarctica and Greenland. This glacier is not very thick (about 140m only), as the prevailing winds remove a good part of the yearly snowfall. But the ca. 65m deep drillings allow going back by more than 1000 years. The researchers studied the dust layers found in the ice cores, especially the abundance of Ca2+ ions. These dust layers are very thin, so they used quite sophisticated laser technologies to investigate them. They found a good agreement between the observed temperature trends and those of the Ca2+ dust layers (mostly dust from the Sahara: warmer temperatures increase the advection of dust-rich air masses).

The IPCC’s view at the last 1000 years temperatures

In its first assessment report (FAR) of 1990, the IPCC gave a graph form Hubert Lamb showing (without any clear temperature scale) the existence of a warmer period (MWP) around year 1000 and the later distinctive cooling of the Little Ice Age (LIA):

Medieval_Warm_FAR

With the infamous Hockey-Stick paper by Mann in the 3rd report (TAR, 1999) the MWP disappeared, or was ignored (link to original paper):

hockeystick_1999

For political or activist reasons, this faulty graph from a junior PhD became a poster-child in the global warming debate, and remained so for long years, despite the fact that it was shown wrong for an incorrect application of statistical calculations (PCA, principal component analysis) and inadequate choice of tree rings.

Today there are many reconstructions of the NH temperatures, and the figure below (blue arrow and highlights added by me) shows how different they are, and that at least one (Christiansen and Ljungqvist, 2012) gives hugely changing temperatures, with a very pronounced MWP nearly as warm as today (link):

many_reconstructions_NH

Now, here follows the reconstruction by Bohleber et al, based as seen above on the study of dust layers, a factor that was not considered in the hockeystick paper.

CG_temp_reconstruction

I have added the text boxes and the arrows to the original graph. First one should note the temperatures are anomalies (=deviations) from the average temperature at GG during 1860 – 2000. The horizontal time axis is reversed, i.e. the most recent period is left, and the “calibration” period is the interval 1860 to 2000. The red curve shows an independent reconstruction by Luterbach of mean European summer temperature anomalies. The black curve gives (if I understand this correctly) these same anomalies as measured by meteorological instruments over Europe (West Europe?).

Willis Eschenbach made a linear regression with the BEST NH temperature reconstructions, and adjusted the Ca2+ curve using this function (y = 1.6*x – 0.2). The visual correlation for the last 250 years is excellent (except a divergence for the last ~25 years):

Eschenbach_BEST_NH

Applying the same regression on the whole CG data, and smoothing by a 15 year filter makes the important details still more visible:

Eschenbach_CG_linearadj

We clearly see two warm periods: one around 850 AD and the other corresponding the the MWP, today called MCA = Medieval Climate Anomaly, because it seems inconvenient to the “consensus climatology” that some CO2 low medieval times were nearly as warm as today. So Bohleber et al. write in their conclusion “the reconstruction reproduces the overall features of the LIA … and reveal an exceptional medieval period around AD 1100-1200”.

What also clearly can be seen in all these graphs is that the climate never was stable for very long times: the normal situation is a changing climate!

 

 

 

Lindzen’s new paper: An oversimplified picture

June 23, 2020

MIT Prof. Richard Lindzen (retired) has published (19 May 2020) a very interesting new paper in The European Physical Journal Plus (Springer) titled “An oversimplified picture of the climate behavior based on a single process can lead to distorted conclusions“. The full article is paywalled (a shockingly high 45€ for 10 pages!), but it is easy to find an accessible version by googling.

The article is written in very easy terms, at least concerning the first 3 chapters and the conclusion in chapter 5. I read it carefully several times and will try to summarize as best I can.

  1. Introduction

In the introduction Lindzen recall’s that greenhouse warming is a recent element in climate literature, and even if known and mentioned, played a minor role in climate science before 1980. He also repeats a mostly ignored argument, i.e. that even if there is some global warming now (from whatever causes) the claim that this must be catastrophic should be seen with suspicion.

2. Chapter 2

Chapter 2 is titled “The climate system” and on these less than 1.5 pages Lindzen excels in clarity. He writes nothing that could be controversial, but many of these facts are constantly ignored in the media: the uneven solar heating between the equator and the poles drives the motions of heat in the air and the oceans; in the latter there are changes in timescales ranging from years (e.g. El-Nino, PDO and AMO) to millenia, and these changes are present even if the composition of the atmosphere would be unchanging.

The surface of the oceans is never in equilibrium with space, and the complicated air flow over geographic landscapes causes regional variations in climate (not well described by climate models). Not CO2, but water vapor and clouds are the two most important greenhouse substances; doubling the atmospheric CO2 content would increase the energy budget by less than 2%.

He writes that the political/scientific consensus is that changes in global radiative forcing are the unique cause of changes of global temperatures, and these changes are predominantly caused by increasing CO2 emissions. This simplified picture of one global cause (global radiative forcing) and one global effect (global temperature) to describe the climate is mistaken.

It is water vapor that essentially blocks outgoing IR radiation which causes the surface and adjacent air to warm and so triggers convection. Convection and radiative processes result in temperature decreasing with height, up to level where there is so little water vapor left that radiation escapes unhindered to space. It is at this altitude where the radiative equilibrium between incoming solar energy and outgoing IR energy happens, and the temperature  there is 255 K. As the temperature has decreased with height, level zero (i.e. the surface) must be warmer. Adding other greenhouse gases (like CO2) increases the equilibrium height, and as a consequence the temperature of the surface. The radiative budget is constantly changed by other factors, as varying cloud cover and height, snow, ocean circulations etc. These changes have an effect that is comparable to that of doubling the CO2 content of the atmosphere. And most important, even if the solar forcing (i.e. the engine driving the climate) would be constant, the climate would still vary, as the system has autonomous variability!

The problem of the “consensus” climatology (IPCC and politics) is that they ignore the many variables at work and simplify the perturbation of energy budget of a complex system to the perturbing effect of a single variable (CO2).

3. History

In this short chapter Lindzen enumerates the many scientists that disagreed up into the eighties with the consensus view. But between 1988 and 1994, climate funding in the USA for example increased by a factor of 15! And all the “new” climate scientists understood very well that the reason for this extraordinary increase in funding was the global warming alarm, which became a self-fulfilling cause.

Let me here repeat as an aside what the German physicist Dr. Gerd Weber wrote 1992 in his book “Der Treibhauseffekt”:

 

4. Chapter 4

This is the longest chapter in Lindzen’s paper, also one that demands a few lectures to understand it correctly. Lindzen wants to show that the thermal difference between equatorial and polar region has an influence on global temperature, and that this difference is independent from the CO2 content of the atmosphere. He recalls the Milankovitch cycles and the important messages that variations in arctic (summer) insolation cause the fluctuations in ice cover. The arctic inversion (i.e. temperature increasing with height) makes the surface difference between equator and polar temperatures greater than they are at the polar tropopause ( 6 km). So one does not have to introduce a mysterious “polar amplification” (as does the IPCC) for this temperature differential.

Lindzen establishes a very simple formula which gives the change in global temperature as the sum of the changes of the tropical temperature (mostly caused by greenhouse radiative forcing) and that of the changes of the equator-to-pole temperature difference (which is independent of the greenhouse effect). This means that even in the absence of greenhouse gas forcings (what is the aim of the actual climate policies) there will be changes in global temperature.

 

5. Conclusion

The conclusion is that the basic premise of the conventional (consensus or not) climate picture that all changes in global (mean) temperature are due to radiative forcing is mistaken.

 

My personal remarks:

Will this paper by one of the most important atmospheric scientists be read by the people deciding on extremely costly and radical climate policies? Will it be mentioned in the media?

I doubt it. The climate train like the “Snowpiercer” in the Netflix series is launched full steam ahead, and political decisions become more and more the result of quasi religious emotions than that of scientific reasoning. But reality and physics are stubborn… and so as the Snowpiercer is vulnerable to avalanches and rockfall, the actual simplistic climate view could well change during the next decades, long before the predicted climate catastrophe in 2100 will occur.

COVID-19 Luxembourg, final remarks

June 6, 2020

 

by Francis Massen (francis.massen@education.lu)
06 June 2020

 

1. Introduction

The COVID-19 pandemic (or epidemic)  started the 29th Feb 2020 in Luxembourg; this is day 0 or day 1, dependent on counting (I start at day 0).

I wanted to follow the evolution using a model (better: a formula) developed about 200 years ago by Benjamin GOMPERTZ, an English autodidact and later member of the Royal Society. The GOMPERTZ formula belongs to the category of sigmoid functions, which describe a phenomenon starting slowly, going through a quasi exponential development phase and than slowing down up to zero progression. The formula written in Excel notation is:

y = a*exp(-b*exp(-c*x)) with exp(x) = ex

It has only 3 parameters, and clearly a represents an horizontal asymptote when x tends to infinity:  y(inf) = a*exp(-b*0) = a

The Gompertz function is much in use in population dynamics, but also in the first phase of a developing epidemic.

All data and graphs are in the subdirectory Archive.

 

2. The situation at the end (31-May-20)

Fig.1 shows the Gompertz function modelled on the 92 days of the total infected in Luxembourg, starting 29-Feb-20 and ending 31-May-20.

Modeling means that the parameters a, b and c are mathematically choosen by a regression calculus (Levenberg-Marquardt algorithm) to give a best fit to the observations:

fig.1. Gompertz function fitted to the total infected after 92 days, extended to 100 days.

The next figure shows the previous fit and the observations:

fig.2. Deviations from Gompertz fit

 

Despite the visible differences, statistically all parameters are significant and the goodness of the fit R2 = 0.998 (the maximum possible is R2=1).

 

3. How does the Gompertz function fit at the start?

A first question is “when is the fit statistically significant?”.

Let’s take as an example the situation on 16-Mar. None of the 3 parameters were statistically significant; the uncertainty range for parameter a was a ridicule [-19621 … +22437 ]

Nevertheless the fit to the few data points was visually excellent:

fig.3 Gompertz fit to the 6 available data from 29-Feb to 16-Mar

 

If we would have taken this fit as a valid predictor for the future, we would have been in for a big surprise:

fig.4. The previous fit extended to 100 days (red line), compared to the observations (green dots).

The green curve shows what happened, the red is the predictor made the 16-May.

Conclusion #1: beware of predictions made too early in the development of the epidemic!

 

The 24-March (=day 24) was the first day where all parameters were statistically significant (alpha = 95%); if we use the Gompertz function calculated from the available data at that moment, the previous fig.4 will change (see fig.5):

fig.5. Gompertz fit made at day 24; the uncertainty range for a at that date shown by the green box was [1182 .. 3098]

 

 We see that the number of total infected is finally  1400 more that the prediction made the 24th March; the final total is even about 1000 cases more than the higher bound of the uncertainty range!

 

Conclusion #2: beware of predictions made too early in the development of the epidemic! (I repeat myself!)

 

4. Evolution of the number of death

Fig.6 shows the number of deaths on the first day where all parameters of the Gompertz fut are significant; the yellow rectangle corresponds to the borders of the uncertainty range, which is large, but not impossible large.

Fig.6. Death number of Gompertz curve prediction, 05-Apr-20 (=36th day)

The Gompertz curve does not fit very well, as the death number increases by jumps:

Fig.7. Death number and Gompertz fit on all observations until 5th Apr. 20

 

The uncertainty range does not narrow smoothly, but goes through some violent swings before narrowing continuously from about the 44th day ( 8 April) on, as shown by the next animation:

Fig. 8. Animation showing the evolution of death number and the uncertainty interval (upper and lower bounds by full and open squares)

 

Similar to what has been concluded above: if one had taken the prediction made the 9th April, the number of predicted deaths would have been about 250, to be compared to the final 110.

 

Conclusion #3: beware of predictions made too early in the development of the epidemic! (I repeat myself!)

 

5. Final remarks

1.  The 200 year-old Gompertz curve represents well the ongoing epidemic, independent from the political decisions to impose strict measures like shutting down schools and many economic actors, and imposing a relatively strict quarantine: the evolution of both total infected and total death number follows well this simple curve. The death numbers vary more by pause and jumps, so that deviation from the Gompertz curve is more apparent.

2. Even if all 3 parameters of the curve are found to be statistically significant, one can not rely on the prediction to have a correct estimate of the total number of infections and of death. These predictions are only valid when enough data are present, which represents a serious decision problem as this “enough” number is unknown a priori. In hind-cast, one can see that after about 50 days the total number of infected can be reasonable well predicted; this is the moment where the exponential increase is over and the progression begins to slow down. For the number of deaths the date where a valid prediction can be made is about 10 days later (i.e. 60 days after the start).

3. The sad conclusion is that this modeling, independent from its intellectual and scientific interest, is a poor instrument to help making political decisions at the earliest moment, a time when they are most urgently needed. Models are very good at the final development stages of the pandemic; but alas their usefulness is inversely proportional to the delay from the start of the epidemic. One should not be fooled by an extremely good R2: all models follow the past in an excellent way, but this is in general no guaranty on how good they are in predicting the future.

 

 

Francis Massen, meteoLCD

06 June 2020

 

 

 

 

Frost Saints Days cooling since 1998 at meteoLCD

May 24, 2020

Eisheilige_5

We all know since our earliest child days that in May there is a period of 5 days that often brings back severe cooling, before temperatures climb again into the summer numbers. The “Frost Saints” (or “Ice Saints”) are called “Eisheilige” in German, and are known as Mamertus, Pankratius, Servatius, Bonifatius and “die kalte Sophie” (from 11 to 15 May). The climatological cause is cold polar air streaming over a still cold nightly soil, which than often causes frost on this soil. This frost may destroy or damage young seedlings, so it always was something the peasants were afraid of.

Josef Kowatsch has an article in “Die Kalte Sonne” (English translation and comments at NoTrickszone) titled “Warum werden die Eisheiligen seit 25 Jahren immer kälter?“. Kowatsch shows that at 5 chosen German weather-stations the trend of the Frost Saints period is negative, i.e. these days are cooling. The station closest to meteoLCD is Bad Kreuznach, located 125km East of Diekirch. Here what Kowatsch has found for this station, located at practically the same altitude (184m asl versus 218m asl):

The cooling over the 25 year periods is – 0.14°C/year of -1.42 °/decade.

Now this pushed me to make the same analysis for meteoLCD, starting in 1998 (the “official” beginning of our data archive). Here is the result:

Same observation at Diekirch: since 1998 (23 years) the Ice Saints period is cooling; here the trend is -1.34 °C/decade, a quite impressive number. If one would foolishly extend this up to 2100, we would expect Ice Saints day more than 10°C cooler than today!

The plot shows that there are important variations w.r. to the linear trend line, which explains the poor R2 of 0.09. A Fourier analysis suggest 2 main underlying periods: about 2 and 10 years; the latter may be compared to the NAO (North Atlantic Oscillation) last period, which seems to be about 20 years… but this might be a coincidence.

The following graph from the excellent climate4you website shows that the heat content of the North-Atlantic changed markedly, with a cooling period following a warming:

The period of the oscillation in the green rectangle could be close to 20 years, the number discussed above for the NAO.

Conclusion

What remains to take home is that despite the ear-deafening shouts of the climate alarmists, the NGO’s, the politicians etc., there is some cooling going on: we are not in a situation of constant warming everywhere!

So it still might be prudent to not put all our eggs into the warming basket!

Wood burning.. real numbers for a green-hyped energy

May 3, 2020

Prof. Fritz Vahrenholt’s and Dr. Sebastian Lüning’s blog “Die kalte Sonne” has a discussion on the US movie “Burned: are trees the new coal ?” (streaming here) which shows what happens in American forests that are the source for European wood-burning installations (power stations, heating…). Since the EU made what I think the completely wrong decision to hype wood burning as “green” and “renewable”, massive quantities of wood are transported from the US East to the EU, where converted power stations like the UK Drax burn yearly wood felled from a surface of 830 km2, corresponding to 1/3 of the surface of Luxembourg (link). “Die kalte Sonne” gives a very instructive document from the Swiss Bafu (Bundes Amt für Umwelt) which shows the emissions of wood-burning installations, compared to natural gas and light oil (HEL) facilities. You may find the document here.

Concerning fine particle and dust emissions (“Staub”), just compare the numbers that I highlighted in turquoise (the unit is mg/MJ, i.e. a mass per unit of energy produced, not a mass per m3 of air!). A “normal” household heating has a power < 50kW; so comparing these, we see that wood burning has dust emissions per energy unit that are between 250 and 1000 times higher than corresponding oil or gas installations. That says it all!

PS: You may read the research paper “The Burning Question: does Forest Bioenergy Reduce Carbon Emissions” (link)

On wind, CO2 and other gases (3/3)

May 2, 2020

4. A fast recap.

This is the third and last part of my comments and observations on the influence of wind speed on near ground CO2 concentrations (mixing ratios). Let me summarize what we have seen and talked about in the first two parts:

  1. Observation shows that under low wind conditions, the daily CO2 concentration swings heavily from an early morning low to an afternoon high; the amplitude can reach values of 130 ppm.
  2.  When wind blows, and all other conditions as solar irradiance, air temperature stay more or less the same, this daily swing is dramatically dampened: the peak values can be clipped by more than 100 ppm, the minimum values remain more or less unchanged, the amplitude of the daily swing is down to about 20 ppm.
  3. A similar pattern can be observed on the NO2 concentrations, as measured for instance in the Beckerich station.
  4. Ground ozone concentrations do NOT follow this pattern of peak-clipping at all: the maxima remain more or less unchanged, but the minima are drastically higher. Air temperature plays a minor role, as shown by the next figure which shows the temperature and ozone date at meteoLCD for the 7 days ending in Saturday 02 May, afternoon:

The box A clearly shows constant O3 minima in spite of rising air temperature minima; the box B shows that the O3 maxima are lower: air temperature maxima are also lower by about 5°C (25%) whereas wind speed is much higher: the peak on the 26/04 is 1.8 m/s, and it is higher than 6 m/s  (>300%) during the B-box days! So the wind speed is possibly the main factor increasing the daily O3 minima.

5. How can we explain the different O3 pattern?

Luckily, I found a recent paper by Thomas Trick et al. published in January 2020 in “Atmospheric Chemistry and Physics”, titled: “Very high stratospheric influence observed in the free troposphere over the northern Alps – just a local phenomenon?” (link). This paper shows that incursions of stratospheric ozone are much more important than what the consensus science says. Usually it is assumed, that these incursions (known since many years) do not upper the local concentrations by much more than 10 ppb (20 ug/m3). In this paper it is shown from observations that this is not the case: the incursions can be much higher, something that activist scientists who see high ozone levels  being caused exclusively by human activity are eager to ignore. The very high O3 concentrations in the stratosphere can increase the O3 concentrations at mid-troposphere heights (and possibly lower) by quite a lot. Balloon observations and LIDAR soundings (by laser) give us a good picture of how the O3 concentration varies with altitude. The next picture (from the paper) shows the situation at Garmisch-Partenkirchen:

For us the lower part in the fuchsia colored box is important: we see that O3 concentration does vary with the time of the day, that it is highest during late afternoon (grey curve) as we know well, but most important, that the overall O3 values are more or less constant! This means that higher wind speeds do increase the mixing of air layers, but as the O3 concentration is about the same, that mixing does not cause a dilution! The low morning values during wind-poor days are the result of O3 destruction caused by NO (we may assume that NO concentrations are well correlated to NO2, the only gas of which we have observational values).

The following picture shows an extreme situation at meteoLCD, 2-Feb-2000; this was a period where our NO and NO2 sensors (by Environnement SA) were still in action. The year 2000 was a year where parts of our buildings were re-constructed, and heavy machinery as compressors and excavators were often operational at ground-level. This day they started after noon, and the extraordinary high NO levels are mainly caused by starting up the Diesel engines. The result is a near complete destruction of O3, this at a time where the O3 concentrations are normally rising:

 

6. In conclusion.

Once more, we have shown that near-ground CO2 concentrations are heavily influenced by wind speed; this CO2 lowering influence certainly is much more important than that of photosynthesis which works in the same direction. Ground ozone concentrations are not impacted in a similar manner, whereas those of NO2 are. So, and this may come as a surprise, the concentrations at ground level of different atmospheric gases does not respond to increasing wind speeds in the same manner!

On wind, CO2 and other gases (2/3)

April 26, 2020

In this 2nd part we will taker a deeper look into CO2 variations, wind speed, ground ozone and solar irradiance. I will settle down on the 11 days period from 14 to 24 April 2020 inclusive. The data are fed into STATISTICA, and graphs are made in that software.

 

2. CO2 and wind speed over the 11 days, and a zoom on two days

This plot shows the dampening of daily CO2 peaks during the 3 day period where wind speed was higher 24 hours around: the CO2 peaks plunge from about 540 to less than 430 ppm, practically by one fifth! This windy period stops the 23th April and the 24th we are back to a 520 ppm morning peak. The 3 days low period covers Monday, Tuesday and Wednesday, i.e. days witch despite lockdown have certainly more traffic than what happens during the weekends (Saturday, Sunday: 18 & 19 April)

Let us take a look at two days, the Friday 17th April belonging to the low-wind period, and the Monday 20th April where the wind blows strong.

I choose the same scales on both figures, which allows a better visual appreciation: compared to the upper graph, CO2 levels seem practically flat during the windy period.

Conclusion #1: CO2 levels at Diekirch are mostly lowered by air movements, which dilute the CO2 concentration through a higher mixing of the near-ground air.

Attention: this is a preliminary conclusion; we know that photosynthesis is the big CO2 killer, and uses is as the building blocks for making plant matter. So we must also look at solar activity (irradiance, energy per day ….) to check if our CO2 slump is not simply the result of a stronger photosynthesis.

 

3. CO2, ground ozone and solar irradiance

But let us look first how another gas that we measure at meteoLCD varies during the same period: ground O3. The next combined graph shows how CO2 and O3 vary together, and how CO2 and solar irradiance change.

3.1 First the O3 problem:

During the “non-wind” periods O3 levels plunge to near-zero at the morning hours, whereas they remain relatively high during the windy days. Exactly the opposite what we see with CO2! The 2 turquoise lines show CO2 minima and O3 maxima happen at the same time during late-afternoon.

Now we know that nightly O3-patterns differ markedly between city and rural locations. In a city with its ongoing 24/24 traffic, O3 destroying emissions of NO remain efficient, and, in the abscence of UVB radiation which creates O3, rapidly lowers O3 concentration. The opposite happens at rural stations: no night traffic, no NO emissions, no hugely important destruction of O3: especially when night temperatures remain high, O3 minimum levels do not fall back to say 10 ug/m3, but stay at 60-80 ug/m3.

Let us check this using the measurements made at Beckerich, a semi-rural village similar to Diekirch regarding traffic: 2 peaksper day of in- and outgoing traffic from the foreigners working in Luxembourg , and a moderate traffic level during the rest of the day. This is nicely shown in the following picture:

The upper plot shows the O3 concentration, and the lower the NO2, which we will use as a proxy for NO. The two red vertical line show the morning NO2 peaks and the corresponding O3 lows. If one looks carefully, two NO2 peaks per day can be seen. The highlighted section corresponds to the windy days (remember that these are working days!): NO2 levels remain very low, and so do not bring down O3 concentration to near zero!

Conclusion #2: NO2 levels respond to wind speed as does CO2; more wind means more dilution and lower levels. O3 concentrations do not! Not all atmospheric gases are impacted by higher wind speed in the same manner!

 

3.2. And now solar irradiance and temperature in more detail:

Can we see the influence of solar irradiance (and air temperature) during low-wind days? Intuitively one would expect stronger air movements by convection when solar irradiance (and air temperature) is higher. So the morning CO2 minima should be lower during sunnier and warmer days.

First a plot showing CO2, solar iarradiance and air temperature for the two “wind-poor” days of 15th and 23th April
Clearly solar irradiance and air temperature is higher on the second day: so this 23th April air convection movements as well as photosynthesis will both be higher.  To have a clearer picture, let us retain only CO2 and air temperature:

The CO2 minimum is 8 ppm lower on the warmer and sunnier day.We can not dissociate the two CO2 lowering causes, convection and photo-synthesis; suffice to say that the lowering of  8 ppm has to be compared with the peaks which are more than 100 ppm lower during windy compared to wind-still days… a difference of one magnitude!

Conclusion #3: The huge CO2 lowering effect during windy days is caused by increased air movements. The variations caused by photosynthesis and air convection are much smaller.

 

(to be continued with 3rd and last part)

On wind, CO2 and other gases (1/3)

April 24, 2020

wind (link to image)

1. Introduction

meteoLCD is one of the few stations that publish CO2 measurements at near-ground level; our readings give the real situation as it is. If you look at the CO2 plots (here), you may be surprised by the often huge daily swings in concentration, when the media usually suggest a smooth and continuous rise. BTW one should say “mixing ratio”, but “concentration” is so much easier on the mind…

One of the main factor influencing the CO2 levels is wind speed: no wind means no turbulent mixing of the boundary layer, and especially at the cool morning hours, CO2 will remain trapped by the inversion and levels reach high values. During the day, when the sun warms the air, convection springs into action and this will dilute the CO2 concentration. Now, add some more or less heavy wind movements, and that dilution will reach “extreme” levels. The concentration begins to be comparable with the numbers published by stations located at the beach-front, on small islands or on top of a high volcano, as is the case for the famous Mauna Loa station. But even Mauna Loa must watch its steps: from time to time the volcanoes belch out a CO2 plume, and this moments must be “edited out” of the data series. Other continental stations keep only the measurements done in the afternoon, to get what is called “the background level”.

I published a very long time ago (2009) a paper with the late Ernst-Georg Beck on how to calculate that background if one has both the wind-speed data and the CO2 mixing ratio. Beck was a specialist on vintage CO2 measurements done with chemical methods, and so we tried to find a possible background out from these historic measurements. You may look at the paper (published by Springer) here. In our TRENDS section I redo this exercise every year, and this is the situation for 2019:

The number 411 ppmV represents the horizontal asymptote and is practically the same as the Mauna Loa value for 2019 (411.4), so this might be a lucky year!

The last days, we had sunny weather, and several days with high wind speed, than 3 days with poor wind, and again a couple of more windy days. Look how the CO2 concentration was influenced by this wind pattern (the graph is from today, Friday 24th April 2020):

 

Do you see the difference?

Well, in the next part which I try to write during the weekend, we will look more into the details, and see how other atmospheric gases like ozone and NO2 behave during that same period.

 

(to be continued)