Welcome to the meteoLCD blog

September 28, 2008

This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.

Klima macht Geschichte: a turning point in German climate hysteria?

January 26, 2015

ZDF_Klima_macht_Geschichte

The German national television ZDF broadcasted a (mostly!) remarkable two part series (Part 1, Part II) on the impact of climate change on human history and development. Part I starts with the Neanderthals and ends with the beginning of the Roman Empire. Part II (which I prefer) starts with the warm Roman period and ends with the modern warm period. What is remarkable is that over the whole series, the authors insist that the recurring great climate changes are due to natural phenomena, mostly Milankovitch cycles with their changing solar irradiance and volcanic activity. For instance it is said that “die Sonne ist der Hauptakteur”, or “jede Klimaveränderung wird vom Weltall aus gesteuert”. A recurrent leitmotiv is that warmer (and more humid) periods are good for development, colder are bad.

The most interesting second part tells the stories of the last 3 warm periods (Roman, Medieval and today) and shows how a warmer climate fostered cultural, scientific and political development. And vastly increasing populations are not described as a parasitic illness destroying the planet, but as a welcome and “natural” development thriving in good climatic conditions.
Why did I start with the quantifier “mostly” ? Because the excellence of the 2 times 43 minutes is spoiled by the last 60 seconds, where Mark Maslin (University College London) closes with this sentence: “We are at a point where we can decide how the future climate will be.” This is blatant silliness, probably forced upon the professor  to include at least a sentence seen to be politically correct and Zeitgeist aware. This last conclusion is the more silly, as all previous examples clearly have shown that the changes of the climate were not caused by human activity. And today, never mind our technological achievements, we are still unable to change the tilt of the axis of the globe, modify solar activity or put a lid on volcanoes to avoid their eruptions.

Nevertheless, this broadcast makes me more optimistic: could it be that the fashionable hysteria regarding an anthropogenic climate change is loosing steam, and starting to go the way typical for all fads and Zeitgeist exaggerations, i.e. dissolving into oblivion?

Read also the excellent comments of  Pierre Gosselin in his blog.

CO2 and temperature : “Da stelle mer uns janz dumm”

January 10, 2015

One of my favorite sentences from the great author Heinrich Spoerl’s book “Die Feuerzangenbowle” (1933) is the physics professor Bömmel. Explaining the working of the steam engine he begins with “Da stelle mer uns janz dumm” (approximate translation “Let’s start by assuming we are completely stupid”).
The primordial question about an eventual anthropogenic caused  global warming is the temperature increase following rising atmospheric CO2 concentrations (or more correctly “mixing ratios”). The IPCC assumes that AGH (anthropogenic greenhouse gases) and predominantly CO2 are the principal  driver of the last 150 years warming of about 0.7°C. We know from laboratory experiments that the radiative forcing F (in W/m-2) produced by CO2 is proportional to the (natural) logarithm of its concentration. Generally it is assumed that dF = 5.35*ln(CO2new/CO2old), and that global temperature increase dT is proportional to dF: dT = lambda* dF, where lambda is the much debated (equilibrium) climate sensitivity. Lambda is not only much debated, but the last 30 years of multi-billion officially fostered climate research has not lowered the “consensus” estimate range (about 1.5 to 4.5 °C). Now politicians eager to avoid planetary Armageddon and to show their environmental care have only one simple question: “by how much will the globe warm when our emissions increase atmospheric CO2 by  so and so much?”. Alas, their only acceptable counselor (the IPCC) can only give an answer based on models (an answer the IPCC calls a scenario to avoid that satanic word “prediction”).

So we will follow professor Böhmer’s dictum and “da stelle mer uns janz dumm”. Let us look at the past data since 1850, and use the Hadcrut4 data set for global temperatures (this dataset is used by the IPCC) and also an officially accepted CO2 variation data set (the Mauna Loa series extended backwards to 1850). I will use Prof. Humlum’s excellent website www.climate4you.com (a website that shines like a rare true scientific beacon in midst global activism and hysterical enviro-angst). We will use two periods with much different CO2 increases: 1850 to 1945 where CO2 increases from 290 to 310 ppmV (let’s not object to these numbers, even if they are still open to debate), and 1945 to 2013, where the CO2 concentrations go from 310 to 397 ppmV. The global temperature increases by 0.3 K (or °C) during the first period, and by +0.4 K during the second.

Now let us take the simplistic model: dT = k* ln(dCO2)  where the d means a delta: dT = temperature difference, same for dCO2. I know, this model differs from the expression given above; it is an “janz dumme” hypothesis which will be subjected to observational verification.

The first period allows to calculate the factor k which is equal to k = dT/ln(dCO2) = 0.3/ln(20) = 0.1.

Let us take this result to calculate the temperature increase that a CO2 swing from 310 to 397 ppm would yield: dTcalc = 0.1*ln(87) = 0.45 K. The observational data give dT = 0.40 K, very, very close to the simplistic calculation. Our zero-dollar model is more or less verified by the second period of nearly 70 years.

CO2_and GlobalTemp_1850_to_2013

fig. Global temperature anomalies (Hadcrut4 series) for 1850 to present with CO2 increase.

Now we will look into the crystal ball and predict how much warming a CO2 doubling from the pre-industrial (1850) baseline will cause (if everything continues as it has during the last 163 years):

dT = 0.1*ln(580-397) = 0.52 K.  This will be the warming from today on if CO2 levels rise to 580 ppmV. IPCC’s “business as usual” scenario A1TI predicts this mixing ratio for 2050, and about 1000 ppmV for 2100. Now 1000 ppmV seems ***BIG***, but our model shows that it will correspond to a warming of only dT = 0.1*ln(1000-397) = 0.64 K from today’s situation. This means a total warming of 0.7 + 0.64 = 1.34 °C in respect to the cold, end of little ice age pre-industrial times if we do nothing. Hardly anything to shudder and scream about!

CO2 avoidance by wind power

December 7, 2014

The problem of how much wind turbines for electricity production decrease CO2 emissions when they replace existing fossil fuel power stations is a  crucial one. Usually the political decision to push wind power electricity is the underlying assumption that climate hurting CO2 emissions will be diminished. Hubert Inhaber has published a paper in the Elsevier Journal “Renewable and Sustainable Energy Reviews”, vol 15, issue 6 (August 2011) with the title “Why wind power does not deliver the expected emissions reductions”. This paper is a meta-analysis of other papers on that subject, and also an analysis of existing wind power installations; here Denmark and Germany certainly are the principal players (in 2013 Danish wind power delivered 35% of its total electricity production, and the German percentage was 8%). Actually the findings seem valid for all intermittent electricity producers, including solar PV.
The media and lobbies usually tell that x kWh produced by a wind turbine correspond to an avoidance of the same number times the CO2 intensity, e.g. x*850 g CO2 when electricity is produced by a classical coal power station. This naive assumptions neglects the intermittency, clearly the Achilles heel of wind and solar power. Weather conditions may plunge these intermittent producers close to zero, as shown in the following picture (link).

windleistung_DE_02Dec2014

The needed power at the end of the day was greater than 50000 MW, which means that fossil fuel (probably not nuclear generators) had to be ramped up to ensure grid reliability. This non-intermittent backups completely change the picture, as they must increase with a higher penetration of wind and solar power stations. Inhaber shows a nice curve (attention: left scale is logarithmic!) based on German and other countries data which gives the percentage in CO2 avoidance versus the penetration of wind power (penetration = percentage of wind power electricity w.r. to total electricity produced), assuming grid stability must be maintained.

DE_2013_CO2_avoidance_wind Germany’s wind turbines produced 8% of a total of 634 TWh in 2013, i.e. 50.7 TWh. Let us assume that the electricity mix emits 552 g CO2 per kWh (see here). If no wind turbines had been installed, the 50.7 TWh produced by classical means would have emitted 50.7*1E9[kWh]*0.552*1E-3[ton] = 28 million metric tons of CO2. With the installed wind turbines, these emissions become 24 million tons of CO2, with an unspectacular avoidance of only 4 million tons (all numbers rounded). Should intermittent wind (and solar) power reach 20% penetration, the CO2 avoidance falls to an abysmal 2% !

Inhaber suggests (with caveats!) the following equation for the above curve:

CO2_avoidance% = 200/(1 + exp(0.2*penetration%))

which tends asymptotically to zero with increasing penetration.

A second effect of increasing intermittent renewables is a dramatic plunge in the capacity factors, as shown in this picture adapted from the Power Magazine:

wind_solar_installedcapacity_10countries

Germany has an impressive 45% of combined wind and solar capacity installed, but has the lowest capacity factor ( = percentage of delivered versus theoretical maximum power) of the 10 countries (the capacity factors are given in the red boxes). These numbers simply tell us that if a certain volume of electricity must be produced, the number of renewable power stations must exceed more and more the number suggested by their name plate capacity.

These are truly gruesome numbers, which make a mockery of the (naive?) enthusiasm of those that think that intermittent emission free producers like wind and solar may de-carbonize a future non-nuclear electricity production.
A discussion from Australia on this problem can be found here and a very interesting report in Biospherica here. Read also this paper by Fred Udo using Eiregrid production data (not models!) show that “the introduction of wind energy without buffer storage leads to increased fossil fuel use and CO2 emissions and is a non-sustainable practice.
The Eiregrid engineers wrote in a 2004 report Impact of Wind Power Generation that “It is not sufficient to estimate the amount of energy which can be obtained from a given capacity of WPG, and to assume that the equivalent percentage of fossil fuel and therefore CO2 can be avoided. This ignores the impact of the increasing number of start-ups and lower capacity factor as WPG increases.” They give 2 scenarios: when wind capacity is 1500 MW, the CO2 reduction is 470 ton/MWh: with 3500 MW capacity, the reduction is only 331 ton/MWh ; there is a diminishing return of CO2 avoidance with increasing wind power penetration.

 

_______________________________________________

History:

07 Dec 2014: original version
08 Dec 2014: added lines with links to the Fred Udo paper and Eiregrid report.

Wind Report Germany 2013

September 24, 2014

(This blog has now been updated and finished. 25 Sep 2014))

The Fraunhofer Institut IWES has published a slick report on the state of wind energy in Germany, with statistics going from 2000 to 2013. Sure, the IWES people are addicted to renewable energy, so do not expect deep going musings on intermittency and power net stability. Nevertheless the report is well done and the time to read is well spent.

1. The crux with the capacity factor

Since 2000, the installed wind capacity (i.e. total theoretical power installed) increased dramatically, and the newest turbines are giants of 3-4 MW (and more to come).  The following tables will be our start point to calculate the yearly capacity factor from the combined ON- and OFFshore systems. Offshore still is negligible, delivering in 2013 only about 0.9 TWh from a total of 47 TWh produced.

WindDE_Ertraege_installierteLeistung

Let us take the last decade 2003 to 2013 and  calculate the capacity factor using both data series:

WindDE_onoffshore_CF In 2004 the total installed power was about 16.5 GW and more than double ten years after. One has to keep in mind that some of this new power came from repowering, i.e. replacing old wind turbines by more modern ones, and as a general rule the technical sophistication of the turbines increased continuously during that decade. As a consequence one should have expected increasing capacity factors (CF’s). Alas, the data show a visible negative trend and a great amplitude between 0.160 and 0.205. This decade suggests a periodicity of about 4 years and a possible ongoing down trend (largely caused by lower windspeed).

The conclusion is that increasing even by a vast amount the number of onshore wind turbines and making an agressive repowering will not increase the CF (here the decadal mean is 0.179). Even a very optimistic fan of wind power should not expect more than 0.20 for onshore wind parks. Shall I recall that classic power stations (fossil, nuclear…) have CF’s between 0.80 and 0.90 !

Offshore wind parks surely are much better, but the available numbers in Germany are still small.

The next figure from the report gives the “Volllaststunden” of offshore wind parks for the decade 2002 to 2011; to get the CF divide this number by 8760 (Germans mostly use the “Volllaststunden” which correspond to the virtual yearly working hours with an output equal to the name-plate capacity).WindDE_Offshore_Volllaststunden

The red line is the estimated average (by eye balling) of ~3300 hours, which gives a CF = 0.377,  about two times that of the onshore parks. Note the big differences between the wind parks during the last years!

Conclusion: should the onshore turbines be scrapped and only offshore installations be installed? One big unknown is the reliability of the offshore turbines, which work in much harsher conditions, might have life spans much shorter than the usual 20 years assumed for the land based turbines and possible need much more expensive maintenance.

 

2. The recycling problem

The report has an interesting chapter on what to do with the big structures when end of life is reached. Steel, concrete and copper cabling do not pose great problems, but nobody knows how to recycle the fiberglass rotors or recuperate the rare earths used in the magnets. For the moment, there is no valid information what happened to the turbines which have been dismounted.. It is suggested that the fiberglass rotors be burned in cement factories. The authors write that “zu den Fragen (Aufgaben, Verantworlichkeiten…) halten sich die Betreiber im Moment noch bedeckt”.

 

3. Running future offshore wind parks.

Offshore wind parks have a big number of wind turbines; it is known that wind turbines work less efficiently in more turbulent air. This means that the second, third etc. turbines in a row (or a matrix) suffer from the turbulence and air velocity reduction created by up-wind located systems. IWES found that slowing down the first turbines touched by the wind increases remarkably overall performance, as shown in the next graph:

WindDE_offshore_regulationPutting a restriction on the first of 9 turbines in a row increases total power by about 25%.  This shows that an intelligent regulation system is mandatory for offshore wind parks.


4. “All electric” future.

The IWES sees the German energy future as a total electric one, with electricity produced exclusively from “renewable” fuels: hydro, biomass (biogas), wind, solar and geothermal. Note that CO2-free nuclear power is completely absent in this scenario. Fossil sources like oil will practically only be permitted as a feedstock for chemistry. The next figure, probably from a simulation run (text boxes added by me), gives the total yearly energy consumed in 2013 for different parts of activity and life:

WindDE_PrimaerenergiebedarfIn 2011 Germany used 3772 -285 = 3487 TWh energy for these 3 activities; the 285 TWh were needed for other activities. If the future is “all-electric” and “renewable”, this is the total which must be delivered by hydro, wind, biomass, solar and possibly for a very small fraction, geothermal sources. In 2013, these renewables delivered about 147 TWh, which means that without major savings through increased efficiency the renewables must be upped by a factor of 25 ! Let us make a very, very  optimistic hypothesis that heating requirements will be half of what they are today and that better efficiency of electrical cars and transportation systems will down the traffic (=transportation) requirements also by 50% . This still leaves about 2509 TWh to be produced by renewables. In 2013 wind energy delivered about 47 TWh, biomass ~29 TWh, solar PV ~19 TWh and hydro ~ 14 TWh. Hydro power is nearly at its maximum; biomass also can not be upped tremendously, as no supplementary soil can be used for growing energy plants. This means that wind and solar PV will have to take the burden to deliver at least 2400 TWh. Without any revolution in electricity storage, and even with smart grid technology, at least 2 times of these 2400 TWh must be installed, to be sure that a minimum baseload is always available. These numbers make me dizzy, and I wonder where the surfaces to install the new systems will be found.

Now let us assume that by a twist of German public mood nuclear power would make a come back. A modern nuclear facility of say 4000 MW has an efficiency of about 80%, so it will deliver a baseload of 28 TWh per year. Dividing 3610 by 28 gives 129 nuclear installations, each needing about 2-3 km2, which gives a total of less than 400 km2 occupied land or sea surface. A report from the NREL gives the needed permanent area for wind parks as 0.3 hectar/MW, probably a non-realistic low number as new environmental restrictions  (“10H Regel”) impose larger and larger distances from dwellings.. Now let us assume that 1200 TWh must be delivered by wind (the other half coming from solar PV on roofs or other sources which minimum land area usage). Assuming an overall capacity factor of 0.3 (which would correspond to a huge increase on offshore installations), the needed name-plate power will be (1200*1E6/8760)/0.3 = 456620 MW, and the occupied land/sea area ~137000 hectar = 1370 km2. The same energy delivered by nuclear facilities would need less than 200 km2, more than 6 times less, and would be non-intermittent and reliable.

IWES assumes that the all-electricity scenario will need only 1000 TWh, (with big electrical lorries using a system of trolley overhead feeding!). In my opinion, a future with more and more energy restrictions will not be tolerated by the public (and rightly so!). Progress which will turn us back to a permanent type of post WWII rationing might be palatable to the Greens who hold political power today, but not to their children and children’s children.

16th September Ozone Layer Preservation Day

September 18, 2014

international-day-ozone-layer

The 16th September is the International Day for the Preservation of the Ozone Layer!

This 16th September is meant to commemorate the 1987 Montreal protocol banning human made ozone depleting gases, like chlorofluorocarbons. Read here a longer explanation of the chemical reactions which reflects the traditional view.
This NASA graph shows that a few years after Montreal, the yearly minimum of ozone layer thickness at the South Pole stabilized (but did not markedly recover!). The scientific reasons for the ban, which seemed complete and rock-solid at that time, are becoming gradually more uncertain, as more natural sources of ozone destroying gases were found (e.g. bromides emitted by plankton) and a new mecanism in the South Seas has been detected. Also unexpectedly new gases have been found recently that seem to destroy ozone.
Caring for the ozone layer is noble, but as often in environmental problems much hype and hysteria muddled the problem; above all, the science really is not settled!
MeteoLCD measures the thickness of the ozone layer above Luxembourg since 1998, and the linear trend does not show an ongoing thinning; neither are the UVB irradation levels increasing (look at the trends page). Look also here for the results of the Uccle station, which is one of the longest acting measuring stations in the world.

Conclusion:
1. data show no continuing further thinning of the ozone layer.
2. the natural occuring ozone hole above Antarctica has stabilized.
3. in our regions, the trend is more on an increase of the layer. The UVB irradiation at ground level is not increasing.
4. In our region the thickness of the ozone layer often varies tremendously on extremely short periods. To expect a constant thickness of the ozone layer is foolish.

PS1: you may read Matt Ridley’s comment in The Times :The Ozone Hole Isn’t Fixed. But That’s No Worry (go to chapter 6 here)

PS2: here is the report of the EEA on the European emissions of ozone depleting substances

The 6th July 2014 storm… and radioactivity peak.

July 7, 2014

See new additions at the end of the text!

Some parts of Luxembourg were under a severe storm during the late afternoon of the 6th July 2014. Wind gusts were very high (we measured a half-hour maximum of 15 m/s (54 km/h, i.e. a maximum of the average over 30 minutes) which is considerable for a location at the bottom of a valley. Our backup Vantage Pro station measured a high wind maximum of 25 m/s (90 km/h, this is an instantaneous maximum, not an average over some time interval), so no wonder that quite a lot of trees went down or lost branches. The atmospheric pressure dip (average over 30 minutes) was not spectacular: about -4 hPa (mbar),  but this seems sufficient to cause very strong winds. Our lightning sensor was down, which is a pity, for we missed to  record a nice storm activity. Precipitation (rainfall) peaked at about 20 mm in half an hour, and the corresponding atmospheric radioactivity surge due to radon washout was  27 nSv/h.

rad_and_rain_06Jul14

So lets go back to our previous discussions of radon washout (here and here) and update the graph relating radioactivity peak to precipitation pulse.

radon_washout_06Jul14

A linear fit forced through the origin is impossible: R2 = 0 ! The affine fit Rad_peak = a + b* Rain_pulse gives a slope of 0.45, i.e. every mm of precipitation pulse would increase ambient radioactivity dose-intensity through radon washout by 0.45 nSv/h. This very low value (to be compared to 7 given in the previous discussion) points to sort of a saturation effect. Let use imagine that a rain pulse of infinite intensity (hm!) would washout all radioactive radon daughters contained in a surrounding volume (a column of a certain height and a certain diameter for instance). Then a better model would be a logarithmic one,  something like radon-peak = a*log(b*rain-pulse) +c, where log is the natural logarithm ( a still better model would have a horizontal asymptote). This gives indeed a better picture with a GOF of R2 = 0.29.:

radon_washout_06Jul14_logmodel

In this discussion, only measurements were there is an interval of at least 3 days between the rain-pulse have been retained. The 8 Sep 2013 data point seems rather odd: may be it is the outlier spoiling a nice model!

Let’s close with a very simple model using a rational function, which has a horizontal asymptote but will not be forced through the origin:

radon_washout_06Jul14_rationalmodelHurrah: the goodness of the fit now jumps to R2 = 0.33. If the rain_pulse x tends to infinity, the rad_peak will reach the asymptotic value of 30.26 nSv/h. When the rain_pulse is zero, we should expect a radon-peak being also zero: we are close with 1.1 nSv/h.

Time to leave the playground, but this amusing topic will be continued…

__________________________

Additional comments (08 July 2014):

Patrick Breuskin from the Division de la  Radioprotection and  a meteoLCD collaborator from time to time sent me his measurements made at 3 different locations by an AGS421 gamma counter (sampling interval = 10 minutes): AGS421_7235 is installed on the deck of meteoLCD, AGS421_7288 at the Findel airport and AGS421_7199 is measuring at Esch-Alzette. I annotated his graphs, which are shown here in the same order:

20140706AGS7235StormRain_annotated 20140706AGS7288StormRain_annotated20140706AGS0199StormRai_annotated

Obviously all instruments show a radiation peak coincident with the precipitation pulse. Expressed as percentages above the previous background levels, the radiation peaks are:

Diekirch meteoLCD:  33 %               (baclground 83 nSv/h)

Diekirch AGS421     :  47% (54%)    (background 83 nSv/h)

Findel airport:             83%               (background  113 nSv/h)

Esch-Alzette:               37%                (baclground  132 nSv/h)

The two Diekirch and Esch-Alzette peaks are relatively close. As the meteoLCD reading is an average over 30 minutes, one should expect lower values than the 10-minutes sampled AGS421_7235 located on the same deck. The high value of 83% at Findel airport is a bit of a surprise: as this airport is located at about 360m asl, and much more exposed to wind than the other stations. Could it be that higher wind-speeds push the radiation peaks up?

Here are the wind speed peaks rounded to the nearest integer:

Diekirch meteoLCD:  15 m/s

Findel airport:             13 m/s

Esch-Alzette:     not available

These wind velocities are comparable, so wind speed does not seem to influence the relative radiation peak; neither does the background level which is highest at Esch-Alzette without yielding a higher relative radiation peak. Possibly the level of the precipitation pulse is the main factor contributing to the intensity of the radon washout. The Findel measurements are not yet available, and the closest station from ASTA is that of Merl which shows only 14.2 mm.

Well you know the tune “We need more data!”, so lets be patient.

 

Radon washout (2)

April 26, 2014

In August 2013 I wrote a small comment  on ambient air radioactivity peaks coincident with a sharp rain fall pulse. Several specialists like A. Kies and M. Severijnen confirmed that these observations show a wash out of the radioactive daughters of the ubiquitous radon gas. This year a similar event happened the 21th April 2014:

Image

Note that the 2 mm rain fall pulse triggered a radiation rise of about 20 nSv/h above the base level. One also sees that the much lower rain fall pulse (0.8 mm) which happened the next day does only a minor radiation peak.

Several interesting questions can be asked:

1. Is there a minimum time between two rain fall pulses needed to cause strong radiation peaks?

Or in other words: how long does it take for the atmosphere aerosol content to recover after a first washout?

The observations made in August 2013 suggest that 1 day could be enough:

Image

The first rain pulse (1.2 mm) causes a radiation peak of about 14 nSv/h; the second much stronger rain pulse (2.2 mm) triggers a radiation peak of comparable amplitude: this could be a hint that the atmospheric aerosol load has not quite recovered to the previous level (which would be the base line after 3 dry days).

A week in September 2013 gives a similar picture:

Image

Here we see two strong similar rain pulses of about 4 mm separated by about 48 hours: the first triggers a radiation peak of about 35 nSv/h. and the second of only 15 nSv/h. A close inspection shows that this second rain pulse actually is double, a first of 2.8 mm followed a couple of hours by a second of 4 mm. The radiation curve peak also shows these two peaks, with again the second (corresponding to a higher rain fall) less than the first.
The conclusion is that a simple relation-ship of the form radiation-peak = f(rain pulse) can not be established without respecting this atmospheric recovery time lapse.

2. Is the radiation peak proportional to the rain fall pulse?

Using only our miniscule set of observations from 2013 and 2014, let us just keep the (rain pulse, radiation peak) points separated by a minimum of 3 dry days:

Image

The plot shows that the slope of the regression line is 7.09. i.e. the first rain-fall pulse of 1 mm causes on average a radiation peak of 7 nSv/h. The goodness of the fit is rather poor (R2 = 0.28), and this analysis shouts for more data!

 

Conclusion:

An analysis of the relationship between rain pulse and radiation peak intensities must respect the time lapse needed for the recovery of atmospheric aerosol load, possibly 3 dry consecutive days . This first short study suggests a possible linear relationship with a slope of 7 (nSV/h)/mm .

RAF revisited (total ozone column and UVI)

March 8, 2014

In April 2013 I used a period where total ozone column (TOC) made a spectacular plunge to calculate the RAF (Radiation Amplification Factor) which tells us by how much the UVI (or biologically effective UVB) will increase when the total ozone column becomes smaller.  This month (March 2014) we had a relatively low TOC of 276.5 DU the 7th March, followed the next day by a DU of 328.1. Sky conditions, total solar irradiance and solar angle were practically the same, so that TOC is the only factor influencing UVI. The calculation gives an RAF = 1.17, similar to the value found last year. Broadly speaking, if TOC diminishes by 20%, UVI increases by 20% (here -18.7% for DU and +18.2% for UVI).

See also this paper from last year.

RAF_March2014

The values used to compute the RAF are the UVI’s, which are proportional to the mMED/h given as red curve in the graph (1000 mMED/h = 25/9 UVI, see here).

2013 in review

December 31, 2013

The WordPress.com stats helper monkeys prepared a 2013 annual report for this blog.

Here’s an excerpt:

A New York City subway train holds 1,200 people. This blog was viewed about 4,200 times in 2013. If it were a NYC subway train, it would take about 4 trips to carry that many people.

Click here to see the complete report.

The winters they are cooling!

November 25, 2013

The EU wants to spend 20% of its budget to avoid “dangerous global warming” (now called “climate change”), but it seems unaware that winters in the EU zone are steadily cooling since quite a long time. I pointed to our meteoLCD data showing this trend for a couple of years, and our latest trend graph for the period 1992 to 2012 showed this:

DJF_winter_airtemp_trends

The overall trend for Diekirch is -0.42 °C/decade (that of our national meteo station at Findel airport even -0.67 °/dec).  Ed Caryl has a comment at the Notrickszone blog of Pierre Gosselin, where he examines winter trends from the GISS data. The global figure shows this:

GISS_winter_cooling_1995_2012_annotated

Luxembourg belongs to the light-blue region, which means that the trend for the 1995-2012 period of 18 years is between -0.5 and -1.0 °C, which gives a decadal trend of -0.28 to -0.56 °C/dec (with a mean value of -0.42 °C/decade, really close to the meteoLCD trend).
The analysis of the Diekirch data shows that the winter temperatures correlate very well with the NAO index of the December to March months. The coming years will tell us if this not so surprising correlation remains stable.

I wonder that absolutely NONE of our climate anxious politicians and NGO’s seems to recognize this situation, which is an observable fact and not a prediction of some climate models ensemble. Colder winter will be problematic, as they stress the energy needs for heating, and collide with the ambition to continuously lower energy usage.

The Austrian meteorologist Dominik Jung found the same winter cooling throughout the Alps, which should comfort the managers of the various sky resorts who have been continuously told by the climate alarmists that sky resorts have no future due to climate warming.


Follow

Get every new post delivered to your Inbox.