Welcome to the meteoLCD blog

September 28, 2008

BadgeExcluThis blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.

Subsidies for renewable and classic electricity

November 22, 2015

In the debate about wind and solar electricity, the amount of subsidies received by political decision is a hot topic. Usually the pushers of this type of electricity (correctly) insist that non-renewable electricity production is also subsidized, and that objecting subsidies for renewables is a moot point.

In this blog I will use concrete data available from the EIA, as well as a few illustrations from the “At the crossroads: Climate and Energy” meeting by the Texas Policy Foundation. All data refer to the USA, 2013.

Let us start with the EIA table which gives the different subsidies for the fiscal year 2013, in million of (2013) US$.


The “classical” electricity production by coal, natural gas, petroleum and nuclear receive 5081 million$, the renewables 15043 million$.

The next slide of a presentation by James M. Taylor from the Heartland Institute gives a good textual overview:

subsidies_EIAThe last point is especially instructive: coal, NG, petroleum and nuclear receive 5081 million$ as subsidies, but produce 86.6% of the total electricity; wind and solar receive 11264 million$ as subsidies and produce 4.4% ! This means that the wind & solar combined receive 43.6 times more subsidy per unit of electricity produced than the traditional producers!

If you just picture solar subsidies per 1000 MWh produced with the traditional producers, you get that nice pie-diagram (by Taylor):

subsidies_per_MWPer unit of electrical energy subsidies to traditional producers become nearly invisible compared to those of solar electricity!

The official EIA numbers in the table should close the debate: yes, renewables get really high subsidies for their very low overall contribution! The situation in Europe probably is similar to the USA, with renewable subsidies possibly still higher.

German PV producers: you will pay more for self-used solar electricity!

November 2, 2015

The Bundesnetzagentur (BNA) has published a guide to explain the future taxes that must be paid by individual producers and consumers of electrical energy. This provisional “Leitfaden zur Eigenversorgung” is devilishly complicated, as many combinations of producing/consumption and storage are possibly.

1. The big principle

In the future every consumer will have to pay an EEG tax, even if he consumes the electricity that he produces with his roof-mounted PV panels. Until now he paid for his own electricity about 20 (Euro) cents per kWh, to be compared with about 30 cents that the ordinary consumer has to pay. This will change in the future (in three increasing steps: up to end of 2015, 2016 and from 2017 on). The following diagram gives this situation:

abb_1The red circle shows that (starting 2017) the producers of solar, wind or biomass electricity (EE) and those of high efficient combined heat and power installations (KWK) will only have to pay 40% of the EEG tax, all others the full amount (100%)

2. The difficult problem of electricity storage

The new “Leitfaden” considers electricity storage installations as consumers and producers; as a consumer they must pay the tax, and as producers they must ask their clients to do it.

Let us just consider the situation where a PV owner uses his own solar electricity, and adds battery storage to smooth-out the variable production:


Normally, he would have to pay two times: first to store, and then to use! The Leitfaden remarks that this would be rather idiotic, so one of the two taxes vanishes. The 100% amount will change to 40% if the electricity is produced by renewables. This is the minimum amount that must always be payed!

3. The zero-tax exceptions

A couple of exceptions exist: the first is the “island” situation shown in the next figure:


Here you have a consumer whose house is completely cut off from the grid, and must never (even for 15 minutes!) connected back to that grid. His electricity is exclusively renewable, and he may not up it with some other means (as a diesel generator) when the quantity is insufficient. This consumer will pay no tax. But remember: a single kWh coming from outside during the year means he will be put back at the 40% tax level for the full year!

If this user has a one-way connection to the grid, so he might deliver eventual excess renewable electricity to the grid (but never the other way round!) and not asking for the usual  renewable subsidy, zero tax also applies.

Very small renewable installations (de-minimis installations), with less that 10 kW capacity and less than 10 MWh annual energy production will also be exempt from the tax.

4. Storage

The only system of storage that will be treated lightly is the pass-through storage system: the owner takes electricity from the grid, stores it and delivers the same quantity later on. Actually he puts his storage batteries in the hands of an external provider who manages storage and retrieval. This owner will not have to pay for that electricity that transits through his batteries (difficult to see it otherwise!)

5. Conclusion

As said at the start, the “Leitfaden” shows a devilishly difficult future world, and one where tricks and poor morality will blossom, as control and management also will be far from easy (even with smart meters). I guess many PV owners will not take lightly that they will have to pay higher taxes on their home-grown electricity they consume!

The never vanishing ozone hole

October 27, 2015

30 October 2015: added link to paper (see at the end)
I just read a short article by Steve Goreham titled “Did we really save the ozone layer?” (link).

After the Montreal Protocol most nations were phasing out their use of ozone depleting substances (ODS, like freon), and since about 20 years there are practically no much human emissions left. Remember that these evil fluor-carbon molecules are seen as the culprits causing the yearly thinning of the ozone layer above Antarctica (the Ozone Hole). During October and November total zone column may plunge to 100 DU in that hole.

Has this phasing out of ODS fostered a closing of the zone hole? Really not, as shown by this graph:

ozone_hole_2015 Since 1996, the area of the ozone hole remains more or less at 20 to 25 millions km2, while the ODS consumption and emission (red curve) fall to zero. Could it be that the whole theory (which gave Molina and Roland their 1995 Nobel price) is either bogus or at least incomplete?  The ozone destroying chemical reactions found by the 2 Nobelist certainly exist; there remains the nagging suspicion that other, possibly more important ozone munching phenomena  as the human ODS emissions might be at work.

How inconvenient that Nature so often refuses to obey our glorious models and the political decisions based on them!


30 October 2015:

See this paper by Gribble: (paywalled):

The diversity of naturally produced organohalogens.

“More than 3800 organohalogen compounds, mainly containing chlorine or bromine but a few with iodine and fluorine, are produced by living organisms or are formed during natural abiogenic processes, such as volcanoes, forest fires, and other geothermal processes. The oceans are the single largest source of biogenic organohalogens, which are biosynthesized by myriad seaweeds, sponges, corals, tunicates, bacteria, and other marine life. Terrestrial plants, fungi, lichen, bacteria, insects, some higher animals, and even humans also account for a diverse collection of organohalogens.”

Ozone: change is the norm!

October 26, 2015

In the discussions about climate change, one often gets the impression that the media and the political people aim for a stable climate (actually they mean “weather”), something like a paradise-like smooth state without any nasty, big and extreme variations and perturbations. The chaotic nature of our atmosphere should tell them that this can never be the case, and that battling to stop climate change is and will remain futile. Here I will write some words on the variability of the total ozone column (TOC). The atmosphere has a variable concentration of ozone (O3), that delicate gas that comes and goes according to solar, temperature and presence of precursor gas conditions.

1. Ground ozone.

We find ozone everywhere, but two regions are the most important: the ground layer where we live and breathe can have O3 concentrations that change up 200 ug/m3 (about 100 ppb) during good (pre-) summer days. Many factors impact this concentration, the most important being (besides temperature) the availability of UVB radiation and precursor gases. Here the most important of these are the natural isoprenes emitted by trees and plants, and the NO2 which mostly comes from traffic related emissions (there are other gases like industrial VOC’s, which have a smaller impact). As vehicles also emit NO, which destroys ozone, we have very different nightly profile in clean air rural and high traffic urban places. Look at the following picture, which gives the ground ozone levels measured at Bonnevoie ( = Luxembourg-City) and Diekirch (=semi-rural) during the week ending the 26th October 2015.


The blue rectangle shows the situation in the night of the start of the 22th October 2015: in a same interval of about 6 hours, the ozone concentration in the city location (top) diminishes by 27 ug/m3 (about 13.5 ppb), whereas at Diekirch without not much nightly traffic the fall is only 8 ug/m3 (4 ppb). The rapid decrease in Luxembourg-City is due to the emitted NO, which destroys the existing ground ozone, a removal that is much slower in Diekirch where there is not much nocturnal emitted NO !

So looking at a daily ozone pattern tells you immediately if the location was urban or rural.

Every May/June, when ground ozone levels are on the rise, the Luxembourg environmental agency issues warnings, as they take wrongly and stubbornly as a reference the O3 levels at the Mont Saint Nicolas in Vianden, a very rural location without only a minimum traffic, but a very rich tree cover. The natural isoprenes, together with clear and non-polluted air (i.e. rich UVB irradiance) makes the ozone levels at this location the highest for Luxembourg. This has nothing to do with noxious human activities, but is an absolute natural phenomenon.

As I will speak in this blog on variability, just look at the extreme swings in the ground ozone concentration: the O3 levels never are constant, but vary from close to zero in the morning to their late afternoon peak.

2. Total ozone column and UVB  irradiance.

The major part of the ozone is located in the stratosphere, between 15 and 50 km with a maximum around 25 km; the concentration there is about 6 times higher than at ground level. This “good ozone” layer absorbs the short-wave and dangerous UV-C radiation (with wavelength below 280 nm) completely, and also part of the UVB radiation (280 to 320 nm). The total ozone column is measured in Dobson Units (DU): if all the ozone contained in a vertical column would be compressed to normal atmospheric pressure, the height of that column would be about 3 mm or 300 DU. Usual numbers in our region vary from 250 to over 400 DU. About 10 to 20% corresponds to the ozone at the ground layer (the “bad ozone”), the major part is stratospheric ozone (the “good ozone”).

The influence of the thickness of the ozone layer on the UVB irradiance can be shown from our measurements at meteoLCD (see this paper). The next figure from the cited paper documents that a thinning ozone layer will increase ground UVB irradiance.


The 22 April 2013 the TOC (total ozone column) was 381.8 DU, and the effective UVB irradiance about 1.5 MED  (MED = minimal erythemal dose); the next day with the same meteorological conditions, the TOC fell down to 265.8 DU and the effective UVB irradiance increased by 0.68 MED, about 2 UVI (UV index). A dip of 100 DU would correspond to an increase of 1.7 UVI.

3. The extremely variable total ozone column

Many factors influence the thickness of the total ozone column, which varies often in  a spectacular manner. Look at the next figure which shows the TOC measured at Uccle (near Brussels, Belgium) for this year 2015:

dobson_Uccle_upto26Oct15The read line represents the DU readings for this year 2015 up to the 25th October, the grey part of the plot are the readings from last year. Uccle has one of the longest DU series in Europe, starting 1979. The sine-wave represents the average of all measurements from 1979 to today. Clearly the TOC is highest in spring and lowest in autumn. The next 3D diagram shows this in a more beautiful way for the global region between 45° and 50° latitude North.


So we have an average smooth sine-curve over the year, but the actual measurements present a totally different pattern. Look at the extreme variations in the Uccle plot, where the ozone column can plunge from a 500 DU peak to a 300 low in a few days; often the peak and troughs follow in very short time, a day or even less.


These are the measurements at meteoLCD for 2015, up to the 26th October: Look at what happened around the 10th April: in two days the thickness of the ozone column increased from 300 to 450 DU, and fell back to 363 DU the next day.

What all these measurements show is that our atmosphere is a very dynamic beast; change is the norm, and no change the exception! I remember that in the past when the TOC fell rapidly, the media were fast with alarmist articles about a vanishing ozone layer (evidently caused by human activity!) and our eradication by skin cancer. Had the authors waited a couple of days and had they not been ignorami of natural variations, these silly articles would not have been written.

There is no cause for alarm, as the total ozone layer has not been thinning since many years. The last figure shows the trend from our meteoLCD measurements (meteoLCD is still the only station measuring the TOC in Luxembourg).


The general trend from 1998 to 2014 is positive, and that of the last 14 years practically flat. So no cause for alarm here!

4. Conclusion

The measurements of the ground ozone and the total ozone thickness document an extremely variable situation.  There is no even spread out of the ozone concentration, no well mixed situation, but a breathtaking variability. The atmosphere is a turbulent beast, not a smooth pudding!

PS: Do not forget to look form time to time at our ozone data by clicking on the “DOBSON (total O3)” link at http://meteo.lcd.lu

OECD climate report: a big swindle concerning Luxembourg ?

October 20, 2015


Many media write in ecstasy that this new report warns that the CO2 mitigation schemes of most countries do not allow to keep global warming below 2°C in 2100! Actually the main point of the report is that if all countries do not reduce their GHG emissions by 40-70% below their 2010 levels and by at least 100% by 2100, this 2°C target will be missed.  In this blog I will not muse on the absurdity of this 2°C level, which is a nearly complete guesswork derived from non-verified climate models. I just will take some items relating to Luxembourg, and will shows that this glossy report gives numbers concerning Luxembourg that are close to swindle.

1. Carbon intensity of electricity generation

Page 79 contains a graph showing the carbon intensity in g CO2 per kWh electricity produced for many countries: here Luxembourg’s CO2 intensity goes through the roof:


The figure tells that Luxembourg produced its electricity in 1990 using 2552 g CO2 per kWh, which would have been a cosmic record! In 1990, Luxembourg consumed about 1250 GWh (here), about which 1000 GWh from coal. In 2013, the fraction between total thermal electricity consumption and in-country thermal production is 2.7.  This suggest an indigenous Luxembourg thermal production of 1000/2.7 = 360 GWh. There is absolutely no reason which would explain the abyssal low efficiency of Luxembourg’s own power plants to produce this tiny amount of electricity. This statistic is clearly nonsensical, and it is very telling about the care the authors took in checking their numbers.

The recent data are fully available at the web site of the ILR (Institut Luxembougeois de Régulation). I use the data for 2013 which can be summarized as follows:

Global electricity production = 1838 GWh

Production of thermal origin = 1575 GWh

Included in this category is the production of co-generation (“Blockheizkraftwerke”), of burning wood, bio-gas, landfill gas and the production of the TwinErg combined gas-turbine plant.
The balance of 263 GWh comes from wind, solar and hydro production.

Now let us assume that the thermal generation produces 500 g CO2 per GWh, which is about  the average for natural gas electricity power plants; notice that we include all bio-gas and wood burners here, where at least the last could be considered carbon neutral.

The 1575 GWh correspond to 7,88*10^11 g CO2 emitted in 2013 . Dividing this number by 1838*10^6 gives us a (maximum!) CO2 intensity of 429 g/KWh, well in line with the grey bar showing about 350 g/kWh.

The graph should have omitted the 1990 bar for Luxembourg; that it found its way into the figure shows that these graphs are milked from databases without any intelligent thinking.

2. Emissions per sector

At page 21 we find this figure:

Look at the distance between the two red lines: Luxembourg’s emissions from transport are close to 55% of the total! This is a world maximum! Do the Luxembourg people drive like crazy 24 hours a day on their 2967 total length of roads (including 147 km motorways)? Or do they drive gas guzzlers that use x*100 liter/100km ? Definitively no; the authors of the OECD report forgot to mention that close to 80% of all fuel sold at Luxembourg’s pump stations goes into foreign cars, and so does not stay in Luxembourg and can not be taken as an internal Luxembourg consumption. The real part of the transport sector emissions inside Luxembourg is closer to 55*0.2 =11%, similar to what can be found in Germany or Belgium. All statistics ignoring this basic fact are swindle, as are the monstrous 20.9 tons CO2 per capita emissions for Luxembourg computed by using wrong numbers and found in many databases (see here). A calculation method that is ok for large countries where fuel export is a tiny part of the total volume sold at the pumps can and must not be used for small countries where the major part of the pumped fuel goes into foreign car tanks.

3. CO2 efficiency

Now lets conclude with a more positive remark. The figure 1.2. shows the trend in GHG emissions versus GDP:

emissionchangeperGDPDuring the 23 years interval 1990 – 2012, Luxembourg’s efficiency increases dramatically. The GHG emissions per unit of GDP fall by close to 60%, bettering Germany and most other European countries. Alas, that the Russian Federation is the champion on this graph makes me think twice, damping the good patriotic vibrations.

4. Conclusion

This OECD report makes for a pleasant reading, but leaves with the impression of a quick copy/paste from various databases and a disturbing absence of critical thinking. The IPCC gospel is taken as a divine truth, and finally the laments reported in the media are not much more than the result of a primitive extrapolation of current CO2 emissions. One more of these sloppy reports that beat the drums waiting for COP21 !



03 Nov 2015:

just as another illustration how Luxembourg is seen on UNEP statistics (graph from UNEP): The red arrow point to Luxembourg, the size of the disk is proportional to the total CO2 emisions:


Taking the plane? Say hello to radiation!

October 15, 2015

Radiation is one of the natural phenomena that many people are afraid of: we do not see this mysterious nuclear rays, and rarely do the media talk about radiation without pushing the scare level to the max.  But radiation is “natural”: the whole universe, and our planet, and our selves constantly bathe in a continuous flow of charged particles, energetic photons, fast neutrons and mysterious neutrinos which zip through us every second.

1. Cosmic and solar rays

Cosmic galactic rays originate from outside our solar system; they usually interact with nitrogen and oxygen in the atmosphere and produce a shower of different electrons, muons etc. Solar rays represent the “solar wind”, mainly formed from protons which are ejected by the huge solar fusion reactor. The solar wind is more intense when a “magnetic hole” = coronal hole opens at the sun’s surface. The charged protons are than free to be ejected into space; if the hole is directed to the earth, a more or less vigorous stream hits the atmosphere. Such a situation is happening now, as shown by this picture taken from the excellent web-site spaceweather.com (next figures all from this site).


The turquoise circle shows a coronal hole; the white lines and arrows indicated the magnetic field lines which usually trap the protons. The number of these solar protons is continuously monitored by the geo-stationary GOES satellites or the ACE satellite positioned at the Lagrangian L1 point between Earth and Sun. Here the data for today, 15 October 2015:

As you can see, 2.4 protons per cm3 is not negligible: as your body volume is approx. 75 liter ( = 75000 cm3), this means that at this point of observation 180000 protons zip through you every second!

One understands that this type of radiation (and mostly rapid surges) can pose serious problems to astronauts on board of the ISS or future planetary travel.

2. Transatlantic travel by plane

It seems obvious, that cosmic radiation and solar protons are increasing with altitude, as the filtering air layer becomes thinner and thinner. Here is a graph showing the relative increase in radiation dose with altitude. Normal transatlantic flight is at about 40000 feet.

dose_rate_versus_altitudeThis plot made from real measurements shows that the dose rate is about 50 times higher than that at sea-level.  The next plot shows comparable data, this time in the usual unit for dose rate (nSv/h):

dose_rate_nSV_versus_altitudeHere in Diekirch we measure a background dose rate of about 80 to 84  nSv/h:


A transatlantic flight would correspond to a 26 times increase.

Now suppose you are a pilot or a steward(ess) and make 10 trips per month, which amounts to about 10*2*8 =160 hours (assuming 8 hours for one flight at high altitude). With a working year of 10 months this would amount to a supplementary radiation dose of 10*160*2200/1000000 = 3.52 mSv/year (the division by 1000000 transforms nSv to mSv). The Health Physics Society gives slightly lower exposures, as for instance 2.19 mSv/y.

3. Conclusion

Even if the dose rates at high altitudes seem impressive, the supplementary dose from one transatlantic travel is tiny. The usual background dose at many locations on Earth is about 3-4 mSv (with some outliers going  up to 260 mSv/y as in Ramsar, Iran), so even pilots and flight crew do not accumulate a dangerous radiation dose. The not so frequent flyer, be it for tourism or business, shouldn’t be scared. But if you intend to spend your next vacation on the Moon or on Mars, things will be different!

Wood and pellets: a “burning” fine particulate problem.

September 26, 2015

28 Oct 2015: new link added at the end of the blog

The heating season is about to start here in Luxembourg. I heat my home with an oil driven central heating, one of my neighbors only burns wood (in cords). The quantities of wood he uses are breathtaking, but probably he choose wood burning as “climate friendly” . Indeed, the carbon dioxide released has been gobbled up by the tree during its 30 to 60 year life from the atmosphere, and returning it to the air will be, at least at a first glance, be “carbon neutral”. At a second thought, the problem is more complicated: his CO2 release is a spike that would not have occurred if the wood had been left to rot (the release would have been over many years ), so that at least on a short period, there is not much gain by switching, say, from gas to wood.

There is much talk during the last years about the dangers of fine particles (the smaller than 2.4 micron PM2.5), be they released by Diesel engines or other energy providers. In Europe all new Diesels have particulate filters which should solve this problem. Burning wood is a very big PM2.5 emitter, and I will discuss this in the next chapters.

1. CO2 emissions

CO2_emissions_of_fuelsThis figure shows that the CO2 emissions per KWh energy from burning wood are about the double of those from natural gas. So say if a state installs a CO2 emission measuring system (using perhaps a satellite like the OCO-2), wood burners would be in a delicate position.

If we look at the composition of the exhaust, we have this:


There is a 1% per weight emission of NOx, which is about the same for gas or oil; there are not negligible VOC (volatile organic compound) and particles emissions. Clearly the exhaust from a wood stove is very different from clean air!

2. The PM problem

Fine particles are the crux of burning wood:

This figure from the www.treehugger.com web site shows the tremendous difference between an uncertified wood-stove and a usual gas furnace. Things become much better if you use a pellet system, but nevertheless remain 162 times higher than gas (the uncertified wood-stove emits 1464 time more than gas!).

Now very often the discussion on particulate emission puts the blame also on agriculture. But the picture is much different, as agriculture does not emit the same percentage of very small particles (the PM 2.5), which are thought to be the most dangerous, being able to transit to the lung, the heart and even the brain. The next figure compares the two emission sources:


3. Hourly emissions

The next figure shows the emissions of particles in g/h for burning oak (the 3rd most frequent wood in Luxembourg): note that the emissions of the larger PM10 are about the same as the dangerous PM2.5; fire logs are possibly wax-wood mixtures, and so have about 4 times less emissions.


These numbers apply to about a burning rate of about 3 kg dry wood per hour (see here).

4. Conclusion

If you burn wood (cord or pellets), you environmental impact may not be what you intend: your immediate CO2 emissions are comparable to those of other fossil fuels, and your particulate pollution is much much worse! That is the reason why for instance the city of Parish forbids burning wood in open fires. Switching to gas (or nuclear powered electrical heating!) would be more environmentally friendly.



28 Oct 2015: see also this article by Scilogs: “Unterschätzte Gesundheitsgefahr durch Holzrauch“.

Electricity generation: very different capacity factors!

September 21, 2015

The US Energy Information Administration (EIA) has an interesting post on the huge differences between countries and origins of electricity generation efficiency, or more precisely the capacity factors.

1. Definition of the capacity factor and the “Volllaststunden”.

Let me recall that the capacity factor is simply the yearly energy produced divided by a hypothetical maximum which would have been produced if the generator had functioned 8760 hours at its the name plate capacity. An example: suppose a wind turbine has a name plate capacity of 2.5 MW; if it would deliver this power during the whole year (what is clearly impossible!), it would produce an energy of 2.5*8760 = 21900 MWh. Now its real production has been only 4380 MWh. So the capacity factor is:

CF = 4380/21900 = 0.20  which is often given as a percentage by multiplying by 100, i.e. here CF%= 20%.

In Germany one uses mostly the term “Volllaststunden” (yes, you can write this using 3 letter l). The VS would be equal in our example  to VS = (CF%)*8760/100, i.e. 1752 hours.

2. Capacity factors vary with type of electricity generation and country

The IEA report has several interesting statistics which give the capacity factors of different countries and regions for the period 2008 to 2012. I modified the first table by discarding 4 countries or regions: Russia, because its quasi nonexistent wind/solar production, Japan, because its shutdown of all its nuclear reactors after the Fukushima accident, the Middle East because it has only negligible nuclear electricity production, and Australia/New Zealand for the same  reason. That leaves 12 countries or regions with the following statistics:


The vertical red lines give the average capacity factors  of the different types of production: nuclear is the absolute champion with 79.8%, fossil and hydro are close at 45.9% and 41.9%, and solar/wind come out very low at 21.9%. If we call “renewables” the last two categories, clearly hydro is the only one delivering acceptable capacity factors. Now lets separate the last category into solar and wind. This time one can keep 13 countries or regions, omitting only Brazil, Central/South America and Russia for not having (or having communicated) any solar production.


Solar comes out at an abyssal low CF of 11.9%, whereas wind practically doubles with CF=23.6%

Our first conclusion comes as no surprise: nuclear really shines when it comes to availability and stability; both solar and wind can not deliver (at least for the moment) a reliable electricity production!

3. Why the big differences?

In most countries, solar and wind have an absolute priority to deliver their electricity into the grid, penalizing the non-renewables which must be turned down to adapt production to demand. If that political decision would not exist, and the free market rules would apply, solar and wind production would have still lower capacity factors. Regarding hydro, one clearly sees that countries like Canada and Brazil have a clear advantage in disposing of enormous hydro potential, which may have reached its peak for many reasons. The OECD hydro electricity production is practically at its maximum, so the CF of 40% will be impossible to increase in the future.

Fossil producers suffer the most from the prioritizing of solar and wind: nuclear facilities are difficult to rapidly ramp down or up, but gas turbines (and even some of the latest coal power stations) can do this quite easily, and so are often used to deliver peak load. Often a certain type of generation is put on hold for commercial reasons, so the capacity factors must be taken with a grain of salt: they not only reflect technical deficiencies or for instance lower wind resource, but also ramp down/up decisions taken at the big electricity exchanges (as the EEX at Leipzig) for monetary reasons.

Now the 100 billion dollar question: if you want carbon free electricity, which type of generation would you choose?


added 14 Oct 2015:

Read also this comment on declining wind capacity factors on the US West Coast

Cosmic Theories, Greenhouse Gases, Global Warming

August 27, 2015

Antero Ollila from the Aalto University (Finland) has published in the Journal of Earth Sciences and Geotechnical Engineering a very interesting paper titled “Cosmic Theories and Greenhouse Gases as Explanations of Global Warming” (link to PDF). His study concludes that “the greenhouse gases cannot explain the up’s and down’s of the Earth’s temperature trend since 1750 and the temperature pause since 1798”. I will comment briefly on this rather easy to read paper, which alas should have benefited from a more thorough proof-reading, as there are quite a few spelling errors and/or typos.

1. IPCC and competing theories.

The IPCC concludes in his AR’s that practical all observed warming since the start of the industrial age comes from human emissions of greenhouse gases; the cause of GW (global warming) clearly is inside the Earth/atmosphere system. Competing theories see (possibly exclusively) outside causes at work: solar irradiance, galactic cosmic rays (GCR), space dust, planetary positions… As the temperatures calculated by the IPCC climate models (or better, the mean of numerous GCM’s), deviate now markedly from observations, Ollila writes that the “dependence of the surface temperature solely on the GH gas concentration is not any more justified”.

fig1In this figure (fig.1 of the paper) the blue dots represent the temperature anomaly calculated using the IPCC climate sensitivity parameter, and the blue line the CO2 induced warming postulated by the Myhre et al. paper. The red wiggly curve are the observed temperatures (t. anomalies): the huge difference with the IPCC dot in 2010 is eye watering!

2. The outside, cosmic  models.

Ollila studies 4 cosmic models (which he blends into 3 combinations): variations of TSI and solar magnetic field, GCR, space dust and astronomical harmonics , as proposed by Nicola Scafetta. What many of these causes have in common, is that they could influence cloud coverage: the variations of cloud percentage is the elephant in the room! One percent variation in cloud cover is assumed to cause 0.1°C temperature change. Satellites shows that cloud coverage has varied up to 6% percent since 1983, which would explain a 0.6°C warming.

Combining space dust, solar variations and greenhouse gases together, he finds the following figure, extending to 2050 (fig.8 of the paper):

Here the red dot shows the average warming in 2010 given by the mean of 102 IPCC climate models; the black curve represents Ollila’s calculation. This figure shows, as many other authors predict, a (slight) cooling up to 2020, and then a 30 year period of practically no warming.

In another try, Ollila left out the putative influence of the increasing GH concentration. His justification are famous papers by Dr. Ferenc Miscolszi, a former NASA physicist, where this author proposes the theory that the impact of an increase in anthropogenic greenhouse gases will be cancelled out by a drying of the atmosphere (i.e. a decrease of absolute water vapour content). Miskolszi is able to reconstruct the past temperature variations beautifully, so this “outlandish”  theory about a saturated greenhouse effect should not simply be discarded or ignored (read comments here and here).

This gives the following figure (fig.9 in the paper), with the black curve corresponding to the output of the calculations including only the SDI (star dust index) and TSI (total solar irradiance).


Now look at this: Ollila’s prediction of a coming longer lasting cooling period is nearly identical to the predictions based on the current (and next) very weak solar cycles !!!

3. The crucial role of water vapour

This whole paper stretches again and again the importance of getting the vapour content of a future climate right: the IPCC still assumes a constant relative humidity, i.e. an increasing water content with rising temperatures, and as a conclusion a positive feedback of the CO2 induced warming. Observations show that this has not been the case: the total water content of the atmosphere has not increased, as shown on this graph from http://www.climate4you.com (upper blue curve):


4. Conclusion

This is a paper I urge you to read. It clearly shows that climate science is far from settled, and that the naive, drastic and hurting climate politics proposed by EOL (end-of-life) presidents or advocacy groups could well try to influence a parameter (CO2) which has only a minor influence: this means much pain for very little or no gain!

Your smartphone is radioactive!

August 17, 2015

Nuclear energy and all thinks related to radioactivity have nowadays a bad press in Europe; few people remember their high-school physics with experiments on radioactive decay, and hopefully some information on the ubiquitous radioactive radiations that are a part of nature since the beginning of our planet. Decades of scare stories. semi-truth and abysmal lies have fostered a generation in Germany that thinks nuclear emission-free energy is outdated, and that radioactivity, where it exists, must be avoided like hell (or forbidden by the government :-))

It may come as a surprise that your humble smartphone that you use so frequently is a radioactive gadget. I learned this after reading an excellent article by David Jones at the website Brave New Climate.

1. The ITO touchscreen

All smartphones and tablets use touchscreens, which are one of the principal causes for easy use.

This picture (adapted from www.ti.com/lit/ah/slyt513/slyt513.pdf) shows that the principal element are the two ITO (indium tin oxide) sheets: these are transparent foils covered with a very thin layer of indium and tin oxide. Indium is element number 49, and the isotope used here is the most abundant I115. This isotope is a beta- emitter (it emits electrons from its nucleus, and converts to Sn115, which is tin.)

The energy of these electrons is rather small (495 keV), and the half-live of the indium is huge: 4.14*10^14 years! Indium is the 65th most frequent metal in the Earth crust, where it is found at a very small concentration of about 160 ppb (parts per billion). The minable world reserve is estimated at 6000 to. With steady increasing use in electronic devices and wind-mills, it may become a bottleneck for further development.

2. Measuring the radioactivity of an Iphone 4

Can the radioactivity of the Iphone touchscreen be detected? To answer this question, I put up a quick experiment, using a semi-professional Geiger counter, the INSPECTOR from S.E. International. This instrument has a very large pan-cake Geiger tube of about 48mm diameter; this means it is very sensitive even to small radioactivity levels. The picture shows the back-side of the INSPECTOR, with the wire mesh protecting the counter tube.


The experiment was done in two steps, each taking 10 minutes: first I ran the counter positioned left to the Iphone (which was switched on during the full experiment), and noted the minimum and maximum of the readings (there is about one reading every 2 seconds). Here is a picture, the counter showing a reading of 0.161 uSv/h, close to the maximum.


Secondly, I put the counter on top of the Iphone, so that the pancake Geiger tube covered the screen. The next picture also shows a reading, also close to the maximum of that part.


Here the results (all in uSv/h):

Background:        minimum = 0.065   Maximum = 0.167

On top of screen: minimum = 0.161  Maximum = 0.275

We note that the second range begins practically where the first one stops: the minimum radiation on the screen is equal to the maximum of the background, and the maximum of the screen exceeds the background maximum by 65%. If we use the mid-points between minima and maxima (116 and 218) as relevant indicators, we see that the touchscreen increases ambient radioactivity levels by 88% !

This means that the tip of your finger is exposed to about two times of what is the normal background radiation in Luxembourg.

Should you be afraid? Yes if you have been brainwashed to believe that all  radioactivity is dangerous! No if you remember your physics teacher and have kept a modicum of common sense!

PS1: When the Geiger counter is put on the backside of the Iphone, readings are similar to the background: the beta radiation does not cross the phone’s case. You may want to put your phone in the shirt pocket with the screen facing out :-))

PS2: There are alternatives to the use of indium in the laboratories: graphene, carbon nano-tubes etc. are some potential candidates. They will most certainly be used in the future, when demand makes Indium (now at ~800 US$/kg) too expensive. So, when will Apple launch with great fanfare the non-radioactive Iphone model?


Get every new post delivered to your Inbox.