Welcome to the meteoLCD blog

September 28, 2008

BadgeExcluThis blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.

Heat stress: when warm is too hot!

June 30, 2015

As a heat wave is expanding through Western Europe (and Luxembourg), I would like to give some comments on the heat stress that is measured at meteoLCD since May 2001 (see paper by Charles Bonert and Francis Massen here). We are still the one and only station which shows this important parameter in real live time on the web.

1. Some technical details on the apparatus.

The ISO standard defines how the heat stress is measured:

WBGT_instruments(figure from here)

The globe temperature is the highest under usual sun-shine conditions , the dry air thermometer gives an intermediate temperature and the wet bulb thermometer shows distinctively lower values. The next figures gives the situation today (30 June 2015, 14:00 UTC) at Diekirch:

WBGT_30Jun2015

If we take the last rightmost values , we have GlobeT = 40°C, DryT =30°C and WetT = 22°C. Using the formula given in the figure we get WBGT = 0.7*22 + 0.3*40 + 0.1*30 = 26.4 °C, shown by the last red point.

The differences between the three thermometer readings are enormous, and the very low wet bulb temperature shows how efficient evaporation is for cooling. The next picture shows the wet bulb device as used at meteoLCD: it is essentially a Pt100 sensor covered by a cotton wick whose other end plunges into a pot of distilled water. Two times a day this water reservoir is filled by a peristaltic pump. We use distilled water to avoid hardening of the wick by the dissolved lime which is abundant in tap water. A grillage (not shown here) is needed to keep thirsty or curious birds from picking at or stealing the wick (yes, this happened several times).

Wetbulb_meteoLCD

2. When is warm too hot?

Normally, the body temperature (measured in the rectum) should not exceed 37°, with 38°C set as the upper limit. Too hot is a situation, where the WBGT pushes the internal body temperature above this limit. Now, depending on physical activity (and clothing), this limit is reached sooner or later. A heavy worker, or a soldier making strong physical efforts (in heavy clothing) will reach this dangerous situation much earlier than a tourist resting on the sea-shore. The metabolic rate gives in W the heating power produced by the working body. For a body at rest it is about 65 W, for hard work it can exceed 300 W. So compared to the man-at-rest,  a worker will reach the WBGT limit much sooner, as shown by the following figure (same ref. as above):

WBGT_reference_values

A difference is also made between a person acclimatized to the warm situation, and one which is not.

3. An example

Now lets take a person riding a bicycle at about 38 km/h. The corresponding MET (metabolic rate) is according to here about 5, which corresponds to 5*65 = 330 W. Using the above diagram for the unacclimatized person (remember: this is the first day of a heat wave!) we see that the heat stress limit is about 24°C: so a bicycler starting at 14:00 UTC (16:00 local time) exceeds the limit (as the WBGT is 26.4) and puts himself at the risk of a heat stroke.

May I suggest that he should make his cycling much earlier, for instance at 06:00 UTC (08:00 local time) and finish definitively at 10:00 local time.

Fact is that more people die from cold than from heat, but nevertheless heat can be an insidious danger. Usually normal common sense is all what is needed to remain safe, and maybe this blog comment will be of some help!

The solar influence on climate

June 28, 2015

1. The “Consensus Science” ignores or belittles the solar influence.

A furious debate is going on since many, many years on the influence of solar variations on global and regional climate. We all know that solar irradiance varies periodically in 11 years, its magnetic field in 22 years, and that many other periodic variations can be found (for instance the Suess cycle of 211 years , the millennium cycle explaining the Minoan, Roman and Medieval warm periods, and the very long Milankovitch cycles which rhythm the great ice ages).
The total irradiance =  the power sent by the sun to the earth varies little over one cycle, about 1 W/m2 (at the top of the atmosphere, surface perpendicular to the rays) from maximum to minimum. This must be compared to the ~1366 W/m2 mean irradiance. The “consensus” climatologists and the IPCC insist that this is too little to explain for instance the 0.8°C warming observed during the last 100 years, and is completely swamped by the radiative forcing of our greenhouse gas emissions (estimated at ~2 W/m2 for CO2 alone when the year 1750 is taken as zero).  What this consensus-science ignores is that the UV irradiance during a solar cycle varies much more. and has many indirect and possible amplified consequences due to ozone heating and influence on the great oceanic oscillations (see here).

Also ignored are what we know from history: periods of low solar activity (as the Maunder minimum during the first part of the 17th century, or the Dalton minimum around 1820) were much colder (by 0.4 to 0.2 °C) and are described by historians as periods of famine and social unrest due to bad agriculture productivity.

2. The new Ineson et al. paper

Sarah Ineson et al. have published the 23th June 2015 an interesting paper titled “Regional climate impacts of a possible future grand solar minimum” (Nature communications, open access). This paper is interesting not for the usage of climate models (we all know how unreliable these can be), but for the steps made in acknowledging what climate realists have said since many years:

a. the actual decline in solar activity is faster than any other such decline in the past 9300 years

b. this decline may lead to Maunder Minimum-like conditions with a probability of up to 20%

c. recent satellite data show that  the variability of UV irradiance could be considerably larger than given by previous estimates

Their modeling exercise (EXPT-B) suggests a regional winter-time cooling for  Northern Europe and East-USA of about 0.4 °C

Ineson_fig2_EXPT_B

This figure of the annual mean temperatures shows the solar induced cooling that touches nearly every part of the globe (under the hypothesis that the ongoing quiet solar situation will have the same effects as it had during the Maunder minimum).

This cooling could begin to start around 2030 and continue up to 2100 !  If you think at the coming Paris COP in December 2015 where all countries will be coaxed into binding warming lowering policies, this new paper should make ripples in the naive enthusiasm of the anti-warming advocates.

Could it be that our rising CO2 emissions, besides their positive influence on plant productivity and planetary greening, will be our best insurance against a fall back into a Little Ice Age?

____________________________

Read also these contributions in Climate Dialogue about what will happen during a new Maunder Minimum.

SMARTMETERS

June 23, 2015

A couple of weeks ago I sent to the Luxemburger Wort a letter with some comments on the introduction of  smartmeters (or smart meters = intelligent electricity counters), which will shortly begin here in Luxembourg. The client has no other choice than to accept!
This letter has been published in an abridged form last Saturday (20 June 2015) in the Luxemburger Wort at page 24. Here is the full text (in German) I wrote on this subject:

_____________________________________________________________________

Smartmeter: naïve Begeisterung?

Francis Massen

Die erzwungene Einführung des intelligenten Stromzählers (Smartmeter) wird in den Medien in höchsten Tönen und kritiklos gelobt. Natürlich ist es nützlich und bequem, wenn der Stromlieferant den Zählerstand aus der Ferne ablesen kann, und wenn eine digitale Anzeige eine genauere Angabe des augenblicklichen Verbrauchs ermöglicht als dies die rotierende Scheibe des alten “Ferraris”-Zählers kann. Verschwiegen werden jedoch quasi systematisch mehrere schwerwiegende Konsequenzen dieses neuen Zählers:

  1. Der Smartmeter (in Frankreich auch als “compteur mouchard” bekannt) liefert ein lückenloses Bild unserer Lebensgewohnheiten, da die fast minütliche Abfrage des Zählers ein äusserst klares zeitliches Profil des Stromverbrauchs erstellen kann.
  2. Die traditionelle, einfache und leicht verständliche Tarifierung wird höchstwahrscheinlich sehr bald durch stark schwankende Preisperioden (eventuell im 10-Minuten-Takt) ersetzt, welche den Kunden zu einem Verbrauchsmuster zwingen sollen welches nicht seinen Wünschen, sondern denjenigen des Lieferanten entspricht.
  3. Der Smartmeter ist das unversichtbare Tor zum “intelligenten Netz” mit seiner DMS (Demand Site Management = Verbrauchs-Steuerung). Vorbei die Zeiten wo die über den Panzerkasten fest eingestellte maximale Stromleistung ohne wenn und aber jederzeit verfügbar war: nun sollen Haushaltsgeräte übers Netz freigeschaltet werden können, wenn dies vom Provider erwünscht wird (die Pille wird mit einem billigerem Tarif versüsst) und/oder die verfügbare Leistung wird zeitweilig zwangs-gedrosselt.

Die extravaganten Energieersparnisse welche oft das Hauptmotiv der Smartmeter sind, haben sich in den Ländern wo diese Geräte schon länger oder als Test eingeführt wurden, als Trugschluss erwiesen:.So zeigt eine Studie des Fraunhofer Instituts von 2011 über deutsche und österreichische Haushalte dass die Stromerparnis im Schnitt nur 3.7 % beträgt.

Diese “guilt-meters” (“Schuld-Meter,”) wie Prof. Woudhuyzen sie in einem Artikel des online Magazin WIRED genannt hat, können selbstverständlich gehackt werden (so vom Chaos Computer Club vorgeführt) und erlauben einen Eingriff in die Privatsphäre welche in einem Bericht der Universität von Ryerson (Kanada) so formuliert werden: “Smart appliances offer utilities the opportunity to control areas of life that Courts have considered to be private and intimate”.

Da die grossen Beraterfirmen ja jetzt offenen Zugang zu allen unsern Entscheidungsebenen haben, will ich mit einer Studie von Ernst & Young abschliessen. Im Bericht von 2013 “Kosten-Nutzen-Analyse für einen flächendeckenden Einsatz intelligenter Zähler » steht ganz deutlich dass: « Die von der EU angestrebte Rollout Quote von 80% bis 2022 über eine allgemeine Einbauverpflichtung… ist für den Grossteil der Kundengruppen wirtschaftlich nicht zumutbar ».

PS: Eine ausführlichere Diskussion (mit allen Referenzen) wurde in den APESS Récré No.28 (2014) unter dem Titel “Compteurs et réseaux intelligents, clients impuissants » vom Verfasser publiziert.

____________________________________________________________________

If you have a comment on this subject, please feel free to give it here! If you which a copy of the APESS text, please ask for it.

Is there an upper limit for wind and solar grid penetration?

June 7, 2015

windwahn

1. Germany’s negative electricity prices.

We often read in the press that German wind and solar electricity exceeds the demand, and must be exported at very low or even negative prices. Agora Energiewende writes in June 2014 that the situation of negative prices becomes more frequent, as the conventional fossil and nuclear producers can not diminish their minimum production (the “spinning reserve”) below 20 to 25 GW (often also called the “must run minimum production”). If all things remain as they are, Agora predicts more than 1000 hours per year of negative electricity prices for 2022 ! But they also say that up until know, renewables never were able to produce more than 65% of the demand, even during peak production periods. The excess in production leading to negative prices is caused by the impossibility of intermittent renewables to guarantee the needed electrical power at every moment of the year. Conclusion: wind and solar (the main renewables) are unsteady (and often unpredictable) producers which absolutely need spinning reserve to have year round electricity availability.


2. Is there an upper limit for renewable producers in a stable electrical grid?

This question has been researched by Jesse Jenkins (MIT) and Alex Trembath (from the Breakthrough Institute). They came to a very easy to memorize rule of thumb: “When wind and solar work without subsidies (as do other producers), the maximum amount of their part in the total power production of the grid is equal to their capacity factor”.

They give the following graph from another publication showing the decline in the value factor of Wind and Solar electricity with increasing market share in Germany (boxes added by me):

value_factor_wind_solar_GermanyThe value factor = (market price from wind&solar generation)/(average market price).  The negative trend for the solar electricity is especially sobering.

Conclusion: Economics, and not so much technology impose an upper limit for the integration of wind & solar electricity.

Let me give two examples:

A. Rheinland-Pfalz (the German Land bordering on Luxembourg) , 2013 (see link):

Production from wind and solar:  3041916 MWh and 1369808 MWh

Capacity factors: 0.151 for wind and 0.092 for solar

Fraction of the total power produced: 0.203 for wind and 0.091 for solar

Conclusion: Wind production exceeds the limit given by the rule of thumb, and solar power has reached its maximum.

B. Luxembourg, 2013 (see link1 and link2):

Production from wind and solar: 83027 MWh and 73738 MWh

Capacity factors: 0.162 for wind and 0.089 for solar

Fraction of total power produced: 0.029 for wind and 0.026 for solar

Conclusion: Wind and solar productions are both well below the rule of thumb limit.


3. Two other papers/comments on this problem

Proteos writes in his French blog: “Les subventions à l’éolienne et au solaire sont parties pour durer”:

He writes that wind and solar have a marginal zero price (as wind and solar energy has no price), so on average they get a price which is lower than the average market price, and with increased renewable penetration, this price is falling (picture shows German situation):

price_electricity_versus_wind_production_Germany_2013 Without subsidies, these renewable producers will go out of business, as will the conventional producers which deliver the base-load and spinning reserve. Finally all power producers must be subsidized to have a working electrical infrastructure and power.

JM. Korhonen writes in his blog: About 20% of wind and 74% of solar production are worthless, and under a free market, the renewables revolution will stop dead on its tracks once peak production reaches demand. He also is skeptic of the much touted “demand site management” (to be introduced with the smart meters), which will not avoid PV hitting a wall.

4. A limit imposed by material requirements.

Korhonen cites a 2013 paper by Vidal et al. published in Nature Geoscience with estimates of the extraordinary huge amounts of concrete, aluminium, copper and glass needed by wind and solar, if 25000 TWh world production would be reached in 2050. From 2035 on these requirements would outstrip the year 2010 total world production. The next figure is self-explaining:

mining_requirements_renewables

Note that despite all the hulla-bahoo about uranium mining nuclear power comes out best!

5. Overall conclusion

Let me give this in the words of Korhonen:

“In conclusion, we may very well have too much of a good thing. And this is something that bears remembering the next time someone tells you that renewable overproduction is not a problem, or that renewables are reducing electricity prices and making existing plants uncompetitive. Or applauds, when 50% (or some other figure) of daily electricity production is met from renewable sources.”

As so often, reality bites harder than the teeth of naive greenies!

Does last century warming exceed natural variations?

May 23, 2015

Dr. Philip Loyd from South-Africa has published in Energy & Environment a short but easily understandable paper trying to ask this question (link to abstract). He took several reconstructed temperature series reaching back up to over 8000 years, i.e. covering the Holocene period. An example shown in the figure below is the Gips-2 reconstruction from a Greenland ice core, where temperatures have been deduced from an analysis of the Ar (Argon) and N2 (Nitrogen) isotopes.

Loyd_GIPS2_temperatures

First he de-trended the data by subtracting the best polynomial fit (which was a linear one in this case); than he calculated the differences from temperatures 100 years apart, and finally calculated the statistical standard deviation of this ensemble. The following figure gives the result for the 4 series that he used:

Loyd_standard_deviations

Taking these 4 results and computing the average gives 0.98 (rounded to 2 decimals) with an uncertainty of 0.27.

So the historical data show that temperatures fluctuate naturally by ~0.98 °C during one century; as the Hadcrut3 series for global temperatures from 1900 to 1999 gives a warming of 0.7°C, this number is smaller than the natural centennial variability. The author concludes prudently “the signal of anthropogenic global warming may not yet have emerged from the natural background”; which means in very simple words that human caused centennial warming (by greenhouse gas emissions for instance), if it exists, does not exceed for the moment natural temperature variability.

The most controversial points in climate science

May 7, 2015

Prof. Judith Curry (GeorgiaTech) has a very interesting comment in here outstanding blog Climate Etc.

A journalists asked her “What are the most controversial points in climate science related to AGW (anthropogenic global warming)?”
Prof. Curry gives two very simple, easy understandable answers:

1. Has the warming since 1950 been dominated by human causes?
2. How much will the planet warm during the 21th century?

The IPCC continues to ignore the importance of natural causes for climate change (and global warming), putting as an act of faith that nearly all changes are caused by human activity. Some dubious graphs appeared in the AR’s and others showed that only when climate models take account of green house gas emissions (actually of atmospheric CO2 mixing ratio), observations and models become compatible:

Climate_Change_Attribution(“Climate Change Attribution” by Robert A. Rohde)

This figure ignores the fudge factors (=parameters) introduced in the models until their result becomes similar to the observations. This technique is more related to curve fitting than to an understanding of all the physical causes of the observed temperature variation.

The insufficient understanding of the natural causes of climate change remains the major obstacle for believing the IPCC so-called “consensus”. We are still largely ignorant about the magnitude (and possibly even the sign) of the influence of factors as cloud cover, solar variation (total irradiance, UV changes, possible amplification of small natural changes). What we are sure is that the climate models overstate enormously the observed global temperature increase during the last 15 to 18 years (read this comment by Fyfe et al.):

cmip5-90-models-global-tsfc-vs-obs1

This figure (link) documents clearly that climate models should not be used as a basis for political decisions: the green curve represents the global temperature anomaly according to Hadcrut4 (land and ocean based weather stations) and UAH (satellite data) up to 2013. It is remarkable how far the different models deviate from one another: the enormity of the differences makes the suggestion that the average ( the black line) somehow represents the “truth” absolutely questionable. Are our political deciders aware of  this?

Global temperature trends according to UAH

April 29, 2015

The researchers of the University of Alabama in Huntsville (UAH) (mainly John Christy and Roy Spencer) are one of the two crews that analyze the MSU (Microwave Sounding Unit) data delivered by about 16 successive or simultaneous satellites orbiting the earth since December 1978 (the other group is the RSS crew). The UAH people now have published version 6.0 of their dataset, which differs in many points from the previous version 5.6, as explained by Dr. Roy Spencer in this report.
The report contains an interesting figure which gives the decadal temperature trends (in °C/decade) for different global regions, and also making a distinction between land and ocean regions.

UAHNCDC_trends_v6-vs-v5.6-thru-Mar-2015 We see that the Antarctic has not warmed at all, and the global oceans (which might represent the best estimate for global warming) by about o.o8 °C/decade, which extrapolated would mean a warming of about 0.8°C at 2080; clearly nothing to be afraid of!

The Arctic oceans show the greatest trend (about 0.27 °C/decade), and it should be noted that the new reanalysis has cut the previous trend nearly in half!

Also interesting are the new data for the North and South hemisphere oceans (points Nhemis_O and Shemis_O): both trends are astonishingly close (approx. 0.9 and 0.75 °C/decade).

This figure does not inform on the global temperature hiatus seen since about 1998; we have to wait before the UAH people will publish the relevant data.

Floods and droughts

April 18, 2015

I continue the discussion on natural disasters using the handy graphing feature of the em-dat website of the UCL (Université Catholique de Louvain).

First the graph of floods (the left axis shows the yearly numbers):

em_dat_floods There clearly is a rise over the full time span; but also clear is that the period 2000-2014 has not seen a continuous increase, but more a slight rise and a return in 2014 below the 2000 situation.

If we look for the total number of deaths caused by these floods, there clearly is a case for optimism (left axis = deaths in thousands):

em_dat_floods_deaths

The post 2000 casualties are not higher than those of 1960, where the reported number of floods was much lower!

The droughts statistics show much more variability:

em_dat_droughts

Here one can not see a spectacular rise, but a lot of inter-annual variability. The exceptional high peak corresponds to 1981, the last maximum from the right to the year 2000. Since that year, the global number of droughts is decreasing (which does not signify that several regions, as parts of California, do not suffer from an ongoing severe drought).

The number of deaths are astonishing small, and show no tendency:

em_dat_droughts_deaths

What is the conclusion of this little exercise? It is not correct to state that natural catastrophes of floods and droughts are continuously increasing thanks to an ongoing global warming. This litany is dear to many environmentalists and politicians, whose agenda is impervious to real data.

EM-DAT, a database of disasters

April 17, 2015

em-dat

Discussions on climate change always  come to the argument that natural disasters like flood, drought, heatwaves etc. are on the rise, due to (anthropogenic) climate change. It is often difficult to have correct numbers at hand, so the Belgian EM-DAT (the International Disaster Database, a work of the Université Catholique de Louvain, UCL)) is a tremendous help to deliver correct data. EM-DAT considers natural as well as technological disasters. The former are those that will be of interest here.

As the discussion on climate mostly considers the global impact, let us just look how floods, droughts, heat- and cold-waves have changed since 1960. The Annual Disaster Statistical Review 2013 begins its summary with: “In 2013, 330 natural triggered disasters were registered. This was both less than the average annual disaster frequency observed from 2013 to 2012 (388) and represented a decreased in associated human impacts of disasters which were, in 2013, at their lowest level since 16 years.”

I added to fig.1 of this report the trend line (in red) which shows an average decrease of 5.3 occurrences per year since 2000.

em_dat_stat2013_fig1

Climatological disasters (extreme temperatures, drought and wildfires) went down from a percentage of 15.5 per year (2003-2012) to 10% in 2013.

I will continue these comments in the next days, time permitting. Meanwhile, go to this excellent website (http://www.emdat.be) and look for yourself at the trends under the “Disaster Trends” label.

Serge Galam for the dummies

April 3, 2015

serge_galam

In my previous blog “Climate modelling nonsense” I urged the hopefully existent visitor to read the excellent article of Serge Galam “Global Warming: the Sacrificial Temptation” which is available at arXiv and was published in 2007.

Firstly, Serge Galam is a physicist, with 2 PhD’s received in 1975 and 1981 at the universities Pierre et Marie Curie in Paris and the university of Tel Aviv. After some time spent in New York and the French CNRS, he joined the CREA of the famous Ecole Polytechnique. CREA stands for “Centre de Recherche en Epistémiologie Appliquée”, which means that Serge Galam has moved over to more philosophical problems, and may now correctly be called a philosopher. He firmly opposes the scientists who becoming politicized have abandoned the scientific method (in his book “Les Scientifiques ont perdu le Nord”, Plon, 2008) and remains very skeptical about man-made global warming or climate change.

So for those of you who never have time to read an article from start to end, let me just give here seven of what I find the most remarkable sentences in the cited article.

1. The debate about global warming has taken emotional tones driven by passion and irrationality while it should be a scientific debate.

2. In the past of human history, the identification of a single responsible of all the difficulties and hardships of a society has always produced huge human destructions.

3. The unanimity exhibited everywhere is indeed obtained by the exclusion of any person who dares to cast a doubt about the man guilt truth.

4. … science has nothing to do with neither unanimity nor the number of voters.

5. It is not the duty of the sceptics to have to brig a proof of whatever it is about which they are sceptical… Rather, it is up to the scientists making the new assertion who must bring the corresponding proog, in this case of human guilt.

6. In case the current climate changes have natural causes, focusing our entire efforts on a drastic reduction of anthropogenic carbon dioxide emissions, implying a suppression of advanced technologies, could leave us defenceless in the face of a newly hostile nature.

7. Most caution should be taken to prevent opportunistic politicians, more and more numerous, to subscribe to the proposed temptation of a sacrifice frame in order to reinforce their power by canalizing these archaic fears that are reemerging.

What a marvelous last sentence!  The German physicist Dr. Gert Weber from the Max Planck Institut gave in 1993 in his book “Der Treibhauseffekt” a similar conclusion: “Heute werden auf eine Weise Forschungsgelder verteilt und Berichte geschrieben, dass sich daraus eine positive Rückkopplungsschleife  bildet, die allen Beteiligten Gewinne abwirft. Die Wissenschaftler bekommen mehr Forschungsgelder, die Medien neue Empörungsgeschichten…, den Politikern erschliesst sich ihr Stimmenpotential.”

This said, I wish you happy, sunny and warm  Eastern holidays!


Follow

Get every new post delivered to your Inbox.