Welcome to the meteoLCD blog

September 28, 2008

blog-2018

This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to the questions of climate change, global warming, energy etc…

Advertisements

Fatal bio-energy decisions

January 22, 2018

Several times in the past I wrote on this blog about the problems of burning wood and of presenting this type of energy use as environmentally friendly and moral. The very EU friendly website Euractiv has a damning article in its Jan.8 2018 edition titled “The EU’s bioenergy policy isn’t just damaging the climate and forests, it’s killing people“.

Here is picture showing the fine particle emissions from different types of wood burning (link):

Compared to oil and gas burning, wood is a dirty, often even extremely dirty energy. An important part of EU’s bioenergy is wood, and the political Zeitgeist was to shift fossil fuel burning power stations to wood, like the infamous UK Drax power station, which burned 6 million tons of wood pellets in 2015. Estimations say that by burning wood this station emits 12% more CO2 than if it had  burned coal (link to picture):

The Euractiv article cites a study by Sigsgaard that estimates 40000 premature deaths caused every year in the EU by biomass pollution. Well, I am not a fan of these statistics, but it is clear that fine particle emissions by open wood fires are huge compared to those of the lamented Diesel cars. Curiously, a group of scientists published an article the 15th January 2018 in Euractiv pushing the need to increase biomass burning. Titled “Bioenergy at the centre of EU renewable energy policy” the 63 authors (yes: 63!) candidly write that “With regards to air quality, it is very difficult to identify the impacts of bioenergy combustion in isolation”. This is an absolute nonsense, as fine particle emissions from wood burning can be measured and compared to other sources, what has been done in many research papers.

The irony of the whole biomass problem is that bio-energy has been promoted by the “Über-greens” as one of the climate-saving politics; ill reflected and hastily promoted, this bioenergy now raises its ugly head and makes the ordinary citizen wonder if “expert advice” (like that of the 63 authors) really should be relied upon…

 

 

Extreme weather more frequent in first than in second half of last century

December 26, 2017

Ocean storm. Credit: Shutterstock

There is an interesting and very easy to read paper by MJ. KELLY (University of Cambridge, Department of Engineering) titled “Trends in Extreme Weather Events since 1990“, published in Feb.2016 in the Journal of Geography & Natural Disasters.

The essence of this study is the following: “A survey of official weather sites and the scientific literature provides strong evidence that the first half of the 20th century had more extreme weather than the second half, when anthropogenic global warming is claimed to have been mainly responsible for observed climate change”. This is a big problem for all those who tirelessly write about increasing extreme weather events and promote immediate decarbonization of our societies to avoid the impending climate disasters. It also is a stumbling block for the poor structural engineer who plans dams, dikes, bridges and other big constructions: should he  device stronger (and much more expensive) structures, as suggested by the climate alarmist community, or should he rely on the lessons of past history?

Kelly politely writes about “the disconnect between real-world historical data on the 100 years’ time scale and the current predictions”, when in effect the disconnect shows that the numerous mentions of “more extreme weather” caused by global warming correspond perhaps to some wishful thinking, but are not grounded in reality.

Very telling is the next figure (taken from a I. Goklany paper) which gives the deaths and the death-rate (in millions per year) from extreme weather events:

Clearly, if one would plan a trip back in the past, the period 1950 to 2000 would be much less dangerous! Kelly adds more graphs (all taken from official data sources) which confirm this situation.

He than reflects on a problem that is continuously debated in the climate realists circles: We should not naively trust the official data series, as many have been “doctored” and “re-doctored” over the time (the scientific correct expression is “homogenized”), practically always in a way that inflates the current warming. A very good illustration is a graph from a paper by Hansen (1984) and from a modern global temperature series (Nasa-GISS, the employer of Hansen) :

Where in the first paper 1980 was colder than 1940, the situation has flipped to the opposite!

Conclusion:

The conclusion is best given by the words of the author: “The lack of clarity about future extreme weather, after 20 years of intensive analysis of future climates is deeply worrying. There is nothing that emerges from references [1,2] that would require a significant refinement of the margins that have applied over the last half-century, and hyperbole is no substitute for hard facts in the engineering of the physical infrastructure.”

Air pollution: numbers and doubt

December 20, 2017

city_smog

Much has been written on air pollution and its health impacts, especially following the Volkswagen Diesel scandal and the new trendy enviro fashion of Diesel bashing. In this blog, I will make a few comments on the contribution of household wood burning, on the fluency of reference values and the extraordinary statistics of air pollution caused mortality. As a long time asthma sufferer I appreciate clean air above all; this does not give me license to take the hysteria train and to throw away all principles of scientific thinking.

  1. The climate-friendly wood burning

In a previous comment I showed that residential wood burning is a big contributor to PMx fine dust particles (PM10 and PM2.5) pollution. The 2016 air quality report of the EEA confirms this with the following picture:

household_wood_burning

Clearly the contribution of road transport is minuscule compared to that from wood burning; curiously it is the former that you find mentioned as a culprit in nearly all media articles, and the latter that is most often conveniently ignored. The percentage of 56% may well be too low as the EEA report mentions a new study that finds that “the residential wood combustion emissions (are) higher than those in previous inventories by a factor of two to three…”

Residential wood burning has been pushed by the “climate-friendly” agenda without any pause for clear thinking. As so oft when feelings dominate over intellect, the unintended consequences are spectacular!

2. The crux of the reference values

When you qualify a gas as a pollutant, it is important to know what the natural background values are. These values haven been christened “counterfactual concentrations C0” in the new EEA report. Curiously, this natural background has often been ignored, so that health related effects start with an impossible 0 concentration. This was the case for PM2.5, until the 2017 (preliminary) air quality report conceded that nothing (or not much) is known about the danger of levels lower than 2.5 ug/m3, so that this new level diminished the attributed mortality by a whooping 18%.
The same problem exists for many other pollutants: natural ozone concentrations may locally be much higher, and introducing a unique EU-wide lower threshold automatically pushes meridional sunny countries into the club of the sinners. The report candidly acknowledges this by ” O3 concentrations show a clear increase as one moves from the northern parts to the southern parts of Europe, with the highest concentrations in some Mediterranean counties and in summer time”.

3. The extravagant mortality numbers

400000 people killed by air pollution in Europe, 40000 deaths in the UK…. These are numbers repeated at nauseam by the copy/paste media without any clear reflection on their validity.

By digging deep, one nevertheless can find some clearer thinking. Let us start by a “factcheck” article by Greenpeace (yes!) titled “Are there really 40000 air pollution deaths a year?“. For instance the article recalls that in the cities where studies have been made on the danger of PM’s, values were never lower than 7 ug/m3 and the data do not show any danger coming from values below. Antony Frew, professor in respiratory medicine says that “the basic data does not say that 40,000 people have died”.

Another comment comes from the Winston Centre for Risk and Evidence Communication “Does air pollution kill 40000 people each year in the UK?“. Here we find the mind blowing statement that “COMEAP point out, for example, that 60% of the estimated harm in the UK is estimated to occur at levels less than 7 ug/m3 PM, and yet the US data provide no direct evidence of harm at this level”. (COMEAP = Committee on the Medical Effects of Air Pollution). The report shows this table from COMEAP:

COMEAP_measure_of_effect

Note the huge plausible interval for the life-years lost: the interval is abyssal, so that an estimate of 340000 borders on the nonsensical. The Winston comment concludes wisely by “There are huge uncertainties surrounding all the measures of impacts of air pollution, with inadequate knowledge replaced by substantial doses of expert judgement. These uncertainties should be better reflected in the public debates. In addition, the situation in the UK is not what we would usually think of as a ‘crisis’.”

4. What a change in a year!

As a last reflection I suggest the very good article by phys.org “NO2 – not as bad as we thought?” The article discusses a new technical report concerning the planned Clean Air Zones in the UK. This report finds that the damages caused by NO2 to the public health is 80% lower than the estimate in a previous report. This previous report assumed that for every 10 ug/m3 PM the mortality risk would increase by 2.5%; now this risk factor is down to 0.92%. When a new report changes the danger level given in a previous one by such an enormous percentage, our politicians would be well advised not to rush into hasty actions, and would wisely wait for things to settle down.

The dubious 2°C limit

December 16, 2017

No comment on the Paris (and newest Bonn) climate meetings are written without raising the warning finger that everything should be done to limit global warming at +°2C “above pre-industrial levels“. Sebastian Lüning and Fritz Vahrenholt have published a new paper in “Frontiers in Earth Science” that tries to find the origin of the number, and how scientifically correct is the baseline yielding this maximum permitted warming.

This paper is very readable; they are not the first to point to the astounding vagueness of the foundations of this internationally agreed limit, but their paper has the big merit of being short and easy to understand. I will discuss just some major findings, and suggest a careful lecture of the full paper.

  1. The sea surface temperatures (SST) of the last 10000 years.

The last 10k years correspond to the ongoing interglacial period. The (annotated) figure shows how the SST changed during that period:

If we take as a baseline the average SST temperature of the last 10k years (up to 1850, the year that is commonly taken as the start of the industrialization), we see that the current global temperature is just +0.4°C higher. The blue box shows that the baseline period for the Paris aim is about 0.35° lower that this average (which corresponds to a period with only natural climate changes).

Taking the Holocene average SST as a baseline, +2°C would allow a supplementary warming (compared to today) by about 1.6 °C !

2. The land + sea global temperatures of the last 2000 years.

The next figure uses both land and sea temperature reconstructions:

It is obvious that taking 1850 as a baseline is extremely dishonest, even silly: the end of the Litte Ice Age corresponds to the coldest period during the last 10000 years. How could any intelligent person choose a baseline that corresponds to an extreme value, and scientifically speaking is at least 0.6°C too low?

3. Conclusions

1. The Paris convention suggests that the pre-industrial climate was more or less constant, and that it is mostly human activity which caused the observed warming since 1850. This is blatantly false, as during the Holocene the climate was highly variable, and has reached or even exceeded the actual global temperatures.
The following pictures shows how the Greenland temperature changed during that period:

Holocene_Greenland_temperature

2. A big chunk of the observed +0.8°C warming (w.r. to the 1850-1900 period) is of natural origin.
3. Using a 2°C target could be valid, but choosing and imposing the coldest period of the last 10000 years as a baseline is silly, dishonest and unscientific.

 

_________________________

History:

  • added Greenland graph (16Dec2017, 15:26 UTC

Decarbonized France

November 10, 2017

The German Nuclear-Angst does regularly ignore the real de-carbonization of electrical power that France has managed to achieve. Look at the figure from RTE, giving the different parts in the French electricity generation of 2016:

So 91% of the French electricity had been carbon-free in 2016.

Now look at Germany, the poster-child in climate-scare and “Energiewende”:

Carbon-free German electricity is just 42.6% in 2016, after 27 years of “Energiewende” (and a cost of ~150 billion Euro from 2000 to 2016, and a total estimated cost of 520 billion Euro up to 2025):

So if you assume that the world really has to be de-carbonized as fast as possible, what technology would you choose for your electricity production?

The greening planet

October 13, 2017

the-green-planet-x

(link to picture)

Reading breathtaking horror stories about rising atmospheric CO2 levels would make mother Earth lol (laugh out loud), if this was possible. Far away from being a catastrophe, increasing CO2 has a demonstrably positive influence on the biosphere. A recent paper by Cheng. et al. in Nature Communications shows this again, using a new method. The title is “Recent increases in terrestrial carbon uptake at little cost to the water cycle” (link), and it has been published in June 2017.

  1. GPP and WUE

The minimum two parameters used to describe the state of the plant biosphere are GPP and WUE.  GPP = Gross Primary Production represents the plant-mass growth; it is measured in gC/(m2*year) = gram carbon per square meter and per year. Now all plant life needs water; normally the availability of water and more important, the efficiency in its use are limiting factors. WUE = Water Usage Efficiency quantifies this; the unit is gC/(mmH2O) = carbon uptake per unit of water loss (for instance gram carbon produced per mm rainfall).

We know since many years that higher CO2 levels reduce the opening of the leaf stomata and as a consequence the water loss by evaporation. So it really does not come as a surprise that WUE has risen during the period 1982-2011 (a period of increasing atmospheric CO2), the basis of the Cheng et al. paper. This figure documents this increase, as found by observations (red line) or calculated from a model used by the authors.

2. Trends in both GPP and WUE

GPP is not a consequence of WUE, but the next figure shows that both parameters (here given as anomalies) increase in step:

The common cause of these increases is atmospheric CO2, and the positive effect is nearly planet-wide, with very few exceptions:

Negative trends correspond to the yellow-red regions, clearly an absolute minority!

3. Conclusion

Rising atmospheric CO2 levels have increased plant production i.e. made the planet greener. On top of this first positive effect, the CO2 gas (which some imbeciles describe as “poisonous”) made the plants more efficient in their water usage: they grow better with less water, the overall ecosystem water use (E) remaining nearly constant!

The authors conclude that “Our results show that GPP has increased significantly and is primarily associated with an increase in WUE“. How is it that these positive aspects of changing CO2 levels are still silenced in the media and the political climate change Zeitgeist?

Recent methane rise mostly due to biogenic sources

September 16, 2017

There is an interesting new paper by Nisbet et al. in the AGU publication Global Biochemical Cycles titled “Rising atmospheric methane: 2004-2014 growth and isotopic shift” . The fossil fuel industry (oil extraction, fracking….) is often blamed for rising methane emissions, and this argument went somehow into limbo as the atmospheric mixing ratio, after a period of clear rising,  was stable for many years:

This picture documents the rise from 1984 to about 1999, the following plateau and finally a new lower rise from 2005 to 2015 (the lower plot shows the derivative = the change in mixing ratio per year).

One fingerprint in detecting the origin of the methane (from fossil fuels or from biogenic sources) is the isotopic composition: biogenic methane has a higher component of the 13C (carbon-13) isotope than the methane from fossil sources which are more depleted in 13C. Usually the isotopic fingerprint is given as delta_13C/12C in per mil (°/°°): the next figure (from Wikipedia) shows the exact defintion.

 

More negative values point to dominant biogenic sources, less negative values to fossil methane. For instance this paper gives a delta_13C/12C of about -60 for methane from ruminants (cattle) and marsh gas (wetlands). The next table (right column) has an overview from different sources:

Clearly methane from landfills or natural gas leaks and vents have a less negative delta_13C/12C.

The following picture from the Nisbet paper shows how this delta_13C/12C has evolved during the last 18 years:

 

CDIAC gives the series of 4 measurement stations (from North (Barrow) to South) which is consistent with the previous plots:

 

Clearly (in the 3 given regions) there was a general plateau until 2005, followed by a marked decrease. Nisbet et al. conclude that the dominant cause of this decrease was biogenic: greater rainfalls in the tropics increased wetlands, and helped increasing agricultural surfaces and livestock. But the contribution of the latter is estimated more gradual and lower, so that the main cause seems to be a meteorological driven increase in the tropical wetlands.

 

 

So what happened to the science?

August 11, 2017

There is an excellent guest comment today at the WUWT blog by John Ridgway. There are no graphs, but essentially very sound reflections on the social impacts on a science that gets politicized, as climate science has become.

Let me just cite a few of  what I think are the best remarks in this very readable article:

  1. I suspect the problem is that climatologists are making predictions that cannot be readily falsified through appropriate experimentation
  2. The lack of falsifiability sets the scene for the achievement of consensus by other means, resulting in a certitude that cannot be taken at face value
  3. it is a logical non sequitur to suggest that a model that performs well for the purposes of hindsight will necessarily be reliable for the purposes of making predictions.
  4. With the Hockey Stick to hand, the IPCC no longer needed to bury the uncertainty in the body of its reports, since the graph proved that the uncertainty simply didn’t exist…it is difficult to avoid the conclusion that the data had been mercilessly tortured for a confession
  5. the consensus, rather than being a result of minds being changed during debate and inquiry, instead emerges following a form of sociological natural selection
  6. in climatology the situation is worsened by a politically motivated denial of uncertainties and a lack of commitment towards openness and the reproducibility of results

Please read this comment with an open mind!

PS: here for your information how the Mann’s hockey-stick reconstruction differs from that of Loehle (source):

 

Wind and Solar: Intermittency, backup and storage (part 2)

August 9, 2017

In the first part of this comment I wrote that the study of F. Wagner on the 100% renewables aim of Germany’s Energiewende showed that this would need a massive blowup of the electrical power structure (about 4 times more capacity than the needed load has to be installed), and a non avoidable production of surplus electricity, which might become more a liability than an asset.
In this second part I will discuss some points of the second study by Linnemann & Vallana “Windenergie in Deutschland und Europa, Teil 1” published in the June 2017 issue of the VGB Powertech Journal (link1 to paper, link2 to additional slides). This interesting study only looks at wind power, insists on what is needed to guarantee reliability, and what has been achieved during the tremendous increase in German wind power from 2010 to 2016.

  1. Big increase in installed capacity, none in minimum.

The following slide shows the big increase in installed capacity during one decade: from approx. 27 GW to 50 GW:

Two facts are sobering:

a. the maximum produced power (during a short annual time-period) decreases from 81% to 68%, in spite of the nearly double installed capacity, and the more modern wind turbines (see blue curves).

b. the minimum power delivered remains constant at less than 1% of the installed capacity. In other words the guaranteed power available at every moment of the year is less than 1% of the installed capacity.

If one makes a statistical analysis, the distribution of the delivered power is very asymmetric and far from normal :

This histogram gives the percentage part of a certain delivered power during 2016 where the mean u is about 8.7 GW. The sum of all relative frequencies left to that mean ( i.e. the blue area left to the vertical u) is high, and corresponds to a 60% probability that the delivered power is low.

2. The capacity factor of the installed wind turbines.

Despite the doubling in installed capacity, the change to more modern and powerful wind turbines, and the increase in offshore wind parks (which in 2016 delivered 12 TWh out of a total of 77 TWh), the overall capacity factor remains depressingly low, and also low when compared to other European countries:

The large variations are essentially due to changing weather patterns: 2010 was a very low wind year for most of Europe. The long-time mean of the CF from 1990 to 2016 did increase by no more than 1% for the 2010-2016 period.

Compared to other European countries Germany’s wind turbines do a disappointing job (note that CF% = [Ausnutzung/8760]*100); the CF always lies close to the lower range.

3. How much backup power?

The VGB report is very clear: you need 100% of the installed  renewable wind capacity as backup. The cause is that the minimum guaranteed power (“Gesicherte Leistung”) is close to zero; the next table shows that this is the case for practically all countries, even those like Ireland or Denmark which are geographically privileged:

As the best windy places are mostly used, a bettering could theoretically be reached by modernization (“re-powering”) and smoothing by including all European producers in one big grid. The real data suggest that neither of these solutions is very effective; low wind areas often extend over a large part of Europe (wind is often strongly correlated over much of Europe).

4. Conclusion

The minimum delivered power during the 2010-2016 period is about 0.15 GW, which represents the displacing of conventional (fossil/nuclear) producers by the newly installed wind turbines. In other words, the doubling to 50 GW installed wind capacity has made only a ridicule low amount of 0.15 GW conventional electricity generators superfluous!

Wind and Solar: Intermittency, storage and backup (part 1).

August 7, 2017

A couple of recent papers/studies make a thorough analysis of the German Energiewende and the new problems caused by relying more and more on intermittent producers. The first paper is by Friedrich Wagner from the Max-Planck Institut für Plasmaphysik. Titled “Surplus from and storage of electricity generated by intermittent sources” it was published in December 2016 in the European Physical Journal Plus (link). The second is part 1 of a two part study published by VGB Powertech in June 2017. The authors are Thomas Linnemann and Guido S. Vallana, and the title is “Windenergie in Deutschland. Teil 1: Entwicklungen in Deutschland seit dem Jahr 2010” (link) .The link points to the original version in German, but an English translation will be available soon.

I wrote a comment on this last paper titled “Vom Winde verweht“, adding some reflections concerning the situation in Luxembourg; this has been published in the Saturday 5th August 2017 edition of the Luxemburger Wort, Luxembourg’s largest newspaper.

  1. A switch from demand orientated to supply driven.

One of the most important aspects in the rush to decarbonize the electricity sector is that a fundamental change is planned to enable the functioning of intermittent suppliers like wind and solar. The traditional electricity market is demand driven: the client has a certain maximum intensity hardwired in his counter or inbox (say 32 A or 64 A per phase, usually there are 3 phases); he can rely on this maximum which will be available at all time (but possibly at a different price according to a predefined peak or low demand period per day); the electricity supplier must do his best that the power asked for will be delivered at the correct voltage and frequency.
The new planned situation will see a swapping of the roles: the supplier decides what electrical energy he will deliver and at what price, and the consumer must accept. Smart meters allow very fine-grained changes of the tariff, which can be modified for very short time-segments; the maximum power can be throttled down if needed. All this is called DSM (demand side management), and practically robs the consumer of its freedom of  lifestyle or consumption pattern. All this because the extremely variable iRES (intermittent Renewable Energy Sources) can not guaranty a previously defined standard base-load. This supply driven “market” may recall to the older of us memories of the life in the former east-European communist states (like the DDR), where complicated 5 year state plans ruled the economy; it seems like an irony that such a life-style will be hyped as progressive by the iRES lobbyists.

2.  Unavoidable surplus and huge backup

F. Wagner’s paper gives some very interesting numbers, concerning an electricity market based on wind and solar alone. He assumes that fuels from biomass will be used for aviation, heavy machinery and big lorry transport, so biomass is not included among the future zero-carbon electricity market. Neither is hydroelectricity, which has practically reached a limit in Germany. He assumes that the future electricity consumption will be 500 TWh per year, which is a rather conservative (low) number if one thinks of the political push for electrical cars. A first conclusion is that in the wind/solar mix, solar PV electricity must not exceed about 20-25%, so there will be no equal sharing between wind and solar. The next figure (fig.2 of report added text-boxes) shows how this scenario, if it had been applied, would have worked out in 2012:

To guarantee supply, a huge backup infrastructure of 131 TWh must be installed (which corresponds to 73 GW installed capacity), and on top a yearly 131 TWh surplus energy will be unavoidable. The calculation show that when iRES sources contribute less than 25%, no surplus will be generated. In the 100% scenario, the unavoidable and strongly variable surplus which will quickly become a liability as there will be no consumer available to pay for it (note that negative price periods are becoming more and more frequent at the EEX exchange); this means that onshore wind turbines must be shutdown every year for a total period of about one month!

Speaking of surplus, the first answer of iRES advocates is electricity storage (in batteries, hydro or through chemical transformations). Wagner analysis covers short time day-storage solutions and long-time seasonal storage, which both will be needed. In winter, surplus is strong both during day- and nighttime, so a one day storage will not be enough. In spring, a daily storage solution would show a capacity factor (=efficiency) of 3%, which is ridiculously low. A seasonal storage solution which would avoid any backup infrastructure would demand an enormous  capacity of at least 100 GW. Nothing similar does exist, and no technological miracle solution for such a storage is in the pipe-line.

The conclusions of F. Wagner’s report:

  • a 100% iRES electricity production must install a capacity 4 times greater than the demand
  • if storage capacity will be delivered by pumped water storage, it must be increased by at least 70 times
  • the integral capacity factor of the iRES system will be 18%, and a backup system of about 89% of peak load must be installed
  • to nominally replace 20 GW nuclear power, 100 GW iRES power will be needed
  • an extremely oversized power supply system must be created
  • the Energiewende can not reach its coal of CO2 avoidance (“there is a clear conflict between political goal and applied method…overproduction by iRES may not be the right way to go”)

__________________________________________

to be followed by part 2 which discusses the PWG paper