This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.
(link to picture)
Many (or better practically all) “climate politics” are based on the outcomes of climate models; as these models predict nearly unanimously a future warming due to the expected rise of atmospheric CO2 concentration, their reliability is of a primordial importance. Naively many politicians and environmental lobbies see these models as objective products of hard science, comparable for instance to the many years-long proofed correctness of the structural physics of skyscrapers.
Alas, this is not the case: as the climate system is devilishly complex and chaotic, building a General Circulation Model (GCM) starting with basic physics laws is a daunting task; during its development, each model must take choices for certain parameters (their values, their possible range), a process which is part of what usually one calls “tuning”. The choices in tuning are not cast in stone but change with the modeling creators, with time and cultural/ideological preferences.
Frédéric Hourdin from the “Laboratoire de Météorologie Dynamique” in Paris has published in 2016 together with 15 co-authors an extremely interesting and sincere article in the “Bulletin of the American Meteorological Society” titled “The art and science of climate model tuning” (link) on this problem. I will discuss some of the main arguments given in this excellent paper.
- Observations and models
Hourdin gives in Fig.3 a very telling example how the ensemble of the CMIP5 models (used by the IPCC in AR5) differ in the evaluation of global temperature change, starting from 1850 to 2010 (the temperature anomalies are given with respect to the 1850-1899 average):
I have added the arrows and text box: the spread among the different models (shown by the gray arrows) is awesome, larger than the observed warming (given in the very warming-friendly Hadcrut4 series); even the “adjusted” = tuned variant (the red curve) gives a warming in 2010 that is higher by 0.5°C than the observations. We are far, far away from a scientific consensus, and decisions that ignore this are at best called “naive”.
2. Where are the most difficult/uncertain parts in climate models?
Climate models are huge constructs which are built up by different teams over the years; they contain numerous “sub-parts” (or sub-models) with uncertain parameters. One of the most uncertain ones is cloud cover. Just to show the importance, look at these numbers:
- the forcing (cooling) of clouds is estimated at -20 W/m2
- the uncertainty about this parameter at least 5 W/m2
- the forcing thought to be responsible for the post 1850 warming of about 1°C is estimated at 1.7 W/m2.Conclusion: the uncertainty of the cloud cover effect is 3 times higher than the cause of the observed warming!
Hourdin asked many modelers about what they think to be the most important cause of model bias, and they correctly include cloud physics and atmospheric convection, as shown in the fig.S6 of the supplement to the paper (highlights and red border added):
3. Are the differences among the models only due to scientific choices?
The answer is no! Many factors guide the choices in tuning; Hourdin writes that ” there is some diversity and subjectivity in the tuning process” and that “different models may be optimized to perform better on a particular metric, related to specific goals, expertise or cultural identity of a given modeling center”. So as in many other academic domains group-think and group-pressure do certainly play a strong role, showing a consensus that might well be due more to job security or tenure than objective facts.
This Hourdin et al. paper is important, as it is one of the first where a major group of “main-stream” researchers puts the finger on a situation that would be unacceptable in other scientific domains: models should not be black boxes whose outcomes demand a quasi religious acceptance. Laying open the algorithms and unavoidable tuning parameters (“because of the approximate nature of the models”) should be a mandatory premise. It would then be possible to check if some “models have been inadvertently or intentionally tuned to the 2oth century warming” and possibly correct/modify/adapt/abolish some hastily taken political decisions based on them.
A new paper from February 2017 has been published by Y.I. Stozhkov et al. in the Bulletin of the Russian Academy of Sciences. Here a link to the abstract (at Springerlink); the complete version is regrettably paywalled, but I was able to access it through the Luxembourg Bibliothèque Nationale.
The paper is very short (3 pages only), has no complicated maths or statistics and is a pleasure to read. The authors predict as many other have done before a coming cooling period; their prediction is based on two independent methods of assessment: a spectral analysis of past global temperature anomalies, and the observation of the relationship between global temperatures and the intensity of the flux of charged particles in the lower atmosphere.
- Spectral analysis of the 1880-2016 global temperature anomalies.
The paper uses the global temperature anomalies series from NOAA and CRU, computing them as the difference with the global average near surface temperatures between 1901 and 2000. Their spectral analysis suggests that only 4 sinus waves are important:
The general form is: wave = amplitude*sin[(2pi/period)*time+phase] with the period and time in years and the phase in radians; the authors give the phase in years, so you have to multiply by 2pi/period to obtain the phase in radians.
- series #1: amplitude=0.406 period=204.57 years phase=125.81*2pi/period (radians)
- series #2: amplitude=0.218 period= 69.30 years phase= 31.02*2pi/period
- series #3: amplitude=0.079 period= 34.58 years phase= 17.14*2pi/period
- series #4: amplitude=0.088 period= 22.61 years phase= 10.48*2pi/period
I computed the sum of these 4 series and merged the graph with the global land-ocean temperature anomalies from GISS; the problem is that GISStemp calculates the anomalies from the mean of the 1951-1980 period, so the concordance will suffer from an offset.
The authors write that spectral periods less than 20 years do not play an important role: this means that El Nino’s (roughly a 4 years period) are ignored, as well as non periodic important forcing phenomena like volcanic eruptions. The following graph shows my calculation of the sum of the 4 spectral components (in light blue) together with the official GISStemp series in red:
The fit is not too bad, but as had to be expected, misses the very high 2016 El Nino caused warming.
The authors are not the first doing a spectral analysis on the temperature series. N. Scafetta in his 2012 paper “Testing an astronomically based decadal-scale empirical harmonic climate model vs. the IPCC (2007) general circulation climate models” (link) gives the following figure of a good fit using 4 short period sinus waves (so he does not seem to agree with the Stozhkov on the non-importance of short periods):
Note that both models predict a cooling for the 2000-2050 period.
2. Cosmic rays and global temperature
We are now in solar cycle 24, one of the weakest cycles since ~200 years, as shown by the next figure (link):
A situation similar to the Dalton minimum during the first decade of the 19th century cold period seems to unfold, and all things being equal, would suggest a return to colder than “normal” temperatures. But as Henrik Svensmark has first suggested, the sun’s activity acts as a modulator of the flux of cosmic charged particles, which create in the lower atmosphere the nucleation particles for condensing water, i.e. cooling low atmosphere clouds. In this paper the authors compare the flux N (in particles per minute) measured in the lower atmosphere (0.3-2.2 km) at middle northern latitudes with the global temperature anomalies dT: the measurements clearly show that an increase in N correlates with a decrease in dT. This is an observational justification of Svensmark’s hypothesis:
As this and the next solar cycle are predicted to be very low-active, this observation is a second and independent prognosis of a coming cooling (you may want to look at my older presentation on this problem here).
I like this paper because it is so short and does not try to impress the reader by an avalanche of complicated and futile mathematics and/or statistics. The reasoning is crystal clear: both the spectral analysis and the to be expected rise in the flux of charged particles suggest a future global cooling for the next ~30 years!
In the first part of this comment on the Energiewende I showed that its primary goal to restrict the CO2 emissions has not been attained.
In this second and last part I will concentrate on the costs of the Energiewende.
2. The costs of the Energiewende
Let us remember that the financial aspect of Energiewende is a system of subsidies going into many directions: those who install solar PV or wind-turbines (for instance) receive a subsidy for the installation costs; they are granted priority in feeding electricity into the grid, and they are paid for this feed-in a tariff largely in excess of the market price. The McKinsey report writes: “Die aktuell vorliegenden Zahlen belegen, dass die bisherigen Erfolge der Energiewende überwiegend durch teure Subventionen erkauft worden sind “.
The costs for the individual household rises continuously. as shown by the next graph:
The increase with respect to the 2010 situation is a mind-blowing 3.35 factor; as the kWh price will probably reach or exceed 0.30 Euro in 2017, most experts agree that the yearly supplementary cost per 4 person household will be higher than 1400 Euro (which has to be compared to the 1 € price of one ice cone per month/person that minister Jürgen Trittin announced in 2004!).
The subsidization has transformed a free market into a planned economy, with many unintended nefarious consequences:
At certain times the combined solar+wind production is excessive, and leads to negative prices (the big electricity companies must pay their (foreign) clients to accept the surplus electricity:
The “redispatch” interventions to stabilize the grid and avoid its collapse rise by a factor of 10 from 2010 to 2015 (link); the costs rise practically more than the doubling given by “Moore’s law” during 2013-2015 (link):
Actually, if one includes not only the costs of not-needed electricity, but also those of the redispatch (changing the provenience of the electrical energy) and the mandatory reserve capacity, we are close to a doubling in the years 2011-2013-2014-2015, as shown by the “Insgesamt” total in million Euro (link) :
The McKinsey report sees grid management costs quadruple during the coming years and rise to over 4 billion Euros (4*10^9) per year. A recent article in The Economist titled “Wind and solar power are disrupting electricity systems”. Here three main problems are cited: the subsidies, the intermittency of wind and solar and finally their very low production costs which make traditional power stations (urgently needed for base-load, backup and grid stabilization) non-economic: without state subsidies nobody will built these power stations, so that the circle of state planning (as we know it from soviet times) is closed.
3. The job problem
Renewables have always been hyped for their job potential, but the reality in Germany is quite different: 2016 was the fourth year with falling job numbers in the renewable industry, and when this trend continues the aim of 322000 “green” jobs will not be attainable in 2020. Equally disquieting is that 2016 is the first year showing a decline in the jobs in the electricity-hungry industry. An older (2011) AEI report concludes that green jobs only displace traditional ones, and that in Spain each green megawatt installed destroyed 5.28 jobs. It seems that the whole Energiewende depends on its foundation of big subsidies (either direct or indirect) and state planning and steering. In a free market, the rise of “renewable” electricity would not be nil, but be much slower. The subsidies have spoiled huge parts of the industry, and they see these subsidies paid by all the citizens as their due.
Fritz Vahrenholdt has published a paper at the GWPF titled “Germany’s Energiewende: a disaster in the making”. He could well be right.
A new report from McKinsey on Germany’s Energiewende (= energy transition policy) has been published in the series “Energiewende-Index”. This very transparent and non-emotional report makes for a good reading: the main lesson is that the costs of the Energiewende (which has driven German household electricity prices 47.3% higher than the EU average) will continue to rise, and that the political deciders seem to ignore the future financial burden.
In this blog, I will comment using only numbers from well-known institutions (as the Dutch PBL report “Trends in global CO2 emissions 2016“, Fraunhofer ISE, Agora Energiewende etc.), and let these numbers speak. Let me just give my personal position on renewable energies: In my opinion, every country should diversify as much as possible its energy sources, and that means that wind and solar should not be brushed aside. But the importance of having reliable and affordable continuous electricity available can not be ignored: intermittent sources as solar and wind should not be presented as the sole environmentally acceptable providers, as clearly the last dozen years have shown that this intermittency and the absence of realistic electricity storage are at the root of many tough problems. The German green Zeitgeist (which seems to drive many EU regulations) clearly is blind on both eyes concerning these problems; condemning nuclear energy under all its actual and upcoming forms as unacceptable increases dramatically the problems.
- The avoidance of CO2 emissions
The Energiewende was first positioned as a measure to avoid and diminish CO2 emissions caused by producing electricity from fossil fuels, transportation and industrial manufacture. After the Fukushima tsunami (March 2011), the “Atomausstieg” (nuclear exit) was added to this political foundation. Heavy subsidies have been poured on solar PV and wind energy facilities, pushing up the installed capacities of these 2 providers to 91 GW for a total installed generation capacity of 196 GW (numbers rounded commercially) as shown in this edited plot from Fraunhofer ISE:
Intermittent sources thus represent 91/196*10 = 46% of the installed capacity in 2016.; in January they delivered 23%, in August 25% of the total installed generating capacity. So we can conclude that when summing the intermittent sources, we find that these subsidized sources which have a feed-in priority contribute at about half of their installed capacity. The problem lies in the word “summing”: under the aspect of emissions, the sum might be a useful metric, but in real life it is the instantaneous available power that counts. The two following graphs from the Agora Energiewende report 2016 show the situation during the first and third quarters: I highlight the days with minimum and maximum (solar+wind) contribution with yellow rectangles.
Without the base load of CO2 emitters like biomass and coal, the lights would have been out many times!
Let us now look at the CO2 (or better the equivalent CO2 (CO2eq)) balance for the last years, compare several countries with Germany, and see if the Energiewende has been a successful CO2 lowering policy.
Our next graph shows how the CO2 emissions varied from 1990 to 2015 (I added zoomed inside pictures):
The most interesting conclusion from this graph is that Germany’s total CO2 output diminishes not much between 2005 and 2015 (the Energiewende started in 2001), in contrary to the USA which had not a comparable policy. The same picture shows up in the “per capita” emissions:
Compared to the non-“Energiewende” countries of France and the USA, Germany again fares very poorly. The next graph highlights in a more precise manner the trends between 2002 and 2015:
I computed the trend-lines for Germany (magenta) and France (black): the equation show that France is two-times more successful than Germany in lowering its CO2 emissions, without any comparable and extremely costly Energiewende policy. Agora concedes this in its report writing that ” … Germany’s total greenhouse emissions haven risen once again“!
And the following graph shows that the part of fossil fuel has remained constant since 2000:
Conclusion: The Energiewende has not achieved its primary goal in greatly lowering CO2 emissions!
(to be followed by part 2)
There is a very good comment by Donald Kasper at the Wattsupwiththat climate blog (15th Feb 2017). He writes that all social issues have a peak of popularity, but that the times of the rise might not be equal to the time of decline. Climate and global warming alarm is now among us since at least 30 years, and it seems that the continuous rise in attention and funding that this problems receives are quite different in many regions of the world. In the USA, the climate problem clearly is not the most burning one for the general population but in Europe the climate-angst train does not yet seem to slow down.
I remember at least 3 big environmental scares that were very popular in the past, and initially seemed to become eternal: the pilfering and exhaustion of the Earth’s resources and over-population (Club of Rome, the “Population Bomb” book by the Ehrlich couple published in 1968) seemed in hindsight to have and attention-grabbing duration of possibly 10 – 15 years. Look here for a good New York Times article and video on “The Unrealized Horrors of Population Explosion”. As neither prophecy, nor those of material exhaustion of the Club of Rome and those of rapid famines predicted by the two Ehrlichs became true, time was ripe for another scare.
During the second half of the 80’s, the danger of ambient radon, the ubiquitous natural radioactive gas, was pushed to new heights. Many profited from this new angst, mostly research labs and companies that were quick to sell radon mitigation appliances to disturbed house owners (usually a simple fan with some sealing of the caves bottom). A gas that in some rare instances could be a problem was pushed by the media and politicians (as always wanting to show that they care about their voters) to a permanent and extreme danger, allegedly causing a high percentage of the lung cancers (a conclusion that was extrapolated from extreme high radon situations to very low ones, according to the probably wrong No Linear Threshold (LNL) theory still fashionable among many anti-nuclear activists today). New legal maximum concentrations were defined (as for instance 300 Bq/m3 in Luxembourg); in the USA a radon certificate had to be added when a house was sold; and than the problem vanished from the media and the overall attention.
Why did the radon angst disappear? Because the new danger of global warming caused by another “pernicious gas”, CO2, was ramped up. The avoidance (mitigation) of high radon levels was not a too difficult task; but avoiding CO2, a natural constituent of the atmosphere and an inevitable by-product of fossil energy use, is quite a different beast. No wonder that climate change (which replaced global warming when it became clear that there has not been much warming for the last 20 years) became rapidly the poster child of everyone: as for radon, the new danger assured heavy funding in university research, the possibility to produce electricity by non-carbon emitting procedures pushed many parts of the industry into renewable wind and solar devices, and on top the very influential environmental movements had a topic that predictably would have a much longer life-span than the previous scares. As an additional pusher we can see the disappearance of the Cold War worries,the slow-down of traditional religious feelings which were, at least in many parts of the Western World, replaced by the new “quasi-religion” of environmentalism.
All these scares have some solid foundations: a future world population of 11 billion would be unmanageable if technology and science would stand still, so the 1968 angst (as the much earlier prophecies of Malthus) seem quite reasonable in an unchanging world. But this has not happened: the green revolution (which owns so much to Norman Borlaug) increased agricultural yields tremendously without destroying the soils and “nature”; in spite of many ongoing (civil) wars, political unrest, deep corruption etc. poverty has decreased and access to education made quite a jump. When the “Population Bomb” was written (1968) the world population was about 3.6 billion, and today, close to 50 years later, is has doubled to 7.2 billion.
The big environmental scares all ignore the tremendous potential for innovation of humanity. Despite the horrors of wars, environmental damage and political unrest in many parts of the world, the overall picture of the past 50 years commands an optimistic point of view, and not one of fear and depression. Will climate angst follow the past pattern? What makes climate change different is that, depending on your view, it is essential a cause of human evolution and progress, which both were and are heavily tied to energy availability and usage. All the previous scares have found at least a partial solution by human progress (remember that as a general rule the most industrialized countries are also the most eco-conscious ones!), but this one demands a big change in thinking. When we want to avoid pumping more and more CO2 into the atmosphere (my personal opinion still remains that the dangerous consequences will be small), and we have installed solar panels and wind turbines everywhere without seriously solving the intermittency problem of these renewable energies, why do we not see the elephant in the room: nuclear energy has all the needed potential for an abundant and cheap carbon-free energy, and many ways different from those used in the past exist to use nuclear (or fusion) energy in a low-dangerous manner, without a legacy of extremely long radioactive waste.
In my comments on the TIR Jeremy Rifkin report I repeated many times that in my opinion this report suggests a devilishly complex future, with millions of digital gadgets being interconnected and working to control and manage nearly every aspect of our life, amongst them the electrical grid. One of the most obvious problems is the vulnerability of the coming “smart” electrical grid and its feeders against malicious attacks. The last years we have seen big attacks deploying rather successfully: on December 2015, the Ukrainian power grid was brought down (read this report), possibly by Russian hackers; USA Today reported (link) that the USA grid is under nearly continuous attack. The website of “Transmission & Distribution World” writes in April 2016 on a dramatic rise in successful cyber attacks (link). 86% of the security experts at a RSA conference said that cyber attacks could cause physical damage to the infrastructure.
A group of 4 senators introduced a bill in January 2016 suggesting to go “retro” in selected components of the electrical grid to isolate it from malicious attacks (link).
The Center for Strategic and International Studies (CSIS) published in October an extremely interesting article by Michael Assante and al. titled “The Case for Simplicity in Energy Infrastructure“. The text clashes definitively with the naive all-digital optimism of the Rifkin paper. Let me just cite a few sentences:
- “Mix in a whorl of oversight organizations, legislation,regulatory frameworks, standards, and continually changing standards, and we’ve baked ourselves a layer cake of complexity and abstraction that no one in their right mind would want”
- “Complexity is not a desirable attribute”
- “There is a point of diminishing returns where more energy is required to sustain the complexity than the complex system provides in benefits”
They suggest not to un-digitize everything (which clearly is not feasible), but to introduce “attack surface interruption zones” which use non-digital, analog technologies to block a cyber attack. So instead of infiltrating every component of the grid with digitalization, well chosen islands using “retro” technologies (as analog relays) and human operators would avoid a break down of the whole attacked grid.
The best strategy in the search for resilience and stability of the electrical grid against attacks might be in this sentence: “Don’t over digitize!”
This last part holds some musings on Ewringmann’s Diesel comments, and make as general conclusion of the report.
8. Diesel bashing
The green movements have taken delight in a new fad: the Diesel bashing. All evils, all bad pollution problems are the fault of the Diesel engines that foolish politicians do favor (at least in the EU) by lower taxes. These people do not remember or consider three important facts:
- Diesel engines have a much higher efficiency as gasoline engines (about 35% versus 25%), so for a given work, they consume less fuel and emit less CO2
- it was green policy to push the Diesel engine as a solution for lower CO2 emissions
- high temperature/high pressure thermal engines emit more NOx and more nano-particles than lower pressure ones; as the newest gasoline direct injection engines (GDI) operate also at higher pressures, they have the same problems with these emissions as the Diesels (plus some 20 times higher CO emissions), but also have a higher power and better mileage than the traditional atmospheric models. See this figure from Delphi:
There are no big differences in emissions between such an injection-fuel car and an EU-6 Diesel car; GDI cars may even exceed the EU6 norm (link of picture). If efficiency is what counts, the Diesel engine remains the king since its invention by Rudolf Diesel. For a more sober comment, read this comment “Do diesels have a future?“.
Now Ewringmann writes that about 85% of the external costs from the sold fuel come from Diesel cars or trucks. As Diesel fuel accounts for 83% (inland) and 84% (export) of the quantities of fuel sold, this fact is not a rocket-science conclusion, but self-evident. If all Diesels were forcibly changed to gasoline, 100% of the cost would come from gasoline! When the author asks for an “Überdenken der Dieselpolitik”, Diesel bashing won’t help. He should clearly say what he thinks: lower (strangulate?) private and commercial transports on the roads by political regulation!
9. A short conclusion
I ended part 1 of this discussion with many citations, where in my opinion Ewringmann was right. All these citations finally have one common argument: even when Luxembourg makes its fuels much more expensive so that the quantities exported will be drastically lowered, the environmental impact outside Luxembourg will practically remain the same. Luxembourg is not the culprit of the external costs happening abroad!
Ewringmann seems to be quite sincere in evaluating the benefits, but he really jumps the line when calculating the costs. Applying Luxembourgish cost-factors to fuel burnt outside the country is wrong, and this single “error” inflates the costs. Attaching fantasy prices to CO2 emissions is equally wrong, even if some German institutes or environmental organizations do favour extremely inflated numbers. CO2 has a clear price in Europe since many years; the price is low and does not increase as predicted. So this price of 10 €/ton should be used, and not an amount 10 times higher!
The report is easy to read, but shows the piece-wise writing (and possibly multiple authors), as Ewringmann acknowledges in his foreword. Childish errors like the rounding problems creeping up in some tables should be taboo in a publication from a serious institute.
Does the report help in making political decisions? I remain dubious, as the main lesson I take away after several careful readings is that the best option is to let things evolve without interference. As the last 3 years show a “natural” lowering trend of the fuel exports, let economy decide. A “natural” weaning will probably allow to compensate the tax losses, avoid to kill numerous jobs and economically devastate the commercially successful border regions by ill-conceived ideology driven politics.
(end of part 4, the last part)
In this 3rd part I will make some comments on rounding errors, and make a recalculation of the external costs that the author gives at 3.5 billion Euros.
6. Rounding errors
A really annoying error is that the author has problems with rounding and sums of rounded numbers. At many places the sums given in a table are different from the correct number by 1. The problem probably comes from summing in an Excel sheet non-rounded numbers, and giving in the tables the rounded numbers without checking that the sum of these rounded numbers is not equal to the rounded sum in the Excel sheet. This is a very basic error, that should not be made in a (probably expensive!) report written by a well-known institute.
As an example, let us look at Table 4 at page 26: all the sums with a blue strike-out are wrong; the sum “440” in the line corresponding to 2000 is two-times wrong: adding the numbers should give 341 (instead of 440), but the number 96 for the “Benzin” (gasoline) inland consumption probably should be 196 (which makes the correct sum 441):
7. Re-checking the scary 3.5 billion cost number
At page 49 we find the sentence repeated by all news articles I have read: “ist mit externen Umwelt-und Gesundheitskosten von insgesamt 3.5 Mrd. Euro pro Jahr verbunden”. In English, the authors says that the fuel sold in Luxembourg in 2012 (for this year has been used for the calculations) has external costs for the environment and health of 3.5 billion Euro. The positive impact on the GNP is 1.8 + 0.26 =2.06 billion € .
The costs which can be attributed to the fuel sold are essentially the costs related to the emissions of pollutants, and those from CO2 emissions. Road accidents will also happen when all vehicles run on electricity, they should not appear in this calculation!
The numbers for the emissions costs relate to 2008 (there is a real confusion in the report regarding the years corresponding to given numbers, because they vary from 2008, 2010 and finally 2012 is taken in the status quo discussion). The inland fuel usages (in kt) in 2008 and 2012 were 573 and 574, and the exported numbers 1610 and 1586. As these quantities do not differ by more than 5%, we will use the emission costs given in part 2 for the calculation of total costs in 2012. The price of 1 ton CO2 is taken as 10€, a number that many experts estimate being the EU emission price until 2020 (see for instance here). The up to 50 times higher number given by Ewringmann at page 32 should be considered as green fantasy.
The following picture shows the costs for inland, exports and the grand total:
So even if we accept that all costs (inland and export) should be summed (I repeat: I do not agree!), the range goes from 264 to 661 million Euro: this is a staggering difference by nearly one order of magnitude for the lower range, and by a factor of 5 for the high range value with respect to the scary 3.5 billion amount.
The 3.5 billion Euro number is pure and extreme guess work, a fantasy number rooted in non-real, extreme CO2 costs and in a faulty calculation of emission costs.
(end of part 3)
(to be continued with last part 4 )
In this second part I will write on the different costs calculated in the report (which usually speaks of “external costs”). I will start with the costs associated to traffic inside the Luxembourg borders, then comment on the so-called global “climate” costs, and finally look carefully at the total costs caused by the emissions (=pollutants) attributed to the fuel bought at Luxembourg’s gas stations.
3. Costs from the traffic inside Luxembourg
The following table (an edited original Tabelle 3, p.23) shows that the average external cost per driven km is about 0.11 € :
The cost factors in column 3 are Umweltbundesamt (UBA) numbers, and include emissions of GHG, pollutants, procuring the fuel (but not the price of the fuel itself), damage to the environment, health and costs of accidents. I will not discuss the validity of these numbers (which could be exaggerated, knowing of UBA’s tendency for dramatization). A 2008 report from the University of San Francisco comes to a similar conclusion, giving for instance a cost of 0.1146 U$ per mile driven on a Honda Accord, which is close to 0.0703 €/km, the number the UBA uses for gasoline cars (lines “Benzinfz.” in the table). The conclusion is that 7.413 billion km of traffic have external costs of 0.784 billion € (I use the term “billion” in the US sense, 1 billion = 1 Milliarde = 10^9). In my opinion, calculating the costs from the real driven km’s is the correct manner to do. The costs per km are IMHO surprisingly low, similar to the fuel costs per km of a mid-size car.
This result could be the final word of the report; on the positive side of the balance are 2440 jobs and a contribution to Luxembourg’s GNP (gross national product) of approx. 1600 million € in 2012 (p. 47); on the negative sides are costs of 784 million € and 28 millions which must be paid to the Kyoto fund. This gives a positive balance of 788 million € plus 2440 jobs.
If the report would have respected its mandate, it should conclude with the advice to leave everything as it is and not to try to destroy a very positive feature of Luxembourg’s economy by so-called “environmental” political decisions. But that is not the case, as will be shown in the next two discussion points.
4. Costs to the climate (“Klimafolgenkosten”)
In chapter 3.1. the author tries to evaluate the climate relevant costs from all the quantities sold. This is a clear example of shifting the focus from examining in-border costs to “global” costs, of which only a tiny part could eventually attributed to Luxembourg. Actually, even ignoring this questionable focus shift, this whole chapter should be scrapped, as the uncertainties in putting numbers on the “climate” or “CO2” costs vary so enormously as to make any calculated result absolutely meaningless. In page 34, Ewringmann acknowledges that these attributed climate costs are, depending on the methodology used, anywhere between 16 and 3000 million Euro (yes: 3000 million, this is not a typo!). The only lesson told by such an extreme range is that the “science” to evaluate these climate costs actually is unusable, and should be considered as totally immature and unsettled.
So I will not write more on the chapter 3.1., but will pass on to the next chapter 3.2. on the costs of emitted pollutants.
5. Costs of traffic induced emissions.
Here the author again does not refrain from calculating the emissions costs from both the in-border used fuel and the exported fuel, a decision I strongly disagree with. In my opinion he should have made his calculations on the first category, and ignore the costs of pollution happening outside the country.
The author first gives a table showing that the costs of a pollutant are not a given number, but vary with the country where the pollution does occur. As these costs are modulated by population density, mean income etc., they are nearly double for Luxembourg than for the average EU:
This table of the costs in €/ton (metric ton) does not give those of PM2.5 fine particles, which are assumed anything between 81400 and 392600 €/t. The author gives a total of 308 tons of PM2.5 emitted from all the fuel sold; with 75% exported, this amounts to an inland cost in the range of [6 – 30 million €] and an out of border cost to [18 – 90 million €].
Now comes what I consider a serious error: in calculating the total costs, the author simply multiplies the total pollutant quantities with the unitary costs applicable for Luxembourg and finds a range of [295 – 817 million €], with rounded numbers:
What he should have done is to multiply the in-border emissions with the Luxembourg costs and the out-border emissions with the EU27 average (as probably precise numbers and quantities for the 3 neighbors Germany, Belgium and France are not available); this brings down these costs to a range [199 – 596 million €]. The next scheme shows the correct calculation, the two arrows per pollutant give the low and high ranges of the costs:
The number of 817 million is what the media focused on; as shown there are at least two good reasons why this number is wrong:
- the calculation method is wrong
- the range of the UBA unitary costs for the PM2.5 (the lower costs are for out-of-town emissions) is unbelievable large
and finally, and once again, I disagree completely in adding the costs related to exported emissions to a total which is meaningless in the frame of the mandated report (and should be given as a curiosity at most).
After correctly writing (p. 40) that “…ist kein Anlass, Luxemburg pauschal als Verursacher dieser Kosten anzusehen”, the author by a twitch of logic concludes that “Dennoch ist es durauch plausibel und gerechtfertigt, Luxemburg…die Bilanz der Gesamtexternalitäten vorzuhalten”. No, it is not !
If we include the costs of inland emissions into the previous balance, we still find a largely positive balance of [582 … 718] million € plus 2440 jobs. But this calculation is moot, as the report’s mission was to focus on the costs of the pump tourism (in the larger sense of total exported fuel); there is absolutely no reason to include the inland costs due to pollutants in this analysis, as these costs are almost independent of and not caused by the pump tourism.
(end of part 2)
(to be continued with part 3)
A remark added the 23-Dec-2016:
a. The WHO (World Health Organization) sets the standards (or guidelines) for air pollution. Here are the guidelines for the annual or other time-interval mean concentrations:
NO2: 40 ug/m3
SO2: 20 ug/m3 as the maximum 24h mean
O3 : 100 ug/m3 as the maximum 8h mean
SO2: 20 ug/M3 as the maximum 24h mean
It should be noted that in 2014 92% of the world population did not meet WHO standards. This is a clear sign that at least some of these guidelines might be over the top. An example can be found in this report on O3 and PM2.5 pollution in the industry free Great Smoky Mountains National Park: In the 7 years 2008 to 2014, the yearly mean of the 8h averages exceeded 40 ppbV (= 86 ug/m3) during 3 years. The EPA wants to limit O3 exposure to 60-70 ppbV, close to the natural background in the GSMNP. In 2016, the O3 limit was exceeded for several days in 8 of the 24 US National Parks; the Sequoia NP holds the record with 92 days of exceedance during the year 2016 open season (link).
b. NO2 annual levels:
The following map from the EEA shows the 2013 mean annual NO2 levels: clearly large cities and industrial regions usually exceed the 40 ug/m3 limit (red and brown points).
Dr. Dieter Ewringmann from the green-leaning FiFo Institute has delivered his long awaited report on what the Luxembourgers call “pump tourism”. Luxembourg is situated at the crossroads of important North-South and to a lesser degree, East-West roads and has usually lower fuel prices that its neighbors. So not astonishingly trucks and cars running through Luxembourg fill their reservoirs here; on top of that about 150000 people come each day from abroad to their Luxembourg workplace, and as to be expected, fill up here. Finally a relatively small number of people living abroad in the boundary region drive to Luxembourg’s fuel stations to buy fuel (and cigarettes, coffee and alcoholic beverages). All this makes that the major part of fuel sold in Luxembourg is exported, as shown in the following picture:
The expression “pump tourism” is usually applied to the 75% part of fuel exported, even if strictly speaking it should apply only to the 11%. The problem with this situation is the accounting scheme adopted in Brussels and Kyoto, where the CO2 emissions of a country are calculated from the fuel quantities sold in that county, independent from the fact that a more or less greater part is directly exported. I always recall my opinion that this is an idiotic accounting convention, which has not been applied to other situations of trans-border commerce. For instance the VAT taxes on e-commerce are now (rightfully in my opinion) calculated and paid at the buyers location, a fact that hurt Luxembourg badly as it lost for instance these taxes formerly paid by Amazon in Luxembourg.
Our Ministry of Environment and Sustainable Development has mandated Ewringmann to quantify the costs of this fuel situation, and to give the costs and benefits of a change through political decisions (what means in practice decreasing the price differential with the Luxembourg neighbors).
The Ewringmann report stands and falls with the notion of “external costs“. He promotes that all costs from selling fuel should be accounted for by the seller; this is a standpoint that I can not accept. In my opinion the costs should by applied and accounted for at the place where they are caused by the consumption of goods. As an example, nobody would think to apply the diabetes-risk costs coming from the consumption of sugary pastry or the costs implicit in textile articles to Germany, when German shops sell these items in Trier to the many Luxembourg customers taking the articles back home. So as to be expected, his external costs are huge, so as to dwarf the benefits. Our news papers have reported mostly on this, as it makes for some good goose-pimp emotions.
I read the full report carefully several times. The style and presentations are clear and easy to understand; I found what I think is an important error in the calculation of the cost of pollutants. As the report has been mandated by a green ministry, there are many pages that are superfluous IMHO. Nevertheless I would suggest to read the full report, and not the short version that hides many important reflections.
I will give full citations as they are in the report, as most readers will probably be fluent in German language.
2. Shutting down? Examples where Ewringmann is right.
The report contains many places where the author insists on what are the consequences of shutting down the fuel exports.
- wenn Luxemburg seine Grenzen für ausländische Fahrer schliessen… würde, so träte eine echte Verringerung der externen Gesamteffekte… nur in dem Masse ein, in dem die bisher in Luxemburg tankenden Autofahrer künftig absolut weniger tanken und weniger Kilometer zurücklegen würden (p. 9).
- …die gewonnen Ergebnisse sind vorsichtig zu interpretieren. Luxemburg kann nicht als Verursacher dieser negativen Gesamteffekte angesehen werden (p. 20).
- Wenn dieselbe Treibstoffmenge im Ausland getankt würde, liesse sich weitgehend dieselbe Summe an externen Kosten berechnen (p. 28).
- Die im Ausland anfallenden Kosten des Luxemburger Treibstoffverkaufs enstehen zwar durch Autofahrer, die in Luxemburg tanken, sie würden aber zu einem grossen Teil auch dann enstehen, wenn Luxemburg seine Tankstellen schliessen, für Ausländer sperren oder durch extrem hohe Steuersätze für Ausländer…unattraktiv machen würde (p. 40).
- Ebenso falsch ist es, den totalen Ausstieg aus dem Treibstoffexport zu fordern und damit die Erwartung zu verbinden, alle am Export hängenden externen Kosten “vernichtet” zu haben…Solange (die ausländischen Fahrer) ihren Gesamtverbrauch nicht einschränken, spielt der Ort desTankvorgangs keine Rolle (p. 78).
- …ein im Regierungsprogramm erwähnter Ausstieg aus dem reinen Tanktourismus (führt) schon rechnerisch nicht zu einem wirklich relevanten Abbau der negativen Umwelt- und Gesundheitseffekten…(p. 79).
- Beim Transitverkehr per LKW ist dagegen zu erwarten, dass trotz der Verlagerung der Tankvorgänge ins Ausland in starkem Masse die Luxemburger Autobahnen und Strassen weiterhin benutzt…werden (p. 79).
- …die im Inland anfallenden externen Kosten (würden) zu 58% erhalten (bleiben) (p. 80).
These are clear, intelligent and refreshingly sincere remarks, that seem to be ignored by agenda driven commentators.
In the next part 2 I will comment on the so-called “climate costs” and on the pollution attributed costs.
(end of part 1)