This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.
I finished a couple of days ago the annual computation of climate trends calculated from the measurements at meteoLCD, Diekirch (Luxembourg). As usual, the numbers show a much less spectacular evolution than the emotional media reports suggest.
- Lets start by the ambient air temperature:
The thermometers have not been displaced since 2002: the calculated blue regression line for 2002 to 2015 shows no warming, but a very small cooling!
A very similar picture is given by the temperature data of our national meteorological station at the Findel airport. The next graph was made using the homogenized data of NASA’s Gistemp:
Here the cooling rate for the 2002 to 2015 period is -0.0058 °C/year, quite negligible!
Did you hear the anthem “there are no more winters?” Actually this ongoing 2016 winter really seems absent, but the overall picture for 1998-2015 is a remarkable cooling:
This plot of the December-January-February winter periods shows a visible cooling of 0.6°C per decade at the Findel airport. If we restrict our analysis to the 2002-2015, there still is no serious warming to be seen at the Findel: just a meager +0.08 °C warming per decade, very close to zero.
Our Diekirch data for the 2002-2015 winters also show only a modest winter warming of 0.5°C/decade and a good correlation with the NAO (North Atlantic Oscillation), a natural phenomenon which has a big influence on European climate: note how the trend lines of Diekirch, the Findel, Germany (DE) and the NOA index are very similar.
You might compare this with the January trend of the German weatherstations given at the NoTricksZone blog!
Finally let us finish this first part with a look on the DTR = daily temperature range = daily Tmax – daily Tmin. The global warming advocates always point to this measure as a sign for an ongoing warming caused by human activity: global warming should decrease the DTR, because it would make the nights warmer than the afternoons, and as a consequence decrease the DTR. Here our Diekirch data:
All trends are practically zero: +0.2 °C/decade from 1998 to 2015 and -0.1 °C/decade for 2002-2015.
So lets finish this first part with a first conclusion: no big warming seen here in Luxembourg since at least 14 years!
to be continued…..
The current “consensus” theory on the impacts of higher atmospheric CO2 is that current basic ocean pH levels (about 7.8 to 8.1 with ample variations) will be lowered by the dissolved CO2, and that the oceans “acidify” (a wrong appellation as the ocean waters still will remain basic). The bad results from the “acidification” would be problems for shell making creatures, as a more “acid” water would dissolve the calcium carbonate shells.
As so often in climate debates, these simplistic and popular alarmist theories do not survive serious scientific research. The latest issue of the journal SCIENCE Magazine (18 De. 2015, volume 350, issue 6267) has a research report by Sara Rivero-Calle from John Hopkins University and co-authors titled: “multidecadal increase in North Atlantic coccolithopores and the potential role of rising CO2”. Coccolithophores are the main calcium forming phytoplanktons, unicellalur alguae surrounded by calcium plates (link to picture)
According to the authors, coccolithophores are a major source for the oceans inorganic carbon, are helping to sink aggregates and thus increase the storage of atmospheric carbon. The study uses 45 years of data, where the method to collect the phytoplankton with silk sieves has not changed. Their results show that the coccolithopores profit from raising atmospheric CO2 levels. The next figures shows that the percentages of collected coccolithophores in the samples increases dramatically from about 2 to 20%:
Their statistical analysis suggests that the causes of this increase are atmospheric CO2 levels and the AMO (Atlantic Multidecadal Oscillation):
The upper plot gives the above mentioned probabilities to find coccoclithophores in the sample, the second the global atmopsheric CO2 mixing ratio in ppmV and the lower plot the AMO index. They conclude that their North-Atlantic results may well represent a global trend. And that “contrary to the generalized assumption of negative effects of ocean acidification on calcifiers , coccolithophores may be capable of adapting to a high CO2 world”.
Three professors from the Physikalisches Institut Universität Heidelberg have written a very short article in February 2015, titled “Findet eine Energiewende statt?” Contrary to what one usually reads (i.e. the Energiewende is only seen as implying electricity production), they discuss how the amount of fossil energies in the total German energy consumption has changed between 2000 and 2013. During that period, a solar photo-voltaic capacity of nearly 39 GW, and a wind-turbine capacity of approx. 34 GW were installed, quite impressive numbers (a large nuclear reactor has a 1.5 GW capacity)!
The costs of the PV installations alone installed from 2000 to 2012 are estimated at 108 billion Euro (including the yet to pay amounts for the future feed-in). The total costs of the EEG (Erneuerbare Energie Gesetz) are staggering, and mostly unknown. One estimation by Hermann (2011), actually seen as much too low, gives a total of 350 billion Euro up to 2030; 500 billion probably will be exceeded.
What is the effect of this huge effort? The figure below shows the total energy consumption in Germany from 2000 to 2013. The left scale shows the percentages, with the situation in 2000 taken as 100%. I added the boxes and arrows.
We see that the percentage of fossil fuel energy was about 84% in 2000 and remains at least 80% in 2013 compared to the 2000 reference; actually it is 90% of the slightly lower total consumption in 2013. So in 14 years of extraordinary expansion of renewables there is at best a minuscule diminution of the total amount of fossil fuels used for heating, driving, industrial processes and electricity production (and even an increase in the 2013 percentage).
The professors rightly conclude: “Der bisherige Ausbau der Wind- und Solarenergie ist augenfällig, das bisher Erreichte fällt aber sehr bescheiden aus, gemessen am Gesamtziel einer weitgehend von fossilen Energieträgern unabhängigen Energieversorgung unseres Landes”.
In the debate about wind and solar electricity, the amount of subsidies received by political decision is a hot topic. Usually the pushers of this type of electricity (correctly) insist that non-renewable electricity production is also subsidized, and that objecting subsidies for renewables is a moot point.
In this blog I will use concrete data available from the EIA, as well as a few illustrations from the “At the crossroads: Climate and Energy” meeting by the Texas Policy Foundation. All data refer to the USA, 2013.
Let us start with the EIA table which gives the different subsidies for the fiscal year 2013, in million of (2013) US$.
The “classical” electricity production by coal, natural gas, petroleum and nuclear receive 5081 million$, the renewables 15043 million$.
The next slide of a presentation by James M. Taylor from the Heartland Institute gives a good textual overview:
The last point is especially instructive: coal, NG, petroleum and nuclear receive 5081 million$ as subsidies, but produce 86.6% of the total electricity; wind and solar receive 11264 million$ as subsidies and produce 4.4% ! This means that the wind & solar combined receive 43.6 times more subsidy per unit of electricity produced than the traditional producers!
If you just picture solar subsidies per 1000 MWh produced with the traditional producers, you get that nice pie-diagram (by Taylor):
The official EIA numbers in the table should close the debate: yes, renewables get really high subsidies for their very low overall contribution! The situation in Europe probably is similar to the USA, with renewable subsidies possibly still higher.
The Bundesnetzagentur (BNA) has published a guide to explain the future taxes that must be paid by individual producers and consumers of electrical energy. This provisional “Leitfaden zur Eigenversorgung” is devilishly complicated, as many combinations of producing/consumption and storage are possibly.
1. The big principle
In the future every consumer will have to pay an EEG tax, even if he consumes the electricity that he produces with his roof-mounted PV panels. Until now he paid for his own electricity about 20 (Euro) cents per kWh, to be compared with about 30 cents that the ordinary consumer has to pay. This will change in the future (in three increasing steps: up to end of 2015, 2016 and from 2017 on). The following diagram gives this situation:
The red circle shows that (starting 2017) the producers of solar, wind or biomass electricity (EE) and those of high efficient combined heat and power installations (KWK) will only have to pay 40% of the EEG tax, all others the full amount (100%)
2. The difficult problem of electricity storage
The new “Leitfaden” considers electricity storage installations as consumers and producers; as a consumer they must pay the tax, and as producers they must ask their clients to do it.
Let us just consider the situation where a PV owner uses his own solar electricity, and adds battery storage to smooth-out the variable production:
Normally, he would have to pay two times: first to store, and then to use! The Leitfaden remarks that this would be rather idiotic, so one of the two taxes vanishes. The 100% amount will change to 40% if the electricity is produced by renewables. This is the minimum amount that must always be payed!
3. The zero-tax exceptions
A couple of exceptions exist: the first is the “island” situation shown in the next figure:
Here you have a consumer whose house is completely cut off from the grid, and must never (even for 15 minutes!) connected back to that grid. His electricity is exclusively renewable, and he may not up it with some other means (as a diesel generator) when the quantity is insufficient. This consumer will pay no tax. But remember: a single kWh coming from outside during the year means he will be put back at the 40% tax level for the full year!
If this user has a one-way connection to the grid, so he might deliver eventual excess renewable electricity to the grid (but never the other way round!) and not asking for the usual renewable subsidy, zero tax also applies.
Very small renewable installations (de-minimis installations), with less that 10 kW capacity and less than 10 MWh annual energy production will also be exempt from the tax.
The only system of storage that will be treated lightly is the pass-through storage system: the owner takes electricity from the grid, stores it and delivers the same quantity later on. Actually he puts his storage batteries in the hands of an external provider who manages storage and retrieval. This owner will not have to pay for that electricity that transits through his batteries (difficult to see it otherwise!)
As said at the start, the “Leitfaden” shows a devilishly difficult future world, and one where tricks and poor morality will blossom, as control and management also will be far from easy (even with smart meters). I guess many PV owners will not take lightly that they will have to pay higher taxes on their home-grown electricity they consume!
30 October 2015: added link to paper (see at the end)
I just read a short article by Steve Goreham titled “Did we really save the ozone layer?” (link).
After the Montreal Protocol most nations were phasing out their use of ozone depleting substances (ODS, like freon), and since about 20 years there are practically no much human emissions left. Remember that these evil fluor-carbon molecules are seen as the culprits causing the yearly thinning of the ozone layer above Antarctica (the Ozone Hole). During October and November total zone column may plunge to 100 DU in that hole.
Has this phasing out of ODS fostered a closing of the zone hole? Really not, as shown by this graph:
Since 1996, the area of the ozone hole remains more or less at 20 to 25 millions km2, while the ODS consumption and emission (red curve) fall to zero. Could it be that the whole theory (which gave Molina and Roland their 1995 Nobel price) is either bogus or at least incomplete? The ozone destroying chemical reactions found by the 2 Nobelist certainly exist; there remains the nagging suspicion that other, possibly more important ozone munching phenomena as the human ODS emissions might be at work.
How inconvenient that Nature so often refuses to obey our glorious models and the political decisions based on them!
30 October 2015:
See this paper by Gribble: (paywalled):
The diversity of naturally produced organohalogens.
“More than 3800 organohalogen compounds, mainly containing chlorine or bromine but a few with iodine and fluorine, are produced by living organisms or are formed during natural abiogenic processes, such as volcanoes, forest fires, and other geothermal processes. The oceans are the single largest source of biogenic organohalogens, which are biosynthesized by myriad seaweeds, sponges, corals, tunicates, bacteria, and other marine life. Terrestrial plants, fungi, lichen, bacteria, insects, some higher animals, and even humans also account for a diverse collection of organohalogens.”
In the discussions about climate change, one often gets the impression that the media and the political people aim for a stable climate (actually they mean “weather”), something like a paradise-like smooth state without any nasty, big and extreme variations and perturbations. The chaotic nature of our atmosphere should tell them that this can never be the case, and that battling to stop climate change is and will remain futile. Here I will write some words on the variability of the total ozone column (TOC). The atmosphere has a variable concentration of ozone (O3), that delicate gas that comes and goes according to solar, temperature and presence of precursor gas conditions.
1. Ground ozone.
We find ozone everywhere, but two regions are the most important: the ground layer where we live and breathe can have O3 concentrations that change up 200 ug/m3 (about 100 ppb) during good (pre-) summer days. Many factors impact this concentration, the most important being (besides temperature) the availability of UVB radiation and precursor gases. Here the most important of these are the natural isoprenes emitted by trees and plants, and the NO2 which mostly comes from traffic related emissions (there are other gases like industrial VOC’s, which have a smaller impact). As vehicles also emit NO, which destroys ozone, we have very different nightly profile in clean air rural and high traffic urban places. Look at the following picture, which gives the ground ozone levels measured at Bonnevoie ( = Luxembourg-City) and Diekirch (=semi-rural) during the week ending the 26th October 2015.
The blue rectangle shows the situation in the night of the start of the 22th October 2015: in a same interval of about 6 hours, the ozone concentration in the city location (top) diminishes by 27 ug/m3 (about 13.5 ppb), whereas at Diekirch without not much nightly traffic the fall is only 8 ug/m3 (4 ppb). The rapid decrease in Luxembourg-City is due to the emitted NO, which destroys the existing ground ozone, a removal that is much slower in Diekirch where there is not much nocturnal emitted NO !
So looking at a daily ozone pattern tells you immediately if the location was urban or rural.
Every May/June, when ground ozone levels are on the rise, the Luxembourg environmental agency issues warnings, as they take wrongly and stubbornly as a reference the O3 levels at the Mont Saint Nicolas in Vianden, a very rural location without only a minimum traffic, but a very rich tree cover. The natural isoprenes, together with clear and non-polluted air (i.e. rich UVB irradiance) makes the ozone levels at this location the highest for Luxembourg. This has nothing to do with noxious human activities, but is an absolute natural phenomenon.
As I will speak in this blog on variability, just look at the extreme swings in the ground ozone concentration: the O3 levels never are constant, but vary from close to zero in the morning to their late afternoon peak.
2. Total ozone column and UVB irradiance.
The major part of the ozone is located in the stratosphere, between 15 and 50 km with a maximum around 25 km; the concentration there is about 6 times higher than at ground level. This “good ozone” layer absorbs the short-wave and dangerous UV-C radiation (with wavelength below 280 nm) completely, and also part of the UVB radiation (280 to 320 nm). The total ozone column is measured in Dobson Units (DU): if all the ozone contained in a vertical column would be compressed to normal atmospheric pressure, the height of that column would be about 3 mm or 300 DU. Usual numbers in our region vary from 250 to over 400 DU. About 10 to 20% corresponds to the ozone at the ground layer (the “bad ozone”), the major part is stratospheric ozone (the “good ozone”).
The influence of the thickness of the ozone layer on the UVB irradiance can be shown from our measurements at meteoLCD (see this paper). The next figure from the cited paper documents that a thinning ozone layer will increase ground UVB irradiance.
The 22 April 2013 the TOC (total ozone column) was 381.8 DU, and the effective UVB irradiance about 1.5 MED (MED = minimal erythemal dose); the next day with the same meteorological conditions, the TOC fell down to 265.8 DU and the effective UVB irradiance increased by 0.68 MED, about 2 UVI (UV index). A dip of 100 DU would correspond to an increase of 1.7 UVI.
3. The extremely variable total ozone column
Many factors influence the thickness of the total ozone column, which varies often in a spectacular manner. Look at the next figure which shows the TOC measured at Uccle (near Brussels, Belgium) for this year 2015:
The read line represents the DU readings for this year 2015 up to the 25th October, the grey part of the plot are the readings from last year. Uccle has one of the longest DU series in Europe, starting 1979. The sine-wave represents the average of all measurements from 1979 to today. Clearly the TOC is highest in spring and lowest in autumn. The next 3D diagram shows this in a more beautiful way for the global region between 45° and 50° latitude North.
So we have an average smooth sine-curve over the year, but the actual measurements present a totally different pattern. Look at the extreme variations in the Uccle plot, where the ozone column can plunge from a 500 DU peak to a 300 low in a few days; often the peak and troughs follow in very short time, a day or even less.
These are the measurements at meteoLCD for 2015, up to the 26th October: Look at what happened around the 10th April: in two days the thickness of the ozone column increased from 300 to 450 DU, and fell back to 363 DU the next day.
What all these measurements show is that our atmosphere is a very dynamic beast; change is the norm, and no change the exception! I remember that in the past when the TOC fell rapidly, the media were fast with alarmist articles about a vanishing ozone layer (evidently caused by human activity!) and our eradication by skin cancer. Had the authors waited a couple of days and had they not been ignorami of natural variations, these silly articles would not have been written.
There is no cause for alarm, as the total ozone layer has not been thinning since many years. The last figure shows the trend from our meteoLCD measurements (meteoLCD is still the only station measuring the TOC in Luxembourg).
The general trend from 1998 to 2014 is positive, and that of the last 14 years practically flat. So no cause for alarm here!
The measurements of the ground ozone and the total ozone thickness document an extremely variable situation. There is no even spread out of the ozone concentration, no well mixed situation, but a breathtaking variability. The atmosphere is a turbulent beast, not a smooth pudding!
PS: Do not forget to look form time to time at our ozone data by clicking on the “DOBSON (total O3)” link at http://meteo.lcd.lu
Many media write in ecstasy that this new report warns that the CO2 mitigation schemes of most countries do not allow to keep global warming below 2°C in 2100! Actually the main point of the report is that if all countries do not reduce their GHG emissions by 40-70% below their 2010 levels and by at least 100% by 2100, this 2°C target will be missed. In this blog I will not muse on the absurdity of this 2°C level, which is a nearly complete guesswork derived from non-verified climate models. I just will take some items relating to Luxembourg, and will shows that this glossy report gives numbers concerning Luxembourg that are close to swindle.
1. Carbon intensity of electricity generation
Page 79 contains a graph showing the carbon intensity in g CO2 per kWh electricity produced for many countries: here Luxembourg’s CO2 intensity goes through the roof:
The figure tells that Luxembourg produced its electricity in 1990 using 2552 g CO2 per kWh, which would have been a cosmic record! In 1990, Luxembourg consumed about 1250 GWh (here), about which 1000 GWh from coal. In 2013, the fraction between total thermal electricity consumption and in-country thermal production is 2.7. This suggest an indigenous Luxembourg thermal production of 1000/2.7 = 360 GWh. There is absolutely no reason which would explain the abyssal low efficiency of Luxembourg’s own power plants to produce this tiny amount of electricity. This statistic is clearly nonsensical, and it is very telling about the care the authors took in checking their numbers.
The recent data are fully available at the web site of the ILR (Institut Luxembougeois de Régulation). I use the data for 2013 which can be summarized as follows:
Global electricity production = 1838 GWh
Production of thermal origin = 1575 GWh
Included in this category is the production of co-generation (“Blockheizkraftwerke”), of burning wood, bio-gas, landfill gas and the production of the TwinErg combined gas-turbine plant.
The balance of 263 GWh comes from wind, solar and hydro production.
Now let us assume that the thermal generation produces 500 g CO2 per GWh, which is about the average for natural gas electricity power plants; notice that we include all bio-gas and wood burners here, where at least the last could be considered carbon neutral.
The 1575 GWh correspond to 7,88*10^11 g CO2 emitted in 2013 . Dividing this number by 1838*10^6 gives us a (maximum!) CO2 intensity of 429 g/KWh, well in line with the grey bar showing about 350 g/kWh.
The graph should have omitted the 1990 bar for Luxembourg; that it found its way into the figure shows that these graphs are milked from databases without any intelligent thinking.
2. Emissions per sector
At page 21 we find this figure:
Look at the distance between the two red lines: Luxembourg’s emissions from transport are close to 55% of the total! This is a world maximum! Do the Luxembourg people drive like crazy 24 hours a day on their 2967 total length of roads (including 147 km motorways)? Or do they drive gas guzzlers that use x*100 liter/100km ? Definitively no; the authors of the OECD report forgot to mention that close to 80% of all fuel sold at Luxembourg’s pump stations goes into foreign cars, and so does not stay in Luxembourg and can not be taken as an internal Luxembourg consumption. The real part of the transport sector emissions inside Luxembourg is closer to 55*0.2 =11%, similar to what can be found in Germany or Belgium. All statistics ignoring this basic fact are swindle, as are the monstrous 20.9 tons CO2 per capita emissions for Luxembourg computed by using wrong numbers and found in many databases (see here). A calculation method that is ok for large countries where fuel export is a tiny part of the total volume sold at the pumps can and must not be used for small countries where the major part of the pumped fuel goes into foreign car tanks.
3. CO2 efficiency
Now lets conclude with a more positive remark. The figure 1.2. shows the trend in GHG emissions versus GDP:
During the 23 years interval 1990 – 2012, Luxembourg’s efficiency increases dramatically. The GHG emissions per unit of GDP fall by close to 60%, bettering Germany and most other European countries. Alas, that the Russian Federation is the champion on this graph makes me think twice, damping the good patriotic vibrations.
This OECD report makes for a pleasant reading, but leaves with the impression of a quick copy/paste from various databases and a disturbing absence of critical thinking. The IPCC gospel is taken as a divine truth, and finally the laments reported in the media are not much more than the result of a primitive extrapolation of current CO2 emissions. One more of these sloppy reports that beat the drums waiting for COP21 !
03 Nov 2015:
just as another illustration how Luxembourg is seen on UNEP statistics (graph from UNEP): The red arrow point to Luxembourg, the size of the disk is proportional to the total CO2 emisions:
Radiation is one of the natural phenomena that many people are afraid of: we do not see this mysterious nuclear rays, and rarely do the media talk about radiation without pushing the scare level to the max. But radiation is “natural”: the whole universe, and our planet, and our selves constantly bathe in a continuous flow of charged particles, energetic photons, fast neutrons and mysterious neutrinos which zip through us every second.
1. Cosmic and solar rays
Cosmic galactic rays originate from outside our solar system; they usually interact with nitrogen and oxygen in the atmosphere and produce a shower of different electrons, muons etc. Solar rays represent the “solar wind”, mainly formed from protons which are ejected by the huge solar fusion reactor. The solar wind is more intense when a “magnetic hole” = coronal hole opens at the sun’s surface. The charged protons are than free to be ejected into space; if the hole is directed to the earth, a more or less vigorous stream hits the atmosphere. Such a situation is happening now, as shown by this picture taken from the excellent web-site spaceweather.com (next figures all from this site).
The turquoise circle shows a coronal hole; the white lines and arrows indicated the magnetic field lines which usually trap the protons. The number of these solar protons is continuously monitored by the geo-stationary GOES satellites or the ACE satellite positioned at the Lagrangian L1 point between Earth and Sun. Here the data for today, 15 October 2015:
As you can see, 2.4 protons per cm3 is not negligible: as your body volume is approx. 75 liter ( = 75000 cm3), this means that at this point of observation 180000 protons zip through you every second!
One understands that this type of radiation (and mostly rapid surges) can pose serious problems to astronauts on board of the ISS or future planetary travel.
2. Transatlantic travel by plane
It seems obvious, that cosmic radiation and solar protons are increasing with altitude, as the filtering air layer becomes thinner and thinner. Here is a graph showing the relative increase in radiation dose with altitude. Normal transatlantic flight is at about 40000 feet.
A transatlantic flight would correspond to a 26 times increase.
Now suppose you are a pilot or a steward(ess) and make 10 trips per month, which amounts to about 10*2*8 =160 hours (assuming 8 hours for one flight at high altitude). With a working year of 10 months this would amount to a supplementary radiation dose of 10*160*2200/1000000 = 3.52 mSv/year (the division by 1000000 transforms nSv to mSv). The Health Physics Society gives slightly lower exposures, as for instance 2.19 mSv/y.
Even if the dose rates at high altitudes seem impressive, the supplementary dose from one transatlantic travel is tiny. The usual background dose at many locations on Earth is about 3-4 mSv (with some outliers going up to 260 mSv/y as in Ramsar, Iran), so even pilots and flight crew do not accumulate a dangerous radiation dose. The not so frequent flyer, be it for tourism or business, shouldn’t be scared. But if you intend to spend your next vacation on the Moon or on Mars, things will be different!
28 Oct 2015: new link added at the end of the blog
The heating season is about to start here in Luxembourg. I heat my home with an oil driven central heating, one of my neighbors only burns wood (in cords). The quantities of wood he uses are breathtaking, but probably he choose wood burning as “climate friendly” . Indeed, the carbon dioxide released has been gobbled up by the tree during its 30 to 60 year life from the atmosphere, and returning it to the air will be, at least at a first glance, be “carbon neutral”. At a second thought, the problem is more complicated: his CO2 release is a spike that would not have occurred if the wood had been left to rot (the release would have been over many years ), so that at least on a short period, there is not much gain by switching, say, from gas to wood.
There is much talk during the last years about the dangers of fine particles (the smaller than 2.4 micron PM2.5), be they released by Diesel engines or other energy providers. In Europe all new Diesels have particulate filters which should solve this problem. Burning wood is a very big PM2.5 emitter, and I will discuss this in the next chapters.
1. CO2 emissions
This figure shows that the CO2 emissions per KWh energy from burning wood are about the double of those from natural gas. So say if a state installs a CO2 emission measuring system (using perhaps a satellite like the OCO-2), wood burners would be in a delicate position.
If we look at the composition of the exhaust, we have this:
There is a 1% per weight emission of NOx, which is about the same for gas or oil; there are not negligible VOC (volatile organic compound) and particles emissions. Clearly the exhaust from a wood stove is very different from clean air!
2. The PM problem
Fine particles are the crux of burning wood:
This figure from the www.treehugger.com web site shows the tremendous difference between an uncertified wood-stove and a usual gas furnace. Things become much better if you use a pellet system, but nevertheless remain 162 times higher than gas (the uncertified wood-stove emits 1464 time more than gas!).
Now very often the discussion on particulate emission puts the blame also on agriculture. But the picture is much different, as agriculture does not emit the same percentage of very small particles (the PM 2.5), which are thought to be the most dangerous, being able to transit to the lung, the heart and even the brain. The next figure compares the two emission sources:
3. Hourly emissions
The next figure shows the emissions of particles in g/h for burning oak (the 3rd most frequent wood in Luxembourg): note that the emissions of the larger PM10 are about the same as the dangerous PM2.5; fire logs are possibly wax-wood mixtures, and so have about 4 times less emissions.
These numbers apply to about a burning rate of about 3 kg dry wood per hour (see here).
If you burn wood (cord or pellets), you environmental impact may not be what you intend: your immediate CO2 emissions are comparable to those of other fossil fuels, and your particulate pollution is much much worse! That is the reason why for instance the city of Parish forbids burning wood in open fires. Switching to gas (or nuclear powered electrical heating!) would be more environmentally friendly.
28 Oct 2015: see also this article by Scilogs: “Unterschätzte Gesundheitsgefahr durch Holzrauch“.