Archive for February, 2011

NAP book (3)

February 24, 2011

In this 3rd comment I will make some reflections on equivalent CO2 concentration and CO2 residence time.

Let us start with a graph given at page 60 which shows the added anthropogenic equivalent CO2 contributions: these include all GHG whose radiative forcing is estimated being greater than 0.15 Wm-2: CO2, CH4, ground (=tropospheric) O3, N2O (laughing gas) and aerosols:


I have added arrows to point to the maximum and minimum levels which correspond to the uncertainty range: this range is huge, and the question on the importance of human emitted aerosols remains largely open. Climate alarmist usually have a quick and simple explanation for the temperature dip observed from 1940 to 1970: this lower temperatures were caused aerosols (emitted mostly by burning dirty, SO2 emitting fuels, with SO2 particulates being effective in blocking sun light). Another quite different explanation is that this period coincides with the negative phase of the AMO and PDO, and as such the lower temperatures are the result of natural cycles.

This is the graph given by meteorologist Joe d’Aleo (link) from icecap. Is shows that the continental US temperatures, as given by the US historic climate network (USHCN) follow the combined PDO+AMO cycle very nicely, and that this natural relationship is all what’s needed to explain the post WWII cooling and the warming at end of the 20th century .

The big aerosol uncertainties are one of the major problems in climate models. Let us look at the right part of the NAP figure 2.1, which shows the total equivalent CO2 concentration. I added the base line corresponding to the pre-industrial situation (CO2 appr.  280 ppmv).:

This variation from what is assumed being the pre-industrial situation (this level may well be too low!) is not so spectacular anymore, if you extend your left axis down to zero. The lower limit corresponding to the greatest aerosol influence even is astonishing close to that situation (even if CO2 levels increased possibly by about 100 ppmV).

Most industrial countries certainly will try hard to diminish their reliance on imported fossil fuels. Carbon low energies certainly will play a bigger part in the energy mix of the future, even if the bulk still comes from fossil fuels and nuclear. But it could well be that these fossil fuels will rely more and more on relatively clean burning gas (shale gas or methane clathrates).

The atmospheric residence time of the released CO2 is another important parameter: if it is short, even large increases in emitted CO2 will have only a passing effect at most, if it is large, the equilibrium, long-term warming might be much more important. The question of this RT (residence time) is still open. The IPCC consensus, also adopted in the NAP report, is that emitted CO2 remains for hundred of years in the atmosphere. The reasoning goes like this: even if an individual CO2 molecule remains about 5 years at most in the air, before being absorbed by the ocean, the concentration (better: mixing ratio) in the atmosphere will stay higher for a long time (hundreds to thousand years), because every molecule absorbed will be replaced by a molecule released. The (alarmist) blog skepticalscience writes: “However, in most cases when a molecule of CO2 leaves the atmosphere it is simply swapping places with one in the ocean”. (see also this short paper by H. Lam from Princeton University). This can not be true, as for instance we have the famous 40% “missing carbon” quantity:  about 40% of the carbon emitted from burning fossil fuel disappears from the atmosphere, and this percentage has remained practically constant throughout the industrial period (link).
Many researchers question the IPCC consensus of a long residence time.

The most recent paper comes from Robert Essenhigh (abstract), who finds an RT of about 5 years, in good accordance with prior results from Tom Segalstad (link1, link2).
The analysis of C14 variations (C14 created during atmospheric atomic bomb tests:  neutrons released turn nitrogen atoms into carbon-14 atoms) confirms this time span of 5 to 7 years (link to comment):

At least one can not but agree with H. Lam who concludes his paper with: ” If indeed our policymakers are betting the future of our world on the consensus IPCC value of τL (= RT) being wrong (too big by a large factor), the general public ought to be told”.

So we have here again a very important climate parameter on which there is no agreement. If we reflect on all the uncertainties and all the parameters in dispute, the necessity of immediate and drastic actions like the 80% reduction in CO2 emissions in 2050 repeatedly given as mandatory in the NAP report could possibly be an exaggerated goal.

NAP book (2)

February 22, 2011

In my previous comment on the new NAP book “Climate Stabilization Targets”  I wrote that the main metrics to quantify climate change used are the cumulative carbon emissions and global temperature change. The first parameter has been proposed by Matthews et al. in 2009 (“The proportionality of global warming to cumulative carbon emissions”, Nature 459). Emissions of carbon include those related to the burning of fossil fuels, cement production and land use. Up to 2010 these emissions are estimated at a total of  approx. 0.52*10E12 metric tons of carbon (i.e. 0.52 teratons). I assume that this total is calculated starting at 1750, the date usually taken as the start date for the calculation of radiative forcings (i.e. the start of the industrial age).

The graph given confuses independent and dependent variables: we want to know “what will the global temperature change dT be when the cumulative carbon emissions CC reach a certain level?”. I made a knew plot, using the numbers picked from the NAP graph and plotting dT versus CC. The fit lines were calculated by linear regression. I also added the upper and lower limits of the uncertainties, which are obtained by multiplying the dT by 1.4 (upper limit) and by 0.7 (lower limit). Here is this graph:

We see that dT= -0.04 + 1.77*CC, with dT in °C and CC in trillion tons. It is obvious that this model gives practically no temperature change if the emissions had been zero.  This formula implicitly says that if there had been no carbon emissions (neither from fossil fuels nor land use), global temperature would have remained stable at the cool level of the LIA. Or a planet leaving a little ice age is warming in a natural way; this process seems to be ignored in the model applied.

Global temperatures were at least 0.4 °C lower than in the reference period 1950-1980 (link). So adding the temperature change caused by the 0.52 Tto emitted should give a global temperature change of -0.4+0.88 = 0.48 °C for today. We know this to be incorrect, as the most probable value is 0.7 to 0.8 °C. The difference can only come from the natural warming caused by the end of the LIA. This suggests that natural warming could be approx. of the same order as that caused by human emissions.

But there is another possibility to explain the observed warming. Akasofu has written several papers on the Earth recovering from the LIA, and he gives this figure in his 2009 paper “Two Natural Components of the Recent Climate Change” (link):

In fact, he says that actual warming can be seen as a superposition of two natural trends: an ongoing warming due to the end of the LIA and a periodic warming/cooling from the different oceanic oscillations (PDO, AMO, …). Carbon emissions do not even need to be taken into account!

As a conclusion:

The cumulative emission formula does not allow to predict actual global temperature change for the future. It can only speculate what the additional warming caused by carbon emissions can be. As it ignores natural causes of warming/cooling, the formula seems suspect. The model contains only one single parameter (cumulative carbon emissions) as a cause for global warming; we know this is not true, reality is much more complex. The predictive value of the formula is low.

Just to show how wildly the different climate models differ in their predictions, look at the fig. 3.5 from the report:

The NAP report correctly writes (p.15): “Uncertainty in the carbon dioxide emissions and concentrations corresponding to a given temperature target is large”, and this will be the last word of my comment #2.

New book from NAP: Climate Stabilization Targets (1)

February 21, 2011

The National Academies Press (NAP) has published a new 299 page book on climate: Climate Stabilization Targets: Emissions, Concentrations, and Impacts over Decades to Millennia (free download page here).
It clearly is an alarmist book, written by the usual “suspects” like Susan Solomon, Raymond Pierrehumbert etc. A lot of money must have been poured in this compilation: the graphics are first class, the layout very professional. I must also confess that the writing style is easy to read, and as a whole reading the report is not a waste of time.

What is regrettable, is that if future evolutions are shown, the most negatives are systematically stretched, and possible positive consequences are practically passed over as non existent.

I just made a quick glance over the book, read the synopsis and summary carefully (43 pages, ) and will give some first impressions here, starting with the end of the report.

1. The references.

The reference – list at the end of the book is extremely large (34 pages, probably over 500 citations) and I doubt that the contributors have read everything. What is the most interesting in this list is:

a. the committee members cite themselves without restrain (Hayhoe, Matthews and Pierrehumbert each have 4 citations)
b. There is not one single reference to a publication by a climate realist. You will not find any of these authors: Akasofu, Carter, Christy, Courtillot, Dyson, Lindzen, Michaels, Pallé, Pielke, Reiter, Scafetta, Singer, Soon, Spencer, Shaviv. It is as if these scientists had never published anything touching climate science!
How can this consensus report wave the banner of objective science! This is exactly the unscientific, politicized behaviour that again and again has been condemned following Climategate. Apparently no lessons have been learnt.

2. The metrics used

To quantify climate change, the report uses two metrics:

– global temperature change
– cumulative anthropogenic carbon emissions

Using the first metric is problematic, as many, many examples have shown that global temperature and temperature changes are difficult to quantify correctly (think of the urban heat island effect, of quality problems with meteorological stations etc.). Roger Pielke Sr. constantly stresses the importance of the OHC (ocean heat content) as a much more reliable metric to quantify global warming. OHC is not even mentioned in the summary.

CO2 emissions seem to be the single cause of warming (“Emissions of carbon dioxide from the burning of fossil fuels have ushered in a new epoch where human activities will largely determine the evolution of Earth’s climate”), natural causes are non existent in the summary; in a typical IPCC fashion it is suggested that without human emissions, global climate would remain stable in an idyllic state.

Let me close this first contribution with a telling sentence given in the acknowledgments section. Ten individuals were reviewers of the report but “…they were not asked to endorse the conclusions or recommendations nor did they see the final draft of the report…”

Was the review process just an alibi?

EU back to energy realism?

February 13, 2011

added 18Feb11:

Article by Maxeiner & Miersch in “Die Welt”:  Unsere Energieträumer verbrennen Geld anstatt Gas

_______________

Remember that official EU energy policy aims to increase its input from renewable sources to 20% of total energy consumption in 2020 (some even aimed at 30%);  renewable should not be confounded with carbon-free, as for instance nuclear energy does not belong to that category! Well, last winter was a further lecture of reality versus politics in the UK.  From 3-9 January 2010 wind turbines rated at a nominal 2.5 GW produced only 0.1 GW, and during the 2th December 2010 their output was between 0.7 to 1.2% of the total 56 GW needed, that is 4 times less than planned; this despite wind energy having the highest priority. Wind turbines are the last to be shut down when production would exceed demand  (read here a Britain Watch comment). It is very difficult to get reliable numbers on the real capacity factor of wind turbines and photovoltaic. For instance UK’s Ofgem could not deliver production data of sources subsidized by heavy feed-in tariffs (read this telling story from the Renewable Energy Foundation).

The over generous feed-in tariffs (FiT’s) are clearly unsustainable, as the financial crisis bites painfully into the public budgets (as shown by Spain and the Netherlands). More and more ordinary citizen are asking themselves why they should pay big taxes to profiteers who claim to save the planet just to reap large rewards for their investments in wind and solar energy. The German experience in photovoltaic showed that these high FiT’s was a hindrance to lower the price of solar equipment, and that a big part of the money went directly to Chinese factories and not to the German solar cell producers.

The Dutch having a new middle/right wing government made quite a splash the last days, when they decided to stop subsidizing the very expensive offshore wind farms and the still more expensive photovoltaics (read article in German in the Financial Times Deutschland).  To the horror of the German green movements and party they plan to build a new nuclear reactor in Zeeland, and as such make a comeback to carbon free atomic energy.

On another front, official EU policy seems to awaken to shale gas, after stubbornly  ignoring this energy revolution started a couple of years ago in the USA and made possible by new horizontal drilling techniques (see excellent youtube video) . Shale gas may well make redundant many grandiose plans for several decades: new pipe lines from Russia to Europe, the gigantic Desertec plan to install thermal solar facilities in the North African region and the monumental hypergrid said to be mandatory to transport intermittent wind electricity throughout Europe could soon become superfluous for the near future. Shale gas also means the welcome killing of the baroque CSS (carbon storage and sequestration) projects, and not many tears will be shed on the burial of this dangerous and expensive technique.

The official EU did not see the Jasmin revolution preparing in Tunisia, neither that going on in Egypt and probably spreading soon to the whole Arab world. Neither were they open to energy realism, following slavishly the various green movements which dictated the EU energy politics. Now serious cracks start to appear in the once seemingly unanimous energy front. Probably the Eastern countries like Poland and the Baltics will follow the Dutch. France probably just waits for the right moment to leave the world of carbon certificates and  Kyoto-style binding treaties without loosing face. It could well be that Germany will be the last of the Mohicans planning using exclusively renewable energies, mostly wind and solar, whatever the costs and the benefits.