This blog started 28th September 2008 as a quick communication tool for exchanging information, comments, thoughts, admonitions, compliments etc… related to http://meteo.lcd.lu , the Global Warming Sceptic pages and environmental policy subjects.
In a previous blog “CO2 and temperatures: da stelle mer uns janz dumm” I presented a “zero-dollar” model using past global temperatures and CO2 data to estimate the climate sensitivity using the 1850-1945 and 1945-2013 periods, and found the effective climate sensitivity (which should be close to the equilibrium climate sensitivity ECS) to be about 1.34°C. This means that a doubling of atmospheric CO2 mixing ratio with respect to the pre-industrial level would cause a global warming of at most 1.34°C. This is a number much lower than the “consensus” values of the IPCC which suggest a most probable warming of 3°C (1.5 to 4.5 °C range). Many authors disagree with the IPCC estimation. Lewis and Curry for instance find in their recent paper “The implication for climate sensitivity of AR5 forcing and heat uptake estimation” values of 1.33°C for the transient climate response TCR and 1.25 to 2.45 °C (i.e. a central value of 1.85°C) for the equilibrium climate sensitivity ECS. Let me just recall that the lower TCR gives the heating due to a CO2 doubling at the moment where this doubling occurs (assuming this doubling takes 70 years to realize), whereas the ECS gives the far in the future lying definitive warming if all feed-backs and readjustments are finished. Dr. Roy Spencer from the UAH presented in his blog yesterday a new calculation using the 15 years of data from the CERES instruments which are carried by successive NASA satellites. CERES measures out-going and incoming radiation fluxes (in W/m2), and is the best (and practically only) source for these extremely important data. Dr. Spence found in several previous papers that when the globe changes its surface temperature, the atmosphere reacts with a delay of about 4 months with its changes in radiative flux. So he took the available 15 years of CERES data, computed yearly means and plotted these data versus the 4 month time shifted global temperatures of the HadCRUT4 series, with a linear regression of flux (t) = a*T(t-4) + b (t = time in months). This gives the following figure: This linear regression tells us that dF = 2.85*dT (a change of global temperature of 1 °C (or 1 K) corresponds to a forcing of 2.85 W/m2 (and vice-versa). The parameter 2.85 represents the climate feedback lambda. Now the effective climate sensitivity ECS is defined as ECS = F2xCO2/lambda where F2xCO2 is the radiative forcing caused by a doubling of atmospheric CO2, and lambda = feedback factor. Let us accept that F2xCO2 = 5.25*ln(2) = 3.71 W/m2; the number 5.35 is the “consensus” value, which remains subject to discussions, but is more or less accepted by both climate alarmists and realists. So we have: ECS = 3.71/2.85 = 1.3 °C When CO2 mixing ratio reaches 560 ppmV (i.e. the double of the concentration at pre-industrial time, about 1850), we should have a total warming of max. 1.3°. As the globe warmed by about 0.8° since that time, there would be max. 0.5° of coming warming in the pipe-line. Conclusion: All these ECS values derived from observations (and not climate models!) are rather low. Dr. Spencer says that the 1.3°C should be taken as a maximum, and that the real ECS could possibly be much lower (Prof. Lindzen suggested 0.7 – 1°C). Should we worry? No! And should we desperately try to avoid any CO2 emissions? Neither!
There is a new very interesting paper by A. Fisher and al. from UC of Santa Cruz on the problem of geothermal heat flux at the base of the West-Antarctic ice sheet. Fisher an his collaborators drilled a 25cm diameter hole through the ice sheet up to the underlying lake Whillans. The red square shows the location of the drilling.
They lowered a heavy 200 kg probe with thermistor sensors through the bore hole and the lake water into the mud at the bottom of the lake. This probe partially entered the mud , so that one thermistor was located about 0.8m deep in the mud (TS1), and the other(TBW) just at the interface between the soil and the bottom of the lake. The next table shows the data for two measurements (GT-1 and GT-2):
The important quantity is the heat-flux q, which is about 280 mW/m2. One part of the heat-flux goes up through the ice sheet, and another one (180 W/m2) essentially causes the ice at the base to melt. This number of 180 seems low, but it corresponds to a melt of approx. 10% of the ice created by snow fall!
The large measured heat-flow comes at a surprise, for usual accepted values for Antarctica (which were derived from various models) are closer to 50 mW/m2.
So we have here again a nice argument not to neglect measurements, and not to rely exclusively on theoretical modelling. This new paper shows a natural phenomenon contributing to the WAIS (West Antarctica Ice Sheet) melt, and not the usual suspected culprit of (anthropogenic) global warming. It remains to be seen, if other measurements at other locations deliver results pointing in the same direction.
This paper follows one of Amanda Lough showing that large volcanoes exist below the WAIS, and that this volcanic activity may also be a contributor to increasing ice melt.
To conclude, here is a figure from Wikipedia showing the heat fluxes over the full globe: note that this flux is highest at the ocean ridges, as should be expected!
The total heat power streaming from the interior to the surface of the earth is estimated to be about 46 TW; this has to be compared to 17 TW power released by human activity (see the meteoLCD energy widgets).
As a heat wave is expanding through Western Europe (and Luxembourg), I would like to give some comments on the heat stress that is measured at meteoLCD since May 2001 (see paper by Charles Bonert and Francis Massen here). We are still the one and only station which shows this important parameter in real live time on the web.
1. Some technical details on the apparatus.
The ISO standard defines how the heat stress is measured:
(figure from here)
The globe temperature is the highest under usual sun-shine conditions , the dry air thermometer gives an intermediate temperature and the wet bulb thermometer shows distinctively lower values. The next figures gives the situation today (30 June 2015, 14:00 UTC) at Diekirch:
If we take the last rightmost values , we have GlobeT = 40°C, DryT =30°C and WetT = 22°C. Using the formula given in the figure we get WBGT = 0.7*22 + 0.3*40 + 0.1*30 = 26.4 °C, shown by the last red point.
The differences between the three thermometer readings are enormous, and the very low wet bulb temperature shows how efficient evaporation is for cooling. The next picture shows the wet bulb device as used at meteoLCD: it is essentially a Pt100 sensor covered by a cotton wick whose other end plunges into a pot of distilled water. Two times a day this water reservoir is filled by a peristaltic pump. We use distilled water to avoid hardening of the wick by the dissolved lime which is abundant in tap water. A grillage (not shown here) is needed to keep thirsty or curious birds from picking at or stealing the wick (yes, this happened several times).
2. When is warm too hot?
Normally, the body temperature (measured in the rectum) should not exceed 37°, with 38°C set as the upper limit. Too hot is a situation, where the WBGT pushes the internal body temperature above this limit. Now, depending on physical activity (and clothing), this limit is reached sooner or later. A heavy worker, or a soldier making strong physical efforts (in heavy clothing) will reach this dangerous situation much earlier than a tourist resting on the sea-shore. The metabolic rate gives in W the heating power produced by the working body. For a body at rest it is about 65 W, for hard work it can exceed 300 W. So compared to the man-at-rest, a worker will reach the WBGT limit much sooner, as shown by the following figure (same ref. as above):
A difference is also made between a person acclimatized to the warm situation, and one which is not.
3. An example
Now lets take a person riding a bicycle at about 38 km/h. The corresponding MET (metabolic rate) is according to here about 5, which corresponds to 5*65 = 330 W. Using the above diagram for the unacclimatized person (remember: this is the first day of a heat wave!) we see that the heat stress limit is about 24°C: so a bicycler starting at 14:00 UTC (16:00 local time) exceeds the limit (as the WBGT is 26.4) and puts himself at the risk of a heat stroke.
May I suggest that he should make his cycling much earlier, for instance at 06:00 UTC (08:00 local time) and finish definitively at 10:00 local time.
Fact is that more people die from cold than from heat, but nevertheless heat can be an insidious danger. Usually normal common sense is all what is needed to remain safe, and maybe this blog comment will be of some help!
1. The “Consensus Science” ignores or belittles the solar influence.
A furious debate is going on since many, many years on the influence of solar variations on global and regional climate. We all know that solar irradiance varies periodically in 11 years, its magnetic field in 22 years, and that many other periodic variations can be found (for instance the Suess cycle of 211 years , the millennium cycle explaining the Minoan, Roman and Medieval warm periods, and the very long Milankovitch cycles which rhythm the great ice ages).
The total irradiance = the power sent by the sun to the earth varies little over one cycle, about 1 W/m2 (at the top of the atmosphere, surface perpendicular to the rays) from maximum to minimum. This must be compared to the ~1366 W/m2 mean irradiance. The “consensus” climatologists and the IPCC insist that this is too little to explain for instance the 0.8°C warming observed during the last 100 years, and is completely swamped by the radiative forcing of our greenhouse gas emissions (estimated at ~2 W/m2 for CO2 alone when the year 1750 is taken as zero). What this consensus-science ignores is that the UV irradiance during a solar cycle varies much more. and has many indirect and possible amplified consequences due to ozone heating and influence on the great oceanic oscillations (see here).
Also ignored are what we know from history: periods of low solar activity (as the Maunder minimum during the first part of the 17th century, or the Dalton minimum around 1820) were much colder (by 0.4 to 0.2 °C) and are described by historians as periods of famine and social unrest due to bad agriculture productivity.
2. The new Ineson et al. paper
Sarah Ineson et al. have published the 23th June 2015 an interesting paper titled “Regional climate impacts of a possible future grand solar minimum” (Nature communications, open access). This paper is interesting not for the usage of climate models (we all know how unreliable these can be), but for the steps made in acknowledging what climate realists have said since many years:
a. the actual decline in solar activity is faster than any other such decline in the past 9300 years
b. this decline may lead to Maunder Minimum-like conditions with a probability of up to 20%
c. recent satellite data show that the variability of UV irradiance could be considerably larger than given by previous estimates
Their modeling exercise (EXPT-B) suggests a regional winter-time cooling for Northern Europe and East-USA of about 0.4 °C
This figure of the annual mean temperatures shows the solar induced cooling that touches nearly every part of the globe (under the hypothesis that the ongoing quiet solar situation will have the same effects as it had during the Maunder minimum).
This cooling could begin to start around 2030 and continue up to 2100 ! If you think at the coming Paris COP in December 2015 where all countries will be coaxed into binding warming lowering policies, this new paper should make ripples in the naive enthusiasm of the anti-warming advocates.
Could it be that our rising CO2 emissions, besides their positive influence on plant productivity and planetary greening, will be our best insurance against a fall back into a Little Ice Age?
Read also these contributions in Climate Dialogue about what will happen during a new Maunder Minimum.
A couple of weeks ago I sent to the Luxemburger Wort a letter with some comments on the introduction of smartmeters (or smart meters = intelligent electricity counters), which will shortly begin here in Luxembourg. The client has no other choice than to accept!
This letter has been published in an abridged form last Saturday (20 June 2015) in the Luxemburger Wort at page 24. Here is the full text (in German) I wrote on this subject:
Smartmeter: naïve Begeisterung?
Die erzwungene Einführung des intelligenten Stromzählers (Smartmeter) wird in den Medien in höchsten Tönen und kritiklos gelobt. Natürlich ist es nützlich und bequem, wenn der Stromlieferant den Zählerstand aus der Ferne ablesen kann, und wenn eine digitale Anzeige eine genauere Angabe des augenblicklichen Verbrauchs ermöglicht als dies die rotierende Scheibe des alten “Ferraris”-Zählers kann. Verschwiegen werden jedoch quasi systematisch mehrere schwerwiegende Konsequenzen dieses neuen Zählers:
- Der Smartmeter (in Frankreich auch als “compteur mouchard” bekannt) liefert ein lückenloses Bild unserer Lebensgewohnheiten, da die fast minütliche Abfrage des Zählers ein äusserst klares zeitliches Profil des Stromverbrauchs erstellen kann.
- Die traditionelle, einfache und leicht verständliche Tarifierung wird höchstwahrscheinlich sehr bald durch stark schwankende Preisperioden (eventuell im 10-Minuten-Takt) ersetzt, welche den Kunden zu einem Verbrauchsmuster zwingen sollen welches nicht seinen Wünschen, sondern denjenigen des Lieferanten entspricht.
- Der Smartmeter ist das unversichtbare Tor zum “intelligenten Netz” mit seiner DMS (Demand Site Management = Verbrauchs-Steuerung). Vorbei die Zeiten wo die über den Panzerkasten fest eingestellte maximale Stromleistung ohne wenn und aber jederzeit verfügbar war: nun sollen Haushaltsgeräte übers Netz freigeschaltet werden können, wenn dies vom Provider erwünscht wird (die Pille wird mit einem billigerem Tarif versüsst) und/oder die verfügbare Leistung wird zeitweilig zwangs-gedrosselt.
Die extravaganten Energieersparnisse welche oft das Hauptmotiv der Smartmeter sind, haben sich in den Ländern wo diese Geräte schon länger oder als Test eingeführt wurden, als Trugschluss erwiesen:.So zeigt eine Studie des Fraunhofer Instituts von 2011 über deutsche und österreichische Haushalte dass die Stromerparnis im Schnitt nur 3.7 % beträgt.
Diese “guilt-meters” (“Schuld-Meter,”) wie Prof. Woudhuyzen sie in einem Artikel des online Magazin WIRED genannt hat, können selbstverständlich gehackt werden (so vom Chaos Computer Club vorgeführt) und erlauben einen Eingriff in die Privatsphäre welche in einem Bericht der Universität von Ryerson (Kanada) so formuliert werden: “Smart appliances offer utilities the opportunity to control areas of life that Courts have considered to be private and intimate”.
Da die grossen Beraterfirmen ja jetzt offenen Zugang zu allen unsern Entscheidungsebenen haben, will ich mit einer Studie von Ernst & Young abschliessen. Im Bericht von 2013 “Kosten-Nutzen-Analyse für einen flächendeckenden Einsatz intelligenter Zähler » steht ganz deutlich dass: « Die von der EU angestrebte Rollout Quote von 80% bis 2022 über eine allgemeine Einbauverpflichtung… ist für den Grossteil der Kundengruppen wirtschaftlich nicht zumutbar ».
PS: Eine ausführlichere Diskussion (mit allen Referenzen) wurde in den APESS Récré No.28 (2014) unter dem Titel “Compteurs et réseaux intelligents, clients impuissants » vom Verfasser publiziert.
If you have a comment on this subject, please feel free to give it here! If you which a copy of the APESS text, please ask for it.
1. Germany’s negative electricity prices.
We often read in the press that German wind and solar electricity exceeds the demand, and must be exported at very low or even negative prices. Agora Energiewende writes in June 2014 that the situation of negative prices becomes more frequent, as the conventional fossil and nuclear producers can not diminish their minimum production (the “spinning reserve”) below 20 to 25 GW (often also called the “must run minimum production”). If all things remain as they are, Agora predicts more than 1000 hours per year of negative electricity prices for 2022 ! But they also say that up until know, renewables never were able to produce more than 65% of the demand, even during peak production periods. The excess in production leading to negative prices is caused by the impossibility of intermittent renewables to guarantee the needed electrical power at every moment of the year. Conclusion: wind and solar (the main renewables) are unsteady (and often unpredictable) producers which absolutely need spinning reserve to have year round electricity availability.
2. Is there an upper limit for renewable producers in a stable electrical grid?
This question has been researched by Jesse Jenkins (MIT) and Alex Trembath (from the Breakthrough Institute). They came to a very easy to memorize rule of thumb: “When wind and solar work without subsidies (as do other producers), the maximum amount of their part in the total power production of the grid is equal to their capacity factor”.
They give the following graph from another publication showing the decline in the value factor of Wind and Solar electricity with increasing market share in Germany (boxes added by me):
Conclusion: Economics, and not so much technology impose an upper limit for the integration of wind & solar electricity.
Let me give two examples:
A. Rheinland-Pfalz (the German Land bordering on Luxembourg) , 2013 (see link):
Production from wind and solar: 3041916 MWh and 1369808 MWh
Capacity factors: 0.151 for wind and 0.092 for solar
Fraction of the total power produced: 0.203 for wind and 0.091 for solar
Conclusion: Wind production exceeds the limit given by the rule of thumb, and solar power has reached its maximum.
Production from wind and solar: 83027 MWh and 73738 MWh
Capacity factors: 0.162 for wind and 0.089 for solar
Fraction of total power produced: 0.029 for wind and 0.026 for solar
Conclusion: Wind and solar productions are both well below the rule of thumb limit.
3. Two other papers/comments on this problem
Proteos writes in his French blog: “Les subventions à l’éolienne et au solaire sont parties pour durer”:
He writes that wind and solar have a marginal zero price (as wind and solar energy has no price), so on average they get a price which is lower than the average market price, and with increased renewable penetration, this price is falling (picture shows German situation):
Without subsidies, these renewable producers will go out of business, as will the conventional producers which deliver the base-load and spinning reserve. Finally all power producers must be subsidized to have a working electrical infrastructure and power.
JM. Korhonen writes in his blog: About 20% of wind and 74% of solar production are worthless, and under a free market, the renewables revolution will stop dead on its tracks once peak production reaches demand. He also is skeptic of the much touted “demand site management” (to be introduced with the smart meters), which will not avoid PV hitting a wall.
4. A limit imposed by material requirements.
Korhonen cites a 2013 paper by Vidal et al. published in Nature Geoscience with estimates of the extraordinary huge amounts of concrete, aluminium, copper and glass needed by wind and solar, if 25000 TWh world production would be reached in 2050. From 2035 on these requirements would outstrip the year 2010 total world production. The next figure is self-explaining:
Note that despite all the hulla-bahoo about uranium mining nuclear power comes out best!
5. Overall conclusion
Let me give this in the words of Korhonen:
“In conclusion, we may very well have too much of a good thing. And this is something that bears remembering the next time someone tells you that renewable overproduction is not a problem, or that renewables are reducing electricity prices and making existing plants uncompetitive. Or applauds, when 50% (or some other figure) of daily electricity production is met from renewable sources.”
As so often, reality bites harder than the teeth of naive greenies!
Dr. Philip Loyd from South-Africa has published in Energy & Environment a short but easily understandable paper trying to ask this question (link to abstract). He took several reconstructed temperature series reaching back up to over 8000 years, i.e. covering the Holocene period. An example shown in the figure below is the Gips-2 reconstruction from a Greenland ice core, where temperatures have been deduced from an analysis of the Ar (Argon) and N2 (Nitrogen) isotopes.
First he de-trended the data by subtracting the best polynomial fit (which was a linear one in this case); than he calculated the differences from temperatures 100 years apart, and finally calculated the statistical standard deviation of this ensemble. The following figure gives the result for the 4 series that he used:
Taking these 4 results and computing the average gives 0.98 (rounded to 2 decimals) with an uncertainty of 0.27.
So the historical data show that temperatures fluctuate naturally by ~0.98 °C during one century; as the Hadcrut3 series for global temperatures from 1900 to 1999 gives a warming of 0.7°C, this number is smaller than the natural centennial variability. The author concludes prudently “the signal of anthropogenic global warming may not yet have emerged from the natural background”; which means in very simple words that human caused centennial warming (by greenhouse gas emissions for instance), if it exists, does not exceed for the moment natural temperature variability.
Prof. Judith Curry (GeorgiaTech) has a very interesting comment in here outstanding blog Climate Etc.
A journalists asked her “What are the most controversial points in climate science related to AGW (anthropogenic global warming)?”
Prof. Curry gives two very simple, easy understandable answers:
1. Has the warming since 1950 been dominated by human causes?
2. How much will the planet warm during the 21th century?
The IPCC continues to ignore the importance of natural causes for climate change (and global warming), putting as an act of faith that nearly all changes are caused by human activity. Some dubious graphs appeared in the AR’s and others showed that only when climate models take account of green house gas emissions (actually of atmospheric CO2 mixing ratio), observations and models become compatible:
(“Climate Change Attribution” by Robert A. Rohde)
This figure ignores the fudge factors (=parameters) introduced in the models until their result becomes similar to the observations. This technique is more related to curve fitting than to an understanding of all the physical causes of the observed temperature variation.
The insufficient understanding of the natural causes of climate change remains the major obstacle for believing the IPCC so-called “consensus”. We are still largely ignorant about the magnitude (and possibly even the sign) of the influence of factors as cloud cover, solar variation (total irradiance, UV changes, possible amplification of small natural changes). What we are sure is that the climate models overstate enormously the observed global temperature increase during the last 15 to 18 years (read this comment by Fyfe et al.):
This figure (link) documents clearly that climate models should not be used as a basis for political decisions: the green curve represents the global temperature anomaly according to Hadcrut4 (land and ocean based weather stations) and UAH (satellite data) up to 2013. It is remarkable how far the different models deviate from one another: the enormity of the differences makes the suggestion that the average ( the black line) somehow represents the “truth” absolutely questionable. Are our political deciders aware of this?
The researchers of the University of Alabama in Huntsville (UAH) (mainly John Christy and Roy Spencer) are one of the two crews that analyze the MSU (Microwave Sounding Unit) data delivered by about 16 successive or simultaneous satellites orbiting the earth since December 1978 (the other group is the RSS crew). The UAH people now have published version 6.0 of their dataset, which differs in many points from the previous version 5.6, as explained by Dr. Roy Spencer in this report.
The report contains an interesting figure which gives the decadal temperature trends (in °C/decade) for different global regions, and also making a distinction between land and ocean regions.
We see that the Antarctic has not warmed at all, and the global oceans (which might represent the best estimate for global warming) by about o.o8 °C/decade, which extrapolated would mean a warming of about 0.8°C at 2080; clearly nothing to be afraid of!
The Arctic oceans show the greatest trend (about 0.27 °C/decade), and it should be noted that the new reanalysis has cut the previous trend nearly in half!
Also interesting are the new data for the North and South hemisphere oceans (points Nhemis_O and Shemis_O): both trends are astonishingly close (approx. 0.9 and 0.75 °C/decade).
This figure does not inform on the global temperature hiatus seen since about 1998; we have to wait before the UAH people will publish the relevant data.
I continue the discussion on natural disasters using the handy graphing feature of the em-dat website of the UCL (Université Catholique de Louvain).
First the graph of floods (the left axis shows the yearly numbers):
If we look for the total number of deaths caused by these floods, there clearly is a case for optimism (left axis = deaths in thousands):
The post 2000 casualties are not higher than those of 1960, where the reported number of floods was much lower!
The droughts statistics show much more variability:
Here one can not see a spectacular rise, but a lot of inter-annual variability. The exceptional high peak corresponds to 1981, the last maximum from the right to the year 2000. Since that year, the global number of droughts is decreasing (which does not signify that several regions, as parts of California, do not suffer from an ongoing severe drought).
The number of deaths are astonishing small, and show no tendency:
What is the conclusion of this little exercise? It is not correct to state that natural catastrophes of floods and droughts are continuously increasing thanks to an ongoing global warming. This litany is dear to many environmentalists and politicians, whose agenda is impervious to real data.