h/t Ric Werme
There have been increasing concerns lately about the environmental impact of biomass power plants which use wood pellets.
Quite apart from the local environmental impact in the forests where the wood is sourced, it has been claimed by US scientists that biomass plants could actually increase CO2 emissions.
The giant facility at Drax is one of the UK power stations which is starting to convert from coal to biomass, encouraged by subsidies paid for by electricity consumers. They have attempted to defend their operation by claiming that they are “using off-cuts of wood that would otherwise be waste”.
One of their suppliers is a US company, Rentech Inc, which has been developing two new plants in Canada to supply the ever growing international market for pellets. On their website, they show this photo of the “First shipment of wood fibre to the Atikokan facility”, one of the two plants.
Now, I didn’t do GCE in trees, but they don’t look like off-cuts that would otherwise be waste to me.
By Paul Homewood
Antarctic sea ice extent continued to set new records in August, finishing the month at 19.154 million sq km, beating the record set last year by 87,000 sq km.
It is worth noting that the climatological maximum, using a 1981-2010 baseline, is 18.581 million sq km, set on average on 22nd September. No year prior to 1998 set a maximum extent greater than the current level, and only seven years have had maximums higher than 19.154 million sq km.
By Paul Homewood
In April, DMI published their annual report on temperature trends in SW Greenland.
They have built up a temperature series built around the three stations of Nuuk, Ilulissat and Qaqortoq, and present this graph. (They also show Tasilaq, or Angmagssalik, which is in Eastern Greenland, for comparison).
The record mild year of 2010 stands out, but otherwise temperatures in the last decade have been around the same level as the 1930’s and 40’s. At Nuuk, for instance, the 2013 annual mean temperature was –0.3C; twenty eight other years have been as warm or warmer, including five in the 1940’s.
Looking at the 10-Year average for Nuuk, we find that the current figure is –0.2C, only slightly warmer then the 1928-37 figure of –0.3C.
The amount and rate of rise in the last two decades is also comparable to that in the years leading up to 1937.
What is interesting, though, is that GHCN/GISS have felt it necessary to adjust the DMI temperature record. And, you’ve guessed it, they’ve cooled the past.
By Paul Homewood
|Change from last month||+0.00||+0.00||-0.07
|12 month running average||0.23||0.25||0.53||0.64||0.66|
|12 month average – 1981-2010 Baseline||0.13||0.25||0.24||0.24||0.24|
By Paul Homewood
David Rose’s piece in the Mail on Sunday has already been picked by WUWT and Bishop Hill, amongst others. But I want to concentrate on one aspect, that of cycles.
The Mail report:
However, Dr Hawkins added that the decline seen in recent years was not caused only by global warming. It was, he said, intensified by ‘natural variability’ – shifts in factors such as the temperature of the oceans. This, he said, has happened before, such as in the 1920s and 1930s, when ‘there was likely some sea ice retreat’.
Dr Hawkins said: ‘There is undoubtedly some natural variability on top of the long-term downwards trend caused by the overall warming. This variability has probably contributed somewhat to the post-2000 steep declining trend, although the human-caused component still dominates.’
Like many scientists, Dr Hawkins said these natural processes may be cyclical. If and when they go into reverse, they will cool, not warm, the Arctic, in which case, he said, ‘a decade with no declining trend’ in ice cover would be ‘entirely plausible’.
Peer-reviewed research suggests that at least until 2005, natural variability was responsible for half the ice decline. But exactly how big its influence is remains an open question – and as both Dr Hawkins and Prof Curry agreed, establishing this is critical to making predictions about the Arctic’s future.
Prof Curry said: ‘I suspect that the portion of the decline in the sea ice attributable to natural variability could be even larger than half.
‘I think the natural variability component of Arctic sea ice extent is in the process of bottoming out, with a reversal to start within the next decade. And when it does, the reversal period could last for several decades.’
This led her to believe that the IPCC forecast, like Al Gore’s, was too pessimistic.
‘Ice-free in 2050 is a possible scenario, but I don’t think it is a likely scenario,’ she concluded.
The cycle they refer to is the Atlantic Multidecadal Oscillation, AMO. Below is the detrended AMO, and you can see it runs on about a 60-year cycle. It is currently around or just below its peak, having risen strongly since the mid 1970’s.
It may remain at the current level for a few more years yet, but it will then fall away for the next 30 years, just as it did from the 1940’s to the 1970’s.
So, what effect will this have on Arctic ice? We can glean much from examining temperature trends around the part of the Arctic affected by the AMO.
First, Iceland. Below is a report from the Icelandic Met Office in 2008, “Past temperature conditions in Iceland from 1798 to 2007”, which uses the long running, high quality site of Stykkisholmur.
Temperature in Stykkishólmur (Western Iceland)
The temperature (figure 1) has in the long run been increasing during the last 200 years at the rate of +0.7°C per century. This is similar to the general temperature increase in the whole Northern hemisphere during the same period. The warming has been very uneven, dominated by three cold periods and two warm ones.
Annual temperature in Stykkishólmur 1798 to 2007
Figure 1. Annual temperature in Stykkishólmur 1798 to 2007. Note that the values prior to 1845 are interpolated from observations at other stations. The confidence is very low for the years before 1830 and the values are preliminary and should not be referenced. Work on quality improvement is ongoing. A few warm and cold years are highlighted.
The time from 1925 onwards is dominated by a very large cycle that does not show an overall significant warming, although the temperature rise of the last 20 years is considerable.
There is also a large decedal variability before 1925. The year 1892 marked the end of a period dominated by a very large year-to-year variability and the end of a long run of very cold years. There was a relatively warm period during 1837 to 1858, and by overlooking the very cold year of 1835 and a few isolated cold months one can identify the interval 1813 to 1858 as a generally warm one.
The years 1807 to 1812 were very cold. Although the following warm period was considerably colder than the corresponding 20th century warm period it was noted as a generally favourable time for agriculture and the population of the country increased markedly.
The 20th century warm period that started in the 1920s ended very abruptly in 1965. It can be divided into three sub-periods, a very warm one to 1942, a colder interval during 1943 to 1952, but it was decisively warm during 1953 to 1964.
The cold period 1965 to 1995 also included a few sub-periods. The so called "sea ice years" 1965 to 1971, a slighly warmer period 1972 til 1978, a very cold interval during 1979 to 1986, but therafter it became gradually warmer, the last cold year in the sequence being 1995. Since then it has been warm, the warmth culminating in 2002 to 2003. Generally the decription above refers to the whole country, but there are slightly diverging details, depending on the source of the cold air.
1) The reference to a long term temperature increase since 1798.
2) No significant warming since 1925.
3) Reference to cold and warm periods. The most recent ones being the cold period from 1965 to 1986, followed by the recent warming. Note how these, and the warm period culminating in the 1940’s, correspond with the rise and fall of the AMO.
If we look further afield, we find patterns in Greenland (Godthab and Angmassalik), Norway (Vardo) and Russia (Murmansk and Salehard). The following graphs are from GISS, and use unadjusted data.
By Paul Homewood
There’s a new paper recently published by James Elsner et al, “The Increasing Efficiency of Tornado Days in the United States”.
The authors analyze the historical record of tornado reports in the United States and find evidence for changes in tornado climatology possibly related to global warming. They do this by examining the annual number of days with many tornadoes and the ratio of these days to days with at least one tornado and by examining the annual proportion of tornadoes occurring on days with many tornadoes. Additional evidence of a changing tornado climate is presented by considering tornadoes in geographic clusters and by analyzing the density of tornadoes within the clusters. There is a consistent decrease in the number of days with at least one tornado at the same time as an increase in the number of days with many tornadoes. These changes are interpreted as an increasing proportion of tornadoes occurring on days with many tornadoes. Coincident with these temporal changes are increases in tornado density as defined by the number of tornadoes per area. Trends are insensitive to the begin year of the analysis. The bottom line is that the risk of big tornado days featuring densely concentrated tornado outbreaks is on the rise. The results are broadly consistent with numerical modelling studies that project increases in convective energy within the tornado environment.
The claim that the average number of tornadoes per tornado day is subject to a number of potential flaws, which need to be highlighted.
Establishing long term tornado trends can be notoriously fraught with problems. McCarthy & Schaefer analysed some of the issues in their 2003 paper, “Tornado Trends Over The Past Thirty Years.”.
Changing Observation Practices
The biggest, but by no means the only problem, is that many, many more tornado reports are filed nowadays. As McCarthy & Schaefer state:
This paper looks at the reported frequencies of tornadoes and their characteristics over the contiguous United States since 1970. There was a significant increase in tornado occurrence during two periods in the last 33 years – in the early 1980s when National Weather Service (NWS) warning verification began, and in 1990 when the WSR-88D became operational.
The increase in reported tornado frequency during the early 1990s corresponds to the operational implementation of Doppler weather radars. Other non-meteorological factors that must be considered when looking at the increase in reported tornado frequency over the past 33 years are the advent of cellular telephones; the development of spotter networks by NWS offices, local emergency management officials, and local media; and population shifts. Changnon (1982) and Schaefer and Brooks (2000) both discuss these influences on tornado reporting.
The growing “hobby” of tornado chasing has also contributed to the increasing number of reported tornadoes.
NOAA address this issue on their Tornado Climatology website, where they say:
Today, nearly all of the United States is reasonably well populated, or at least covered by NOAA’s Doppler weather radars. Even if a tornado is not actually observed, modern damage assessments by National Weather Service personnel can discern if a tornado caused the damage, and if so, how strong the tornado may have been. This disparity between tornado records of the past and current records contributes a great deal of uncertainty regarding questions about the long-term behavior or patterns of tornado occurrence. Improved tornado observation practices have led to an increase in the number of reported weaker tornadoes, and in recent years EF-0 tornadoes have become more prevelant in the total number of reported tornadoes. In addition, even today many smaller tornadoes still may go undocumented in places with low populations or inconsistent communication facilities.
With increased National Doppler radar coverage, increasing population, and greater attention to tornado reporting, there has been an increase in the number of tornado reports over the past several decades. This can create a misleading appearance of an increasing trend in tornado frequency. To better understand the variability and trend in tornado frequency in the United States, the total number of EF-1 and stronger, as well as strong to violent tornadoes (EF-3 to EF-5 category on the Enhanced Fujita scale) can be analyzed. These tornadoes would have likely been reported even during the decades before Doppler radar use became widespread and practices resulted in increasing tornado reports.
Elsner correctly excludes these EF-0 tornadoes from his study for this reason. However, when we analyse EF-1 tornadoes, we find a very similar phenomenon to the EF-0’s, namely a rapidly increasing proportion up to around 1990, as Figure 2 illustrates.
There can only be two reasons for this:-
1) It is artifact of changing observation methods, and other non-climatological reasons. In this case, Elsner’s inclusion of EF-1 tornadoes could significantly skew his results, as the number of tornadoes before 1990 will be substantially underestimated.
(A change from 50 to 70% would add 20% extra tornadoes to the total, assuming the number of EF-2’s+ remained the same).
2) The increasing proportion of EF-1’s is due to genuine meteorological factors. In this case, we would need to acknowledge that the average intensity of tornadoes has been reducing.
By Paul Homewood
I looked at snowfall records for Amherst, MA yesterday in relation to various claims about snowfall from MIT, as Boston was quoted as an example.
Just for completeness, we can look at annual snowfall amounts at Amherst as well.
Although there are big swings from year to year, there does not seem to be much long term change taking place. Meanwhile, in recent years, snowfall totals are well within the usual bounds seen throughout the record.
The sort of threats about looming disasters contained in both this study and the UEA one on dengue fever, which I also posted on yesterday, seem to have much in common with the sort of threats made to children if they didn’t behave – hobgoblins, bogeymen, wicked witches and so on.
It’s rather pathetic really.
By Paul Homewood
h/t Dave Ward
“Experts” from the UEA warn us that global warming will bring Dengue fever to Europe.
The UEA report:
The study used current data from Mexico, where dengue fever is present, and information about EU countries to model the likelihood of the disease spreading in Europe. They found that coastal regions around the Mediterranean and Adriatic seas, the Po Valley and North East Italy were most at risk.
Dengue fever is a tropical disease caused by a virus that is spread by mosquitoes, with symptoms including fever, headache, muscle and joint pain. Each year, dengue infects 50 million people worldwide and causes approximately 12,000 deaths – mostly in South-east Asia and the Western Pacific.
Because the mosquitoes that carry and transmit the virus thrive in warm and humid conditions, it is more commonly found in areas with these weather conditions. Dense populations and global travel are also associated with increasing the spread of the disease, which was observed in the last few decades.
Dengue fever, a very old disease, has reemerged in the past 20 years with an expanded geographic distribution of both the viruses and the mosquito vectors, increased epidemic activity, the development of hyperendemicity (the cocirculation of multiple serotypes), and the emergence of dengue hemorrhagic fever in new geographic regions. In 1998 this mosquito-borne disease is the most important tropical infectious disease after malaria, with an estimated 100 million cases of dengue fever, 500,000 cases of dengue hemorrhagic fever, and 25,000 deaths annually. The reasons for this resurgence and emergence of dengue hemorrhagic fever in the waning years of the 20th century are complex and not fully understood, but demographic, societal, and public health infrastructure changes in the past 30 years have contributed greatly. This paper reviews the changing epidemiology of dengue and dengue hemorrhagic fever by geographic region, the natural history and transmission cycles, clinical diagnosis of both dengue fever and dengue hemorrhagic fever, serologic and virologic laboratory diagnoses, pathogenesis, surveillance, prevention, and control. A major challenge for public health officials in all tropical areas of the world is to develop and implement sustainable prevention and control programs that will reverse the trend of emergent dengue hemorrhagic fever.
The maps below show just how far the aegypti mosquitoes have spread in the Americas, but what is apparent is that they were equally well spread in the 1930’s. ( Also note the spread around the tip of S America – hardly a hot spot! Also Argentina).
A. aegypti distribution in the Americas during the 1930s and in 1970 and 1998.
By Paul Homewood
It seems the Guardian has finally woken up to the fact that, while Britain is busy shutting down the last of its coal fired power plants, most of Europe is busy building new ones.
Worse still, many of these will burn lignite, which emits much more CO2 than black coal.
New coal power stations designed to burn Europe’s massive deposits of lignite pose a serious threat to the continent’s decarbonisation efforts, according to figures released on Wednesday.
Analysts from Greenpeace’s Energydesk compiled data from the German government that shows burning Europe’s reserves of lignite would wipe out the EU’s entire carbon budget from 2020 until the end of the century.
Lignite – also known as brown coal – power stations currently make up more than 10% of the EU’s total CO2 emissions. Greenpeace said that if Europe is to continue to play its part in keeping the world within the internationally accepted limit of 2C of warming, 90% of the carbon contained in its lignite reserves must remain buried.
Despite this, lignite-fuelled power stations are still being built, locking in consumption of the fuel for decades. There are 19 such facilities in various stages of approval, planning or construction in Bulgaria, Czech Republic, Greece, Germany, Poland, Romania and Slovenia. Greenpeace figures show these new projects alone would emit almost 120m tonnes of CO2 every year – equivalent to three-quarters of the annual carbon output of the UK’s energy sector. The average lifespan for a coal power station is about 40 years, meaning the plants could release nearly 5bn tonnes of CO2 into the atmosphere.
Greenpeace energy analyst Jimmy Aldridge said: “The expansion of lignite mining in Europe is today the most serious symptom of the continent’s chronic addiction to dangerous fossil fuels, and a massive threat to its efforts to tackle climate change. The companies involved will continue for as long as they can – we need our political leaders to act in order to stop this situation from getting worse. [Barack] Obama has taken decisive action against coal in the US, it’s time European leaders did the same.”
By Paul Homewood
Just in case you wicked deniers thought a bit snow disproved global warming, grant funded climate scientists at MIT would like to remind you that global warming can lead to more snow as well as less snow. Nothing like an each way bet!
Using models (!) they tell us that winters overall may be less snowy. Which is, of course, the opposite to what has been actually happening.
But places like Boston may see more extreme snowfalls.
Well, let’s see what has been happening at Amherst, tucked away in the heart of Massachussets, and one of the USHCN stations there. The chart below plots all days with 9 inches of snow or more, of which there have been 62 since 1893.
The distribution in recent years looks very similar to the early part of the record before 1920. I wonder whether they blamed heavy snowfall on global warming then!