By Paul Homewood
Back in 1993, Arctic alarmist, Mark Serreze, was joint author of a letter to Nature regarding research they had done on the evidence for greenhouse warming over the Arctic.
ATMOSPHERIC general circulation models predict enhanced greenhouse warming at high latitudes owing to positive feedbacks between air temperature, ice extent and surface albedo.
Previous analyses of Arctic temperature trends have been restricted to land-based measurements on the periphery of the Arctic Ocean. Here we present temperatures measured in the lower troposphere over the Arctic Ocean during the period 1950–90. We have analysed more than 27,000 temperature profiles, measured by radiosonde at Russian drifting ice stations and by dropsonde from US ‘Ptarmigan’ weather reconnaissance aircraft, for trends as a function of season and altitude. Most of the trends are not statistically significant. In particular, we do not observe the large surface warming trends predicted by models; indeed, we detect significant surface cooling trends over the western Arctic Ocean during winter and autumn. This discrepancy suggests that present climate models do not adequately incorporate the physical processes that affect the polar regions.
Before anybody jumps on me, yes this is only up to 1990, but it nevertheless shows there is no evidence at all for polar amplification, at a time when CO2 emissions were rising rapidly.
It is also interesting that they realised they could not trust the small number of land based measurements.
But what is most interesting is that, according to GISS’s hand selected handful of land based stations, most of the Arctic did show moderate warming between 1950 and 1990.
From all of this we can gather that:
1) GISS’s temperatures for the Arctic are worthless.
2) Grant funding can trump the best of research.
By Paul Homewood
Roy Spencer has an interesting post up:
Why 2014 Won’t Be the Warmest Year on Record
October 21st, 2014 by Roy W. Spencer, Ph. D.
Much is being made of the “global” surface thermometer data, which three-quarters the way through 2014 is now suggesting the global average this year will be the warmest in the modern instrumental record.
I claim 2014 won’t be the warmest global-average year on record.
..if for no other reason than this: thermometers cannot measure global averages — only satellites can. The satellite instruments measure nearly every cubic kilometer – hell, every cubic inch — of the lower atmosphere on a daily basis. You can travel hundreds if not thousands of kilometers without finding a thermometer nearby.
(And even if 2014 or 2015 turns out to be the warmest, this is not a cause for concern…more about that later).
The two main research groups tracking global lower-tropospheric temperatures (our UAH group, and the Remote Sensing Systems [RSS] group) show 2014 lagging significantly behind 2010 and especially 1998:
With only 3 months left in the year, there is no realistic way for 2014 to set a record in the satellite data.
Granted, the satellites are less good at sampling right near the poles, but compared to the very sparse data from the thermometer network we are in fat city coverage-wise with the satellite data.
In my opinion, though, a bigger problem than the spotty sampling of the thermometer data is the endless adjustment game applied to the thermometer data. The thermometer network is made up of a patchwork of non-research quality instruments that were never made to monitor long-term temperature changes to tenths or hundredths of a degree, and the huge data voids around the world are either ignored or in-filled with fictitious data.
Furthermore, land-based thermometers are placed where people live, and people build stuff, often replacing cooling vegetation with manmade structures that cause an artificial warming (urban heat island, UHI) effect right around the thermometer. The data adjustment processes in place cannot reliably remove the UHI effect because it can’t be distinguished from real global warming.
Satellite microwave radiometers, however, are equipped with laboratory-calibrated platinum resistance thermometers, which have demonstrated stability to thousandths of a degree over many years, and which are used to continuously calibrate the satellite instruments once every 8 seconds. The satellite measurements still have residual calibration effects that must be adjusted for, but these are usually on the order of hundredths of a degree, rather than tenths or whole degrees in the case of ground-based thermometers.
And, it is of continuing amusement to us that the global warming skeptic community now tracks the RSS satellite product rather than our UAH dataset. RSS was originally supposed to provide a quality check on our product (a worthy and necessary goal) and was heralded by the global warming alarmist community. But since RSS shows a slight cooling trend since the 1998 super El Nino, and the UAH dataset doesn’t, it is more referenced by the skeptic community now. Too funny.
In the meantime, the alarmists will continue to use the outdated, spotty, and heavily-massaged thermometer data to support their case. For a group that trumpets the high-tech climate modeling effort used to guide energy policy — models which have failed to forecast (or even hindcast!) the lack of warming in recent years — they sure do cling bitterly to whatever will support their case.
As British economist Ronald Coase once said, “If you torture the data long enough, it will confess to anything.”
So, why are the surface thermometer data used to the exclusion of our best technology — satellites — when tracking global temperatures? Because they better support the narrative of a dangerously warming planet.
Except, as the public can tell, the changes in global temperature aren’t even on their radar screen (sorry for the metaphor).
Read the rest here.
By Paul Homewood
A paper was published in GRL last April by NCAR scientists, which found that apart from the Antarctic Peninsula, both sea surface and surface air temperatures have declined since 1979, consistent with the expansion of Antarctic sea ice.
In their summary they find:
During the later period, the distribution of SST trends shows notable and widespread cooling over Southern Ocean, except for the area near the Antarctic Peninsula and adjacent West Antarctica where SST warming is observed. The widespread SST decreases correspond to areas of sea ice expansion, while the region of SST warming is associated with sea ice loss. Such a physically intuitive relationship suggests that SSTs can be used as a proxy for sea ice in areas near the Antarctic continent, allowing inferences of past sea ice behavior from conventional SST measurements.
Junk scientists would like to convince you that the opposite is true.
By Paul Homewood
It looks like Owen Paterson has let the genie out. This is today’s editorial comment from yesterday’s Telegraph:
The fire at Didcot B power station is not going to bring the National Grid to its knees. But in combination with other fires at Ironbridge and Ferrybridge power stations, and problems with the Heysham and Hartlepool nuclear reactors, it will chip away at our surplus generating capacity, to the point where blackouts will become, if not likely, then far more likely than they should be.
The underlying problem, as Brian Wilson spells out on the opposite page, is simple. Our power stations are ageing fast. We have eked out their lifespan for longer than expected, but replacements are urgently needed. Yet for years, our politicians have failed to act, promoting costly and over-subsidised renewables rather than building new gas or nuclear plants. To make matters worse, much of our capacity has been scrapped, in compliance with environmental restrictions set in Brussels.
If things continue as they are, the prospect has been raised of Seventies-style restrictions on energy use, even rolling blackouts. That is a grim prospect for a 21st-century economy. To avoid it, we first need to get serious about energy efficiency. Even if they do not help to save the planet, measures such as better insulation, or more watchful monitoring of the electricity meter, would make sound financial sense. Unfortunately, it seems to go against the spirit of the times to put on a jumper to cope with the chill; it is far easier simply to turn up the thermostat.
Beyond that, there is an obvious need for more generating capacity. New nuclear plants are at last being approved, but they are expensive to build and take years to construct. There is also a case for suspending the provisions of the Climate Change Act, to buy Britain some time to get itself out of this mess: given the amount of CO2 emitted worldwide, it will hardly doom the planet if we take off our hair shirt for a spell. We should also consider the proposal by Owen Paterson, the former environment secretary, that we build small-scale nuclear reactors rather than pointless offshore wind farms.
The Didcot episode also raises extremely serious questions for Labour. Ed Miliband, who lumbered us with the Climate Change Act in the first place, has repeatedly promised that Labour will decarbonise the electricity supply by 2030. As the Didcot accident makes clear, it will already cost tens of billions just to keep the lights on – so where on earth would Mr Miliband find the tens of billions more to replace our coal and gas capacity completely? And what source of power would he use instead? This is fantasy policy, on an issue that could not be more important to Britain’s citizens, or Britain’s future.
By Paul Homewood
There’s a few bits of the CCC response to Owen Paterson’s call to scrap the Climate Change Act which have slipped under the radar. These concern the costs of decarbonisation.
Here is what the CCC have to say:
1) 1 to 2% of GDP
UK GDP is around £1.5 trillion, so we are looking at an annual cost of £15 to 30 billion. The 2008 Climate Change Act talked of annual costs of £14.7 to 18.3 billion, albeit at 2008 prices, so it would appear that the CCC is estimating potentially much more.
The CCC claim we will be twice as rich by 2050, so will still be a lot better off. But this rather misses the point – if the economy, and in particular the industrial sector, is damaged as a result of decarbonisation, we may not get the growth they are forecasting, indeed we may get none at all.
They also ignore Paterson’s main contention, that there is no practical way, given current technology, that we can reduce emissions by 80%, without shutting down whole chunks of the economy.
2) Energy bills rising by £10/year
This is the usual trick that DECC employ, and is extremely disingenuous. Still, taking this logic, we can still expect our energy bills to be £150 higher by 2025. (The baseline used by CCC was 2010).
But this only represents part of the cost. Domestic users only take about 35% of the power consumed. Other sectors, such as the Public Sector, Industry, Commerce, Retail and Transport, will also have to pay higher bills, which ultimately will end up being passed on to the public.
For instance, extra costs for the public sector will need to be paid for by higher taxes or reduced services. In industry the costs will be met either through higher prices, lower wages, or worst of all lost jobs.
A cost per household of £150/year, (again at 2010 prices), is about £3.9 billion. As this represents only 35% of the full cost, we can pro-rata this up to £11.1 billion, or a real cost per household of £429/year.
This figure looks reasonable for the mid 2020’s. We already know that the Government has budgeted a Levy Control Framework of £7.6 billion/year by 2020 (at 2012 prices), representing the cost passed onto consumers to cover all of DECC’s support mechanisms for low-carbon generation.
[The estimate for 2015/16 is already up to £4.3 billion]
Based on current prices, when Hinkley Point C comes on stream in 2023, we will be looking at an annual subsidy of £1.0 billion, and there will inevitably be a large expansion in offshore wind between 2020 and 2025, as well as more nuclear capacity, as the CCC state.
Each tranche of 1GW offshore capacity means a subsidy of £275 million/year, and we will need ten times that amount for offshore to increase its share of total supply by 10%.
So it is easy seeing how the annual cost could increase to £11 billion or more by the mid 2020’s.
3) Bills expected to fall after mid 2020’s
This looks very dubious, and they offer no reasons for this at all in their report in 2012, that they link to.
Strike prices guaranteed for wind and other renewables are for at least 15 years, and the expansion of offshore, the most heavily subsidised sector, has barely begun. We will be seeing the cost of subsidising these increasing for many years yet, and not falling until, at the very earliest, well into the 2030’s.
As already mentioned, expansion of nuclear capacity is also planned, and the contract for Hinkley Point C has guaranteed prices for 35 years. Assuming similar contracts are offered for other plants, there will continue to be significant upward pressure on energy bills beyond 2030.
It is impossible to forecast how things will look in 2050 (when most of us will be dead anyway!). But, in the shorter term, we can all expect to be paying heavily for Ed Miliband’s folly, the 2008 Climate Change Act.
By Paul Homewood
(You can tell its raining today, and the dog’s bored stiff!)
I came across this by accident, from DECC’s press release last December.
The statement, as I reported a week or so ago, claimed:
Increasing the amount of home-grown renewable energy will boost energy security, reduce reliance on imported fossil fuels, and support up to 200,000 jobs by 2020.
When I scrolled to the bottom, (well it is raining), I was astonished to find this.
So DECC include figures of extra jobs, which are provided by the Renewable Energy Association, who would have doubtlessly exaggerated numbers in order to drum up more subsidies for its members.
Then DECC, themselves, use these numbers to justify their renewable strategy, which will pay out more, huge subsidies to the aforesaid members!
Anyone for incest?
By Paul Homewood
The EU have now approved the go ahead of the Hinkley Point C nuclear power station, which is now, I understand, just waiting final signing off of contracts.
DECC have issued a press release, which goes into a bit more detail of some of the key terms, including:
1) Strike Price of £89.50/MWh fully indexed to the Consumer Price Index. Price benefits from upfront reduction of £3/MWh built in on assumption that EdF will be able to share first of a kind costs of EPR reactors across Hinkley Point C and Sizewell C sites. If the final investment decision is not taken on Sizewell C, Strike Price for Hinkley Point C will be £92.50/MWh.
Note that these are in 2012 prices, so the latter price will already have increased to around £96/MWh at 2014 prices. With wholesale power prices around £50/MWh, this would equate to a subsidy currently of £1.03 billion/year, about £46 per household. (Assuming 80% capacity utilisation).
2) Contract difference payment duration for each reactor of 35 years. Contract term for a reactor will be reduced if that reactor does not reach its Start Date within its Target Commissioning Window.
As already announced, the strike price will be guaranteed for 35 years – at £1.03 billion/yr, this works out at a subsidy of £36 billion. (In addition, of course, to the market price for power received).
3) Arrangements whereby the Strike Price could be adjusted, upwards or downwards, in relation to operational and certain other costs (including business rates, and balancing and transmission charges) at certain fixed points (including through opex reopeners at 15 and 25 years after the start date of the first reactor…..
Arrangements whereby Hinkley Point C would be protected from being curtailed without appropriate compensation. If export of power from Hinkley Point C is curtailed by the operator of the national transmission system and NNBG receives less compensation than is available under current market arrangements it will be able to claim compensation for this difference in respect of any power it has sold on the Season Ahead market.
As I read this, it means that if the Grid does not take power from Hinkley, for instance, because surplus power from windfarms takes priority, Hinkley will receive full compensation.
As I have pointed out before, there should be no surprise there. There is no way EDF would spend £25 billion building the plant without a guarantee that it could run at an economic utilisation of capacity.
By Paul Homewood
There’s a good comment piece at the Telegraph today by Brian Wilson, who was the Energy Minister in the Blair government.
It’s worth reading in full, but I am highlighting this bit:
Things were exacerbated by the Labour government’s refusal, from which I dissented, to allow new nuclear plants to be built. Instead, a fiction was created that imported gas and heavily subsidised renewables would fill the gap left by declining nuclear and polluting coal, which was scheduled to disappear from the scene by 2015. It was nonsense in both economic and environmental terms.
Rather says it all.
By Paul Homewood
The Telegraph have a good summary of the implications of the fire at Didcot power station.
Two more power plants are due to close by next winter under EU anti-pollution rules, making the UK even more vulnerable to blackouts if there are further incidents such as the Didcot fire.
Littlebrook oil plant in Kent is due to shut in March, while the remaining part of a former coal-fired Ironbridge plant in Shropshire will also be forced to close next year – despite having converted to burn “greener” biomass wood pellets.
Both are closing under an EU directive designed to phase out dirty old power plants that did not upgrade their equipment to help prevent acid rain and other forms of pollution.
The so-called Large Combustion Plant Directive has already forced the shutdown over the last two years of a series of coal plants: Kingsnorth in Kent, Didcot A in Oxfordshire, Cockenzie in East Lothian, and part of Ferrybridge in Yorkshire.
Oil-fired plants at Fawley in Hampshire and Grain in Kent have also been forced to close under the rule.
By Paul Homewood
According to NOAA, last month globally was the hottest September on record, 0.04C warmer than September 2005.
They apparently know the global temperature to such an exact amount, despite having no temperature data for most of the world’s landmass.
Conveniently, of course, they forget to mention the margin of error, which is +/-0.12C.
Allowing for the margin of error, September 2014 is statistically tied with 14 of the last 18 years, as the chart below shows.
Sometimes, margins of error are misunderstood, but GISS summed it up well in their report on 2010 global temperatures.
Global surface temperatures in 2010 tied 2005 as the warmest on record, according to an analysis released Wednesday by researchers at NASA’s Goddard Institute for Space Studies (GISS) in New York.
The two years differed by less than 0.018 degrees Fahrenheit. The difference is smaller than the uncertainty in comparing the temperatures of recent years, putting them into a statistical tie.
If GISS understand this, why do NOAA continue to publish false claims?