Skip to content

Why The GHCN Adjustments In The Arctic Are Wrong

February 6, 2015

By Paul Homewood

 

GHCN are quite clear on why they adjust temperatures. This is what their Technical Report says:

 

The nature of the homogeneity adjustments made to remove non – climatic influences that can bias the GHCN – M temperature record are described in Lawrimore et al. 2011 for version 3.0.0. In brief, adjustments are necessary because surface weather stations are frequently subject to minor relocations throughout their history of operation and may also undergo changes in instrumentation as measurement technology evolves. Furthermore, observing practices may vary through time, and the land use/land cover in the vicinity of an observing site can be altered by either natural or man-made causes. Any such modifications to the circumstances behind temperature measurements have the potential to alter a thermometer’s microclimate exposure characteristics or otherwise change the bias of measurements relative to those taken under previous circumstances. The manifestation of such changes is often an abrupt shift in the mean level of temperature readings that is unrelated to true climate variations and trends.

 

To identify and correct these biases, GHCN use their “Pairwise Homogenisation Algorithm” (PHA). Essentially, this compares temperature trends between nearby stations, so as to highlight the outlier.

 

So, if we apply this principle to the Arctic temperature adjustments, there are three key issues to address.

 

 

1) Has there been an abrupt shift in temperatures?

Yes, emphatically so, as the raw temperature record below for Akureyri in Iceland shows. There was a sharp decline in temperatures between 1964 and 1966, when annual means fell from 4.7C to 2.18C.

 

station

http://data.giss.nasa.gov/cgi-bin/gistemp/show_station.cgi?id=620040630003&dt=1&ds=1

 

And it was precisely that time when the GHCN algorithm kicked in , and decided to cool the years preceding this decline, and adjust upwards the years after.

 

62004063000

ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/products/stnplots/6/62004063000.gif

 

2) Was the shift real?

Was this abrupt drop in temperature a real, climatic event, or simply a reflection of changing observation practices?

The overwhelming evidence is that it was genuine, and there will follow a detailed post on this.

Indeed, it was so remarkable a change that it has been described as certainly one of the most dramatic events of the century in the Norwegian Sea.

 

3) What other stations have been used for homogenisation?

The PHA has concluded that this sudden drop in temperature was not real, based on trends at “nearby” stations. Yet, this is not a case of a solitary station in Iceland being adjusted to bring it into line with other Icelandic stations. Indeed, all Icelandic stations have been adjusted in the same fashion.

It is not even a case of the Icelandic stations being brought into line with other nearby Arctic sites in, say, Greenland, as they have also been adjusted in the same way.

Nick Stokes has a useful little App on his website, which maps which sites have been adjusted up (pink), down (blue) and no change (yellow). This shows the preponderance of the pinks.

(One little note – there are a few blues in the area I have covered, but each stopped recording many years ago or, in the case of Egedesminde, were not operational in 1940. The only exception to this is Angmagssalik, which most definitely has received a warming adjustment: [this may be due to Nick’s data not being up to date])

 

image

http://moyhu.blogspot.com.au/2015/02/google-maps-app-showing-ghcn-adjustments.html

 

The only conclusion can be that GHCN stations hundreds of miles away, and outside of the Arctic region, have been used for homogenisation. And here lies the problem – is there any reason why we would expect climate in, for instance, Britain to follow the same pattern as Iceland or Greenland?

Certainly not according to the experts, as this paper “AN ANALYSIS OF ICELANDIC CLIMATE SINCE THE NINETEENTH CENTURY” by Hanna, Jonsson & Box, published in 2004 showed.

 

They had this to say:

The warming was non-uniform in time, occurring in three distinct phases, approximately from 1880 to 1900, from 1925 to 1940, and from 1983 to 2001. Warming was most rapid in 1919–33, reaching the maximum temperatures over the entire record in 1939 and 1941. The northwestern European records surveyed do not indicate any significant trends over the 1901–30 standard period, whereas Icelandic trends are highly significant , somewhat indicating a decoupling between the Icelandic and northwestern European climates.

In an analysis of Greenland temperature records, Box (2002) lists 1939 and 1941 among the five warmest years and 1907 and 1983 among the five coldest years for the nearest site to Iceland, Tasiilaq, southeast Greenland. This is consistent with the results from Reykjavik (Table IV). Furthermore, this is consistent with the often-cited temperature dipole between Greenland and northwestern Europe (e.g. Van Loon and Rogers, 1978). Thus, 1941 was one of the coldest years of the 20th century in northwestern Europe, e.g. Copenhagen, Oslo, Stockholm (Table IV). 1983 was the 11th warmest year in Copenhagen, 13th warmest year in Oslo, and 18th warmest in Stockholm.

The Icelandic cooling from the 1940s to the 1980s is in broad agreement with a general cooling between the late 1950s and the 1990s observed in western and southern Greenland (Przybylak, 2000; Box, 2002; Hanna and Cappelen, 2003) and also agrees with the P.D. Jones/Hadley Centre data shown in Serreze et al. (2000) of a widespread cooling (or at least muted warming) over southern Greenland, Iceland and the northwestern North Atlantic. These regions experienced a prolonged and deeper mid-20th century cooling when compared with the global warming trend (e.g. Houghton et al., 2001). The contrast is attributable to variations in the intensity of the Icelandic low and is thus linked to the NAO.

 

In short, there exists a temperature dipole, with cold years in Iceland and Greenland associated with warm years in NW Europe, and vice versa. Any climatologist worth his salt would not attempt to homogenise the two regions.

 

Of course, all of these scientists might have got it totally wrong, and it might be Nick Stokes and his algorithm who have got it right. Who said pigs could not fly!

23 Comments
  1. A C Osborn permalink
    February 6, 2015 7:45 pm

    Paul, another good argument.
    Their Algorithms are just that, mathamatical rules that take no cognizance of the Real Data or the Real Conditions at the time, they can’t, they are just a computer program.
    We all know with Computers that if you put garbage in you get garbage out, that applies even more to the “programming” than it does to the data.
    I wonder how people would feel if their bank used a similar technique, you pay for your Supermarket bill with your Debit Card, it is usually around £200, this week you decide to buy a new £350 TV with your normal shopping.
    The banks says, this is not homogeneous, we will only pay out £200, that would go down really well.
    Or your bill only comes to £50, but they insist on paying the supermarket £200 as usual.
    Nobody else could get away with such crap.

  2. February 6, 2015 7:59 pm

    If I understand this correctly, some people have written GCMs to predict future temperatures; and, some other people have written the PHA to correct historic data in an attempt to improve the accuracy of GCMs.

    From the desription, it also seems that “Pairwise Homogenisation Algorithm” is a (deliberate?) misnomer in that it compares multiple nearby stations.

  3. Keith permalink
    February 6, 2015 9:11 pm

    Cold years in Iceland appear to coincide with warm years in NW Europe.

    Relative to warm NW Europe, Iceland is a cold outlier.

    Relative to cold Iceland, NW Europe is a warm outlier.

    Why does it warm Iceland rather than cool Europe? How exactly does the algorithm work?

    There have to be lines in the code that control the process such that a divergence corrects the outlier cold, rather than the outlier warm.

  4. Mikky permalink
    February 6, 2015 9:30 pm

    This may be an example of failure of algorithms (and of expert meteorological checking of the results), but I would advise extreme caution in drawing premature conclusions.

    One possible problem in the analysis here is that only annual MEAN temperatures are being displayed, the analysis has to be done separately on MIN and MAX (produced by different thermometers), ideally looking at monthly or seasonal averages. A key clue for an inhomogeneity is that it occurs at an instant in time affecting ALL seasons, unlike for instance a natural shift to (say) only colder winters, or a spell of summer-only heatwaves.

    Stations separated by hundreds of miles can be compared. For example, Glasgow probably doesn’t match London in its year-to-year temperature fluctuations (Oxford probably does), but if (say) London is 5C warmer ON AVERAGE today, then that difference has probably been very similar for centuries. Thus, London could be used to homogenise Glasgow, maybe for temperature shifts of greater than around 0.5C.

    On the other hand, the Arctic may be a special case where the lower latitude rules don’t apply. I don’t understand why algorithms are used at all in the Arctic, there must be enough Icelandic, Norwegian and Canadian meteorological expertise to do it manually.

    • February 7, 2015 1:10 am

      Mikky, your rationale can safely be ruled out. There is indisputable systemic warming bias. At a minimum it flows from confirmation bias obscuring at least two logical flaws in all (AFAIK) homogenization algorithms. See post below. Read essay When Data Isnt for many more examples.

  5. February 6, 2015 10:30 pm

    The dipole has been known at least since 1937
    http://onlinelibrary.wiley.com/doi/10.1002/qj.49706327108/abstract

    A period of warm winters in Western Greenland and the temperature see-saw between Western Greenland and Central Europe
    Dr. F. Loewe
    Quarterly Journal of the Royal Meteorological Society
    Volume 63, Issue 271, pages 365–372, July 1937

  6. February 6, 2015 10:35 pm

    This is the oldest source
    HANN, J. Zur Witterungsgeschichte von Nord-Grönland, Westküste. Meteor. Zeitschrift, 1890, 7: 109-115.

  7. February 7, 2015 1:05 am

    Paul, your post is a good example of the ‘regional expectations’ fallacy in all automatic homogenization algorithms. PHA is just an instantiation. The clearest single example is BEST station 166900. BEST rejected 26 months of extreme cold readings to turn essentially no warming raw since the starion was established in 1956 into BEST warming.
    #166900 is the US Amundsen Scott research station at the south pole, arguably most carefully maintained and certainly the most expensive weather station on the planet. Its records need no QC.
    And, for regional Antarctic expectations, the nearest station from to derive those would br McMurdo, 1300 km away and 2700 meters lower on the coast! QED. See last footnote to essay When Data Isn’t for more.

  8. John F. Hultquist permalink
    February 7, 2015 6:41 am

    An off topic question:
    I just read this article from 3 weeks ago.
    http://www.telegraph.co.uk/news/weather/11355906/Mortuaries-overflowing-as-freezing-weather-causes-rise-in-deaths.html

    I do not understand why a body could not be released for burial. Perhaps the funeral rites in the UK are very different than here in the USA. Here there are businesses (undertakers) that handle such things and unless there is a legal issue hospitals or government agency morgues would be very short term keepers of a body.
    Your thoughts?

    • February 7, 2015 7:08 am

      Hi John,
      I think the problem here is that there is not enough personnel to determine cause of death (i.e. perform autopsies) because too many people died. They did not foresee so many deaths on account of the cooler weather, i.e. everyone still believes that is getting warmer……

  9. February 7, 2015 7:01 am

    My investigations have shown that we are cooling from the top latitudes downward, i.e. the current global cooling is more pronounced at the higher latitudes. A typical example can be found for my results from Alaska:

    where the average of ten weather stations showed an average decline of -0.055K/annum since 1998. That is almost -0.9 degree C or K since 1998.
    Unfortunately nobody is telling the poor farmers up there that it is not going to get any better…..

  10. tom0mason permalink
    February 7, 2015 7:18 am

    Therein is the problem with homogenized temperatures – removing the climatic signals.

    If for instance there are 3 small events (for argument sake — a local change in ocean flow, unseasonal wind change, and a temperature change) around a particular geographical area, and only one station registers the temperature variation, this signal will be lost when homogenized. How do they know that this signal is not a significant?
    How do they know that this bunch of changes are not the precurser to something greater happening?

    Also if homogenizing is such a good idea why is it not done throughout science.
    E.g. Why not homogenize the data from the Hubble Space telescope – after all we have a pretty good idea of what we are looking for, so why not just homogenize the data to confirm our pre-ordained view of the universe. Why bother examining the petty little details? Surely its the big blurred, averaged, and normalized picture we need.
    Or not.

  11. February 7, 2015 7:50 am

    “and it might be Nick Stokes and his algorithm who have got it right”

    It’s actually not my algorithm. And I don’t run it or use it. I’m not an authority on its workings.

    I’ve put my general thoughts on why it’s a good thing (but in practice has small effect) here.

    • Bloke down the pub permalink
      February 7, 2015 11:34 am

      Nick, this year has been claimed by some authorities to be the hottest evah, by a very small margin. The effect at an individual station may be small, but once smeared across an area otherwise lacking in data, is sufficient to produce the desired new record. Being an old cynic I find it hard to believe this is anything other than deliberate.

  12. johnmarshall permalink
    February 7, 2015 11:31 am

    The assumption that temperatures between local areas must be similar is a poor assumption. I have a pair of instruments giving temperature in two parts of my garden, they are both electronic 0.5C accuracy instruments with the same characteristics and made by the same manufacturer. I frequently get temperature differences of over 5C between them, and they are 50ft apart.

    It is also funny that all past data seems to have been too warm.

    • John F. Hultquist permalink
      February 7, 2015 5:14 pm

      Chart the temperature difference, not the actual temperature.
      If there are wide swings in the difference then you have an instrumentation problem.

      • johnmarshall permalink
        February 8, 2015 12:39 pm

        That is what’s done. i get no wild swings, Temperature differences are probably down to wind direction and strength. Also one sensor has a deliberate input of heat from the house to try to equate UHI but again that varies with wind vectors. What it does expose is the claim that temperatures will be modelable for connected areas.

Trackbacks

  1. Temperature Record Chicanery: An Overhyped Scandal | Sandora News Aggregator Portal
  2. Temperature Record Chicanery: An Overhyped Scandal | Michigan Standard

Comments are closed.

%d bloggers like this: