Skip to content

NOAA Tampering Exposed

July 20, 2015

By Paul Homewood   


Last month, NOAA caused a lot of controversy by adjusting historic global temperature data to show that the pause had never happened. This has been well covered by WUWT and others, but what is less well known is that NOAA have been making similar but subtle adjustments year by year for a while now.

When they do this, the old versions are never archived, and they do not publically announce what they have done. Instead, the new figures simply replace the old version.

Fortunately, however, Walter Dnes has been archiving the old data each month since January 2010. His results were published at WUWT last week.

I have used his data to show in a simple fashion what the total effect of these changes since 2009 has been.



First, we can look at the changes that have been made to annual data back to 1880. Figure 1 shows, for instance, that the temperature anomaly for 1938 published currently is 0.13C lower than was published in January 2010. In contrast the anomaly for recent years has been increased, for instance 2009 has increased from 0.56C to 0.64C.



Figure 1


The overall effect of these changes has been to cool the 1930’s and 40’s, and increase warming in recent years.

Note as well though the way that the changes have steadily added to temperature anomalies since 1999, thus conveniently removing the pause.


The effect of these recent changes can be seen in more detail in Figure 2. Temperatures have been progressively increased as each year has gone by.



Figure 2

For instance, in their State of the Climate Report for 2010, NOAA showed 2010 and 2005 tying as the warmest years, 0.02C higher than 1998.



Figure 3



However, the latest version shows that 2010 as being 0.04C warmer than 2005, and 0.06C warmer than 1998.



Figure 4


But the tampering is even worse than this, as changes were being made prior to Walter Dnes beginning his archiving in 2010. It is difficult to get a handle on the full extent, since NOAA do not archive these things.

We can though see how things changed between 2004 and 2010.

Take another look at the 2010 version in Figure 3. The anomaly for 1998 was 0.60C, 0.06C higher than the figure for 2004.

But when we look at the State of the Climate Report for 2004, we find that 1998 was 0.09C warmer than 2004. In other words, between 2004 and 2010, the temperature for 1998 has been reduced relative to 2004 by 0.03C.

Add that to the 0.06C already identified, and by magic 1998 is now demoted to only the fifth warmest year!





It is little wonder that, with tampering on this scale, the NOAA dataset has been diverging so drastically from the satellite numbers since 1998.

  1. July 20, 2015 10:58 am

    Reblogged this on Climate Collections and commented:
    Another insightful look at NOAA data tampering by Paul Homewood.

  2. Mark Hodgson permalink
    July 20, 2015 11:10 am

    I noticed among some of the alarmists’ comments on one of your stories reprinted in the Telegraph, the old chestnut that only “scientists” are qualified to comment on climate because “climate = science”.

    It was stated that you are a retired accountant, and therefore should not be listened to. I don’t know if you are a retired accountant or not, but if you are, I think it makes you eminently qualified to deal with this sort of thing. Aren’t auditors supposed to keep an eye out for fraud when auditing company accounts?

    It seems to me that it is the alarmists who have politicised the whole issue of climate change, and that makes it fair game for intelligent discussion by anyone qualified to look at the various aspects to the debate – inter alia it’s political, financial, statistical, and numerical. The alarmists know that of course, which is why they are so desperate to close down the debate and disqualify anyone but scientists from commenting, and then doing their best to intimidate scientists who are off message.

    Keep up the good work!

  3. Phantomsby permalink
    July 20, 2015 11:15 am


    • Phantomsby permalink
      July 20, 2015 11:17 am

      …would be my suggestion as a catch-all term for strange data tampering.

  4. Green Sand permalink
    July 20, 2015 11:30 am


  5. July 20, 2015 12:36 pm

    Reblogged this on eliquidassets.

  6. Bloke down the pub permalink
    July 20, 2015 1:30 pm

    They never seem to identify what was wrong with readings in the 90’s that could be put right by adding to the anomaly.

  7. July 20, 2015 1:38 pm

    Coyote Blog has had articles on this pre-2010. I don’t know if they kept the data, but I’m sure someone out there has a set of adjusted NOAA data going back years. Skeptics are likely to keep data sets, being more scientific than the actual keepers of the legend are. Legends are so much easier to rewrite if you don’t archive anything…..

  8. Eliza permalink
    July 20, 2015 1:38 pm
    Be very alert to any outside pressure on DMI to tamper this data to oblivion.
    This is one that’s going to be extremely hard to explain away by the climate cartels so be prepared for extreme pressure to be brought upon the poor ol folks at DMI to alter something to make this go away (ie change baseline, change ice border areas, change calculation method).

    • July 20, 2015 1:50 pm

      Agreed. Everyone should start archiving as much data as possible on temperature, etc from the published data. Anything is probably fair game to change if it salvages the myth and maintains the politics.

  9. July 20, 2015 3:17 pm

    They have also removed the IPO ( PDO ). Paul Vaughan (Talkshop) is working on details hidden in the adjustments. They are removing natural variation, in a desperate attempt to make CO2 correlate.
    (July19 8:48am)

  10. July 20, 2015 3:17 pm

    Thanks, Paul. A very well-researched article.
    What is fantastic is the leap NOAA has been performing with tampering on this scale, they have now launched to go “out of this world”.

  11. July 20, 2015 3:33 pm

    And the conclusion is:

    The old thermometers ran hot and have to be cooled with adjustments, while the modern thermometers run cool and need adjusting upwards.

  12. robinedwards36 permalink
    July 20, 2015 3:53 pm

    Paul, I have on my computers files that were downloaded I think in 2004, but possibly earlier judging by some of the contents, consisting of GHCN data of various types, (Mean temp, Min, Max, Precip, etc) with the appropriate metadata (station identifiers with names, elevation etc etc) which I have used extensively in the past for my researches into climate. They are all in a format specific to the RISC OS operating system – which is what I use because my software runs on it – but I imagine that it could be transmitted by a Zip procedure. Individual file sizes range up to 70M or so. The temperature files are nearer 40M.

    It is possible that I could rake up earlier files too. My interest in climate began I guess in 1992, and Oxford University’s system was used to download some 9MB files of GHCN data long before the days of universal broadband. Also dating from around that time I have (somewhere!) much smaller files that were sent to me – very kindly indeed – by the now famous Phil Jones, way before he got on the bandwagon.

    Let me know if they might be interesting to someone.


    • July 20, 2015 6:05 pm

      Are they basically station data, Robin?

      • robinedwards36 permalink
        July 21, 2015 9:13 am

        They are Station data, Paul. I think that they are in text file format, but without delving in again can’t be sure. The ones I use are certainly text.

  13. A C Osborn permalink
    July 20, 2015 4:30 pm

    Paul, if you check back through your old posts you will find that I posted about NOAA making the mistake of posting both the Baseline and the Actual Temperature for 1997/8 which showed it was whole degrees warmer than it is shown now, not just a few 10ths of a degree.
    You will also note they only show Anomalies now.
    Remember this

  14. July 20, 2015 7:27 pm

    The day-to-day variation of GHCN-M adjusted station records may also be of interest. As these are temperatures not anomalies it is of course older data which is adjusted, so varying the mean during the anomaly base period, and as a result the recent anomaly values when a varying mean during the anomaly base period is subtracted.

    The Menne Williams Pairwise Homogenization procedure throws up some surprising, and rapidly varying, results.

    I’ve been putting together a blog post, published already although still under ongoing construction, at starting with Marseille/Marignane and surrounding stations. I will be adding a further post showing that this result variation is not confined to Marseille, or to France and neighbouring countries, but is widespread.

    • July 21, 2015 1:20 am

      I should probably make clear here that despite the title of the post I am commenting on I am not suggesting tampering, or as in at least one other comment, fraud. Some of the stations I illustrate show an increased trend through Pairwise Homogenization, others show a decreased trend, and the ones I am particularly interested in, where adjustment varies, often frequently and/or rapidly simply “cannot make their minds up” whether the adjusted trend should be increasing or decreasing. At this point I am simply suggesting that it may be desirable to study the performance of the Menne Williams procedure in greater depth.

      I have not as yet proceeded to study the variations of an area-weighted adjusted global mean from day to day, and expect that that step will be very much more demanding on computer time, and so much slower than my analysis so far, the examination of a single past month at each station for an archive of more than 500 daily GHCN-M datasets (archived daily with a few missed dates from April 2014 on, intermittently before that date but covering v3.0.0, v3.1.0, v3.2.0, v3.2.1, v3.2.2 and now v3.3.0). The anomalous day-to-day variations of some station adjustments may simply be reflected as a source of noise in the area-weighted adjusted global mean. I do not know, and I do not believe that the Pairwise Homogenization is designed to introduce a bias, either up or down, in the adjusted global mean.

      I do not question the need for homogenization. I do question the success of a homogenization procedure which leads to so many homogenized station records varying so much even from one day to the next.

  15. Lloyd Preston permalink
    July 20, 2015 8:53 pm

    All these climate scientists depend on the public purse – ie they are not employed in private industry. To keep the grants rolling in they need to create a sense of urgency and if that means they tell lies, well, as they see it: they need the salary.

  16. cheshirered permalink
    July 20, 2015 9:32 pm

    Given the money involved in public policies that are in part predicated on this very data, it’s hard not to conclude that this deliberate data adjustment is plain and simple fraud.

  17. July 21, 2015 2:08 am

    Reblogged this on Centinel2012 and commented:
    It is worse than this please take a look at my latest post Comments and suggestions are welcome! Especially if you can show i am wrong!

  18. July 21, 2015 2:45 am


    Keep up the good work! The constant adjustments just make no sense.

  19. July 21, 2015 9:23 am

    Reblogged this on Roald j. Larsen and commented:
    The fraudsters are adjusting the data before the swindle meeting in Paris later this year. If there’s no man made global warming on earth, do not despair, they’ll give you REAL man made global warming in the datasets. This swindle will only continue as long as people let it ..

  20. BLACK PEARL permalink
    July 21, 2015 2:51 pm

    Got this back from MetOffice on the subject
    “Thank you for your email regarding the NOAA global surface temperature dataset. The NOAA data set has had adjustments applied to it which reduce the effect of non-climatic artefacts in the raw data arising from things like the movement of observing stations, instrumentation changes and changes in measurement technique. Non-climatic artefacts show up as changes in temperature in these long station records which would not have occurred if the station had not moved, or the instrumentation had remained the same. This process of reducing the effect of these artefacts in global temperature records is known as homogenisation and is a necessary part of making a climate dataset from historical data.

    Over the years NOAA’s methods have changed and new data have been added both to the end of the data set and, occasionally, through the addition of new stations from data sources that weren’t previously included. As methods change and as data are added to the database, the global temperature record will change. The methods are described appropriately in the peer-reviewed scientific literature and the NOAA data set is in reasonable agreement with the data sets from other organisations around the world, produced using different methodologies from largely the same data.”

    • Mark Hodgson permalink
      July 21, 2015 6:00 pm

      It is always possible that the adjustments to the data are justified and necessary. If they can be explained to me in terms that establish both then I will back off.

      However, the reply Black Pearl received from the Met Office, although 2 substantial paragraphs long, doesn’t explain it at all – it’s just a fob-off. Admittedly it would require a much longer answer by the Met office to explain clearly and fully why the changes are necessary (and why they they always seem to have the effect of cooling the past and warming the present) but the fact that they couldn’t even be bothered to make much effort suggests they don’t have a strong case. Their reply smacks of the same “press office” type reply Paul Homewood received to his specific questions in connection with “that” July record at Heathrow – his questions weren’t answered, and the response was similarly a lazy fob-off.

  21. Against Thelaw permalink
    July 21, 2015 5:35 pm

    Isn’t it against a law to destroy data in this way?

    • July 21, 2015 6:47 pm

      It should be. Photographers employed by the White House are not allowed to delete photos for any reason, AFAIK. That’s actually silly, as photographs are non-essential and there are hundreds of thousands taken each year. But there you have it.

    • July 21, 2015 8:00 pm

      Maybe, but no one cares. There was the Rose Law Firm shredding by Hillary and the lost emails by the IRS and Hillary (who alledgedly distroyed her server containing government property—learned that at the Rose Law Firm, no doubt). Someone would have to care and then find an applicable law that has been used in the past three decades or so. It seems unlikely in both cases. Politicians and government entities destroy records all the time with impunity.

  22. Manniac permalink
    July 21, 2015 6:35 pm

    The Future is certain but the Past is always changing.

  23. July 21, 2015 8:17 pm

    The Met Office and Peter O’Neill’s “explanations” of the adjustments just do not wash.

    They have no Idea how the changes in instrumentation and location actually can be translated into accurate, repeatable and consistent adjusted temperatures,, If they did they would only have to make one adjustment not numerous adjustments over a period of many years.
    In any case homogenisation using Kriging techniques is almost certainly not appropriate as they do not have accurate semi-variograms or and/or properly validated Ktriging co-efficients for each and every distinct air mass whose variability charecteristics are known andwjhich air mass temperature is being measured. Just averaging the max and mi of 24 hour period measurements is hopeless.
    A station could be situated in a cold dry mass with no cloud cover for 20 hours in a 24 hour period and the high recorded in 4 hour period of a warm subtropical maritime air mass. The min Max average would then be way above the actual REAL 24HOUR AVERAGE. As for not archiving and preserving ALL original sample data.

    In my business ( mineral reserve and resource evaluation) if you did this and took money from people on this basis you would be at minimum banned from practise and sanctions would probably include compensation of people who lost money as a result of relying on your data and in some cases a jail sentence.


  24. July 23, 2015 12:20 pm

    David Rose reports: I can reveal that the US House of Representatives science committee, led by the Texas Republican Lamar Smith, also has doubts. At the end of last month, committee staff sent emails to several experts in Britain, saying Mr Smith ‘is making climate change data within NOAA a priority’. The committee, they added, was seeking outside help to ‘analyse’ NOAA’s claims – apparently, it would seem, because some members do not trust NOAA’s ‘input’ alone.

    A committee aide told me: ‘NOAA released a conclusion it claims is based on scientific analysis. It has provided the Committee with documents to show their methodology and we’re seeking to confirm that their conclusions are accurate.’

  25. ThinkingScientist permalink
    July 23, 2015 3:11 pm

    I have archive copies of GHCN data downloads as follows:

    GHCN1 downloaded June 2007
    GHCN2 downloaded June 2007
    GHCN2 downloaded December 2009
    GHCN2 final ever (which I think is 2011)***
    GHCN3 downloaded August 2013
    GHCN3 downloaded June 2014

    *** This is the absolute final version of GHCN2. I obtained this directly from NOAA after an email exchange requesting the final archive version of GHCN2. They provided it on an FTP link to me and I have the email trail with the provenance in support.

    All of these are the full datasets and include the station data as well as the results files.

  26. ThinkingScientist permalink
    July 23, 2015 3:15 pm

    And the questions of the temperature adjustments that are never answered are:

    1. Why do the adjustments exhibit a systematic linear increasing trend over time, he slope of which accounts for just under half of the final temperature increase over the 20th century
    2. What possible physical explanation is there for a systematic bias in the temperature adjustments with time.

    • July 23, 2015 5:19 pm

      Therein lies the problem with any attempted explanation of adjustments. When I was in college, if all the adjustments went in the direction of your conclusion, you would be flunked for fudging data. If you were luckly, you could do the experiment or study over. If not, you took the class over. Yet NOAA does this with impunity. This is not science.

      I like the “absolute final version”! At this point, data seems never to be final, which means it’s pretty much worthless.

  27. D.I. permalink
    July 24, 2015 11:32 pm

    Anybody interested in old Climate Data from the World Meteorological Organisation for the period 1961-1990 should explore here-
    (scroll down page)

  28. July 26, 2015 12:13 am

    Reblogged this on Quixotes Last Stand.

  29. Pinardi1 permalink
    December 31, 2016 5:20 am

    It was just a week ago I emailed NOAA Snow and ice questioning the calibration of their new satellite. data as being about 500k milers lower than normal. They wanted to know my proof.


  1. Monster Waves, San Fran Quake, Michael | S0 News July 22, 2015 ~
  2. Monster Waves, San Fran Quake, Michael | S0 News July 22, 2015 | TheSurvivalPlaceBlog
  3. Monster Waves, San Fran Quake, Michael | S0 News July 22, 2015 | The Wave Chronicle
  4. Paris World Summit of Conscience, International interfaith gathering #2 | Marcus Ampe's Space

Comments are closed.

%d bloggers like this: