Skip to content

How Homogenization Destroys Climate Science

August 15, 2019

By Paul Homewood

This is a must see video from Tony Heller, which shows how UHI is corrupting good data at rural sites:

  1. August 15, 2019 11:15 am

    Reblogged this on Climate-

  2. August 15, 2019 11:19 am

    I would say that “climate science” was already destroyed. It is not science as I was taught it or as Richard Feynman would define it: “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong” and “Science is the belief in the ignorance of experts.”. There are plenty more.of his fine quotes at

    • manicbeancounter permalink
      August 15, 2019 7:30 pm

      The above quote is from a short 1964 lecture on the scientific method. Feynman then goes onto say

      “You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.”

      Climate science long ago stopped making novel predictions that would distinguish the CAGW hypothesis from random variations. Instead, climatologists actively embrace viewing the acceptability of data in terms of prior beliefs, the opposite of the aims of scientific methods. For example, an email from Kevin Trenberth to M Mann in 2009 (copied to Karl of NOAA, Schmidt of NASA & Jones of Hadley Research Centre)

      “The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.”

      In Trenberth’s opinion when the observations contradict the theory, it is the observations that are wrong. Temperature data has been homogenized a number of times, often producing unstable results. When you have a prior belief in what the results should look like it is not surprising that the overall average should come to resemble the UHI influenced data than from higher quality temperature stations.

  3. It doesn't add up... permalink
    August 15, 2019 11:32 am

    Somebody ought to create a few datasets that only take rural stations into account, with separate datasets for sites affected by UHI. Then we can all agree that humans are responsible for UHI while basing wider climate change estimates on the rural data.

  4. Phoenix44 permalink
    August 15, 2019 11:38 am

    If Climate Science had any confidence in its own claims, it wold simply remove data that comes from cities, unless you can show that the station is well-sited and immune from UHI, The remaining data should prove warming – if it does not, how can you possibly still say the data shows warming?

    This is lie a drug company homegnsiing three separate studies, two of which just had the drug being tested, which shows no effect, and the third having both the test drug and another existing, effective drug. Lo and behold, add them together and the test drug appears to work! But it doesn’t.

  5. Saighdear permalink
    August 15, 2019 11:49 am

    Caught my attention – affects me in more ways t han just clmate affairs:Homogenization = . We have it in Agric & engineering too – where the Society doesn’t want the wee guy to be adventurous and try things out – has to be “type approved” etc – for Insurance Purposes etc.. “Claim Culture” and th like stifles development. Pity help Otto, Carnot & Rudolf et al if they were born today.

  6. suka47 permalink
    August 15, 2019 11:49 am

    Hi, This is very interesting and great ammunition to show people. My question is have you investigated more locations or does Buenos Aires show the largest discrepancies? It would be great is the NASA data showed numerous similar locations. Thanks Paul

    Sent from my iPad


    • Broadlands permalink
      August 15, 2019 7:31 pm

      Charles Darwin went around the world on the HMS Beagle for almost five years. Every day (usually around noon) the temperature of the air and the water was recorded. Very few were done on land. Those few were made in Rio de Janeiro at 9:00 AM and 9:00 PM. I compared his data with those a few years ago in Rio and there is no statistical difference… UHI or not. The same is true for other places the Beagle stopped, but not on land. If Darwin were to make the same voyage today (even with all of the human changes) it is likely the results would be about the same…out on the ocean away from the land.

  7. August 15, 2019 2:28 pm

    As an amateur homogeniser I would say that this video is misleading and that its conclusions are probably false. Tony fails to mention that homogenisation is necessary to deal with station moves and equipment changes, it is probably adjustments for those issues that changed flat raw data into a 20th century warming trend, not “mixing-in” data from a station with undoubted UHI, such mixing-in is not part of homogenisation.

    Adjustments are usually estimated from a short data window (maybe +/- 10 years) centred on the time of the adjustment, if the UHI station data was used in the estimation of the adjustments then there may be a small error from any change in UHI over the 20-year window, but other station data will be used as well, likely making the error from UHI negligible.

    NASA GISS undoubtedly makes bad mistakes in places (Iceland and Paraguay spring to mind), but I don’t think that this UHI error is significant.

    • A C Osborn permalink
      August 15, 2019 3:02 pm

      “Tony fails to mention that homogenisation is necessary to deal with station moves and equipment changes”

      No it is not absolutely necessary, a station move should be a new Station and an equipment change if providing different values should also become a new station.
      New & old equipment should be run side by side for at least 3 years for verification purposes unless an Equipmement “Failure” was involved.
      When Rural v Urban values are compared there is quite a bit of difference between the two.
      There is also the official 0.6C addition to the Trend from the TOBS and Homogenisation.

      • August 15, 2019 4:10 pm

        See essay ‘When Data Isn’t’ in ebook Blowing Smoke for a fuller discussion with many more global illustrations.

    • JimW permalink
      August 15, 2019 3:05 pm

      I think you are conflating two different effects. NASA-GISS averages or homogenises data over its ‘squares’ that it uses to cover the world where info is scarce. So if you have a UHI in a square with say only two other stations covering past years ( such as the example of BA) then the square’s data is definitely unduly influenced by the UHI record. This effect can be even more extreme in polar regions, oceans and poorly covered landmasses such as central Africa, asian Russia etc.
      The errors involved with moves of stations/changing equipment are an additional problem.
      Indeed if the error bars were correctly assigned to most records it would be impossible to see any meaningful change as the bars exceed any possible underlying trend movement many times over.
      Of course NASA-GISS etc deny this, but then their rationale for existence and funding is at stake.
      What was a somewhat unfortunate ‘jump’ in the video was remarking that the red squares over Brasil had anything to do with BA rather than the same problem with Brasilian records.

    • August 15, 2019 3:43 pm

      The trouble is that when GHCN homogenise, they don’t specifically make an adjustment for, say, change of location, because they generally have no knowledge of such. Instead they compare trends with nearby stations, which are often urban, and alter accordingly.

    • George L permalink
      August 16, 2019 12:10 am

      NASA and NOAA Climate people.
      Since Hansen and Gore for a scientist working on climate to get funding, or keep their job if they are at NASA and NOAA, they had to accept the “settled” conclusion and work back to the “evidence”. A few brave ones like Curry, Lindzen and Spencer can attest to that. World wide that is where the IPCC and such money goes to. Obama sent billions of our money there – $500 million of state department funds just before he left office all to prop up what he decreed “settled”.

      As one close observer noted: Money, Money, Money. If they have Grad students to support, when they send a proposal up for evaluation, if they parrot the company line about Anthropogenic Global Warming, their proposal gets funded. If they call it out for the nonsense it is, they will likely not get funded. At least, that is how its been for most of the last 20 years or so. I know of dozens of top Meteorologists who have lost their funding because of this, Including Dr. Linzen.

    • Rob permalink
      August 16, 2019 12:58 pm

      “As an amateur homogeniser I would say that this video is misleading and that its conclusions are probably false.”

      Yet you just saw in the video how much homogenization contaminated the data over a large geographic area. This effect has been documented at numerous other stations as well. How much more proof do you need?

  8. August 15, 2019 2:56 pm

    Add on the whole of Africa , one fifth of the world’s land mass is mostly estimated.

    WMO- “Because the data with respect to in-situ surface air temperature across Africa is sparse, a one year regional assessment for Africa could not be based on any of the three standard global surface air temperature data sets from NOAANCDC, NASA-GISS or HadCRUT4. Instead, the combination of the Global Historical Climatology Network and the Climate Anomaly Monitoring System (CAMS GHCN) by NOAA’s Earth System Research Laboratory was used to estimate surface air temperature patterns”

  9. August 15, 2019 6:14 pm

    Here is Anthony Watts’s presentation regarding homogenisation of temperature data in the USA. His survey of 1,000 weather stations found that 92% were not fit for purpose.

  10. August 15, 2019 6:21 pm

    At some level there will be computer software used. 1: to handle the input of data . 2: to analyse the data in the way required. 3: To store and output the data.
    From my 35 years of writing computer software mostly in research and development for real time systems. I know there can be many problems in all these tasks.
    If you look at a detailed version history of the specification changes. The testing results against those specification. Plus the software and documentation changes to meet the specifications. You should if properly recorded find a host of changes to fix the problems encountered. Then if you add in the hardware specification , testing results for hardware and documented changes and fixes. These are normally done in house before site testing.
    The system has to be designed and tested in the environment it will be used. So every effort must be used to standardise the environment.
    Inevitably over time you can end up with different hardware at different sites running different versions of the software.
    The idea of making adjustments to the data or the way the data is analysed to allow for changes in the site environment sounds like a strange approach. The Software and hardware should be able to work to specification in its current environment and be changed if required by changes in the environment.

  11. August 15, 2019 11:35 pm

    Reblogged this on Climate Collections.

  12. I_am_not_a_robot permalink
    August 16, 2019 12:31 am

    Two lighthouses in Vic Aus, near perfect uncontaminated locations, have long temperature records back to 1880 indicating little or no net warming viz. Wilson’s Promontory and Cape Schanck; by applying their ‘regional expectation’ filter Berkeley Earth has been able to massage the data including made-up data into distinctly positive trends approximating the purported global trend.
    That’s just two examples.

    • George L permalink
      August 16, 2019 1:52 am

      “able to massage the data including made-up data into distinctly positive trends approximating the purported global trend.” I am sure Berkeley Earth does that a lot.
      The first half of the twentieth century was much hotter with more ice melt. The Medieval Warm Period saw green and farming in Greenland, in England vineyards and sea levels a mile more in land

  13. I_am_not_a_robot permalink
    August 16, 2019 1:33 am

    Cape Otway gets the same treatment:

  14. tom0mason permalink
    August 16, 2019 7:39 pm

    ChiefIO (E.M. Smith) has reposted my favorite Tony Heller video from 2016.

    Tony Heller presents a detailed analysis of the USHCN data set and how the data are diddled.

  15. Vincent Syracuse permalink
    August 16, 2019 9:25 pm

    Climate change is a fraud and a scam. There is no truth, fact, or any credible scientific evidence to back up any of their wild, outlandish claims. Some the worlds best scientists and meteorologists all say the same thing. It’s a total fraud and disgrace.

  16. Wellers permalink
    August 17, 2019 4:19 pm

    Today he just released another video about the “Hottest July on Record” claim by NOAA. Well worth watching – I subscribe to his YouTube videos and they’re all excellent, “setting the record straight on climate”. Be sure to click LIKE.

  17. Huston permalink
    August 21, 2019 3:02 pm

    Interesting…. I just went to the GISS website. For Buenos Aires I get the message “Error. I cannot construct a plot from your input. Invalid station id.” The Mercedes and Rocha stations only go back to 1952.

Comments are closed.

%d bloggers like this: