Skip to content

Two Decades of Temperature Data from Australia – Not Fit for Purpose

September 2, 2017
tags:

By Paul Homewood

An important post from Jennifer Marohasy:

image

Australia is a large continent in the Southern Hemisphere. The temperatures measured and recorded by the Australian Bureau of Meteorology contribute to the calculation of global averages. These values, of course, suggest catastrophic human-caused global warming.

Two decades ago the Bureau replaced most of the manually-read mercury thermometers in its weather stations with electronic devises that could be read automatically – so since at least 1997 most of the temperature data has been collected by automatic weather stations (AWS).

Before this happened there was extensive testing of the devises – parallel studies at multiple site to ensure that measurements from the new weather stations tallied with measurements from the old liquid-in-glass thermometers.

There was even a report issued by the World Meteorological Organisation (WMO) entitled ‘Instruments and Observing Methods’ (Report No. 65) that explained because the modern electronic probes being installed across Australia reacted more quickly to second by second temperature changes, measurements from these devices need to be averaged over a one to ten-minute period to provide some measure of comparability with the original thermometers. The same report also stated that the general-purpose operating range of the new Australian electronic weather stations was minus 60 to plus 60 degrees Celsius.

This all seems very sensible, well-documented, and presumably is now Bureau policy.

Except, this winter I have discovered none of this policy is actually being implemented.

Rather than averaging temperatures over one or ten minutes, the Bureau is entering one second extrema. This would bias the minima downwards, and the maxima upwards. Except that the Bureau is placing limits on how cold an individual weather station can record a temperature, so most of the bias is going to be upwards.

Jen Marohasy at the Goulburn weather station where the Bureau acknowledges it set a limit of minus 10 degrees on how cold a temperature could be recorded this winter, never mind that this AWS recorded minus 10.9 degrees Celsius during a previous winter.

I have known for some time that the Bureau remodel these recordings in the creation of the homogenised Australian Climate Observations Reference Network – Surface Air Temperatures (ACORN-SAT), which is subsequently incorporated into HadCRUT, which is used to inform the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Nevertheless, I naively thought that the ‘raw data’ was mostly good data. But now I am even sceptical of this.

As someone who values data above most else – this is a stomach-churning revelation.

Indeed, it could be that the last 20-years of temperature recordings by the Bureau will be found not fit for purpose, and will eventually need to be discarded. This would make for a rather large hole in the calculation of global warming – given the size of Australia.

Read the full post here.

17 Comments
  1. September 2, 2017 10:59 am

    It seems the warmists community cannot tolerate data that contradicts it’s unwavering commitment to the cause.

  2. Chris, Leeds permalink
    September 2, 2017 11:15 am

    This makes you wonder whether this also applies to the UK and elsewhere? I have always thought it likely that electronic thermometers react more quickly than the old mercury ones and I did not know about the WMO advice. Makes you think that the recent notorious(?) warm ‘blasts’ recored in the UK at London Heathrow Airport – that have produced allegedly record high temperatures in recent Junes and Julys aren’t also because of this one second sampling…..

  3. September 2, 2017 1:03 pm

    They’re not very good at data tampering either.
    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2968352

  4. September 2, 2017 1:44 pm

    And they claim there is not an international conspiracy to manipulate the temperature record as a means of promoting CAGW. Birds of a CAGW feather, bastardize together.

  5. September 2, 2017 1:53 pm

    Is anyone surprised at this revelation?

  6. September 2, 2017 1:57 pm

    I have been trying to find out exactly what temperature sensing equipment is in use at the stations in question. It looks like the BOM may get its temperature sensors from Environdata, who make two air temperature sensor modules: the TA40 Series and the TA60 Series. If these are the ones in use, their specs are at

    Click to access TA40-Air-Temperature-Sensor.pdf

    Click to access TA60-Air-Temperature-Sensor.pdf

    The TA40 specs indicate calibrated temperature outputs between -20C and +60C. The TA60 spec only claims calibrated outputs between -10C and 60C. The spec sheets don’t give the circuit diagrams, but the specs say the temps are converted to frequencies. It is unclear if a thermocouple output voltage is converted to frequency or if an oscillator frequency is output and the oscillator frequency-temperature relation is used. The reported calibration range suggests the latter may be the case, but that would be surprising given the quality of thermocouples available.

    In any event, if the TA40s (or one similar) are in place, there is no excuse, according to the specs, to throw out data below -10C. If however, the TA60 (or one similar) is in use, they may have to figure out how to incorporate data that technically is “out of spec” of the device. If the number of sites at which this occurs is small, they could send someone out to each station and try get calibrating data down to -20C or thereabouts by simply using a sodium chloride ice mixture at 1:3 salt to ice as a standard. If they are worried about non-linearity (and the probably should be) then they could get a couple of other measurements with other standard mixes, e.g. ice and ammonium chloride to get -5C.

    Anyone know what equipment is actually in place?

    • September 2, 2017 2:00 pm

      Sorry the additional calibration should be something like dry ice in ethylene glycol (antifreeze) to get -15C. The -5C won’t be much help.

    • September 2, 2017 6:51 pm

      The Environdata thermometers use semiconductor sensors, perhaps something like this: http://www.ti.com/product/LMT70, with a V-F converter, perhaps like this: http://www.ti.com/product/lm231

      • September 2, 2017 7:09 pm

        Edit: sorry, not a separate V-F converter. They use the internal ADC in their microcontroller and generate the transmitted frequency also using the microcontroller: http://environdata.com.au/wp-content/uploads/2016/02/TA40-Air-Temperature-Sensor.pdf

      • fah permalink
        September 2, 2017 11:36 pm

        Thanks. That helps a lot. I could not get the TA40 doc link that has /2016/02/ embedded to download. I went to the Environdata site and tried as well and it does not work from there either. So far I am not seeing anything that should preclude the TA40 data from being used well below -10C. Are we sure the Ta40 is the one used and not the TA60?

  7. September 2, 2017 3:29 pm

    Reblogged this on Climate Collections.

  8. MrGrimNasty permalink
    September 2, 2017 8:53 pm

    I have long wondered if the switch to electronic instruments is the main reason for the apparent level shift in the UK’s CET.

  9. Jennifer permalink
    September 3, 2017 1:02 am

    Here are the site specs for Goulburn – http://www.bom.gov.au/clim_data/cdio/metadata/pdf/siteinfo/IDCJMD0040.070330.SiteInfo.pdf

    I have watched the AWS weather stations measure below minus 10 and then this recording not be taken forward into the ADAM dataset. A relevant blog post is here: http://jennifermarohasy.com/2017/07/bureau-still-limiting-cooling-minus-10-degrees/

    • fah permalink
      September 4, 2017 7:14 pm

      Jennifer, looking at the site specs you give for Goulburn, it looks like they replaced the temperature probe there with a WIKA TR40. They make no mention of the Environdata equipment. They say they replaced the previous device which they say is a Rosemount, but no model number is given so it is impossible to tell what that device’s capability was. The TR40s are cable resistance devices which in general have a very wide range of thermal stability. If that is the equipment used for their measurements, there does not appear to be any good reason to truncate data at -10. Perhaps their downstream software is designed for some other equipment? Who knows. It seems reasonable to keep after them for this.

  10. Gerry, England permalink
    September 3, 2017 11:19 am

    I am sure NASA et al will be able to make up some ‘data’ to fill the gaps – they do already for most of Africa. Much easier than measuring and ensures no wrong results.

  11. Dave Vought permalink
    September 3, 2017 12:21 pm

    Keep up the good work Jennifer.
    The head of the BOM should have been sacked by now, corrupt conduct and failure to act.
    Over 300 million dollars a year of taxpayers money for this mob of fraudsters. They can’t even acknowledge or answer an email sent to them a month ago.

  12. September 5, 2017 5:21 am

    Look, it’s perfectly simple.
    We know it’s warming – all our data says so.
    Therefore, any temperature which is low is clearly wrong, and we don’t want wrong data getting into the system and fueling denial.
    Therefore, we need to either remove these low outliers, or limit them to what we know from our records to be more realistic.
    See? Simple!
    What’s wrong with that?

Comments are closed.