Skip to content

Why Do We Use Temperature Anomalies?

August 27, 2013

By Paul Homewood




As promised, a quick look at what temperature anomalies are and why we use them.


A temperature anomaly is the variation between a particular temperature for a particular station and a particular month, and the average for that month for a selected baseline period. These baseline periods are different for each of the four datasets which I present. For instance, HADCRUT use 1961-90 as their baseline. The others are:-


GISS – 1951-80

RSS – 1979-98

UAH – 1981-2010


For this reason, the anomalies for each set cannot be directly compared with the others.


Why are anomalies used rather than absolute values?

1) To say that the absolute mean temperature for a particular station in a particular month is, say, 17C  is pretty meaningless. To be able to say it is 1C more, or less, than a baseline period carries more meaning.

2) Temperature anomalies can be compared on a month by month basis, in a way in which absolute numbers cannot. say that March was warmer than February is meaningless. To say that both months were colder than “normal” tells us something.

3) One big advantage of anomalies is that regional temperature trends can be developed. There is correlation between anomalies within a region, although there may be disagreement over just how far this pertains. In contrast, absolute temperatures can vary significantly over short distances, because of factors such as altitude and geography.

Because of this, temperature trends for, say, a US state can be built up for a many decades, even though the individual stations may not have records throughout the period.

4) Who decides which baseline to use? The WMO recommend that the most recent three decades should be used as the “climatic norm”, i.e 1981-2010. UAH changed their baseline to this a year or so ago.

Others argue that their baseline does not claim to be a norm, and that changing the numbers every decade would be too confusing.

My view is that there should be consistency between all sets, so that direct comparisons can be made, and that the most recent period is most appropriate.


I am probably as guilty as anyone of interchanging the terms “temperature” and “temperature anomalies”! But, bearing in mind that the baselines are fixed, if the global temperature anomaly for, say, 2012 is less than that of 2002, it follows that the absolute temperature is as well.


As for the “global absolute temperature”, there probably is no such thing! A look at GISS’s “ The Elusive Absolute Surface Temperature” is worthwhile.

  1. Jeffery permalink
    August 27, 2013 2:37 pm

    Many thanks for the cogent explanation. It is as I had thought. And apparently we’ve been running above the baseline for some time. What that suggests to me is that we are in a warming period, but the level of warming is constant, not accelerating. So when we see a temperature anomaly below the mean for temperature anomalies, we’re seeing that the average temperature is above the baseline for that month, but below the expected amount of warming. In order for a month to be at the baseline, the graph would have to be at zero.

    This does seem useful for the reasons stated, but does also seem to minimize the fact that today’s temperatures are about a half degree C above the UAH 1981-2010 baseline and continue to range around that point.

    We completely agree, on the difficulty — if not impossibility or even nonsense — of computing a “global average temperature” with the level of accuracy required to support claims being made about global average temperature!

    Again, many thanks. Love the work you’re doing and I constantly use it to definitively refute many ludicrous claims made by sincere but uninformed colleagues.

    • August 27, 2013 3:23 pm

      I think what the graph does show is that it is warmer now than it was 30 years ago, but that it stopped getting any warmer at least 10 years ago.

  2. August 27, 2013 3:10 pm

    Not so fast…anomalies carry their own sets of problems, and can be very misleading.

    Imagine on a planet two areas of the same size, one near the pole (P), the other at middle latitudes (M). Average temperature at P is -10C. Average temperature at M is +10C.

    Now for whatever reason temperatures change, going to -4C at P is +7C at M. Anomaly at P: +6C. Anomaly at M: -3C. Average anomaly: +3C. “Global warming”…or is it? Now P is still very cold, and M has gotten colder. The end result is a cool planet, whatever the anomaly might mislead you into believing.

    Something similar is happening (time-wise) on Earth at the North Pole. If you look at the Danish 80N temperature series for 2013, the positive anomalies clearly outweigh the negative ones. Does that mean the North Pole is getting warm? Not by a long shot…because the positive anomalies occurred when in winter temperatures were incredibly cold (so they went from unbearable to awful), and the negative anomalies are occurring now that temperatures are barely below 0C.

    The North Pole has been abysmally cold all year, whatever the anomalies say. Cue the deep red in the GISS thermometer-free Arctic…

    • August 27, 2013 5:29 pm

      Of course you get the same problem with absoloute temperatures.

      The real issue is that temperature and energy are two different things.

      Joe Bastardi describes it well:-

      What is the relationship between cold dry air and warm moist energy as far as the energy budget? Do you understand that a 1 degree drop in temperature at a wet bulb of 80 has for more implication for the energy of the earth than a rise of 20 degrees where its 20 below.
      The same amount of energy that changes the tempfrom 80 to 81 will change it from 15 to 0.( I may even be underestimating, our friend Jay Schlegal said it was
      a doubling of the buoyancy for every 20 degrees).

      Click to access Note_to_Paul_Douglas_by_Joe_Bastardi.pdf

      So to average temperatures (or anomalies) over the whole world, including the poles, is a fairly meaningless exercise.

  3. Brian H permalink
    August 28, 2013 2:11 am

    Here’s a post made by an IPCC reviewer after an SA article dissing Judith Curry:
    ” Iconoclast 05:06 PM 10/23/10

    The proposition that the average temperature of the earth’s surface is warming because of increased emissions of human-produced greenhouse gases cannot be tested by any known scientific procedure.

    It is impossible to position temperature sensors randomly over the earth’s surface (including the 71% of ocean, and all the deserts, forests, and icecaps) and maintain it in constant condition long enough to tell if any average is increasing. Even if this were done the difference between the temperature during day and night is so great that no rational average can be derived.

    Measurements at weather stations are quite unsuitable since they are not positioned representatively and they only measure maximum and minimum once a day, from which no average can be derived. They also constantly change in number, location and surroundings. Recent studies show that most of the current stations are unable to measure temperature to better than a degree or two.

    The assumptions of climate models are absurd. They assume the earth is flat, that the sun shines with equal intensity day and night, and the earth is in equilibrium, with the energy received equal to that emitted.

    Half of the time there is no sun, where the temperature regime is quite different from the day.

    No part of the earth ever is in energy equilibrium, neither is there any evidence of an overall “balance”.

    It is unsurprising that such models are incapable of predicting sny future climate behaviour, even if this could be measured satisfactorily.

    There are no representative measurements of the concentration of atmospheric carbon dioxide over any land surface, where “greenhouse warming” is supposed to happen.

    After twenty years of study, and as expert reviewer to the IPCC from the very beginning , I can only conclude that the whole affair is a gigantic fraud.”

  4. 4TimesAYear permalink
    March 4, 2014 12:16 am

    An anomaly in the weather is an extreme departure from the normal, right? Isn’t that what they tell us we can’t use?

  5. November 19, 2015 11:11 pm

    I’m sorry, but this reads like utter nonsense. Assuming that a baseline is “normal,” is deliberately introducing bias into your analysis. Referencing a baseline value arbitrarily chosen, does not give you any more data that you already had. Charting average departures from an arbitrarily chosen baseline, for limited stations, doesn’t mean anything to me. It certainly doesn’t mean you can legitimately extend data analysis to locations or areas where you don’t have data measured.
    There is no “normal” temperature, anywhere. Temperatures change, location to location, minute to minute, hour to hour, day to day, month to month . . . . ad infinitum. An “average temperature” might be measured averaging the high and low for the day, or by averaging 24 measurements made on the hour, or 1440 measurements, etc., at a station, or at 1,000,000 stations. Averaging anomalies in a region, if you only have two stations, doesn’t give you a more accurate picture than “averaging” absolute temperatures.
    If you want to emphasize that temperatures change, then by all means focus your discussion on “anomalies,” like something bad is happening, and like temperatures “should be” different – i.e., non-anomalous. Oh my, are we supposed to be scared since we are experiencing temperature anomalies?? Is there really an appropriate temperature?
    This is gibberish.

  6. Brian Donovan permalink
    November 27, 2016 1:00 am

    Anomalies are just a way to make errors. You already listed a change in the baseline. Who maintains that baseline? No,it’s far better, for a long term subject, to use the actual data in publications and databases, and let folks calculated the anomalies when they need to. Otherwise you don’t know if the baseline has changed or introduced errors or how it was calculated. Just use temperature. Averaged in time and space if you want, No hidden, possibly changed biases. Now people can check calculations.

    Worse, the use of anomalies for predicting temperatures has totally infected the commercial public weather sites, so now I can’t get the actual predicted temperatures for my area. It means nothing to me that it’s +3 or =3 degrees, what matters is it below freezing, by how much, and what steps to I have to take for my home and business to be ready for that.

    This is a mistake and could costs millions of dollars in damage from errors in planning. We need to stop this.


  1. Why the Economics of Independence Don’t Matter | Fundamental Uncertainty
  2. About graphs, and their use | DON AITKIN

Comments are closed.

%d bloggers like this: