UAH And RSS Trends
By Paul Homewood
I had not spotted it before, but NOAA keep a database of satellite temperature, and provide a graphing tool which runs both RSS and UAH off the same 1981-2010 baseline (RSS use 1979-98 to calculate their anomalies usually).
The Jan – July graph makes interesting reading:
http://www.ncdc.noaa.gov/temp-and-precip/msu/time-series
It not only reiterates that this year is nowhere near being a record, but also emphasises how closely RSS and UAH track each other.
It is generally accepted that climate models project faster warming in the troposphere than at the surface.
It is also known that satellite temperatures have spiked at higher levels during the El Nino events of 1998 and 2010. Although it is likely that they will go higher before the end of the year, as they lag behind ENSO changes, the satellite temperatures should already be showing the effect of El Nino conditions that have been around for more than a year now.
http://www.esrl.noaa.gov/psd/enso/mei/
Finally, let’s remind ourselves of just how patchy the surface data really is.
http://www.ncdc.noaa.gov/sotc/service/global/map-land-sfc-mntp/201507.gif
Comments are closed.
I never like it too much when two different ways of measuring something track each other too closely, because it tells me that they aren’t really independent. Nor must I admit that I don’t like too much discrepancy as that suggest they aren’t actually measuring the same thing at all.
They are not independent. They both process (slightly differently) the raw data from the same satellite instruments. There are not two sets of satellites.
Thanks, Paul..
Homogenized temperatures not only are subject to “man-made global warming” needs.
They have a very small global coverage.
The match with RSS got much better after UAH did the last calculation software update.
Yes. You can read about the changes at Spencer’s blog. The formal paper is in process. Basically, UAH was using a shortcut method to solve the ‘aperture’ problem (earths curvature as seen from space needs a correction). They went back and did it the hard but precise way in V6.0. Motivation was, also gives UAH better vertical resolution–and guess what, the modeled tropical upper troposphere hotspot is for sure missing.
The nicest thing about UAH V6 is they use a US West Coast (Southern California to northern Alaska) series of observational radiosonde temperatures to validate the new algorithms. Checks out beautifully for that latitude transect.
“series of observational radiosonde temperatures to validate the new algorithms. Checks out beautifully for that latitude transect.”
Validate.. ?????
gees , don’t mention that word to the GISS mob..
They will have an apoplectic fit !!!
“It not only reiterates that this year is nowhere near being a record…”
Did you notice that the graph only goes up to 2014, and doesn’t include anything from “this year”?
Good point!
For some reason, NOAA have not updated July numbers for this year.
Here’s the June YTD ones.
http://www.ncdc.noaa.gov/temp-and-precip/msu/time-series/global/lt/jun/ytd
LOL, I saw this Andrew Bolt’s blog the other day..
I invite people to compare it to the one Paul has just posted. ! 🙂
The word FRAUD comes to mind !!!
ps , here’s the actual thread..
http://blogs.news.com.au/heraldsun/andrewbolt/index.php/heraldsun/P30/
Sorry, picked up the wrong url !!
http://blogs.news.com.au/heraldsun/andrewbolt/index.php/heraldsun/comments/how_did_our_cool_weather_get_logged_as_some_of_the_hottest_ever/
Reblogged this on Climate Collections.