Climate Fast Food

“How would you like your climate trends, sir? Homogenised, data treated the same the world over with no regard for local conditions? Carefully prepared using the best raw data from reliable sources with a long operational history and local knowledge?”

I know which I’d prefer, but we don’t get a choice. Just like the Western diet, climate prediction has degenerated into a morass of highly processed data that bears very little relation to its original form and is likely to be just as bad for our health. The Hockey Stick is a global brand designed to beat us into submission. It is instantly recognisable, but few seem to care about how it is produced or question the quality of the data and methods behind this icon or the climate predictions that fuel our daily diet of media-driven climate guilt. Did I say icon? Aye, Con. Well, no longer.

It is a really complicated thing trying to measure the world’s average temperature. In fact the phrase “fools rush in …” comes to mind. But NASA/GISS have done it and rather than saying “we’ve had a go – here is a first attempt, although it is a bit rough and ready”, it has become a fait accompli. The science is settled. We are being asked to commit billions to changing our society, reducing greenhouse gas emissions. Now, don’t you think it would be a good idea to be sure the whole thing is sound and that the data it is founded upon is the best we can get? That would seem reasonable.

The trouble is the program, GIStemp, used to produce that world temperature, is a bit of a monster. GIStemp is huge, badly written and basically impenetrable for all but the most dedicated, who need to be prepared to invest a lot of time in understanding what it does and how it does it. Oh, and did I mention that a reasonable familiarity with computer programming language helps too. [This: http://chiefio.wordpress.com/2009/08/09/will-the-good-ghcn-stations-please-stand-up/ is a good place to start] Did I say it was a monster? Well in programming terms it is a dinosaur, but that is another story.

Why GIStemp should really be MIStemp.

Anyone who starts digging around in GIStemp data, quickly finds a lot that is far from perfect. In fact you start to shake your head in incredulity. The GISS site helpfully provides a clickable map that you can search for graphs and data from temperature stations all round the world. This is where I started and I quickly became hooked.

You see I started to understand this ‘homogenisation’ thing and to see how flawed it is. I’d read about it (Climate Audit probably). It is supposed to correct for artificial warming or Urban Heat Island effects. Stations deemed as ‘Urban’ that show a warming trend are compared with rural stations and altered to reduce the warming. This is done by warming the older data, which should be OK, since the ultimate aim is to look at the rate and ‘amount’ of warming, not the absolute temperature. Rural stations are not adjusted (supposedly), but are used to quantify the adjustments. Many of the rural stations are airports. No heat sources there then? All that lovely wide open acreage of asphalt absorbing the sun’s rays!

Graph of GIStemp data for Tokyo – the example used by GISS to show how homogenisation corrects UHI. I was shocked by what I found. I kept finding sites where there is a clear warming trend, but instead of reducing the warming, homogenisation actually increased it; sites in which cooling was changed to warming (I kid you not!). There were examples of this everywhere I looked. These were minor changes, surely they couldn’t have a big difference overall? The more I looked the more examples of this I found. I am still working on it. There are examples on every continent, often a sizeable percentage of any adjustments that are made. Here are some of my favourites.

The GIStemp Hall of Shame.

I start with Oslo, Norway, where Al Gore was lauded for his promotion of Global Warming. Well guess what – Oslo isn’t (warming). Look first at a rural station just outside the city: Gardermoen, Oslo Airport. It seems to be warming, until you do a bit of digging.

Gardermoen became a military airfield in 1920, but was upgraded, after WWII bombing, with two separate runways, eventually handling intercontinental flights then civilian charter flights in 1972. After another major upgrade it became Oslo’s main airport in 1998, serving more than 19 million passengers in 2008. Gardermoen’s temperature record shows an overall warming (ΔT) of 3.73⁰C/100 years, but there is a strong cooling trend until the late 1980s around the time when the construction work for the airport upgrade was underway. After that there was a temperature jump of 1.5⁰C and a strong warming trend. Now look at another record for Olso – Blindern.

Blindern is in the city – at the University of Oslo. You might expect it to be an Urban Heat Island, and maybe it is, but it shows cooling over a century of (meticulous) records. What does GIStemp do? Compares it with rural data – you guessed it – BINGO – warming trend introduced.  This is not the only whopper in Norway. I’ve only had time to look at a few. Bødø, a coastal town above the Arctic Circle, has a warming trend that is increased by homogenisation from 0.77 to 0.93⁰C/100 years. The trend in Trondheim changes from cooling of 0.16⁰C/100 years to warming of 0.72⁰C/100 years after homogenisation. I don’t doubt there are places that are naturally warming, but GIStemp introduces too much warming in too many places for my liking.

How the hell did the program come up with this mixed up set of numbers in the Bahamas? Convenient little programme isn’t it? Instant warming.

How can we increase the warming? Oh let a little bit more adjustment creep in. How about homogenising some of that rural data? Arcadia, Florida is listed as a rural area; looks that way on Google too. So how come it gets the treatment?

In Aswan, Egypt the station was moved to its present position in 1957, presumably at the commencement of construction of the dam. In fact GISS splices five temperature records together to get this set. What about all that talk of quality control? Should these data sets really be spliced together? Isn’t that really ‘no warming’ followed by ‘relocation to a cooler site’ which then gradually warms as construction grows around it?

Bombay – conveniently doubles the warming rate; Adelaide Airport changes from cooling of 0.69⁰C/100 years to warming of 0.23⁰C/100 years

OK, this is my cherry pie. I stumbled on some really juicy ones, but I promise you there are plenty to find; you don’t have to look far. What I want to know is how these fit into the rest of the data. Is there enough unadjusted, or correctly UHI corrected/homogenised data that the maladjusted ones won’t make a difference, or do they actually, as I think they do, increase the warming trend? And then what if we took all those small towns that have grown into cities all over the world and did a more accurate estimation of UHI? Would we still have the warming that is claimed – the unprecedented 0.6⁰C per century?

So in the meantime, here’s my global warning. Climate temperature trends should be taken with a very large pinch of salt. If climate change is weather records homogenised, I’d rather have the raw data, as I’m finding the processed kind rather hard to digest.

[Update 30th Nov. 2009.  Link to an excellent webpage by Lucy Skywalker on GIStemp records and UHI correction in the UK]

[Update 8th Dec. 2009.; Link to Willis Eschenbach’s detailed post at WUWT on ajdustments made to the Darwin record in Australia. Oh and I know Steve McIntyre has detailed work on UHI, here at Climate Audit]

Advertisements
This entry was posted in GIStemp, Opinion, Station Data, Trends and tagged , , . Bookmark the permalink.

One Response to Climate Fast Food

  1. tonyb says:

    Much of climate science is experimental.

    Michael Mann created the hockey stick a couple of years after getting his Phd. He had probably intended to fiddle around with the calculations for a decade, quietly fine tuning it, but it got catapulted to international stardom and he had to defend it.

    Sea level rises are based on three Northern Hemisphere tide gauges from which data has been heavily manufactured. It is interesting but experimental.

    As For Hadley and Giss…Hansens 1987 paper was interesting but failed to grasp the basic point that if you measure from the bottom of a climate valley you shouldn’t be surprised when temperatures start to climb towards the next summit. An well reasoned paper but experimental.

    Ice cores-highly theoretical-they may be right (and 100,000 co2 readings by scientists back to 1830 wrong) but they were seized on as the truth whilst the science was still in its infancy.

    Climate science is unique in thinking that it has found the answers first time instead of stumbling towards the truth over hundreds of years, as other branches of science does.

    We know far less about the climate than we believe we do and much of what we do know is by no means settled and certain.

    Tonyb

Comments are closed.