The only way is up

Charles Duncan has sent me a simple analysis comparing the unadjusted and adjusted versions of the data in GHCN V3.

The following graphs were created in an Excel spreadsheet as anomalies for each stations ref 1940 using a First Differences Method.  The annual anomaly for each station is compared and the differences summed without spatial weighting.  Only one distinction is made: that between Rural/Suburban/Urban stations.

Yes that’s right, from 1940, on average adjustments increasingly warm the data. Here are the differences:

That is a lot of warming, especially in the Suburban stations (although they have little effect overall).  In fact the only cooling adjustment is for rural stations between 1940-1970, and the result of that is to increase the slope of the rural post-1970 graph.

Are you shocked? I was. Without spatial weighting this overall adjustment averages 0.24°C per century (trend of black line above).  Then I remembered I had seen something about this previously – from the slides of the NCDC presentation at the Exeter Workshop in September 2010.

Slide 12 from Matt Menne's presentation

Let me just say I understand the need for adjustments. In the past station data may have been recorded at a different time of day (Tobs – Time of Observation – adjustment); stations move (SHAP – Station History Adjustment Program) and the equipment used changes. Hang on, isn’t there something about ‘adjustments may go down as well as up’?  Oh hang on – that’s ‘investments’ 😉

This data also goes through Quality Assurance and Homogeneity Adjustment, which essentially looks for changes in stations relative to those around them.

In this approach, comparisons are made between numerous combinations of temperature series in a region to identify cases in which there is an abrupt shift in one station series relative to many others. The algorithm starts by forming a large number of pairwise difference series between serial monthly temperature values from a region. Each difference series is then statistically evaluated for abrupt shifts, and the station series responsible for a particular break is identified in an automated and reproducible way. After all of the shifts that are detectable by the algorithm are attributed to the appropriate station within the network, an adjustment is made for each target shift. Adjustments are determined by estimating the magnitude of change in pairwise difference series form between the target series and highly correlated neighboring series that have no apparent shifts at the same time as the target.

You would imagine that step changes dues to site moves or new equipment are easiest to detect, but gradual change due to land use alterations might be harder to pick up.  Growth of urban areas can also be picked up, although I imagine this might also be harder up if gradual change is happening throughout a region. Here’s Slide 15 from Menne’s Exeter presentation:

NCDC’s methods seem thorough, so why am I still bothered by the fact that the adjustments seem only to amplify the warming, and by so much?  It’s just that I’ve seen even greater thoroughness, and we should not underestimate the effort it takes to get this right. It is not so much the adjustments – corrections for documented and undocumented station moves – that bother me, as the unknown quantity of unidentified, gradual change that may be spread around by homogenisation.

NCDC’s method was previously used just in the US (for USHCN) and in GHCN V3 is now being applied to the rest of the world.  Yet when these methods are compared to those used by a professional company (WeatherSource) for US data (Utah) further changes and differences are picked up.  As previously featured on Watts up with That here, Mark Gibbas’ report An Investigation of Temperature Trends from weather station observations representing various locations across Utah  is very detailed, picking up differences that both warm and cool that data. The company’s method, as described in the report, is also based on pair-wise adjustments, but uses daily figures and seasonal data, and their ‘clean data’ shows significant shifts compared to USHCN data. In some cases this increases trends, in others it reduces them. Of course knowing about changes that affect the temperature is one thing; being able to correct for them accurately is a whole other kettle of fish.

Perhaps future generations will look back and laugh that we even tried to detect a warming signature in the global surface temperatures against a background of change in measurement environs that were so poorly documented and understood at the time.

Advertisements
This entry was posted in GHCN V3 Beta, Station Data, Trends and tagged , , , . Bookmark the permalink.

7 Responses to The only way is up

  1. Tom Harley says:

    Ken’s Kingdom spent a lot of time and effort last year in an extensive audit of the BOM ‘High Quality’ data right across Australia, and found similar adjustments to the raw data, resulting in a significant percent increase right across the continent, urban and rural.

  2. Doug Proctor says:

    The adjustments look odd to me not just for their increasing change over time, but for the lack of STEP functions. TOBS should be a step-function change, unless the boys got us at 6:00 am in 1910, 6:07 in 1912 etc. Weather station data was gathered under very strict rules: a new rule would have to be posted to everyone at the same time. Hence the step-function. Instrumental changes would also show up as step-functions: you don’t use half a new thermometer for the first year. And new instrumentation would sweep through a department over a short period of time as funds are allocated for replacements rather than fixes.

    The UHIE changes should show up as incremental with population growth but not random, though you would expect a number of places in the world with declining or stabilized populations (Detroit? Dayton, Ohio? Russia?) to have reversed UHIE. SOME places must have reversed UHIE.

    The only RANDOM change I see is the SHAPS correcction, and it sure as heck won’t be incremental. The first year data isn’t collected half-way down the road. So, another stepfunction.

    The only situation of random and incremental I can perceive is actual global warming or cooling with “random” regional differences through time as well as location. Perhaps the data has been massaged too much.

    Gee, another indication that there is too much messing around?

    Do we have curves reflection ONLY TOBS, SHAPS, UHIE and instrumental changes? By sea, land and USA only?

  3. Doug Proctor says:

    The fundamental concern I see is that there is a difference in temperatures (as seen by the reported anomalies, i.e. deviation from mean temperatures) between the different types of stations. Should not the anomalies, not just the rise-rate, be the same at least regionally?

    How is this split justified by GISS/Hansen/the IPCC? Or do they not recognize it?

  4. boballab says:

    Verity:

    There is no Unadjusted Data int he GHCN v3 datasets, they might call it Unadjusted but it is not. Here is the deal back in the GHCN v2 datasets the “Unadjusted” dataset wasn’t completely the raw temps from the stations, however it had only rudimentary changes done to it via “quality control”. You also got to see how many different thermometers and locations made up one “station” going by Station ID number. You even saw this in the fully adjusted set of GHCN v2.

    Now here is where they really put things into the black box in the new dataset. Where as in GHCN v2 the combining of thermometers was part of the adjustment procedure, in GHCN v3 it is now part of the “quality control” procedures. So they did all the adjustments needed to stitch those multiple thermometers together and then released that dataset as the GHCN v3 “Quality Controlled Unadjusted” (QCU) dataset. Here is simple way to check what I mean. Download the latest copy of GHCN v2 unadjusted and the GHCN v3 QCU datasets. Now look up the London/Gatwick Station. In the GHCN v2 unadjusted/raw dataset you will find two listed: 651037760000 and 651037760001. Number 65103776000 goes from the year 1961 to 1991 and number 65107760001 goes from 1987 to 1998. In the GHCN v3 QCU dataset you will find only one: 65103776000 that goes from 1961 to 1998. Notice that they combined those to thermometers , chopped off the last digit that signifies different locations/thermometers and they call it “Unadjusted”.

    • Verity Jones says:

      Actually I am just following NCDC’s terminology in using ‘unadjusted’ but I do realise how much a misnomer that really is. Thanks for summarising some of the issues.

Comments are closed.