Dorothy behind the curtain (Part 1)

A Guest Post by Peter Morcombe

I started out with the aim of understanding “Global Warming” by analyzing raw data for myself rather than accepting whatever the “consensus” might be.   This meant that the data had to be digestible by a laptop rather than a super-computer.  My main data source is NOAA/NCDC v2 ground stations, and the GISP2 ice core data.  While there are several alternative sources of data the NOAA information is wide open to the general public, free of charge or fuss.

One of the problems with “Climate Change” is that we are trying to measure long term changes of a few tenths of a degree Kelvin in measurements that have huge variations from night to day and from winter to summer.  Observations show that temperature changes are magnified at high latitudes, leading to an improved SNR (Signal to Noise Ratio).  Consequently, I decided to concentrate on arctic Canada, Greenland and Russia.

This approach soon ran into problems because the number of high latitude stations reported in the GHCN v2 data set dropped like a stone after 1975.  This issue was discussed in some depth here in January 2010.  A few months later the issue was picked up by D’Aleo & Watts and the furor spread to live TV.   It was suggested that stations were being discarded from the GHCN to create a “Warm” bias.

It puzzled me at the time that nobody seemed interested in what NOAA had to say so I was curious to visit Asheville to ask about this and other issues.   It proved to be easy to set up a meeting but my business in North Carolina was postponed several months so the meeting took place late in October.

The National Climatic Data Center in Asheville, North Carolina.

I was expecting to find a small group of scientists located in a dusty annex but instead there was a modern six-story building of at least 150,000 square feet.  My next thought was that NOAA would be a minor tenant of some other government department.  Wrong again!  NOAA is the “landlord” with over 400 staffers.

The “Station Drop Off Problem” has been known to the professionals since Peterson & Vose, 1997, so amateurs such as myself have arrived at the scene of the crime a little late.  The trouble with the P&V paper is that while it explains “What?”, it does not explain “Why?”

Having spent a dozen years living off Department of Defense research grants it was easy for me to relate to NOAA researchers.  The folks I met in Asheville were scientists rather than politicians, yet  they have to respond to the folks who control the purse strings.   While the general public would like to think it a simple matter to collect every bit of data from every surface weather station there will never be enough resources to do it.  The “Station Drop Off Problem” is not something that was deliberately planned to create a “Warm” bias; it just happened owing to shifting priorities, changing arrangements with other countries and budget restraints.

Let’s see how this applies to one of my pet projects.  There are more than a dozen surface weather stations that meet WMO standards in the Canadian arctic but only two of them (Alert and Resolute) can be found in recent GHCN v2 records.  It turns out that there was a shake up at Environment Canada (I confirmed this by contacting the Canadians) that took years to work itself out.  The good news is that the hiatus is over so the number of stations reported is trending up again.

Then I asked about the GISP2 ice core records which cover the last 50,000 years with decadal resolution.  Are there any station records at NCDC that could be used to connect the GISP data which ends in 1905 to the present day?  The answer was an unequivocal “No” but there was some good news.  There are several sources of station data for Greenland other than the GHCN v2.

Several automatic stations were set up close to the GISP drilling site.  These are known as Barber, Cathy, Julie, Kenton, Klinck and Matt.  Data from these stations can be found at: ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/greenland/summit/gisp2/climate/climate.txt

Unfortunately, this data only covers 1988 to 1994 so it was not suited to the task of bridging the gap from 1905 to the present day.  Over their brief period of activity the mean temperature recorded in central Greenland was -29oC compared to -7oC for the coastal stations, owing to the altitude of the central stations (72.6N, 38.5W and 3,205m).

In spite of the large difference in mean temperature there is a good correlation in temperature anomalies between the coastal stations and the GISP stations.   It should therefore be reasonable to use anomalies calculated for coastal stations to extend the GISP2 temperature record from 1905 to the present.  To be continued…

Advertisements
This entry was posted in Station Data and tagged , , , . Bookmark the permalink.

7 Responses to Dorothy behind the curtain (Part 1)

  1. John F. Hultquist says:

    Good Start. I saw the Part I at the top so was only partly disappointed at the “to be continued.”

    After we got a broadband connection and I became aware of all that was happening (about October of 2008) I just assumed the “data” were decent. Thus, I took a different path – right to the local university and the head of the physics department. Please give me a recent textbook, I asked. With that and the web I investigated the CO2 claim. Then I found Anthony Watt’s surface station project.

    Can’t wait to hear the rest of your story.

  2. boballab says:

    Hmm looks like you got “rolled” by the “scientists” at NCDC. Probably because you didn’t have the right information to ask the right questions such as:

    1. Since NCDC Asheville is one of the three “World Data Centers for Meteorology” first established under the auspices of the the WMO and now under the ICSU and is charged with the collection of all CLIMAT reports, why is it that they fail to do so?

    WORLD DATA CENTER FOR METEOROLOGY, ASHEVILLE

    Maintained by: U.S. Department of Commerce, National Oceanic and Atmosphere Administration (NOAA). The WDC for Meteorology, Asheville is operated by, and collocated with, the National Climatic Data Center (NCDC).

    Summary of Data Held: Various data sets and from international programs and experiments, including meteorological and nuclear radiation data for International Geophysical Year (see IGY Annals Vol. 26); Global Atmospheric Research Program, World Climate Research Program, World Climate Data and Monitoring Program; and data exchanged with WDC-USA by participating countries.
    International Geophysical Year (IGY). Global meteorological and nuclear radiation data and data products, 1957-1958.
    International Quiet Sun Year (IQSY). Global meteorological data and data products, 1964-1965.
    Global Atmospheric Research Program (GARP):
    GARP Atlantic Tropical Experiment (GATE) 1974; First GARP Global
    Experiment (FGGE) 1978-1979; Winter and Summer Monsoon Experiments
    (WMONEX, SMONEX) for 4-month periods within FGGE; Alpine Experiment (ALPEX) for 2-month period in 1982.
    The World Climate Research Program (WCRP):
    International Satellite Cloud Climatology Project (ISCCP). Global analyses of satellite radiance measurements, 1982-2000. Data products are archived at ISCCP Central Archive and are available from WDC-USA.
    Tropical Ocean Global Atmosphere (TOGA) for specified ocean area, 1985-1994; TOGA Coupled Ocean-Atmosphere Response Experiment for a 12-month period in 1992-1993, including a 4-month intensive campaign in the Western Pacific. Global Precipitation Climatology Project (GPCP). Monthly precipitation data from surface, radar and satellite measurements for 1986 onwards. World Climate Data and Monitoring Program (WCDMP). Baseline Data sets prepared in cooperation with WMO, WDCB and WDCD and exchanges with participating countries. Global Historical Climate Network (GHCN). Comprehensive monthly global baseline climate data set of temperature, precipitation, and pressure. The earliest record dates from 1697. Comprehensive OceanAtmosphere Dataset (COADS) from ships and buoys, some dating from the 1850s. Comprehensive Aerological Reference Dataset (CARDS) from radiosondes and rawinsondes, and station histories, 1948-1995.
    High altitude rocketsonde data for 1959-1976.
    Ozone Data for the World from 1965, Atmospheric Environment Service, Department of the Environment, Canada, in cooperation with WMO.
    Solar Radiation and Radiation Balance Data from World Radiation Data Center, St. Petersburg, Russia, in cooperation with WMO, from 1964. Synoptic Data for surface and upper air observations, daily and monthly summaries, some in computer form, from countries participating in data exchange activities with WDC-USA.

    2. Why is it that the Global Historical Climate Network which is available to the public free of charge but it is not the official dataset, but one compiled by the US Air Force known as DatSav2?
    National Climatic Data Center
    DATA DOCUMENTATION FOR
    DATA SET 9950 (DSI-9950)
    DATSAV2 SURFACE
    January 6, 2003

    1. Abstract: DATSAV2 is the official climatological database for surface observations. The database is composed of worldwide surface weather observations from about 10,000 currently active stations, collected and stored from sources such as the US Air Force’s Automated Weather Network (AWN) and the WMO’s Global Telecommunications System (GTS). Most collected observations are decoded at the Air Force Weather Agency (AFWA) formerly known as the Air Force Global Weather Central (AFGWC) at Offutt AFB, Nebraska, and then sent electronically to the USAF Combat Climatology Center (AFCCC), collocated with NCDC in the Federal Climate Complex in Asheville, NC. AFCCC builds the final database through decode, validation, and quality control software. All data are stored in a single ASCII format. The database is used in climatological applications by numerous DoD and civilian customers.

    Also of note how is it the US Air Force able to find 10,000 active stations but the agency charged with the storage of the worlds temperature data can barely find over 1,200? I can guarantee you that there is nowhere near 8,800 military bases making up the AWN, so most of the station have to becoming in over the WMO’s GTS system in CLIMAT format, the same system that NCDC has full access to.

    3. Since the US Air Force has already done most of NCDC’s job for it by collecting and QC’ing the “Official” Dataset and then turned around and given it to NCDC:

    AFCCC sorts the observations into station-date-time order, validates each station number against the Air Weather Service Master Station Catalog (AWSMSC), runs several quality control programs, and then merges and sorts the data further into monthly and yearly station-ordered files. AFCCC then provides the data to the collocated National Climatic Data Center (NCDC).

    http://www1.ncdc.noaa.gov/pub/data/documentlibrary/tddoc/td9950.pdf

    Why does NCDC continue this farce that they have no idea why there is a drop in station count? They could drop the current GHCN and just use the “official” dataset that is comprised of almost 8 times as many stations, especially since someone else did the work for them, and call that the new GHCN and make that public and then update that with DatSav 2’s replacement DatSav 3.

    Abstract: The DATSAV3 Surface Database is composed of worldwide surface weather observations from about 10,000 currently active stations, collected and stored from sources such as the Automated Weather Network (AWN) and the Global Telecommunications System (GTS). Most collected observations are decoded at the Air Force Weather Agency (AFWA) at Offutt AFB, Nebraska, and then sent electronically to the USAF Combat Climatology Center (AFCCC). AFCCC builds the final database through decode, validation, and quality control software. All data are stored in a single ASCII format. The database is used in climatological applications by numerous DoD and civilian customers. DATSAV3 refers to the digital tape format in which decoded weather observations are stored. (Two older, discontinued formats were DATSAV and DATSAV2.) The DATSAV3 format conforms to Federal Information Processing Standards (FIPS). The DATSAV3 database includes data originating from various codes such as synoptic, airways, METAR (Meteorological Routine Weather Report), and SMARS (Supplementary Marine Reporting Station), as well as observations from automatic weather stations. The users handbook provides complete documentation for the database and its format. AFCCC sorts the observations into station-date-time order, validates each station number against the Air Weather Service Master Station Catalog (AWSMSC), runs several quality control programs, and then merges and sorts the data further into monthly and yearly station-ordered files. AFCCC then provides the data to the co-located National Climatic Data Center (NCDC).

    http://www.ncdc.noaa.gov/oa/documentlibrary/surface-doc.html

    Bottom line: the WMO’s Global Telecommunications System was setup for the express purpose of transmitting Meteorological data, which includes temperature, from stations and agencies from around the world. These reports are in a standardized form called CLIMAT and the NCDC has access to every single one that is sent out on that system just like the US Air Force. The USAF somehow is able to come up with over 10,000 active stations but NCDC can barely find over 1,200 even after the USAF gives them the dataset made up from the 10,000 active stations and they don’t have a clue why?
    Rigggghhhht.
    If you believe that from NCDC I got this Bridge for sale with a nice view of Brooklyn.

    • Verity Jones says:

      The other excuse being – left hand, go see what right hand is doing 😉

    • David Jones says:

      DATSAV2 is primarily synoptic (that is, hourly) reports. The QA and QC that goes into these is often minimal. It’s easy to find METAR reports (collected by DATSAV2) that have missing temperatures, missing minus signs, misplaced fields, wet-bulb and dry-bulb swapped, and so on. Presumably the same is true of other hourly reports. Fully automated stations can be useful, but even they have off days when the sensors are faulty, the battery needs replacing, the cable fell out.

      You can easily turn synoptic reports into daily reports and daily reports into monthly reports, but without proper QA and QC you’ll have a lot of garbage data. Which is basically why there are loads more synoptic stations than climate stations (also, they’re much more useful for weather reporting and forecasting).

      CLIMAT reports are monthly and they’re often produced by entirely different stations than the synoptic reports. Do you actually have any evidence that CLIMAT reports are missing?

      • boballab says:

        Lets take the case of WMO station number 4037108200 Alert Canada. GHCN says there is only data for that station beginning in 1950 and ending in 1990 in their Mean Raw file and from 1954 – 1990 in the Mean Adj file. (you can get GHCN data here: http://www.ncdc.noaa.gov/ghcnm/v2.php)

        However if you go to the Environment Canada’s (the agency that sent the data that NCDC uses in GHCN) site you find that there is data from 1950 to 2005. (You can get the EC data here:
        ftp://ccrp.tor.ec.gc.ca/pub/AHCCD/Homog_monthly_mean_of_daily_mean_temp and station list here: ftp://ccrp.tor.ec.gc.ca/pub/AHCCD/Temperature_Stations)

        So you are asking me to believe that for those 15 years the Canadians couldn’t figure out how to send a CLIMAT report for that station? At the same time they were able to send CLIMAT reports for Eureka which according to both the GHCN Raw and EC datasets show have up to date data (Note the EC dataset last up date was through 2008 and the GHCN Raw dataset has it up into 2010).

        Did they some how all of sudden forget for 15 years how to send Alert’s data or is it more likely that NCDC just decided to pitch that data? To me it’s the latter since if you look at the Eureka data in the GHCN Raw dataset they have the data all the way into 2010 but in he GHCN Adj data they chopped it off way back in 1991 and if you look in the failed QC file you find that only the years 2006 and 2008 are in it. So for no good reason NCDC pitched the data from the record. Seems that GISS doesn’t share the NCDC opinion that the data from 1991-2010 is no good since they use the GHCN Raw dataset and have data for Eureka from 1947-2010 in their Adjusted dataset.

        Now if you think that everything there is just some wild coincidence I got some bottom land for sale…cheap…just don’t ask what it is on the bottom of.

        Also I guess you didn’t read the documentation on DatSav 2 or you never would have made the false remarks you did:
        1. DatSav 2 does not collect anything, it is a dataset compiled by the people that did the collection the USAF and named after the tape format.

        DATSAV2 refers to the digital tape format in which decoded weather observations are stored.

        2. DatSav 2 is QC’ed by the Air Force before NCDC gets it.

        AFCCC sorts the observations into station-date-time order, validates each station number against the Air Weather Service Master Station Catalog (AWSMSC), runs several quality control programs, and then merges and sorts the data further into monthly and yearly station-ordered files.

        3. The DatSav series is the official Climatological dataset not GHCN.

        1. Abstract: DATSAV2 is the official climatological database for surface observations.

        http://www1.ncdc.noaa.gov/pub/data/documentlibrary/tddoc/td9950.pdf

  3. Peter Morcombe says:

    boballab,
    If you got the impression that I was “rolled” by the scientists in Asheville it may be because I plan to return for a more extended visit in 2011. The scientists I met were very helpful during my first visit and I would like to build on that.

    If there are any issues that you would like to raise such as “DatSav2”, I would be happy to add them to my agenda but I would need some briefing beforehand as much of what you wrote is new to me.

    • boballab says:

      Peter

      I would recommend then you read the WMO guide here:
      http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf

      Pay particular attention to section 2.6.2 on the Logging and Reporting of Observations. From there move to Chapter 3 where you find:

      A major step forward in climate database management occurred with the World Climate Data andMonitoring Programme (WCDMP) Climate Computing (CLICOM) project in 1985. This project led to the installation of climate database software on personal computers, thus providing NMHS in even the smallest of countries with the capability of efficiently managing their climate records.

      That was 25 years ago, I’m sure that by now that most countries have at least a passing knowledge of what a computer is, what the Internet is and how to use them and if you read the whole thing should be sending their Climate data out over them as per the WMO. However it gets better, under section 3.5 Exchange of Climatic Data they tell you who every WMO member is suppose to send their climate data to.

      ,b>Data are also shared through International Council for Science World Data Centers (WDCs). The WDC system works to guarantee access to solar, geophysical, and related environmental data. It serves the whole scientific community by assembling, scrutinizing, organizing, and disseminating data and information. WDCs collect, document, and archive measurements and the associated metadata from stations worldwide and make these data freely available to the scientific community. In some cases WDCs also provide additional products including data analyses, maps of data distributions, and data summaries. There are climate related International Council for Science WDCs covering Meteorology, Paleoclimatology, Oceanography, Atmospheric Trace Gases, Glaciology, Soils, Marine Geology and Geophysics, Sunspots, Solar activity, Solar‐Terrestrial Physics, Airglow, Aurora, Cosmic Rays, as well as for other disciplines.

      Now who was it that is one of the 3 WDC’s for Meteorology again? NCDC Asheville North Carolina.

      Now as can be seen starting 25 years ago all Climate records, which would include surface temperature readings, were to phase from paper to digital. The WMO provided software and Hardware to even the smallest countries for this purpose. All that information is to be sent to the WDC’s whose job it is to archive it. So bottom line NCDC has access to every climate station record from every country that is part of the WMO. Now while this was going on the WDC sponsored data set GHCN was dropping stations and getting smaller. You can find graphs of this all over the place including NASA’s GISTEMP site. Example using the graph from GISTEMP there was about 2,500 stations in the GHCN in 2003, while at the same time the USAF somehow was able to find 10,000 active stations in the same year.

      Also whenever you hear the responses from Climate scientists they point out that the data is available and use GHCN, but when you read the documentation you find out it isn’t the official data set it’s the 10,000 active one compiled by the USAF. Where I come from that is called bait and switch and is illegal and is something Con-men pull.

      What I have just put in this comment is the tip of the NCDC iceberg and if you want to look at how bad it is I suggest you start here and spend about two weeks reading:
      http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

      Oh another thing that whole you need 30 year BS for a baseline? Here is what the WMO says:

      A number of studies have found that 30 years is not generally the optimal averaging period for normals used for prediction. The optimal period for temperatures is often substantially shorter than 30 years, but the optimal period for precipitation is often substantially greater than 30 years.

Comments are closed.