SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Brumar89 who wrote (835724)2/10/2015 12:59:57 PM
From: Brumar891 Recommendation

Recommended By
TideGlider

   of 1574702
 
The Horrors of Homogenization

Posted on February 6, 2015 by Roger Andrews

There has recently been a lot of discussion about the homogeneity adjustments that GISS and others have applied to surface air temperature records, and since this is a subject I’ve done some work on I thought it would be appropriate to say something about it.



The problem, however, is how to say it, because the subject defies exhaustive treatment in a single blog post, and I don’t think presenting yet another set of before-and-after examples of what homogenization does to raw records would greatly advance the state of knowledge. So what I will do here is touch on the basics and then work my way through a few examples of what homogenization actually does in practice, one of them in detail.

First a note on data sources. My main data source is the GHCN v3 mean temperature data set available at KNMI Climate Explorer, which includes the raw records (“GHCN all”) and the NOAA/NCDC homogeneity-adjusted records (“GHCN adjusted”). The GHCN v3 data are expressed as anomalies relative to 1981-2010 means. (Note that the GHCN v3 adjusted data set is not the same as the GISS “homogeneity adjusted” data set.) Supplementing the GHCN v3 data are raw GHCN v2 records that I downloaded from the GISTEMP GHCN v2 data set a few years ago. I am unable to update them because GISS has not updated them since October 2011. Other data sources are specified in the text.

Homogenization is a process whereby raw temperature records in the same general area are adjusted by computer algorithms to match each other to within acceptable limits. The adjustments are applied because the raw records are assumed to be distorted to a greater or lesser extent by station moves, equipment replacement, time-of-observation changes and/or physical changes in the vicinity of the station, and further assumed that these distortions must be removed before the records can be considered suitable for use. There are, however, two problems with these assumptions.

The first is that many raw temperature records show no sign of serious distortion. It’s common when comparing raw records from stations in the same area to find that they match quite well, meaning either that all of them are distorted by the same amount in the same direction at the same time, which is improbable, or that none of them is significantly distorted. Figure 1 shows an example from the Nova Scotia-Newfoundland area:



Figure 1: Five Nova Scotia-Newfoundland raw records (Charlottetown, Sydney, St. Johns, Gander, Shearwater). Data GISS


And often when a raw record does show what appears to be a large artificial discontinuity, such as the abrupt ~2C upward shift in the Nome, Alaska record after 1976 ….



Figure 2: Nome, Alaska, raw record. Data GISS


…. it turns out not to be. Other records in Alaska show the same feature (Figure 2). The upward shift was natural. It was caused by the 1976 “phase change” in the Pacific Decadal Oscillation:



Figure 3: Nome plus five other Alaska raw records (St. Paul, Bethel, Tanana, Bettles, Kodiak). Data GISS


Second is the problem of identifying any artificial distortions that may be present. Usually only the really large ones are visible in the raw records; small and medium-sized ones are difficult to detect and sometimes impossible to detect at all, either visually or statistically. The Paraguayan records, some of which were recently featured on Paul Homewood’s blog and elsewhere, are an example. Figure 4 plots eleven unadjusted GHCN v3 temperature records from the area. They are fairly typical of what raw temperature records over much of the world look like:



Figure 4: Eleven raw records in Paraguay and surrounding area (Puerto Casado, Concepción, Ponta Pora, Bahia Negra, Pedro Juan Caballero, Mariscal, Asunción, Corumba, Las Lomitas, San Juan Bautista, Formosa). Data GHCN v3


Unless historic records that accurately document the history of each of these stations are available – and here I’m sure they aren’t – identifying artificial discontinuities in records like this is a hopeless task. Yet the records get homogenized anyway, with the results shown in Figure 5:



Figure 5: Paraguayan records after homogeneity adjustment. Data GHCN v3


The adjustments that achieve this result are shown in Figure 6. (The internal workings of the NOAA/NCDC computer algorithm that generated them – hereafter the NCDC algorithm – aren’t strictly relevant to this post but details are here should anyone want more information.)



Figure 6: Homogeneity adjustments applied to Paraguayan records. Data GHCN v3


It’s hard to see how this hodgepodge of adjustments could reflect actual artificial discontinuities in the records. The NCDC algorithm seems to have applied adjustments simply to make the records track each other. But at the same time it adds about a degree of warming that isn’t seen in the raw records – where did that come from? Clearly there are questions as to whether the homogenization process is working the way it’s supposed to here. (A case can also be made that if adjustments this large are needed to make the records “correct” then the records were too distorted to have been used in the first place, but we’ll let that pass.)

To obtain further insights on how homogenization works we now turn to a detailed example – Alice Springs in Australia, which being the only continuous long-term record for many miles in any direction is one of the more important records in the global surface air temperature data base. Alice is a good example of what homogenization does in practice and it also gives an idea of the degree of detail we have to go into before we can decide whether a record needs adjusting.

Figure 7 presents the Alice record its raw state. It shows little or no warming since 1880 and no obvious evidence of artificial discontinuities:



Figure 7: Alice Springs raw record. Data GHCN v3


First we will review the history of the Alice station, which I reconstructed from historic records and Australian Bureau of Meteorology metadata (example here.) Temperature measurements at Alice began in 1879 at the Telegraph Office north of town, shown in the old sepia photo below.



Figure 8: The Telegraph Office


A key factor in evaluating temperature records is station quality, and this looks like a good place for a station (at least urban warming wouldn’t have been problem). But before we can confirm that it was we need to know that the thermometer was out in the open and properly screened. And thanks to a painstaking job of restoration by the local authorities plus a tourist photo posted on Google Earth I was able to find the location. The structure inside the black circle is a Stevenson screen , and given how persnickety the restorers of historic sites are we can reasonably assume that it was part of the original installation:



Figure 9: Tourist photo revealing screen location


Figure 10 pinpoints the screen on a Google Earth overhead view. The location probably would not have made WMO class 1 but It’s a high-quality site nonetheless (the red lines on this and following overhead views are provided for scale and are 100 m long unless otherwise specified).



Figure 10: Google Earth view of Telegraph Office thermometer location


In 1932 the Telegraph Office station was decommissioned and replaced by a station 3km south at the Alice Springs Post Office. Figure 11 shows what the site looks like now (the Post Office itself has moved to the suburbs). It would not of course have looked like this in 1932, but station quality would probably still have suffered and we might expect a discontinuity in the temperature record as a result.



Figure 11: Current Google Earth view of Post Office thermometer location (approximate)


The Post Office station operated until 1989, but in 1942 the station at what is now known as the Old Airport, located 11 km south, supplanted it as the official recording site. The Old Airport was a wartime base, so there probably would have been changes in equipment and observational procedures as well as a station move. The location of the Old Airport station is shown as best as I can fix it along with the current airport station in Figure 12. The exact location is uncertain but it was probably on or close to the asphalt hardstanding to the right of the buildings – hardly an ideal place for a thermometer – so we might expect another discontinuity in the Alice record in 1942.



Figure 12: Google Earth view of Old Airport and Current Airport thermometer locations


In 1974 the station was relocated for the third time from the the Old Airport to the current airport station a little less than a kilometer to the northeast (Figure 13). We might expect another discontinuity in the temperature record here because the current airport station is a high-quality station (I rate it WMO class 2) while the Old Airport station probably was not:



Figure 13: Google Earth close-up of current airport station thermometer location


Figure 14 shows the four station locations relative to each other. The current airport station is 13km south of the original Telegraph Office station and 42m lower.

So here we have three documented station moves, each of which might be expected to have generated an artificial discontinuity in the Alice temperature record. Did they?



Figure 14: Google Earth view of the four Alice Springs thermometer locations


One way to find out is to compare the individual records from the four stations. First we compare the Telegraph Office and Post Office records. There are no reliable overlap values but visually there’s no obvious sign of a discontinuity:



Figure 15: Telegraph Office and Post Office temperature records. Data GISS



Next we superimpose the Old Airport record. It matches the Post Office record very closely. There’s definitely no discontinuity here:



Figure 16: Telegraph Office, Post Office and Old Airport temperature records. Data GISS



Finally we superimpose the current airport station. It also overlays the Post Office record almost exactly. No discontinuity here either.



Figure 17: Telegraph Office, Post Office, Old Airport and current airport temperature records. Data GISS



Matches this close are, however, suspicious. Temperature records from different stations rarely line up this well. Is it possible that the four Alice records were at some point adjusted to match each other? Indeed it is. But if this is what was done the raw Alice record is already homogenized. There’s no need to re-homogenize it. And if the numbers weren’t adjusted there’s also no need to homogenize it. It was homogeneous to begin with.

I ran a final check by comparing the raw Alice record with the raw records from stations around it, which as noted earlier is a good way of confirming that a record isn’t seriously distorted. Unfortunately there are no records close to Alice that are long enough to tell us anything, so I had to settle for the records from the six stations shown in Figure 18, which are up to 900km away.



Figure 18: Stations around Alice Springs


Figure 19 plots the records from these six stations against the Alice record. The match isn’t perfect, but with the stations covering an area of at least a million sq km we wouldn’t expect it to be. Yet the peaks and troughs generally line up and the overall trends are substantially the same, giving us confidence that all of them are recording mostly real and not spurious temperature changes. (The plot excludes three 2-sigma outliers; Boulia in 1893 and 1894 and Halls Creek in 1949. The 1957-84 period is used as the baseline because all seven stations were operating in those years):



Figure 19: Alice Springs raw temperature record vs. raw records from surrounding stations. Data GISS



That concludes the detailed analysis of the raw Alice Springs temperature record (although it still isn’t as detailed as it should be; a full analysis would among other things require a review of the original paper records to confirm that the individual records shown in Figures 15 through 17 are what I think they are). Does the analysis prove beyond doubt that the raw record is correct? No, it doesn’t. But neither has it revealed any obvious flaws, and one of the things I learned in the years I spent analyzing assay data bases in the mining industry is that if your raw data are not obviously flawed you don’t mess with them.

But few records in the Southern Hemisphere escape the gentle ministrations of NCDC’s homogenization algorithm, and the Alice record is no exception. Figure 20 shows what it looks like after the algorithm has done its work:



Figure 20: Homogeneity adjustments applied to Alice Springs raw record. Data GHCN v3



How did the algorithm obtain these results? The adjustments tell the story. The algorithm identified most of the peaks and troughs in the raw record between 1918 as artificial discontinuities and smoothed them out with stepwise adjustments. But only one of these adjustments (1930-32) coincides with a known station move. The others coincide with fluctuations that are all visible to a greater or lesser extent in the records from surrounding stations (Figure CC), indicating that they are real climatic features and not spurious shifts. So instead of replacing distorted data with valid data the algorithm has replaced valid data with distorted data. (The gaps in the adjusted record, incidentally, show where the algorithm decided the raw data failed quality control).

But again the adjustments add warming – in this case 1.5°C, even more than they added in Paraguay. How did the NCDC algorithm achieve this? I’m at a loss to explain it. Warming can be added if a record is homogenized with surrounding records that show more warming, but there’s no record anywhere near that shows as much as the +2C of warming the adjusted Alice record shows.

Now homogeneity adjustments do not of course always add warming. In some cases they add cooling. But the impact over large areas is to bias the raw records towards warming. A few years ago I did a before-and-after analysis of 52 records in South America using the GISS raw records and an older version of NCDC’s adjusted GHCN v3 data set that has now been superseded, so the results are no longer current. Nevertheless they still illustrate the impact of homogeneity adjustments on the sub-continental scale. The results are summarized on Figure 21, which plots the warming/cooling trends measured from the raw GCNv2 records against the adjustments applied by the homogenization algorithm:



Figure 21: Raw record warming trend vs. homogeneity adjustment, 52 records in South America


The implications of these results are not immediately obvious so I will summarize them verbally. The trend line slopes up to the left, signifying that raw records that show cooling have received larger warming adjustments than those that show warming. This is the way the trend line has to go if the raw records are to be homogenized. But the warming adjustments applied to the records that show cooling are not offset by cooling adjustments to those that show warming, and as a result the adjustments add about a degree of overall warming that isn’t present in the raw records (the average of all the adjustments is plus 1.05C). Again I am unable to explain where the extra warming came from, but it’s clearly been manufactured somehow by the algorithm, like the warming at Alice Springs and in Paraguay. (I have a comparable plot for Africa south of the Equator which I won’t bother to show because it looks very much like the one for South America.)

And if you are still confused look at where the trend line crosses (0,0). If the homogenization algorithm is unbiased the trend line will pass through (0,0). Here it passes over a degree C above it.

So homogeneity adjustment adds warming in Central Australia, Southern Africa and South America, and similar adjustments by the Australian Bureau of Meteorology and NIWA add warming over the rest of Australia and over New Zealand too. Pretty much the entire Southern Hemisphere is adjusted. How does one justify adding warming to raw records over the entire Southern Hemisphere? One doesn’t. The warming is clearly manufactured, spurious, non-existent.

Curiously, however, the raw surface air temperature records in the Northern Hemisphere are rarely subjected to warming-biased homogeneity adjustments. I will not speculate as to why. I will just observe that they show approximately twice as much warming as the raw records in the Southern Hemisphere and leave it at that.

Two questions remain. First, how much difference does the manufactured warming in the Southern Hemisphere make? As a practical matter, not very much. The impact on global land air temperature series like CRUTEM4 is muted by the fact that less than a third of the Earth’s land area is in the Southern Hemisphere, so the impact on the global land surface temperature record would be only in the 0.1X degrees C range even if the amount of warming over the Southern Hemisphere landmasses had been artificially doubled. And the impact on “surface temperature” series like HadCRUT4, which are about 70% based on SSTs, would be down in the 0.0X degrees C range. So removing the homogeneity adjustments doesn’t make global warming go away.

As to the second and potentially more troublesome question of why such obviously flawed adjustments are being applied, I will leave that up to the judgment of the reader.

http://euanmearns.com/the-horrors-of-homogenization/

............
Dave Rutledge says:
February 6, 2015 at 2:53 am
Hi Roger,

Thank you for a thoughtful post.

I spent some time trying to understand the temperature records for my father’s home of Detroit Lakes, Minnesota. This station got some attention in the blogs. I found that the adjustments were quite large, up to 2C. They were different for different months and for max and min readings. So the trends for different months and max and min readings changed in completely different ways. In addition, an obvious jump was missed by the algorithm when an air conditioner was added near the thermometer in 1999. The July max jumped 3.5 degrees the next year, while the average increase for the ten nearest stations was 0.5 degrees. However, no adjustment was made, and this biased the trend upward.

I spent thirty years in experimental electronics research, and I just could not imagine making the kinds of adjustments that were being made. Better to have a simple quality test for the raw data, and then just live with the results. This is how state maximum temperature records are done. And it turns out that 23 state records were set during the 30s, but only one has been set so far this decade.
............

edhoskins says:
February 6, 2015 at 6:13 am
As support for your findings please have a look at the example of Dale Enterprise station West Virginia. It is typical of these sorts of one way adjustments being made to the land based temperature record. It is a single correctly sited and continuously well-maintained, rural US weather station. Its records are instructive. The un-adulterated record even shows modest cooling of 0.29°C per century, if all other adjustments made by “climate scientists” are ignored.

However as is shown above the NASA GISS published “value added” temperatures for this same location. This shows a massive adjustment lowering of past temperatures before 1965 to give the impression of very substantial (+0.78ºC / century) warming at this station. Of particular interest is the apparent step wise adjustment of the homogenised data, which would seem to be truly spurious. It is graphed along with other articles on the site below:

https://edmhdotme.wordpress.com/2014/10/06/official-adjustments-to-temperature-records-worldwide/

Cumulatively the result has been to emphasise warming from the US rural data sets by some 0.47ºC / century. These results are always a one-way street to emphasise the apparent amount of warming. The following table clearly shows the scale and impact of the overall adjustments in the USA

.........
Euan Mearns says:
February 6, 2015 at 11:26 am
Fascinating insight Roger. I really like the Alice Springs case study. A completely flat temperature record gets cooked into warming. It seems likely that the guys on the ground already applied adjustments for site changes etc and that GHCN have adjusted again. Its really quite shocking that this is going on.

You claim that all S hemisphere records are cooked in this way and like Graham I don’t understand why this evidently has so little impact on the global outcome. Perhaps another post to explain how the global record is constructed and weighted?
.................

.........
A C Osborn says:
February 6, 2015 at 3:40 pm
And this is really what it is all about.
“This is probably the most difficult task we have ever given ourselves, which is to intentionally transform the economic development model, for the first time in human history,” Christiana Figueres, who heads up the U.N.’s Framework Convention on Climate Change, told reporters. – See more at: http://www.thegwpf.com/un-climate-chief-we-are-remaking-the-world-economy/#sthash.zCMAGFKI.dpuf

JerryC says:
February 6, 2015 at 7:15 pm
Ah, I believe that intentionally transforming the economic development model has been attempted at least once previously.

Glorious Five Year Plans, anyone?


........
A C Osborn says:
February 7, 2015 at 4:00 pm
Javier, BEST “tinkering” is even worse than GISS & NCDC and they openly admit it.
Their Final Product is formulated based on Model Predictions of what the Temperatures “should look like”.
They combine many stations and also Splice & Dice the data.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext