SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : The Environmentalist Thread -- Ignore unavailable to you. Want to Upgrade?


To: Wharf Rat who wrote (29013)5/29/2010 11:36:08 AM
From: Brumar89  Read Replies (1) | Respond to of 36917
 
How bad is the global temperature data?

Pretty darn bad.

13

10

2009
By Joseph D’Aleo, AMS Fellow, CCM

In this recent post, we discussed the problems with recent data that showed the argument presented by the EDF’s millionaire lawyer playing clueless environmentalist on Lou Dobbs Tonight that this will be the warmest decade is nonsense. This claim was well refuted and Al Gore’s credibility disassembled by Phelim McAleer, of the new documentary Not Evil, Just Wrong that challenges the LIES and exaggerations (totalling 35) in Al Gore scifi horror comedy film, An Inconvenient Truth. 9 were serious enough for a UK judge to require a disclaimer itemizing them be read whenever, the movie was shown in the schools.

The world’s climate data has become increasingly sparse with a big dropoff around 1990. There was also a tenfold increase in missing months around the same time. Stations (90% in the United States which has the Cadillac data system) are poor to very poorly sited and not properly adjusted for urbanization. Numerous peer review papers suggest an exaggeration of the warming by 30%, 50% or even more. The station dropout can be clearly seen in the two maps below with the number of station going from over 6000 to just 1079 from April 1978 to April 2008.

April 1978 GISS global plot - click for larger image

April 2008 GISS global plot - click for larger image

See the big gaps in the recent data in Canada, Greenland, Africa, South America, parts of western Asia, parts of Australia.

SEE FOR YOURSELF

Take this test yourself to see how bad a shape the global data base is. Look for yourself following these directions using the window into the NOAA GHCN data provided by NASA GISS here.

Point to any location on the world map. You will see a list of stations and approximate populations. Locations with less than 10,000 are assumed to be rural (even though Oke has shown that even a town of 1,000 can have an urban warming of 2.2C).

Personal note. One of these weather stations is located in my hometown - would be classed as rural as the pop is <10K. For decades it was run by a peach and apple farmer named Bon Hartline and the temperature readings were on his farm on the outskirts of town. He's dead now but the weather station is still there. His orchards and home are long gone and the weather station is now in a developed area surrounded by businesses, asphalt parking lots and so on ...

You will see that the stations have a highly variable range of years with data.

Try and find a few stations with data that extends to 2009. To see how complete the data set is for that station, click in the bottom left of the graph Download monthly data as text.

For many, many stations, you will see the data set in a monthly tabular form has many missing data months mostly after 1990 (designated by 999.9).

See larger image here

This required the data centers to estimate data for the grid box for that location with other stations nearby (homogenization). In the 2008 plot above only 1079 stations were used. NASA went to locations within 250 km (155 miles) to find data for the grid boxes. For grid boxes without stations within 250 km, they are left blank, thus the large gaps.

Most of the stations that dropped out were rural. More of the missing data points are having their missing months filled in with more urban data in the grid boxes.

Maximizing the urban heat island effect.

See larger image here

WUWT Volunteer John Goetz created this video that shows the worldwide dropout of weather stations:

done

One example of how good or bad this works is from Maine. Volunteers completed surveys of the United States Historic Climate Network (USHCN) temperature stations in Maine for Anthony Watts surface station evaluation project. The survey determined that every one of the stations in Maine was subject to microclimate or urbanization biases. One station especially surprised the surveyors, Ripogenus Dam, a station that was officially closed in 1995.

See larger image here

Despite being closed in 1995, USHCN data for this station is publicly available until 2006! (GISS stopped in 1995)

Part of the USHCN data is created by a computer program called “filnet” which estimates missing values. According to the NOAA, filnet works by using a weighted average of values from neighboring stations. In this example, data was created for a no longer existing station from surrounding stations, which in this case as the same evaluation noted were all subject to microclimate and urban bias, no longer adjusted for. Note the rise in temperatures after this before the best sited truly rural station in Maine was closed. GISS does display this station that did incorporate the “filnet” data input for missing months although as noted they stopped its plot in 1995 which NOAA extended artificially to at least 2006.

How can we trust NOAA/NASA/Hadley assessment of global changes given these and the other data integrity issues? Given that Hadley has destroyed old original data because they were running out of room in their data cabinet, can we ever hope to reconstruct the real truth?

As one of our scientist readers noted: “Well, the 999.9s you showed me today sure opened my eyes…the ramifications are stunning. I knew about the drop-off of stations before but never that existing station reports are so full of gaps or that they’re getting temperature readings from “ghost” stations. This is, as you have said, GARBAGE. See PDF here.

wattsupwiththat.com

-----------------------------------------------

GISS & METAR – dial “M” for missing minus signs: it’s worse than we thought

17 04 2010
Here’s a story about how one missing letter, an M, can wreck a whole month’s worth of climate data. It is one of the longest posts ever made on WUWT, I spent almost my entire Saturday on it. I think it might also be one of the most important because it demonstrates a serious weakness in surface data reporting.

In my last post, we talked about the a curious temperature anomaly that Jean S. found in the March GISS data and posted at Climate Audit:

The anomaly over Finland has an interesting signature to it, and the correction that GISS posted on their website confirms something I’ve been looking at for a few months.

The data shown between 4/13 and 4/15 were based on data downloaded on 4/12 and included some station reports from Finland in which the minus sign may have been dropped.

With some work I started back in late December and through January, and with GISS putting stamp of approval on “missing minus signs” I can now demonstrate that missing minus signs aren’t just an odd event, they happen with regularity, and the effect is quite pronounced when it does happen. This goes to the very heart of data gathering integrity and is rooted in simple human error. The fault LIES not with GISS (though now they need a new quality control feature) but mostly with NOAA/NCDC who manages the GHCN and who also needs better quality control. The error originates at the airport, likely with a guy sitting in the control tower. Readers who are pilots will understand this when they see what I’m talking about.

I’ve seen this error happen all over the world. Please read on and be patient, there is a lot of minutiae that must be discussed to properly frame the issue. I have to start at the very bottom of the climate data food-chain and work upwards.
.......
As mentioned in the recently updated compendium of issues with the surface temperature data by Joe D’Aleo and myself, there has been a move in the Global Historical Climatological Network (GHCN) to rely more and more on airports for climate data. This, in my opinion, is a huge mistake because in addition to those issues

E.M. Smith aka “Chiefio” reports that in GISS (which uses GHCN) worldwide, there has been a wholesale migration towards airport weather data as a climatic data source.
............
That in the YEAR 2009 the USA has almost 92% airports in GHCN.

So clearly, airports make up a significant portion of the climate data.

On the issues of airports as climate station, obvious issues with siting, UHI, failing ASOS instrumentation, and conflicting missions (aviation safety -vs-climate) aside, I’m going to focus on one other thing unique to airports: METAR

What is METAR you ask? Well in my opinion, a government invented mess.
........
The SA method originated with airmen and teletype machines in the 1920’s and lasted well into the 1990’s. But like anything these days, government stepped in and decided it could do it better. You can thank the United Nations, the French, and the World Meteorological Organization (WMO) for this one. SA reports were replaced by METAR in 1996.

From Wikipedia’s section on METAR

METAR reports typically come from airports or permanent weather observation stations. Reports are typically generated once an hour; if conditions change significantly, however, they can be updated in special reports called SPECIs. Some reports are encoded by automated airport weather stations located at airports, military bases, and other sites. Some locations still use augmented observations, which are recorded by digital sensors, encoded via software, and then reviewed by certified weather observers or forecasters prior to being transmitted. Observations may also be taken by trained observers or forecasters who manually observe and encode their observations prior to transmission.

History
The METAR format was introduced 1 January 1968 internationally and has been modified a number of times since. North American countries continued to use a Surface Aviation Observation (SAO) for current weather conditions until 1 June 1996, when this report was replaced with an approved variant of the METAR agreed upon in a 1989 Geneva agreement. The World Meteorological Organization’s (WMO) publication No. 782 “Aerodrome Reports and Forecasts” contains the base METAR code as adopted by the WMO member countries.[1]

Naming
The name METAR is commonly believed to have its origins in the French phrase message d’observation météorologique pour l’aviation régulière (“Aviation routine weather observation message” or “report”) and would therefore be a contraction of MÉTéorologique Aviation Régulière. The United States Federal Aviation Administration (FAA) lays down the definition in its publication the Aeronautical Information Manual as aviation routine weather report[2] while the international authority for the code form, the WMO, holds the definition to be aerodrome routine meteorological report. The National Oceanic and Atmospheric Administration (part of the United States Department of Commerce) and the United Kingdom’s Met Office both employ the definition used by the FAA. METAR is also known as Meteorological Terminal Aviation Routine Weather Report or Meteorological Aviation Report.
..........
Here is where METAR coding departs from normal numeric convention. SA reports did not have this problem.

In the METAR report above, instead of using the normal way we treat and write negative numbers, some policy wonk decided that we’ll use the letter “M” to report a negative number. Only a bureaucrat could think like this.

So instead of a below zero Centigrade temperature and dewpoint looking like this:
-04/-07
in the “new and improved” METAR coding, it looks like this:
M04/M07

OK not a problem you say? Well I beg to differ, because it forces technicians who manually code METAR reports for transmission to do something they would not do anywhere else, and that’s write down an “M” instead of a minus sign. Using an M is totally counter-intuitive and against basic math training, and increases the likelihood of error.
It gets worse. Let’s say the technician makes a boo-boo and puts a minus sign instead of an “M” in front of the numbers for temperature/dewpoint. You’d think this would be alright, and the system would correctly interpret it, right?
.......
Hey look at that, the temperature is 39°F (3.8°C). Minus signs are discarded from METAR decoding. Note that decoded METAR temperature also comes out the same if the “M” is missing in front of the 04/-07 or 04/M07

If it had been decoded correctly we would have gotten:
(-4) degrees Celsius = 24.8 degrees Fahrenheit
A whole 14.2 degrees F difference!
..........
Maddeningly, even when egregious errors in aviation weather data are pointed out and even acknowledged by the reporting agency, NOAA keep them in the climate record as was demonstrated last year in Honolulu, HI International Airport when a string of new high temperature records were set by a faulty ASOS reporting station. NOAA declined to fix the issue in the records:

NOAA: FUBAR high temp/climate records from faulty sensor to remain in place at Honolulu

The key sentence from that story from KITV-TV:
The National Weather Service said that is not significant enough to throw out the data and recent records.
..............
Clearly, NOAA simply doesn’t seem to care that erroneous records finds their way into the climatic database.
...........
OK back to the METAR issue.

The problem with METAR reporting errors is worldwide. I’ve found many examples easily in my spare time. Let’s take for example, a station in Mirnvy, Russia. It is in Siberia at 62.5° N 113.9° E and has an airport, is part of GHCN, and reports in METAR format.

Weather Underground logs and plots METAR reports worldwide, and these METAR reports are from their database on November 11th, 2009.

It shows a clear error in the 12:30PM (330Z) and 1 PM (400Z) METAR report for that day:
wunderground.com
UERR 010330Z 22005G08MPS 9999 -SN 21/M23 Q1026 NOSIG RMK QFE738 24450245
UERR 010400Z 22005G08MPS 9999 -SN SCT100 OVC200 20/M22 Q1025 NOSIG RMK QFE737 24450245
UERR 010430Z 21005G08MPS 4000 -SN SCT100 OVC200 M20/M22 Q1024 NOSIG RMK QFE737 24450245
UERR 010430Z 21005G08MPS 4000 -SN SCT100 OVC200 M20/M22 Q1024 NOSIG RMK QFE737 24450245
UERR 010500Z 21005G08MPS 4000 -SN SCT100 OVC200 20/M22 Q1023 NOSIG RMK QFE736 24450245
Note the missing ” M” on the 12:30PM (330Z) and 1 PM (400Z). It happens again at 2PM (500Z). Of course it isn’t very noticeable looking at the METAR reports, but like the GISS plot of Finland, stands out like a sore thumb when plotted visually thanks to Weather Underground:

Mirnvy, Russia

The effect of the missing “M” is plotted above, which coincidentally looks like an “M”.
Put those METAR reports in this online METAR decoder: wx-now.com and you get 70F for 12:30PM and 68F for 1PM
What do you think 70 degree F spike this will do to monthly averaged climate data in a place where the temperature stays mostly below freezing the entire month?
...............
Yakutsk, Russia, also in Siberia is part of GHCN and has a METAR reporting error. Here’s an example what one off-coded hourly reading will do to the climate database.

The city of Yakutsk, one of the coldest cities on earth, reported a high of 79°F on November 14th with a low of -23°F.
..........
A month later, it happened again reporting a high of 93°F on December 14th with a low of -34°F
...........
[ No wonder those anomaly maps show so much warmth in the arctic!

Remember this when you hear we just had the warmest month/year ever.

GHCN, CRU, GISS - they operate on the Garbage In, Gospel Out principle. ]
..........
And it was not a one time occurrence, happening again on Dec 25th as shown in the Monthly graph:
......
It happened twice on Feb 2nd, 2007, and with a space added between the M and 09 on the 0300Z report, it is a clear case of human error:
.....
It is not just Russia that has METAR reporting errors
Lest you think this a fault of Russia exclusively, it also happens in other northern hemisphere Arctic site and also in Antarctica.
...........
Missing M’s – Instant Polar Amplification?

It has been said that the global warming signature will show up at the poles first. Polar Amplification is defined as:

“Polar amplification (greater temperature increases in the Arctic compared to the earth as a whole) is a result of the collective effect of these feedbacks and other processes.” It does not apply to the Antarctic, because the Southern Ocean acts as a heat sink. It is common to see it stated that “Climate models generally predict amplified warming in polar regions”, e.g. Doran et al. However, climate models predict amplified warming for the Arctic but only modest warming for Antarctica.

Interestingly, the METAR coding error has its greatest magnitude at the poles, becuase the differences in the missing minus sign become larger as the temperature grows colder. Eureka, NWT is a great example, going from -43°C to +43°C (-45.4°F to 109.4°F) with one missing “M”.

You wouldn’t notice METAR coding errors at the equator, because the temperature never gets below 0°C. Nobody would have to code it. In middle latitudes, you might see it happen, but it is much more seasonal and the difference is not that great.

For example:

M05/M08 to 05/M08 brings the temp from -5°C to +5°C, but in a place like Boston, Chicago, Denver, etc a plus 5C temperature could easily happen in any winter month a -5C temperature occurred. So the error slips into the noise of “weather”, likely never to be noticed. But it does bump up the temperature average a little bit for the month if uncorrected.

But in the Arctic and Antarctic, a missing M on a M20/M25 METAR report makes a 40°C difference when it becomes +20°C. And it doesn’t seem likely that we’d see a winter month in Siberia or Antarctica that would normally hit 20°C, so it does not get lost in the “weather” noise, but becomes a strong signal if uncorrected.

Confirmation bias, expecting to see polar amplification may be one reason why until now, nobody seems to have pointed it out. Plus, the organizations that present surface derived climate data, GISS, CRU, only seem to deal in monthly and yearly averages. Daily or hourly data is not presented that I am aware of, and so if errors occur at those time scales, they would not be noticed. Obviously GISS didn’t notice the recent Finland error, even though it was glaringly obvious once plotted.

With NASA GISS admitting that missing minus signs contributed to the hot anomaly over Finland in March, and with the many METAR coding error events I’ve demonstrated on opposite sides of the globe, it seems reasonable to conclude that our METAR data from cold places might very well be systemically corrupted with instances of coding errors.

The data shown between 4/13 and 4/15 were based on data downloaded on 4/12 and included some station reports from Finland in which the minus sign may have been dropped.

4/15/10 data.giss.nasa.gov

That darned missing M, or an extra space, or even writing “-” when you mean “M” (which is counterintuituve to basic math) all seem to have a factor in the human error contributing to data errors in our global surface temperature database. To determine just how much of a problem this is, a comprehensive bottom up review of all the data, from source to product is needed. This needs to start with NOAA/NCDC as they are ultimately responsible for data quality control.

It has been said that “humans cause global warming”. I think a more accurate statement would be “human error causes global warming”.
......
wattsupwiththat.com

[ Yeah, but this "human error" is convenient. It moves things the way they want so they ignore it. If amateurs auditing the data can find this sort of error, the PhD's in charge of these databases could too. They don't want to! ]

..............
JRR Canada (21:03:36) :
I agree with KIrk.The deeper we get to look the sadder this science seems. Is it because these people are all govt employees? I mean the errors are so pathetic its like they just do not care, no repercussions for poor work no consequences for mistakes. Lets save billions, their work can not be trusted? Stop funding.
17 04 2010
GregO (21:04:11) :
Great work.
Reported thermometer readings matter. Instrument bias and uncertainty matter. Missing minus signs matter. They matter because we are attempting to measure a temperature rise allegedly caused by man-made CO2 and said temperature rise is conjectured to threaten to destroy the world. In this debate, thermometer readings matter.
The conjecture that world-wide AGW is from man-made CO2 is weak due to a paucity of observable evidence. Melting ices caps? What melting ice caps? Rising sea levels? Really? Disappearing glaciers? So what? They cover only a fraction of Earth’s surface and though they shrink and grow, are hardly disappering (much to the embarassment to IPCC). Ifs, coulds, maybes. On and on.
Isn’t this all about (global) catastrophic warming? If it’s really getting catastrophically hotter here; show me the thermometer readings. They will show distinct unmistakable catastrophic heating if the readings are any good and if it is really getting catastrophicall hotter, the thermometers will show it’s getting catastrophically hotter. Period.
What we actually see in surface thermometer temperature records is inconsistency, instrument bias, human error, and this engineer has had it up to his keester with wishful thinking, madness of crowds, with frauds and charlatans showing up to the debate with a suitcase full of political baggage, advocacy, and empty rhetoric. All we need in this debate is the temperature readings.
Kindly, Anthony, you and others have shown me the thermometers. I’m sure the analysis to follow will be interesting; but to me not as interesting as this post. I’ve seen enough to know that any global catastorphic warming from man-made CO2 is either being smothered by negative feedback within the climate system itself, and/or is so minute as to be within the measurement error and of little consequence in comparason to world destruction from AGW.
What is really fascinating to me is how a weak conjecture like AGW has grown into a mass delusion driving drastic political policy change despite a complete lack of observable results/catastrophes.
......
Mark (21:29:59) :
Another error that works in favor of the believers. The odds of this happening are remote. And it’s not just us noticing this:
Professor Watson, who served as chairman of the IPCC from 1997-2002, said: “The mistakes all appear to have gone in the direction of making it seem like climate change is more serious by overstating the impact. That is worrying. The IPCC needs to look at this trend in the errors and ask why it happened.”
timesonline.co.uk
.......
jaypan (21:44:51) :
There are well funded institutions, getting all these raw data, processing them in order to “enhance their value”, then tell the world about the dangerous findings and projections, feed politicians with action plans and even get heard.
Why can’t these highflyers not do their homework first, like check the siting of their reporting stations, doublecheck the numbers they report in case of suspicious results, just be sceptic, as a pre-post-normal scientist has to be?
Instead the people doing this work voluntarily are called deniers, flat-earthers, and looked at downward of their ivory tower.
I am getting sick of all these a……..
17 04 2010
Leon Brozyna (21:45:23) :
A superb bit of detective work.
This is a fine concretized and particularized example of the data difficulties highlighted in the famed Harry_Read_Me file. Brings it into focus for the average layman.
BTW, thanks for the link to the updated work you’ve done with Joseph D’Aleo. I’m sure it’ll make for some fine reading tomorrow morning.
........
Mike D. (21:51:58) :
Excellent sleuthing Anthony!
Where is the data cleansing? Data cleansing is common in a host of disciplines. In my mundane field, forestry, we call measuring forests “mensuration” or “cruising”. Data is either hand-written in field notes and key-punched in later, or entered in a data logger in the field. At some point in the data gathering process, it is HIGHLY RECOMMENDED that some sort of data cleansing/editing software be applied. Because mistakes are common. A decent data cleansing algorithm will note that a 3 inch diameter tree cannot be 100 feet tall. Or that it is unlikely that a 30 inch diameter tree is only 10 feet tall.

Since THE FATE OF THE WORLD depends (these days) on taking the globe’s temperature, it might be handy if the GENIUS CLIMATE SCIENTISTS take the same precautions with their data that a lowly forester does.

Note to NOAA. Please clean your data. If the job is too tough, too confusing, and you don’t even know where to begin, call a forester. They can help you out.
............