Search This Blog

De Omnibus Dubitandum - Lux Veritas

Saturday, February 17, 2018

Overheated claims on temperature records

It’s time for sober second thoughts on climate alarms

Dr. Tim Ball and Tom Harris
Now that the excitement has died down over the news that Earth’s surface temperature made 2017 one of the hottest years on record, it is time for sober second thoughts.
Did the January 18 announcement by the National Oceanic and Atmospheric Administration (NOAA) that 2017 was our planet’s third-hottest year since 1880, and NASA’s claim that it was the second hottest year, actually mean anything?
Although the Los Angeles Times called 2017 “a top-three scorcher for planet Earth,” neither the NOAA nor the NASA records are significant. One would naturally expect the warmest years to come during the most recent years of a warming trend. And thank goodness we have been in a gradual warming trend since the depths of the Little Ice Age in the late 1600s! Back then, the River Thames was covered by a meter of ice, as Jan Grifier’s 1683 painting “The Great Frost’ illustrates.
Regardless, recent changes have been too small for even most thermometers to notice. More important, they are often less than the government’s estimates of uncertainty in the measurements. In fact, we lack the data to properly and scientifically compare today’s temperatures with the past.
This is because, until the 1960s, surface temperature data was collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the United Kingdom and eastern Australia. Most of the rest of the planet had very few temperature sensing stations. And none of the Earth’s oceans, which constitute 70 percent of the planet’s surface area, had more than an occasional station separated from its neighbors by thousands of kilometers or miles.
The data collected at the weather stations in this sparse grid had, at best, an accuracy of +/-0.5 degrees Celsius (0.9 degrees Fahrenheit). In most cases, the real-world accuracy was no better than +/-1 deg C (1.8 deg F). Averaging such poor data in an attempt to determine global conditions cannot yield anything meaningful. Displaying average global temperature to tenths or even hundreds of a degree, as is done in the NOAA and NASA graphs, clearly defies common sense.
Modern weather station surface temperature data is now collected using precision thermocouples. But, starting in the 1970s, less and less ground surface temperature data was used for plots such as those by NOAA and NASA. This was done initially because governments believed satellite monitoring could take over from most of the ground surface data collection.
However, the satellites did not show the warming forecast by computer models, which had become so crucial to climate studies and energy policy-making. So bureaucrats closed most of the colder rural surface temperature sensing stations – the ones furthest from much warmer urban areas – thereby yielding the warming desired for political purposes.
Today, virtually no data exist for approximately 85 percent of the earth’s surface. Indeed, fewer weather stations are in operation now than in 1960.
That means surface temperature computations by NOAA and NASA after about 1980 are meaningless. Combining this with the problems with earlier data renders an unavoidable conclusion: It is not possible to know how Earth’s so-called average surface temperature has varied over the past century and a half.
The data is therefore useless for input to the computer models that form the basis of policy recommendations produced by the United Nations Intergovernmental Panel on Climate Change (IPCC) and used to justify eliminating fossil fuels, and replacing them with renewable energy.
But the lack of adequate surface data is only the start of the problem. The computer models on which the climate scare is based are mathematical constructions that require the input of data above the surface, as well as on it. The models divide the atmosphere into cubes piled on top of each other, ideally with wind, humidity, cloud cover and temperature conditions known for different altitudes. But we currently have even less data above the surface than on it, and there is essentially no historical data at altitude.
Many people think the planet is adequately covered by satellite observations, data that represents global 24/7 coverage and is far more accurate than anything determined at weather stations. But the satellites are unable to collect data from the north and south poles, regions that the IPCC, NOAA and NASA tout as critical to understanding global warming. Besides, space-based temperature data collection did not start until 1979, and 30 years of weather data are required to generate a single data point on a climate graph.
So the satellite record is far too short to allow us to come to useful conclusions about climate change.
In fact, there is insufficient data of any kind – temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH,  and so on – to be able to determine how today’s climate differs from the past. Lacking such fundamental data, climate forecasts cited by climate activists therefore have no connection with the real world.
British Professor Hubert Lamb is often identified as the founder of modern climatology. In his comprehensive 1972 treatise, Climate: Past, Present and Future, he clearly showed that it is not possible to understand climate change without having vast amounts of accurate weather data over long time frames. Lamb also noted that funding for improving the weather database was dwarfed by money being spent on computer models and theorizing. He warned that this would result in wild and unsubstantiated theories and assertions, while predictions failed to improve. That is precisely what happened.
Each and every prediction made by the computer models cited by the IPCC have turned out to be incorrect. Indeed, the first predictions they made for the IPCC’s 1990 Assessment Report were so wrong that the panel started to call them “projections” and offered low, medium and high “confidence” ranges for future guesstimates, which journalists, politicians and others nevertheless treated as reliable predictions for future weather and climate.
IPCC members seemed to conclude that, if they provided a broad enough range of forecasts, one was bound to be correct. Yet, even that was too optimistic. All three ranges predicted by the IPCC have turned out to be wrong.
US Environmental Protection Agency (EPA) Administrator Scott Pruitt is right to speak about the need for a full blown public debate among scientists about the causes and consequences of climate change. In his February 6 television interview on KSNV, an NBC affiliate in Las Vegas, Mr. Pruitt explained:
“There are very important questions around the climate issue that folks really don’t get to. And that’s one of the reasons why I’ve talked about having an honest, open, transparent debate about what do we know, and what don’t we know, so the American people can be informed and they can make decisions on their own with respect to these issues.”
On January 30, Pruitt told the Senate Environment and Public Works Committee that a “red team-blue team exercise” (an EPA-sponsored debate between climate scientists holding differing views) is under consideration. It is crucially important that such a debate take place.
The public needs to understand that even the most basic assumptions underlying climate concerns are either in doubt or simply wrong. The campaign to force America, Canada, Europe and the rest of the world to switch from abundant and affordable coal and other fossil fuels – to expensive, unreliable, land intensive alternatives – supposedly to control Earth’s always fluctuating climate, will then finally be exposed for what it really is: the greatest, most damaging hoax in history.
Dr. Tim Ball is an environmental consultant and former climatology professor at the University of Winnipeg in Manitoba. Tom Harris is executive director of the Ottawa, Canada-based International Climate Science Coalition.

No comments:

Post a Comment