Paul
Driessen and David R. Legates
President Obama’s agreement with China
is about as credible as his “affordable care” pronouncements.
Pleistocene glaciers repeatedly buried
almost half of the Northern Hemisphere under a mile of ice. The Medieval Warm
Period (~950-1250 AD) enriched agriculture and civilizations across
Asia and Europe, while the Little Ice Age
that followed (~1350-1850) brought widespread famines and disasters. The Dust
Bowl upended lives and livelihoods for millions of Americans, while
decades-long droughts vanquished once-thriving Anasazi and Mayan cultures, and flood and
drought cycles repeatedly pounded African, Asian and Australian
communities. Hurricanes
and tornadoes
have also battered states
and countries throughout history, in numbers and intensities that have been
impossible to pattern or predict.
But today we are supposed to believe
climate variability is due to humans
– and computer models can now forecast climate changes with amazing accuracy.
These models and the alarmist scientists behind them say greenhouse gases will
increasingly trigger more “severe, pervasive and irreversible impacts for
people, species and ecosystems,” a recent UN report
insists.
In reality, carbon dioxide’s effect on
devastating weather patterns is greatly overstated. We are near a 30-year low
in hurricane energy (measured by the ACE index
of “accumulated cyclone energy”), and tropical cyclone and storm activity has
not increased globally over that period. In fact, as of November 18, it’s been
3,310 days since a Category 3-5 hurricane hit the US mainland – by far the
longest stretch since records began in 1900. This Atlantic hurricane season was
the least active in 30 years.
Moreover, there has been no warming
since 1995, several recent winters have been among the coldest in centuries in
the United Kingdom and continental Europe, the 2013-14
winter was one of the coldest and snowiest in memory for much of the United
States and Canada – and the cold spell
could continue.
Accurate climate forecasts one, five or
ten years in advance would certainly enable us to plan and prepare for, adapt
to and mitigate the effects of significant or harmful climate variations,
including temperatures, hurricanes, floods and droughts. However, such
forecasts can never be even reasonably accurate under the climate change
hypothesis that the IPCC, EPA and other agencies have adopted. The reason is
simple.
Today’s climate research defines carbon
dioxide as the principal driving force in global climate change. Virtually no
IPCC-cited models or studies reflect the powerful, interconnected natural
forces that clearly caused past climate fluctuations – most notably, variations
in the sun’s energy output.
They also largely ignore significant
effects of urban and other land use changes, and major high-impact fluctuations
like the Pacific Decadal Oscillation (El Niño and La Niña) and North Atlantic
Oscillation. If we truly want reliable predictive capabilities, we must
eliminate the obsession with carbon dioxide as the primary driver of climate
change – and devote far more attention to studying all the powerful forces that have always driven climate change, the roles they play, and the complex
interactions among them.
We also need to study
variations in the sun’s energy output, winds high in the atmosphere, soil
moisture, winter snow cover and volcanic eruptions, Weatherbell
forecaster Joe D’Aleo
emphasizes. We also need to examine unusual features
like the pool of warm water that developed in the central Pacific during the
super La Niña of 2010-2011 and slowly drifted with the wind-driven currents
into the Gulf of Alaska, causing the “polar vortex” that led to the cold, snowy
winter of 2013-2014, he stresses.
“The potential for climate modeling
mischief and false scares from incorrect climate model scenarios is
tremendous,” says Colorado State University analyst Bill
Gray, who has been studying and forecasting tropical cyclones for
nearly 60 years. Among the reasons he cites for grossly deficient models are
their “unrealistic model input physics,” the “overly simplified and inadequate
numerical techniques,” and the fact that decadal and century-scale circulation
changes in the deep oceans “are very difficult to measure and are not yet well
enough understood to be realistically included in the climate models.”
Nor does applying today’s super
computers to climate forecasting help matters. NOAA, the British Meteorological
Office and other government analysts have some of the world’s biggest and
fastest computers – and yet their (and thus the IPCC’s and EPA’s) predictions
are consistently and stupendously wrong. Speedier modern computers simply make
the “garbage in, garbage out” adage occur much more quickly, thereby
facilitating faster faulty forecasts. Why does this continue? Follow the money.
Billions of dollars are doled out every
year for numerous “scientific studies” that supposedly link carbon dioxide and
other alleged human factors to dwindling frog populations, melting glaciers,
migrating birds and cockroaches, and scores of other remote to
ridiculous assertions. Focusing on “dangerous human-induced” climate change in
research proposals greatly improves the likelihood of receiving grants.
American taxpayers alone provide a
tempting $2.5 billion annually for research focused on human factors, through
the EPA, Global Change Research Program and other government agencies.
Universities and other institutions receiving grants take 40% or more off the
top for “project management” and “overhead.” None of them wants to upset this
arrangement, and all of them fear that accepting grants to study natural factors or climate cycles might imperil funding from
sources that have their own reasons for making grants tied to manmade warming,
renewable energy or antipathy toward fossil fuels. Peer pressure and shared
views on wealth redistribution via energy policies, also play major roles.
When Nebraska lawmakers budgeted
$44,000 for a review of climate cycles and natural causes, state researchers
said they would not be interested unless human influences were included. The
“natural causes” proposal was ultimately scuttled in favor of yet another
meaningless study of human influences.
The result is steady streams of
computer model outputs that alarmists ensure us accurately predict climate
changes. However, none of them forecast the 18-years-and-counting warming
pause, the absence of hurricanes, or other real-world conditions. Nearly every
one predicted
temperatures that trend higher with every passing year and exceed
recorded global temperatures by ever widening margins.
The constant predictions of looming
manmade climate disasters are also used to justify demands that developed
nations “compensate” poor and developing countries with tens or hundreds of
billions of dollars in annual climate “reparation, adaptation and mitigation”
money. Meanwhile, those no-longer-so-wealthy nations are implementing renewable
energy and anti-hydrocarbon policies that drive up energy costs for businesses
and families, kill millions of jobs, and result in thousands of deaths annually
among elderly pensioners and others who can no longer afford to heat their
homes properly during cold winters.
Worst of all, the climate disaster
predictions are used to justify telling impoverished countries that they may
develop only to extent enabled by wind and solar power. Financial institutions
increasingly refuse to provide grants or loans
for electricity generation projects fueled by coal or natural gas. Millions die every year because they do
not have electricity to operate water purification facilities, refrigerators to
keep food and medicine from spoiling, or stoves and heaters to replace wood and
dung fires that cause rampant lung diseases. As Alex Epstein observes in his
new book, The Moral Case
for Fossil Fuels:
“If you’re living off
the grid and can afford it, an installation with a battery that can power a few
appliances might be better than the alternative (no energy or frequently
returning to civilization for diesel fuel), but [such installations] are
essentially useless in providing cheap, plentiful energy for 7 billion people –
and to rely on them would be deadly.”
By expanding our research – to include
careful, honest, accurate studies of natural factors – we will be better able
to discern and separate significant human influences from the powerful natural
forces that have caused minor to profound climate fluctuations throughout
history. Only then will we begin to improve our ability to predict why, when,
how and where Earth’s climate is likely to change in the future. Congress
should reduce CO2 funding and earmark funds for researching natural forces that drive climate
change.
Paul
Driessen is senior policy analyst for the Committee For A Constructive Tomorrow
(www.CFACT.org) and author of Eco-Imperialism: Green power - Black death
and coauthor of Cracking Big Green: To
save the world from the save-the-Earth money machine. David R. Legates,
PhD, CCM, is a Professor of Climatology at the University of Delaware in
Newark, Delaware, USA.
No comments:
Post a Comment