Report Indicates IPCC Ignore Facts and Failed Predictions To Claim Better Results

by Dr. Tim Ball on August 18, 2013

in Government,Political,Politics,Theory

The Intergovernmental Panel on Climate Change (IPCC) never followed the scientific method. They inferred the hypothesis that an increase in atmospheric CO2 due to human activities would inevitably cause a rise in global temperature. They set out to prove this when they should have tried to disprove it in what Popper calls “falsification.” Over at least the last 15 years global temperature has leveled and declined while CO2 levels continue to increase. What is actually happening is in contradiction to their hypothesis and essentially impossible according to the conclusion in their 2007 Report.

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations. It is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent (except Antarctica).

Despite this on 16 Aug Reuters news agency reported:

“Drafts seen by Reuters of the study by the U.N. panel of experts, due to be published next month, say it is at least 95 percent likely that human activities – chiefly the burning of fossil fuels – are the main cause of warming since the 1950s.”

They’re talking about a change in the next Report of the Intergovernmental Panel on Climate Change (IPCC) or Assessment Report 5. (AR5). It is significant because it is an increase from the 2007 Fourth Report (FAR) when they were >90 % certain.

If accurate, this claim is made in the face of evidence that their hypothesis is wrong. Perhaps it is explained by the recent comment by a leading member of the IPCC. He effectively said, failed proof of the hypothesis doesn’t matter because,

“Proof is for mathematical theorems and alcoholic beverages. It’s not for science.”

He added, all you need is “credible theories” and “best explanations”. The problem is both must account for all facts and be able to make accurate predictions. The IPCC abandoned “predictions” for “projections” or “scenarios” after the 1995 Report because of their failures. Now even the lowest projections are wrong.

The new claim of certainty extends the deceptions created in the FAR about the >90% certainty. The major deception deliberately created by the IPCC was the vast difference between what the Working Group I (WGI) Physical Science Basis Report says and the Summary for Policymakers (SPM). The “conclusion” cited above appears in the SPM. There is no reference to the actual or even inferred percentage in the WGI Report.

These differences and disparities appear frequently, which raises the question, why would they identify all the limitations of their work in the WGI? The answer is because if challenged, they could say they identified all the limitations. However, they followed a procedure that virtually ensured the message to the media and the public was very different. They orchestrated the focus by releasing the SPM, with fanfare, to the media months before the WGI Report was released. They relied on two things, that few would read the WGI Report and even fewer would understand what was being said. It has worked frighteningly well for all Reports to date.

But the deception takes many forms. For example, the actual >90% figure was never used directly even in the SPM. Notice the term “very likely” in the cited comment. It is defined in a separate table in the Glossary of the SPM under the listing “Likelihood” as shown.

If the next report does include the phrase “at least 95 percent” then it is a departure from the table and a conflation between a number and a phrase. (Presumably we will see a revised table in the Glossary.) At least 95 is in the >90 designation, but what is the descriptive phrase? This appears to be more evidence of a political motivation to reassure the public the IPCC is increasingly certain of its work.

I understand the late Stephen Schneider created the table because they thought it would have more impact on the public than a percentage.  The table and categories in themselves are bizarre. “About as likely as not” is a nice catchall phrase and far removed from the precision of science. It is also a reflection of Stephen Schneider’s philosophy that the end justifies the means expressed in his 1989 comment to Discovery that reads in part:

“On the one hand we are ethically bound to the scientific method, in effect promising to tell the truth, the whole truth, and nothing but& which means that we must include all the doubts, caveats, ifs and buts.  On the other hand, we are not just scientists, but human beings as well.  And like most people, we’d like to see the world a better place, which in this context translates into our working to reduce the risk of potentially disastrous climate change.  To do that we have to get some broad-based support, to capture the publics imagination.  That, of course, entails getting loads of media coverage.  So we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have.  This double ethical bind which we frequently find ourselves in cannot be solved by any formula.  Each of us has to decide what the right balance is between being effective and being honest. I hope that means being both.”

[emphasis added].

Of course there is no formula because there is no decision. Honesty must always trump effectiveness, especially in science. What is even more frightening is the IPCC decision to be effective has created false science as the basis for completely unnecessary and devastating energy and economic policies. It’s time to hold them accountable and begin by rejecting their Report and closing them down.