Shonky Climate Change Statistical Methodology By James Reed

With the IPCC “science” report on alleged climate change out, some critics have subjected its methodology to scrutiny, which is what good science should do, but of which we see too little. However, Dr Ross McKitrick has criticised one key method used by the IPCC, that of “optimal fingerprinting,” which is a method used for allegedly identifying the role of greenhouse gases in climatic data. The problem is that the actual model already has assumptions about the climatic effects of such gases built in, so it is a garbage in, garbage out situation.

https://www.naturalnews.com/2021-09-08-entire-climate-change-statistical-model-revealed-as-junk-science-hoax.html

“For decades the communist left in America has tried to scare us all into accepting an authoritarian lifestyle, the kind which our founders rebelled against.

During the countercultural ‘revolution’ of the 1960s, the big scare — the big lie, the hoax — was that our planet was going to die from overconsumption and “overpopulation,” so we should stop having children to ‘preserve’ our world.

Next up, it was global “cooling.” As average temperatures fell over a period of a decade or so and weather patterns changed somewhat, the unscientific left predicted a coming second “Ice Age” would snuff out all life, so we would need to make major alterations in our lifestyles in order to prevent that catastrophe.

But when temperatures began to warm again, and in some instances, unseasonably so, global cooling was changed to “global warming” but with the same dire predictions that modern life was ruining the planet and we would have to give up raising meat, fossil fuels and modernity so we could live.

When ‘global warming’ didn’t scare enough people into compliance, they altered the verbiage again to “climate change” — because after all, our climate has always ‘changed,’ hasn’t it? Only, it changed because the earth is a living planet with a changing ecosystem, not because of cattle methane and SUVs.

Flawed research underpinning the current arguments about ‘climate change’

Now we learn that all of the statistical models used to predict our demise were nothing but junk science, too, as The Epoch Times is revealing:

A new study in “Climate Dynamics” has criticized a key methodology that the Intergovernmental Panel on Climate Change (IPCC) uses to attribute climate change to greenhouse gases, raising questions about the validity of research that relied on it and prompting a response from one of the scientists who developed the technique.

The new study’s author, economist Ross McKitrick, told The Epoch Times in an exclusive interview that he thinks his results have weakened the IPCC’s case that greenhouse gases cause climate change.

The methodology, known as “optimal fingerprinting,” has been used to link greenhouse gases to everything from temperature to forest fires, precipitation, and snow cover.

The researcher went on to compare optimal fingerprinting to the way police officers use the technique to identify criminals.

“[They] take this big smudge of data and say, ‘Yeah, the fingerprints of greenhouse gas are on it,’” he told the outlet.

 McKitrick went on to say that the research he was criticizing, 1999’s paper in Climate Dynamics “Checking for model consistency in optimal fingerprinting,” is a “cornerstone of the field of attribution” — that is, the primary body of research that has often been cited to allegedly identify what causes climate change (as in, human activities).

The researcher said that the authors of that decades-old paper, Myles Allen and Simon Tett, made several mistakes in the way they validated their strategy.

“When you do a statistical analysis, it’s not enough just to crunch some numbers and publish the result and say, ‘This is what the data tell us.’ You then have to apply some tests to your modeling technique to see if it’s valid for the kind of data you’re using,” he said.

“They claimed that their model passes all the relevant tests—but there are a couple of problems with that claim. The first is they stated the conditions wrong—they left most of the relevant conditions out that you’re supposed to test—and then they proposed a methodology for testing that is completely uninformative. It’s not actually connected to any standard testing method,” he added.

And again, this is the paper the climate change whackjobs have often cited as their Bible of sorts, meaning that all policies related to its ‘findings’ are based on false premises, false data, and false conclusions.

“You’re dependent on climate model data to construct the test—and the climate model already embeds the assumptions about the role of greenhouse gases,” said McKitrick. “You can’t relax that assumption.”

Allen refuted McKitrick in an email to The Epoch Times, blowing off the criticism as if it is irrelevant because climate science as ‘moved on.’

But McKitrick didn’t buy that and pushed back.

“Even if it were true that [Allen’s method] is no longer used and people have moved on to other methods, [given] its historical prominence, it would still be necessary as a scientific matter for Simon and Myles either to concede their paper contains errors or rebut the specific criticisms,” he wrote in an email to the outlet.

“And the reality is the climate profession hasn’t moved on. The IPCC still discusses the Optimal Fingerprinting method in the AR6 and relies on many papers that use it.”

We’ve all been duped again by these left-wing counterrevolutionary authoritarians who, for some reason, can’t stand the fact that humans like to live free and prosper.”

https://www.theepochtimes.com/statistical-method-used-to-link-climate-change-to-greenhouse-gases-challenged_3983949.html

“A new study in “Climate Dynamics” has criticized a key methodology that the Intergovernmental Panel on Climate Change (IPCC) uses to attribute climate change to greenhouse gases, raising questions about the validity of research that relied on it and prompting a response from one of the scientists who developed the technique.

The new study’s author, economist Ross McKitrick, told The Epoch Times in an exclusive interview that he thinks his results have weakened the IPCC’s case that greenhouse gases cause climate change.

The methodology, known as “optimal fingerprinting,” has been used to link greenhouse gases to everything from temperature to forest fires, precipitation, and snow cover.

McKitrick compared optimal fingerprinting to the way law enforcement officers use fingerprinting to identify criminals.

“[They] take this big smudge of data and say, ‘Yeah, the fingerprints of greenhouse gas are on it,’” he said.

McKitrick said the optimal fingerprinting research he criticized, 1999’s paper in Climate Dynamics “Checking for model consistency in optimal fingerprinting,” is a “cornerstone of the field of attribution”—the branch of climate science focused on identifying the causes of climate change.

But according to McKitrick, the authors of that paper, Myles Allen and Simon Tett, made errors in the steps needed to validate their strategy.

“When you do a statistical analysis, it’s not enough just to crunch some numbers and publish the result and say, ‘This is what the data tell us.’ You then have to apply some tests to your modeling technique to see if it’s valid for the kind of data you’re using,” he said.

“They claimed that their model passes all the relevant tests—but there are a couple of problems with that claim. The first is they stated the conditions wrong—they left most of the relevant conditions out that you’re supposed to test—and then they proposed a methodology for testing that is completely uninformative. It’s not actually connected to any standard testing method.”

Their framework, McKitrick said, also assumes that a major chunk of climate change has to be attributable to greenhouse gases—so using that to prove greenhouse gases lead to climate change is meaningless.

“You’re dependent on climate model data to construct the test—and the climate model already embeds the assumptions about the role of greenhouse gases,” said McKitrick. “You can’t relax that assumption.”

McKitrick, who explained his results in more technical detail at JudithCurry.com, said the IPCC’s attribution of climate change to greenhouse gases is largely based on the 1999 paper or closely related research with the same problems.

Myles Allen, co-author of the 1999 paper McKitrick challenged, responded to McKitrick’s paper in an email to The Epoch Times.

“Fully addressing the issues raised by this paper might have made some difference to conclusions regarding human influence on climate when the signal was still quite weak 20 years ago,” Allen said.

He said the signal is now much stronger, whether one uses his 1999 technique or the simpler method employed at GlobalWarmingIndex.org. He also said that newer methods, including one in his own 2003 paper, have superseded the method from 1999.

“To be a little light-hearted, it feels a bit like someone suggesting we should all stop driving because a new issue has been identified with the Model-T Ford,” Allen said.

McKitrick responded to Allen’s argument in an email: “Even if it were true that [Allen’s method] is no longer used and people have moved on to other methods, [given] its historical prominence, it would still be necessary as a scientific matter for Simon and Myles either to concede their paper contains errors or rebut the specific criticisms.

“And the reality is the climate profession hasn’t moved on. The IPCC still discusses the Optimal Fingerprinting method in the AR6 and relies on many papers that use it.”

While Allen argued that his later 2003 paper superseded his 1999 paper, McKitrick responded that the 2003 paper, along with other more recent methods that Allen identified, “has all the same problems.”

McKitrick also argued that the method at GlobalWarmingIndex.org may have the same issues as Allen’s 1999 paper in large part because both studies cite a 1997 study from Klaus Hasselmann, which, McKitrick said, proposes the same method.

“Thus by Myles’ own examples, AT99 is still central to the attribution literature,” McKitrick said.

Allen argued that McKitrick’s criticism of his use of a climate model is misguided, as his 1999 method may actually be “overly conservative” in attributing climate change to human influence.

According to Allen, standard climate models may yield results in which the amount of statistical “noise,” and thus uncertainty, is overstated.

This rebuttal, McKitrick said, “does not address the core problem I point out,” which has to do with testing for errors in their fingerprinting calculations.

Allen and McKitrick also sparred over a specific statistical test in the 1999 paper, with Allen saying McKitrick had greatly overstated its importance and McKitrick countering that it is the only such test researchers have used in this context.

Attribution researcher Aurélien Ribes, whose papers were among those Allen claimed had superseded the 1999 research, declined to comment on the paper in detail in an email to The Epoch Times, though he said he had looked at an earlier version of it.

“I do not expect a very large impact in terms of attribution results,” said Ribes, a climate change researcher at France’s National Centre for Meteorological Research.

He said that some of his own research wasn’t dependent on fingerprinting. He also said that certain attribution findings, such as on global mean temperature, are “very robust.”

But another expert, Richard Tol, believes much of McKitrick’s criticism is on the mark.

“McKitrick is right,” said Tol, a professor of economics at the University of Sussex and a professor of the economics of climate change at the Vrije Universiteit Amsterdam, in an email to The Epoch Times.”

https://link.springer.com/article/10.1007/s00382-021-05913-7

 

 

Comments

No comments made yet. Be the first to submit a comment
Already Registered? Login Here
Tuesday, 16 April 2024

Captcha Image