By Joseph on Wednesday, 09 February 2022
Category: Race, Culture, Nation

The Censorship of So-Called Scientific Misinformation: Royal Society Cautions By Brian Simpson

We know things must be bad when the Royal Society itself is cautioning against the censorship of scientific misinformation. According to Professor Frank Kelly FRS, Professor of the Mathematics of Systems at the Statistical Laboratory, University of Cambridge, and Chair of the Royal Society report, "Science stands on the edge of error and the nature of the scientific endeavour at the frontiers means there is always uncertainty. "In the early days of the pandemic, science was too often painted as absolute and somehow not to be trusted when it corrects itself, but that prodding and testing of received wisdom is integral to the advancement of science, and society. "This is important to bear in mind when we are looking to limit scientific misinformation’s harms to society. Clamping down on claims outside the consensus may seem desirable, but it can hamper the scientific process and force genuinely malicious content underground."

 

But, this does not go to the heart of the matter with the Covid issue. Scientists have been exposed for having vested interests, and conflicting interests regarding their support of the natural evolution hypothesis over the lab leak hypothesis. Scientists have been connected to he CCP. There have been calls even by established medical journals for release of the Big Pharma clinical trial data, which as covered in other articles, the FDA does not seem overjoyed to release. And, then there is the entire debate by a minority of scientists about the alleged harms of the vaccines, which we cover daily. Overall, scientific misinformation is coming from the establishment who have their conflicts of interests and biases. Trust the science? Whose science?

https://royalsociety.org/news/2022/01/scientific-misinformation-report/?utm_source=gnaa

 

“Royal Society cautions against censorship of scientific misinformation online

19 January 2022

Governments and social media platforms should not rely on content removal for combatting harmful scientific misinformation online, a report by the Royal Society, the UK’s national academy of science, has said.

The Online Information Environment report also warns that the UK Government’s upcoming Online Safety Bill focuses on harms to individuals while failing to recognise the wider ‘societal harms’ that misinformation can cause. Misinformation about scientific issues, from vaccine safety to climate change, can cause harm to individuals and society at large. 

The report says there is little evidence that calls for major platforms to remove offending content will limit scientific misinformation’s harms and warns such measures could even drive it to harder-to-address corners of the internet and exacerbate feelings of distrust in authorities.

It recommends wide-ranging measures that governments, tech platforms and academic institutions can take to build resilience to misinformation and a healthy online information environment.

Professor Frank Kelly FRS, Professor of the Mathematics of Systems at the Statistical Laboratory, University of Cambridge, and Chair of the report said, "Science stands on the edge of error and the nature of the scientific endeavour at the frontiers means there is always uncertainty.

"In the early days of the pandemic, science was too often painted as absolute and somehow not to be trusted when it corrects itself, but that prodding and testing of received wisdom is integral to the advancement of science, and society.

"This is important to bear in mind when we are looking to limit scientific misinformation’s harms to society. Clamping down on claims outside the consensus may seem desirable, but it can hamper the scientific process and force genuinely malicious content underground."

The report defines scientific "misinformation" as content which is presented as fact but counter to, or refuted by, the scientific consensus - and includes concepts such as ‘disinformation’ which relates to the deliberate sharing of misinformation content. 

While the internet has led to a proliferation of misinformation, its impact on public understanding to date is less clear. 

The vast majority of British respondents to a YouGov poll, commissioned for the report, agreed that COVID-19 vaccines are safe and that human activity is changing the climate - around one in 20 disputed the scientific position. 

This group who dispute the science, although small, can be influential. They also express a range of motivations for sharing misinformation, from altruistic concerns to profit or political motivations, and are unlikely to be addressed by any single intervention. 

Instead, the report recommends a range of measures for policy makers, online platforms and others to understand and limit misinformation’s harms, including:

Professor Gina Neff, Professor of Technology & Society at the Oxford Internet Institute, and Executive Director at the Minderoo Centre for Technology and Democracy, University of Cambridge, and a member of the report’s working group said, "Scientific misinformation doesn’t just affect individuals, it can harm society and even future generations if allowed to spread unchecked.

"Our polling showed that people have complex reasons for sharing misinformation, and we won’t change this by giving them more facts.

"We need new strategies to ensure high quality information can compete in the online attention economy. This means investing in lifelong information literacy programmes, provenance enhancing technologies, and mechanisms for data sharing between platforms and researchers."

Dr Vint Cerf ForMemRS, Vice President and Chief Internet Evangelist at Google and a member of the report’s working group said, “Technology plays a big role in shaping our information environment and, as this report makes clear, it has a part to play in tackling scientific misinformation as well.

"Many technology platforms already use tools like demonetisation, regulating the use of recommendation algorithms and fact-check labels to reduce the harms of scientific misinformation without censoring debate.

"Misinformation is a complex problem. Technology, governments, science institutions, educators and the public all have a part to play in assuring the quality of scientific information that underpins so much of our day-to-day lives."”

 

 

Leave Comments