By John Wayne on Friday, 13 December 2024
Category: Race, Culture, Nation

Hoisted by His Own Misinformation Petard! By James Reed

The petard is a late Renaissance weapon, being an explosive device used to breach a wall. Being hoisted by one's own petard, means becoming a victim of one's own schemes.Thus, an expert on "misinformation," Professor Jeff Hancock, of Stanford University, who played a big part in advising governments about mandatory social distancing during Covid (which was totally useless for a virus spread through the air as well as contact), has himself become snared by the misinformation web. He was producing a review of the risks of misinformation through the GPT-40 program for a court case, but did not check the false references the program generated. He apologised to the court, and continued as business as usual, not seeing that he was hereby discredited.

https://www.naturalnews.com/2024-12-09-stanford-misinformation-expert-cited-fake-journal-articles.html

"A renowned expert on technology and "misinformation" at Stanford University who played a pivotal role in advising governments to implement mandatory social distancing during the Wuhan coronavirus (COVID-19) "pandemic" has been outed for spreading misinformation himself.

Jeff Hancock, a professor of communications at Stanford, admitted in a court declaration that he overlooked "hallucinated citations" that were generated for him by the artificial intelligence (AI) program known as ChatGPT.

In his declaration, Hancock looked at scholarly literature to assess the risks of deepfake technology and the spread of misinformation through GPT-40. Ironically, he failed to fact-check the citations generated by the AI program – citations that turned out to be from phony journals that do not even exist.

"I did not intend to mislead the Court or counsel," Hancock wrote. "I express my sincere regret for any confusion this may have caused. That said, I stand firmly behind all the substantive points in the declaration."

AI hallucinations

Hancock's original filing was filed on November 1 as part of a court case in Minnesota involving that state's 2023 ban on the use of deepfakes to influence an election.

Plaintiffs in the case argue that the ban is unconstitutional in that it wrongly limits free speech. Hancock, on behalf of defendant Minnesota Attorney General Keith Ellison, submitted an expert declaration stating that deepfakes amplify misinformation while chipping away at the perceived legitimacy of "democratic institutions."

Attorneys for the plaintiffs then accused Hancock of using AI to craft the court declaration itself, pointing at two citations to articles that do not even exist.

Hancock filed another declaration detailing the process he went through to research and draft the first declaration, admitting that he used GPT-40 and Google Scholar to create the faulty citation list.

In addition to generating two "hallucinated citations," as the AI industry and its proponents are calling them, ChatGPT also produced a notable error in the list of authors for an existing study.

"I use tools like GPT-4o to enhance the quality and efficiency of my workflow, including search, analysis, formatting and drafting," Hancock wrote.

The error was generated after Hancock asked the GPT-40 program to produce a short paragraph based on one of the bullet points he personally wrote. A "[cite]" he included as a placeholder to remind himself to add the correct citations failed to be properly read by the AI model, which created fake citations at the end of Hancock's court declaration instead of real citations.

Because Hancock was paid $600 per hour to create the declaration, the government seems to want that money back, especially since Hancock stated under penalty of perjury that everything included in the document was "true and correct."

"AI definitely is being used by the laziest among the 'academics' as a pass to doing work," someone wrote on X / Twitter. "It's only a matter of time with each of these that they are caught."

"They always do what they accuse others of doing ... 100% of the time," wrote another.

"What if this undermines trust in academia?!" wrote another, seemingly tongue in cheek since trust in academia is already at dismally low levels after COVID.

"Ivy League frauds," suggested another about the potential deeper layers to all this. "They will probably promote him to dean."

"Legacy media certainly should be blamed a lot, but the worst butchery of truth & integrity is done by these 'misinfo expert' academics. They are just fraud activists." 

Leave Comments