Truth, Inspiration, Hope.

Facebook Announces ‘Stronger Action’ Against Users Who Spread ‘Misinformation’

Prakash Gogoi
Prakash covers news and politics for Vision Times.
Published: May 30, 2021
In this photo illustration, a smart phone screen displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia.
In this photo illustration, a smart phone screen displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia. (Image: OLIVIER DOULIERY/AFP via Getty Images)

Facebook has announced that it will be taking action against “misinformation” on the platform. Whether it be “content about COVID-19 and vaccines, climate change, elections or other topics,” fewer people will see misinformation “on our apps,” the company said in a blog post. 

Three aspects of the action include:

  • Context for pages that share misleading claims: If a user wants to like a page that has “repeatedly shared content that fact-checkers have rated,” Facebook will show a pop-up message. The popup will notify users whether the page has shared information deemed “false” by fact-checkers.
  • Penalizing individuals who share misinformation: “We will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. We already reduce a single post’s reach in News Feed if it has been debunked,” from the blog post.
  • User notifications for Fact-checked content: If a post shared by a user is later on ‘debunked’ by a fact-checker as false, the user will be notified about the fact-checker’s article that debunks the post. Users will also be shown a prompt to share the fact-checker’s article with their followers. Those who repeatedly share “false information” may have posts moved “lower in the News Feed so that other people are less likely to see them.”

Facebook fact-checkers criticized a book without bothering to read it

In a May 16 article published on Wall Street Journal, physicist Steven E. Koonin accused Facebook of “spreading disinformation under the guise of ‘fact-checking.’”

Koonin’s criticism came after a WSJ review of his climate science book Unsettled: what climate science tells us, what it doesn’t, and why it matters was flagged as having “very low scientific credibility” by Facebook after 11 “self-appointed ‘fact-checkers’’ deemed it so. The review was written by Mark Mills. 

The fact-checkers never actually criticized Koonin’s 75,000-word book but only Mills’ 900-word review of it, something Koonin said unfairly impacted the book’s credibility. 

“By branding Mr. Mills’ review with ‘very low scientific credibility,’ the company directs its billions of users to a website that claims to discredit the review and, by direct implication, my book. This action adds to the growing suppression of open discussion of climate complexities,” Koonin wrote in the article.

He calls the Facebook fact-checkers “trolls who pan political adversaries’ books on Amazon without bothering to read them.”

Facebook launched its fact-checking program in 2016. Its “independent third-party” fact-checkers need to be certified by the International Fact-Checking Network (IFCN).

IFCN was set up by journalism non-profit Poynter, which in 2019 “was almost entirely funded” by Pierre Omidyar and billionaire George Soros, according to a report by The Epoch Times. Pierre Omidyar is the founder of eBay and also a major Democrat donor. Facebook is also one of Poynter’s funders.

Among the two Americans on IFCN’s advisory board are Angie Holan from PolitiFact and Glenn Kessler from Washington Post. PolitiFact is owned by Poynter, while Kessler has published a book titled “Donald Trump and His Assault on Truth.”

Shifting gears on ‘lab leak’ theory

Facebook’s new rules come as the company changed its stance on the COVID-19 lab leak theory. On Feb. 8 this year, Facebook announced that it would be removing “false claims” regarding COVID-19 and vaccines, including on the claim that “COVID-19 is man-made or manufactured.”

The move comes as part of an upswing of mainstream coverage on the possibility that SARS-CoV-2, the virus that causes COVID-19, may have been leaked from the high-security bioresearch laboratory in Wuhan, China.

On May 26, Facebook said that they will “no longer remove the claim that COVID-19 is man-made or manufactured from our apps.”

During a recent press conference, Florida Governor Republican Ron DeSantis highlighted the danger of social media websites censoring content.

“When people last year were raising that (lab leak theory) as something that needed to be investigated, they were de-platformed… And now, even Fauci (White House chief medical advisor) admits that this may be something that very well is the case. Are they going to now censor Fauci and pull him down off social media?”

“So, this shows you… because corporate media said it was a conspiracy theory at the outset, these big tech oligarchs responded to that, pulled down instead of having an honest debate about something that’s very very important,” he said.

In an interview with Fox News, Facebook whistleblower Morgan Kahmann said that Facebook was censoring users who questioned the effectiveness of COVID-19 vaccines and discussed its side effects. Such topics are classified as “vaccine hesitancy” by Facebook algorithms.

“They’re afraid of what people might conclude if they see that other people are having negative side effects. They think that this is going to drive up vaccine hesitancy among the population and they see that as something that they have to combat,” he said in the interview.