Health misinformation on Facebook viewed nearly 3.8B times in last year
Health misinformation spread by networks on Facebook has been viewed 3.8 billion times over the past year, a new study reported Wednesday.
According to the nonprofit advocacy group Avaaz, the peak of the misinformation hit in April as the coronavirus pandemic ravaged US cities.
The networks involved received approximately 400 million views in just one month and their reach far eclipsed that of leading health organization accounts on the social media giant.
Additionally, Avaaz said that just 16 percent of the health misinformation it discovered had an advisory label attached to it, arguing that Facebook founder Mark Zuckerberg’s algorithm is posing a major public health threat to its more than 2.7 billion users by failing to remove the false statements.
To draw these conclusions, Avaaz reportedly located third-party websites that had previously reached large audiences with health-related misinformation, found the top Facebook pages that drove content to those sites and devised a metric for views of videos relative to interactions with the footage.
Its findings showed that as COVID-19 swept the nation, so did conspiracy theories and false characterizations.
Earlier this year, 100 frontline doctors and nurses sent a letter to America’s largest social media platforms warning that misinformation was impeding their ability to treat patients.
A study in the American Journal of Tropical Medicine and Hygiene earlier this month concluded that 800 people had died globally in the first three months of 2020 as a result of misinformation.
The claims may prompt fresh scrutiny of Facebook and other tech leaders, especially from Congress.
At the end of July, Zuckerberg was called to testify in a virtual hearing on bias, market power, data usage and other topics.
He squabbled with lawmakers, telling Rep. David Cicilline, D-R.I., that Facebook has a “good track record” when it comes to policing misinformation, including on topics related to the global health crisis.
Since the spring, Facebook has implemented several initiatives to reduce health misinformation, including the creation of a “coronavirus center,” the removal of political ads that misrepresented COVID-19 and the expansion of fact-checking efforts.
The Washington Post reported that between April and June Facebook added warning labels to 98 million pieces of content and took down 7 million fraudulent posts related to coronavirus.
In a recent press release, Facebook offered a detailed overview of steps it’s taking to make sure users have “access to accurate information.”
Just a few days ago, however, the Department of Justice said it had seized four Facebook accounts run by ISIS, Al Qaeda and Hamas-linked terrorists that sought funding for their organizations using ads for COVID-19 personal protective equipment.
Avaaz says Facebook’s efforts so far are insufficient and don’t match the scale of the problem at hand.
Instead, it says the company should provide all users who have read misinformation with independently fact-checked corrections, as well as lower the visibility of misinformation in an individual’s news feed.
Facebook didn’t immediately respond to a request for comment from Fox News.