Facebook wants to toughen up on anti-vaccine misinformation
The American social media giant wants to do more to “remove false claims on Facebook and Instagram about Covid-19, Covid-19 vaccines and vaccines in general during the pandemic,” he said in a press release.
Facebook has therefore extended its list of misconceptions that will not be tolerated, and are already prohibited in advertisements.
It includes messages stating that Covid-19 has been made by humans, that vaccines are not effective, that it is less dangerous to catch the disease than to be vaccinated, or that the vaccines are toxic or cause autism.
People who share this kind of disinformation could be banned, the California group warned.
Group admins have been told that they will need to approve posts from members who tend to spread misinformation, before they are shared.
And on Instagram, user accounts looking to discourage their followers from getting the vaccine will be harder to find.
The dominant platforms have been collaborating for months with large health organizations to highlight information “which has authority” on the health crisis, in particular through its “information center on Covid-19”.
“More than 2 billion people from 189 countries have been connected to reliable information” via this tab, argues the company.
But Facebook critics are not convinced.
“Facebook has repeatedly promised to crack down on disinformation related to Covid and anti-vaccines for a year,” tweeted an NGO fighting “digital hatred”, the Center for Countering Digital Hate. “Each time, they fail to meet their goals.”
The social network is soon to publish the results of a large study on the pandemic, which has collected 50 million responses from people expressing their opinion or recounting their experiences on subjects such as symptoms of Covid-19, wearing a mask or ‘Health care access.