After a year of the pandemic, you can no longer say anything on social networks like Facebook, because the danger of creating mass disinformation phenomena is extremely high in the context of the health crisis we are going through.
If, a year ago, no one would "sanction" you if you posted on your social page that this virus does not exist, well, now things are not like that at all.
Humanity is going through a very sensitive period, and social networks play a key role in the formation of public opinions, which is why the information spread in this environment should not be ignored, but filtered and eliminated, if, by spreading, they put them in danger to health or social integrity.
In this context, Facebook has engaged in the fight against fake news.
How it wants and thinks it will remove fake information
To remove fake information, the network uses a combination of technology and human moderation, deleting a lot of fake accounts and multiplying collaborations with fact sites checking.
Facebook doesn't generally remove posts that contain misinformation, but the fact-checking organizations the company works with label them as such and their distribution across the network is greatly slowed. When people want to share that type of content they are warned that it is content with false information in it.
Also, the platform itself removes content that presents an imminent danger to the safety of users, and since the Covid crisis appeared many types of information fall into this category.
In this regard, posts such as "the virus does not exist" or "vaccines are harmful", when in fact the position of the official authorities is the opposite, are considered to be harmful content and will therefore be removed from the platform .
What posts Facebook no longer supports
Monika Bickert, Vice President of Global Policy Management at Facebook explained what types of content Facebook no longer supports:
What does Facebook allow in posts about vaccines?
"People can ask questions and give their opinion on whether to vaccinate or not. For example, we will NOT remove a post like: . People can also post about doubts they have about the studies that have been done on vaccines, so we allow adversarial discussions about these studies. But we will not let people promote vaccine reluctance when health authorities have said that these vaccines are safe."
Monika Bickert continued the explanations, saying that there is another way of disinformation. The ones with messages like, "The virus was created in a lab in China and then intentionally released into the world to spread," according to HotNews.
Bickert also said that the platform has also made errors over time, but the situation will improve.
The mistakes made by Facebook are blamed by Bickert on far too many active users on the platform, in front of which the moderators employed by Facebook cannot cope. They would work, as moderators of the platform now, around 15,000, in 50 languages around the world, including Romanian.
At a volume of 2.79 billion users in the last quarter of 2020, it's clear that moderators can't control content entirely, and fake news is still making its way and spreading.
Besides, the recommendations of the authorities also change, which makes it difficult for the platform, which has to react practically in real time and readjust to the new criteria.
Even the company has announced that it is doing its best to fight misinformation. Tens of thousands of people work, along with algorithms, to combat harmful content. Also, the platform is now designed to direct readers to official sources of information.