Should social media platforms censor vaccine misinformation?
When does knowledge become fact? There was a time when the greatest minds on our planet agreed that the Earth was flat or that leeches were a valid medical tool. A prerequisite for scientific integrity and development is informed debate.
But what happens when evidence-based debate supersedes political and social agendas? In other words, can opinions that deviate from the scientific consensus become dangerous? This is a key question facing governments and scientists, amplified by the COVID-19 pandemic. The current debate is whether institutions have the right to remove dissenters from the scientific status quo, particularly if governing bodies believe lives are at risk.
This topic has been in the headlines in recent months as controversial figures express varying degrees of skepticism about vaccines. At the same time, governments have resorted to extreme measures, with Austria imposing compulsory vaccination in January of this year. Balancing free speech with reducing COVID-19 deaths, governments face an ethical dilemma. However, the power of censorship no longer resides solely in the hands of the state.
Social media platforms allow us to debate and share information, views and emotions. However, this rapidly expanding space with limited regulation can blur the lines between the realms of fact and opinion. These are the digital spaces that are now facing censorship issues over vaccine misinformation. Do these private companies have the right or responsibility to shut down potentially harmful content?
The Institute for Strategic Dialogue (ISD) think tank monitors extremism and has demonstrated that a few individuals can have a ‘disproportionate influence on public debate” using social networks. Disinformation with harmful consequences has already spread in this way. Content linking COVID-19 to the installation of 5G towers has resulted in violent threats against engineers. This highlights the dangerous impact of the pseudo-scientific conspiracy, although the question of how people medicalize their bodies arguably follows a different trend.
The Royal Society recently urged social media companies not to remove ‘legal but dangerous contents. Instead, they advise stemming the flow of vaccine misinformation by changing algorithms so that content is harder to access and share. For example, preventing certain messages from automatically appearing on flow of people.
While people have a right to express their opinions, some argue that they don’t automatically have a right to such a large audience that could be harmed by consuming certain opinions. Prof Gina Neff, social scientist at the Oxford Internet Institute, agrees that the use of algorithms ‘[ensures] that people can still say what they think” but denies a broad reach of influence.
The Center Against Digital Hate (CCHR) takes an alternative position, advocating the removal of incorrect content on the grounds that it may cause harm.
However, suppressing dissenting opinions not only raises ethical questions, but runs the risk of exacerbating the problem. Austrian research last year suggests distrust of COVID-19 vaccine correlated with distrust of authority. The director of the Reuters Institute for the Study of Journalism at Oxford University, Professor Rasmus Klies Nielsen, agrees: “I imagine there are quite a few citizens who would see their worst suspicions confirmed on the workings of society, if established institutions took a much more practical role in limiting people’s access to information.
However, not all vaccine skepticism is at the extreme end of the spectrum. Removing any dissenting content could potentially suppress the legitimate concerns of well-studied scholars. These concerns are necessary to advance and improve scientific understanding. If direct censorship is implemented, social media platforms must be careful in drawing the line between dangerous and incorrect content and legitimate scientific concern.
Tackling vaccine misinformation is critical to protecting public health, but reasonable skepticism risks being caught in the crossfire. Providing the public with tools to identify erroneous content and analyze data preserves individual freedoms and curbs a reactionary movement. The best way to fight scientific misinformation is through education.
Check out these stories below to learn more about Sci&Tech:
The reality of hymen repair surgery in the UK
Creature Feature: Black-Footed Ferret
Xlinks – Using Moroccan Solar Power in the UK