Should social media platforms be the arbiter of truth?
Facebook’s Supervisory Board recently ruled that the company was right to ban Donald Trump over a blatant violation of Facebook’s community standards. However, he also called Facebook’s “indefinite” ban and inconsistent moderation standards, recommending that the company create a “proportionate” procedure for equal enforcement of moderation policies and reconsider the ban within six months. .
The Supervisory Board is often dismissed as a way for Facebook to offload its responsibility when controversial decisions about free speech need to be made. Now that he has indeed sent the decision back to Facebook, an important question arises, one that social media companies have so far tried to sidestep.
Should social media platforms be the arbiter of truth on the internet, and if so, how far should they go?
From social sharing to social influence
Most social media platforms are the source of tools for keeping in touch, making new friends, and general sources of entertainment. Facebook was designed as a social networking service for Harvard students. Twitter was designed to make text messaging work across multiple channels and devices. Instagram was started to share photos with friends.
Today, social media has become a powerful platform for shaping thoughts and ideas. With its easy accessibility and instant global outreach, social media has given everyone a voice and a stage. However, the lingering debate around free speech is not only whether all voices have the same right to be heard, but also who should moderate that right.
Facilitator vs referee
Historically, social media companies have been keen to distance themselves from this content moderation debate because it places enormous responsibilities and repercussions on them. They often see their reluctance to moderate or close problematic content as a duty; impartiality is necessary, they say, to safeguard freedom of expression.
But the right to free speech inadvertently comes with strong caveats: responsible behavior (online) and the need to respect the rights of others. When social media proves to be a direct accomplice in facilitating devastating events, such as the Capitol uprising– by amplifying hate speech and fake news, social media platforms cannot abdicate their responsibility and plead their innocence. At best, it is fallacious to preach the defense of free speech while refusing to recognize and ultimately shirk the duty to enforce responsible behavior online.
The big tech platforms are clearly aware that changes need to be made in this area, and progress has been made. These platforms now have community engagement rules and channels for reporting offensive or problematic content that is reviewed by moderation teams. They give genuine pages and profiles blue “verified” badges to prevent imitators and fake news. Even Facebook’s aforementioned supervisory board was supposed to be an independent “supreme court” to provide oversight and balance.
But it’s not enough. Being fully funded by Facebook, the Supervisory Board cannot escape the public assumption of inherent bias, and its recent attempt to distance itself suggests an awareness of this situation. Elsewhere there are plenty of horror stories about traumatic states in which Facebook’s moderator teams operate, which, among other ramifications, means moderation could be inaccurate, inconsistent and ineffective.
Considering the record billion dollar earnings of these social media platforms, it’s hard to see these non-committal positions and half-hearted efforts as anything other than a business decision. Staying “neutral” means a larger audience and therefore more advertising revenue. If top decision makers like Mark Zuckerberg and Jack Dorsey wanted their businesses to do better, these changes would be visible. The fact that they are not speaks volumes.
With the prevalence of social media, online spaces are becoming eerie reflections of offline spaces. Social media companies should be held accountable to their users in the same way that public service providers are. They can mimic the way offline public spaces are constructed and design better digital spaces by having regulations and standards that govern behavior, collectively determined by regulators, stakeholders and users for a healthier digital community.
Who will fill the void?
At the end of April this year, major soccer clubs, sports organizations, players and athletes joined a four-day social media boycott to highlight the constant abuse and discrimination they face on these platforms. The first line of declaration published by the English Premier League (EPL) said simply: “Social media companies must do more to stop online abuse. “
Vitriol on social networks towards personalities and sports bodies is nothing new. Just check the comments on any post from any football club after an EPL match. Clubs have been asking social media platforms to moderate this clearly abusive content for years, but the platforms have shown they are unwilling or unable to do so.
If social media companies refuse to step up, then who will? My bet is on governments, and that will come at a cost. We are already seeing India require social media platforms to appoint local representatives and to try to put in place new rules which require traceability. Twitter offices at India were raided by police after the company put a “manipulated media” label on the tweets of ruling party members. The government of Belarus rerouted a plane to arrest an outspoken dissident who used social media to criticize the government. Making governments the arbiter of truth can and will have serious ramifications for the neutrality and independence that social media companies so proudly protect.
The legal question
Much of the debate around “who should be the keeper?” Comes down to legality. Are these social media platforms themselves editors or is this role only for users who post content? The fact that the digital world is adapting faster than our legal frameworks can keep up with makes this response even more difficult. Many of the problems we are currently facing with the use of social media are unprecedented.
Maybe it’s fair to say that social media platforms shouldn’t necessarily be the sole and final authority on the truth. Ideally, the regulation and moderation of the Internet would be the result of a collaborative relationship between them and progressive governments, strong legal frameworks, and responsible users.
But as we as a society navigate towards this utopian understanding, what is clear is that social media companies have the capacity and should be doing more now. They gave the floor to everyone. Now they have to face the consequences, good and bad.
Pranav Rastogi is Managing Director of Redhill
Click here to subscribe to the FREE Asia PR and Communications newsletter and receive dedicated news, features and commentary from the region straight to your inbox. Be sure to register on the site to access more than one story per month.
To submit a news, comment, case study, or analysis idea for the Asia newsletter, email [email protected]