Media platforms

Push social media platforms to self-regulate

Social media companies can increase content oversight if motivated by federal incentives and responsibilities.

Online platforms can be “double edged swords, ” generator spectacular economy while providing a forum for bad actors and bad behavior. AT to fix On the dark side of platforms, only a combination of government regulation and pushing for businesses to self-regulate has a chance to curb extremes in social media. U.S. Supreme Court Justice Clarence Thomas hinted at a similar position.

While the challenges facing social media are global and require multinational solutions, our primary focus here is on the United States. We start by clarifying the problem we are trying to solve. On the one hand, social media platforms have an incentive problem. Facebook, Instagram, Google, YouTube, Twitter and other platforms frequently accelerate the viral spread of extreme content because it generates greater user engagement, which translates into billions of dollars in advertisements.

The more outrageous the content – for example, stories of grand conspiracies around the US presidential election, the coronavirus, and COVID-19 vaccines – the greater the opportunities for profit.

Yes, social media companies have ‘terms of service’ that To allow to remove harmful or illegal content. But there is no financial incentive for these companies to control themselves.

On the other hand, the volume of social media content – four billion videos are seen on Facebook alone every day, makes it difficult and expensive, though not impossible, to track, keep, or edit so many posts.

We have been studying and writing on digital platforms since their emergence in the 1990s. Our research has led us to conclude that many of the proposals discussed in the press and in the US Congress are unlikely to work. Consider them one by one.

One option is to break Facebook. This option would reduce Facebook’s market power, but it does not resolve any of the incentive or conflict of interest issues. identified by Frances Haugen, the former Facebook employee and whistleblower. Even if Facebook, Instagram, and WhatsApp were to be separate companies, each app would remain a huge and powerful network without new incentives to stop bad behavior. Similar problems would persist if YouTube were detached from Google.

Another option is for the federal government to directly regulate social media algorithms. This option, which Haugen offers, is possible, but would be difficult to perform for technical reasons. Social networks and search algorithms are designed using artificial intelligence technology. Algorithms are difficult for outsiders to take apart and modify and even for software developers themselves, let alone fully understand them. In addition, these algorithms are based on machine learning and big data. They are continually evolving as users interact with platforms billions of times a day. Establish a bureaucratic change control process overseen by government officials or even expert special masters — such as occurred with the settlement of Microsoft’s antitrust lawsuit, would likely freeze technology and hold back innovation.

A third option would be for lawmakers to eliminate Article 230. This change would eliminate the 1996 law which protected online platforms for civil liability for the content they disseminate. But article 230 distinguishes between platforms and publishers, the latter being entirely responsible for their content. As such, the law remains fundamental to the functioning of all online platforms. We want these internet companies to continue to be as open as possible to innovation and facilitate free speech in user-generated content, without being legally responsible for each of the billions of posts on their sites.

A better alternative would be to combine government regulations that change incentives and put more pressure on social media companies to self-regulate more. We have three suggestions for you.

First, the US government should require all social media platforms to strictly adhere to their own “terms of service,” giving enforcement authority over those terms to the Federal Communications Commission (FCC), the Federal Trade Commission (FTC) or even the Securities and Exchange Commission (SEC). These terms already give social media companies the ability to retain content or close accounts when they see abuse of their platforms. For example, Facebook has a set of “community standards” including a long list of potentially prohibited content, including posts considered to encourage violence and criminal behavior, sexual exploitation of children, hate speech, fake news and intellectual property violations.

However, it is not enough to have terms of service. The federal government must ensure that social media companies meet their terms of service and other publicly declared standards. One way to force membership would be government lawsuits or civil litigation, with fines imposed not only on companies, but also on their CEOs, other officers and members of their boards of directors. Just like the Sarbanes-Oxley Law 2002 made executives and directors personally liable for financial statements, new regulations could make Mark Zuckerberg and Facebook’s board liable for failing to comply with their terms of service.

Second, Congress, with the cooperation of the United States Department of Justice, should revise Section 230. We suggest that the revisions make it possible to hold online platforms accountable for the profits they derive from the deliberate dissemination of harmful content. Businesses could then open up to civil lawsuits or specific actions from the FCC, FTC, SEC or Department of Justice. Because Facebook and Alphabet, the parent company of Google, collectively earned $ 69 billion in net revenue in 2020, episodic fines are unlikely to change behavior, as we have seen with Europe antitrust fines against Google or the 5 billion dollars FTC fine imposed on Facebook. However, it should be possible to force the platforms to identify the profits derived from the advertisements linked to these viral publications, and then to recover these sums in the form of fines. These numbers could be very important.

Finally, any additional government regulations should be administered in a way that promotes more effective self-regulation, as occurred with movie ratings and advertisements for tobacco, alcohol, pornography, terrorist recruiting or self-preference on airline flight listings during the 1960s and 1970s. digital platforms are too complex and rapidly evolving, and the content too vast, to rely solely on government oversight.

Historically, when complex new technologies emerge and new industries emerge, a credible threat of intrusive government regulation triggers many companies to regulate their own operations. In industries like film and video games, the threat of government intervention stimulated business coalitions to establish content standards and codes of conduct, including content rating systems, which have worked quite well. This coalition approach could work with social media platforms.

At the end of the day, responsible self-regulation is good for business: Social media platforms need to take more responsibility for their impact on the world, lest they run the risk of continuing to damage common resources essential to business. information and commerce – the Internet itself and user confidence in digital content.

Michael A. Cusumano is the Distinguished SMR Professor and Associate Dean of the MIT Sloan School of Management.

David B. Yoffie

David B. Yoffie is the Max and Doris Starr Professor of International Business Administration at Harvard Business School.

Annabelle gawer

Annabelle gawer is a Full Professor of Digital Economics and Director of the Center for Digital Economics at Surrey Business School.


Source link