Media platforms

Opinion: Time to hold social media platforms accountable: We don’t need another TikTok video inviting or suggesting school violence – Columns

TikTok: a place where we can watch cooking videos, silly dance moves and, apparently, where people can post non-credible threats of school violence. Last month, dozens of school districts across the country announced their closure due to a surge of anonymous TikTok videos citing shootings and bomb threats.

It is incredibly dangerous for our children, our schools, our parents, our teachers and the community. Now more than ever, it’s time for social media companies like TikTok to be held accountable for the content posted on their site.

TikTok has become one of the most popular social media platforms of this generation. It was launched in 2016 by Chinese tech company ByteDance and allows users to create, watch and share short videos. By the end of 2021, the app had around 1 billion active users. Yes, because 1 in 7 people in the world use TikTok. It’s a lot of people.

Since this platform is so huge, it can be difficult to regulate what is posted. Currently, TikTok has a US-based security team and all uploaded content goes through a machine that inspects for policy violations. This is then reviewed by a human, a member of the application’s security team, before being published. More recently, TikTok used software capable of automatically deleting any videos that might violate its guidelines.

Obviously, this software is not enough.

In mid-December, school districts from Texas to Michigan issued warnings, canceled classes and increased security presence due to a viral TikTok video warning of an impending bombing or shooting. Even the Austin ISD Police Department stepped up security and monitored the national social media trend in late December, in an effort to prevent potential harm. Obviously, threats of school violence have passed through the security software, and this is not the first time.

TikTok and other social media giants like Facebook and Twitter have come under fire for spreading harmful content to children and young adults. In 2021, teachers had to ask TikTok to step in when a challenge to “slap the teacher” went viral. Two years ago, 4,000 people watched a live stream of the massacres that was posted on Facebook, which quickly spread across the internet and was reposted countless times.

In response to this dangerous and discordant content, social media giants say they continue to strengthen their security measures. I’m not so sure.

A TikTok spokesperson responded to the alleged threats of school violence, tweeting that “…we are working with law enforcement to review warnings about potential violence in schools even though we have no found no evidence of such threats originating or spreading via TikTok.”

These are empty and broken promises. As dangerous content continues to make its way through the so-called security measures that are in place at the social media giants, we will continue to see issues such as school threats arise. We will continue to see hate speech, calls for violence and other toxic content affect our children and endanger our communities.

As such, it’s time big social media companies were held accountable.

Currently, under Section 230 of the Communications Decency Act, platforms like TikTok, Twitter, and Facebook aren’t treated as publishers and aren’t technically responsible for the content users post. The law was created in 1996 and was designed to protect websites from lawsuits if a user posts something illegal. Joe Biden has suggested revoking Section 230 entirely, which would be a good start. The administration could act by removing legal immunity from lawsuits from social media giants, especially those that refuse to be proactive in removing dangerous content.

Enough is enough. It’s time to pass legislation that says social media networks can be held liable for damage caused by false information, harmful content and incitement to violence that is shared on their platforms.

We don’t need another video inviting or suggesting school violence. The clock is ticking, TikTok.


Annika Olson is associate director of policy research at the Institute for Urban Policy Research & Analysis. Annika is passionate about using research and legislative analysis to inform policies that impact the lives of vulnerable members of our community. She earned a dual master’s degree in psychology and public policy from Georgetown University and her bachelor’s degree in psychology from Commonwealth Honors College at UMass Amherst. Annika was previously an AmeriCorps member working with at-risk youth in rural New Mexico and Austin.


Source link