Media platforms

Social media platforms aren’t doing enough to monitor domestic extremists – NBC4 Washington

A new report from the Digital Citizens Alliance and The Coalition for a Safer Web says social platforms need to do a better job of cracking down on domestic extremists who continue to use digital platforms to recruit new members, spread misinformation and glorify their violent acts.

The 56-page report, released Monday, calls on Congress to pass new sanctions for domestic terrorism. He also recommends that social media companies hire more people to monitor their sites and increase enforcement consistency by sharing information about bad actors with other online platforms.

“They brag about their ability to keep up with videos and being able to put ads that really relate to the experience. They need to use that prowess and stop treating it like a PR issue and start treating it like a problem. national security.” said Tom Galvin, executive director of the Digital Citizens Alliance.

Researchers from the DC-based nonprofits teamed up and spent more than two months scouring social media sites Youtube, Facebook, Instagram, Twitter, TikTok and Telegram. The report describes hundreds of disturbing posts, including photos and videos allegedly from domestic terrorists, militias and other extremists.

“They used it to plan their activities [at the Capitol] for January 6. It was a very convenient and efficient way for them to do that,” Galvin said. “We found that they were recruiting through digital platforms, just like jihadists do.

Galvin said social media companies not only have a legal obligation, but also a moral obligation, to ensure that hate groups don’t use the platforms to spread misinformation or recruit new members.

But he says the typical business model is to seek out as much content and as many users as possible – to monetize them – which can be against the mission of removing posts or people who violate the platform’s rules. .

“We’re not here to say platforms don’t do anything, what we’re saying is they don’t do enough,” Galvin said.

Galvin said once his researchers began researching the posts, they encountered an endless loop of suggested content feeding them extremist ideas. He worries that TikTok caters to younger users and talks about an Instagram platform for kids, the lack of a crackdown could help recruit a future generation of hate.

“I hope that with this investigation in this report, we can add to the public discourse on the role of digital platforms in, frankly, the rise of domestic extremism to truly significant heights not seen since the 1990s. is something that we have to pay very close attention to,” Galvin said.

Social media platforms told News 4 they try to be proactive and remove extremist content before it is even flagged. Several said they had already deleted many posts and profiles of researchers cited in the report.

“Violent extremism has no place on TikTok,” a spokesperson said. “We are working aggressively to stop the spread of misinformation and ban accounts that attempt to co-opt our platform for violent or hateful behavior.”

According to TikTok’s latest transparency report, the platform removed 92.4% of non-compliant videos before they were flagged, and 93.5% were removed within 24 hours of posting.

Digital Citizens Alliance acknowledged that removed content is often quickly replaced with a similar version or the same content under a different name, even citing a specific example from the report that is already back on Instagram.

In a statement, a spokesperson for Instagram and Facebook wrote: “We expect this movement to change its tactics to try to evade our enforcement. Our teams are monitoring any changes in behavior and the types of content shared so that we can adapt our policies and We continue to remove thousands of accounts, pages, groups, events and profiles that violate our policies against QAnon.

A YouTube spokesperson disputed some of the report’s findings, writing, “We welcome more research on this front, but we disagree with the statement that we haven’t done enough. to combat extremist content on YouTube. In fact, in some cases, our policies and efforts to limit the spread of harmful misinformation through our systems are more comprehensive than those of other platforms.”

YouTube pointed out that channels belonging to many extremist leaders, including the Proud Boys and Chris Hill, as well as prominent QAnon figures, were shut down months before January 6.

In the fourth quarter of 2020 alone, YouTube said it removed more than 13,000 channels and 72,000 videos for violating its violent extremism policy.

Twitter and Telegram did not respond to requests for comment.

Source link