Media Platforms Must Combat Election Misinformation, Now – InsideSources
It’s a haunting video. A man with a megaphone on the steps of the US Capitol reads out a tweet from then-President Donald Trump as insurgents storm the building on January 6.
“Mike Pence didn’t have the guts to do what should have been done to protect our country and our Constitution.”
Trump added fuel to the fire as he knew the violent and heavily armed mob had already entered the Capitol building.
Social media platforms have played a prominent role as the January 6 Committee unravels the sprawling criminal plot orchestrated by President Trump and his henchmen. Companies were unfortunately unprepared to fight misinformation during the last election.
The big lie that the 2020 election was stolen has taken hold, with 68-74% of Republicans believing it. State legislatures took the opportunity to pass at least 34 laws in 19 States that complicate voting or sabotage our electoral process. Many candidates in 2022 are making election denial and voter suppression cornerstones of their campaigns. Therefore, recent survey shows that 56% of Americans have “little or no confidence” in free and fair elections.
The challenges of election misinformation cannot all be solved for the 2022 election. But there are steps platforms can take now to help voters combat the spread of misinformation. Common cause and more than 120 organizations sent a letter to the largest social media platforms urging them to proactively tackle election misinformation in 2022. Here are some of the actions we suggested:
—Platforms must commit to Translating voting information in languages other than English, so that voters have reliable and accessible information. Non-English misinformation continues to thrive online, and the gap between resources allocated to moderating English and non-English misinformation content reveals a wide disparity in how violent content is handled. Platforms should increase resources dedicated to moderating non-English language content and make resource allocation measures transparent.
—Platforms can — and should — ensure that content that calls for political violence is addressed and that posts that promote election disinformation are not allowed to go viral. Each platform not only decides what content is allowed, but also how it is promoted to users through algorithmic amplification, i.e. what the platform recommends or shows you. Platforms can apply “friction” to messages containing misinformation and thereby reduce the spread of misinformation.
—Platforms should be consistent in their civic integrity policies. This must include removing content spreading false claims about the 2020 election and staffing teams that enforce these policies all the time, not just in the weeks immediately leading up to the election. There is no off year for election disinformation, and the weeks following an election can be among the most critical. False claims about 2020 continue to affect laws being developed in 2022 and need to be addressed. This is particularly critical in the post-election period to ensure peaceful transitions for newly elected officials. Fact-checking of political content should be prioritized and loopholes in the closed content app.
—Platforms should provide real-time access to data so that researchers and watchdogs can identify the harms of misinformation. This includes greater transparency about political ads, enforcement practices, and algorithmic models. They should also reverse retaliatory measures taken against researchers and allow academic research into the impact of platforms and algorithmic harm to continue. This includes permission to access tools such as CrowdTangle, which are invaluable in measuring Facebook’s reach and effect on users.
It is essential that all social media platforms take steps to ensure that they do not play a role in future election misinformation. In our coalition letter, we endorsed some simple actions social media platforms can take to reduce the spread of election misinformation and the potential it holds for offline political violence. We believe that platforms have a responsibility to ensure that their products are not used by vote suppressors and disinformation providers.
The 2020 election and its aftermath have exposed the dangers posed to our democracy by misinformation. We’ve seen how President Trump used social media platforms to rally and incite crowds on the Ellipse on January 6, then unleash them on Capitol Hill where Congress was at work to certify his opponent’s election victory. The results were shocking and tragic, and they could easily have been much worse. We have to make sure they don’t happen again in 2022, or 2024, or ever again.