Media platforms

National View: Media Platforms, Start Fighting Election Misinformation Now – Duluth News Tribune

It’s a haunting video. A man with a megaphone on the steps of the U.S. Capitol reads a tweet from then-President Donald Trump as insurgents storm the building on January 6, 2021: ‘Mike Pence didn’t have the guts to do what should have been done to protect our country and our constitution.

Trump added fuel to the fire as he knew the violent, armed mob had already entered the Capitol building.

Social media platforms have played a prominent role as the January 6 committee unravels the sprawling plot orchestrated by the then president and his henchmen. Companies were unfortunately unprepared to fight misinformation during the last election.

The big lie that the 2020 election was stolen has taken hold, with 68-74% of Republicans believing it. State legislatures took the opportunity to pass at least 34 laws in


States that complicate voting or


our electoral process. Many candidates in 2022 are making election denial and voter suppression cornerstones of their campaigns. Therefore,

recent survey

shows that 56% of Americans have “little or no confidence” in free and fair elections.

The challenges of election misinformation cannot all be solved for the 2022 election. But there are steps platforms can take now to help voters combat the spread of misinformation. Common cause and more than 120 organizations sent a


to the largest social media platforms urging them to proactively tackle election disinformation in 2022. We suggested that:

  • The platforms undertake to


    voting information in languages ​​other than English, so that voters have reliable and accessible information. Non-English misinformation continues to thrive online, and the gap between resources allocated to moderating English and non-English misinformation content reveals a wide disparity in how violent content is handled. Platforms should increase resources dedicated to moderating non-English language content and make resource allocation measures transparent.

  • Platforms can and should ensure that content that calls for political violence is addressed and posts that promote election disinformation are not allowed to go viral. Each platform not only decides what content is allowed, but also how it is promoted to users via algorithmic amplification, i.e. what the platform recommends or shows you. Platforms can apply “friction” to posts containing misinformation, thereby reducing the spread of misinformation.
  • Platforms should be consistent in their civic integrity policies. This must include removing content spreading false claims about the 2020 election and staffing teams that enforce these policies all the time, not just in the weeks immediately leading up to the election. There is no off year for election misinformation, and the weeks following an election can be among the most critical. False claims about 2020 continue to affect laws being developed in 2022 and need to be addressed. This is particularly critical in the post-election period to ensure peaceful transitions for newly elected officials. Fact-checking of political content should be prioritized and


    in the closed content application.

  • Platforms should provide real-time access to data so that researchers and watchdogs can identify the harms of misinformation. This includes greater transparency about political ads, enforcement practices, and algorithmic models. They should also reverse retaliatory measures taken against researchers and allow academic research into the impact of platforms and algorithmic harm to continue. This includes permission to access tools such as CrowdTangle, which are invaluable in measuring Facebook’s reach and effect on users.

It is essential that all social media platforms take steps to ensure that they do not play a role in future election misinformation. In our

coalition letter

, we endorsed some simple actions social media platforms can take to reduce the spread of election misinformation and the potential it holds for offline political violence. We believe that platforms have a responsibility to ensure that their products are not used by vote suppressors and disinformation providers.

The 2020 election and its aftermath have exposed the dangers posed to our democracy by misinformation. We saw how President Trump used social media platforms to rally and incite crowds on the Ellipse on January 6, then unleash them on Capitol Hill where Congress was working to certify his opponent’s election victory. The results were shocking and tragic, and they could easily have been much worse. We have to make sure they don’t happen again in 2022 – or 2024 or ever again.

Emma Steiner is a disinformation analyst at Common Cause (, a watchdog group in Washington, DC, with chapters in 35 states, including a Minnesota chapter in St. Paul.

Emma Steiner

Source link