Is media literacy the magic bullet for fake news?
As online news and social media have proliferated over the past decade, a whole new category of information has entered the popular lexicon: fake news. Online misinformation has come a long way from “No one on the internet knows you’re a dog” to a place where a malicious article from a troll farm can sit alongside legitimate and rigorous journalism. Awareness of the problem is widespread, but to put an end to misinformation, social media platforms, journalists, fact checkers and citizens must all take action.
One touted solution to preventing the spread of misinformation is “media literacy,” but what is it and how effective is it?
Media literacy is essentially critical thinking. It may seem redundant to teach media literacy and critical thinking in schools, where these things are presumably already taught, but the UK Commission on Fake News and Teaching Critical Literacy Skills, led by the Group All-Party Parliamentary (APPG) on Literacy and the National Literacy Trust, found that only 2% of children have the essential literacy skills they need to tell whether news is true or false.
It’s clear, especially in markets like the UK, that Facebook wants to see how it can take advantage of not having to follow the same rules as TV broadcasters.
While media literacy is essential for navigating online media, it should be a compulsory part of schooling – which is likely to limit high levels of literacy. In Finland, the fact-checking organization Faktabaari teaches media literacy and fact-checking in schools. The material is only intended for use with children up to grade 9 (15-16 years old), which means not much can be taught. Programs that target even younger demographics, such as the Angelina Jolie-produced BBC My World project, are so broad in scope and aimed at such a young demographic that they seem unlikely to get much of a following. impact.
If, on the other hand, media literacy is not taught but is voluntary, then it is purely self-selected and unlikely to solve the problem. Those who want to learn about misinformation and detect it are probably already the least likely to be misled by it.
While it is encouraging that groups like UNESCO publish manuals for journalists to combat fake news, they are usually not the most vulnerable. Typically, journalists should already back up their stories and work with editors and editors who will weed out information that cannot be trusted.
The crux of the matter is that misinformation taps into various psychological domains, such as social identity theory and feelings of group belonging or social isolation. One of the ways people are most vulnerable is in matters about which they believe they are knowledgeable but are not.
Studies have asked to what extent greater media literacy can help combat this type of prejudice. A study in the Journal of Experimental Psychology found that repeated statements were easier to process and therefore perceived as more truthful than new statements – even when study participants previously knew the statements were false.
What can benefit the public are broader digital hygiene programs, such as Facebook’s Digital Literacy Library, which breaks down big issues into smaller parts (e.g. individual privacy settings and behavior among peers), which is better suited to public awareness campaigns.
More viable solutions will come from solving problems at a higher level. An NPR poll found that the American public sees misleading information as the biggest threat to election security, but the solution lies more with the media, tech companies and government than with the public.
What can the media do to counter misinformation? Organizations like Reuters publish guides on how to spot fake news, and The Guardian Prominently displays the date an article was posted in its social media thumbnails. This was after making the deadline more visible on articles, but they still found that Facebook users often only see a shared post, but don’t dig deeper and look at the article itself. same. The change was intended to prevent users from miscontextualizing its reports.
The line between a news platform and a social media platform can also be blurred. Traditional media platforms often strictly regulate political messaging and advertising, while social media platforms have far fewer restrictions. Although Facebook is the main social media platform, it also runs a journalism project and acts as a gatekeeper for news accessible to its users. To that end, Facebook will continue to see itself as having a role in journalism, but it will not moderate political advertising on its platform.
It’s clear, especially in markets like the UK, that Facebook – which has pledged not to remove inflammatory or misleading political ads – wants to see how it can capitalize on the fact that it has no to follow the same rules. as television broadcasters.
Meanwhile, platforms like Twitter and Spotify have banned political advertising to combat fake news. But these policies also affect smaller political organizations that don’t have the capacity (or financial power) to advertise anywhere other than online. Social media platforms remain the cheapest way to advertise.
There will certainly not be a one-size-fits-all solution. It is not possible to simply ask citizens not to fall victim to disinformation campaigns carried out by hostile intelligence agencies using manipulative psychological tactics. There is no easy way to get people to have a better level of self-perception and understanding of their own biases. When it comes to issues of free speech and open debate, regional and cultural differences are also relevant, and governments will need to find solutions that will work specifically for them.
Media literacy is a good start, but it’s not the end of the story.