The vital role of impact measurement in media literacy initiatives
Improving media literacy for the general population is essential in this time of rapid digital innovation and, more worryingly, profitable and even militarized digital disruption. There is an urgent need for us to figure out how to combat misinformation through media literacy initiatives. We don’t have time to develop shiny materials or spend as much budget on design as there is left for evaluation. Nor worry about getting approvals for the correct logos or exact colors used, or verifying that our assets reflect the organization’s brand.
Rather, we need to know that our initiatives are working:
- We need to know which initiatives work best for which audiences and how to effectively target our resources.
- We need to know what results we are trying to achieve in order to determine if we have succeeded.
- We need to share what works with others, rather than keeping our methodology exclusive.
- And we need to share what went wrong so that others don’t waste their resources making the same mistakes, even if it means we might feel like we were wrong.
Having studied the impact of media literacy initiatives over the years, we know that it is essential to deploy effective measures, careful research design, eliminate common mistakes and learn from best practices. and the mistakes of others. It’s also essential to target specific audiences, clearly define outcomes, and make your methods and findings publicly available.
For example, for the European Commission-funded ySKILLS project, my colleagues and I recently conducted a systematic evidence review of newly published, peer-reviewed research on adolescent outcomes of learning different dimensions of skills digital. This large body of research reveals many findings, but I can summarize them by showing that efforts to teach adolescents only functional or technical skills can lead to negative as well as positive outcomes. This aligns with what we know from the EU Kids Online project – you might think you’re promoting a particular skill, but people might put it for other purposes you didn’t expect, like using new skills to seize risky opportunities and potentially get into new kinds of trouble online, precisely because they are now better informed about the media. It is therefore important not to forget to measure the unintended consequences of your intervention.
More positively, the systematic review also revealed that while young people acquire multiple dimensions of media literacy, in particular critical and informational skills, as well as communicative, functional and creative skills, they seem to acquire deeper knowledge which brings more positive results and less negative results. .
Optimization of impact
I will end with some general conclusions about how best to maximize impact. The tricky part is that the answers vary depending on the approach chosen, the target audience and the desired outcome, but we can make some generalizations, based on past experience:
- If you deliver your speech to a general audience, then those who are already advantaged (privileged, motivated, knowledgeable, interested) will benefit more. And those who are busy, distracted, anxious about other things, not to mention those who don’t speak your language or lack connectivity or can’t read your tiny fonts – won’t get the message either. So the overall effect may be to improve media literacy, but it could also exacerbate inequalities at the same time. So target the optimal audience that really needs your intervention.
- If you deliver your speech to an audience that you have not consulted, worked with, and listened to, you run a serious risk of missing your target, being misunderstood and seen as condescending or incomprehensible, or focusing on the wrong priority. . So remember to consult with your audience, listen carefully to what they say, and co-design your intervention with them. Partner with civil society and advocacy groups that have a long track record of representing underserved groups and involve them in your assessment.
- People are good at learning things when well-meaning people tell them about them, but there is a huge gap between knowledge and practice. In the heat of the moment, when they can’t be bothered, when no one else makes sense, when they crave a thrill – yes, they will share misinformation or act foolishly, even if they know better, even when you’ve told them better. So where do you look for the impact – on their knowledge or their actions? Hoping it’s the latter, how are you going to measure it?
- Media literacy is a matter of education. It takes time, it involves progression in learning, it is multidimensional. It engages people’s faculties and changes who they are and how they relate to the world, digital and otherwise. As with learning about print literacy – reading and writing, everyone’s ABCs, media literacy cannot be acquired sustainably through a one-off campaign, and it offers rarely “quick wins” because there is no quick fix. It is therefore crucial to develop a serious and costed plan to educate people about the digital environment in a way that respects their learning and meets their needs, and in a way that has a good chance of being sustainable. and transferable to future circumstances.
- However, people can only learn what is learnable, what someone can reasonably teach in the time available. And what is learnable often depends not only on the recipients, but also on the digital technology – whether it is designed to be opaque, non-transparent, even misleading; or if it is very complex and constantly changing, if the legal and technical experts themselves do not really understand it, then it will be difficult to deliver an effective media literacy initiative or to establish its successful impact. In such cases, we would be better off devoting our resources to changing digital technology through regulation or design.
This is not an easy task: neither improving media literacy nor evaluating initiatives. But it is a task to be taken seriously, and I have high hopes for what can be accomplished.
This message was originally published on the EDMO website, and is republished with thanks. This article represents the author’s point of view and not the position of the [email protected] blog, nor of the London School of Economics and Political Science.