Media literacy

Ditch the cynicism of media literacy and get to work » Nieman Journalism Lab

[ad_1]

A few days ago, YouTube star Pewdiepie advised his 75 million subscribers to follow a YouTube account associated with promoting alt-right and anti-Semitic content. He chose the account based on several video essays on movies there, but the signs were all there for everyone to read, visual “jokes” about Heather Heyer’s death in Charlottesville, a link to a Gab account that featured comments on the “Jewish question”, and even a speech by Adolph Hitler in one of the oldest videos, apparently. Thanks to his recommendation, the number of subscribers to the channel increased of 15,000.

When I talk to people about media literacy, skeptics often voice a version of what I call the “homeostatic fallacy”: the idea that ultimately we all share and read things that confirm our beliefs, with no net effect on anything. It’s often described as a hard truth – that’s why media literacy can’t work, you silly little bunny! But it’s actually a deeply comforting belief for those who embrace it: the more things change, the more they stay the same.

Yet my classroom experience and the increasing frequency of events like the one above belies this analysis. I don’t know how many of those 15,000 subscribers came knowing they were subscribing to a channel that might have this content in their feed, but I’m guessing that’s the most on the Death anime analysis Note, not to anti-Semitism. If they had known the nature of the channel, many would not have subscribed. It’s also true that PewDiePie, which lost major content partnerships when accused of anti-Semitism before, stands to lose millions of dollars in revenue with mistakes like this. He probably wished he had controlled the chain better. And in our classrooms, we find that lack of skills is a far bigger driver of error than worldview – when students are taught basic verification skills, we see little discernible effect of tribalism.

Of course, perspectives change. Once a person subscribes to a page or channel, what Claire Wardle calls the drip of radical content begins to affect their view of the world. But this process so often seems to start with a series of small errors, small oversights that eventually lead to more permanent results. In reality, many forms of radicalization and infiltration would be more difficult with a media-savvy audience – particularly if those with the most influence had better skills and habits in assessing reputation and intent.

In some ways, the homeostatic fallacy has served us well in recent years. It reminded people of the complex reasons why people might share things that weren’t true. He emphasized the resilience of bad ideas in the face of correction. And it formed a useful counterpoint to naive Cartesianism, which viewed bad information primarily as bad inputs leading directly to bad conclusions, an idea that is now rightly dead and buried.

But as we watch this slow, uncontrolled year-long slide toward gas pumps, it’s probably best to bury the homeostatic error as well. This “bias” part of confirmation bias has always meant something more specific than many realize – the tendency for errors to fall more on one side or the other of an equation. Leave bias aside: if you reduce errors, you reduce drift. And maybe, just maybe, the slippage stops.

Mike Caulfield leads the American Democracy Project’s Digital Polarization Initiative.


[ad_2]
Source link