First things first. Sure as the sun rises in the east and sets in the west, someone is going to complain that this story has nothing to do with clean technology. If that’s your opinion, you are dead wrong. At a time when building a consensus about stopping the use of fossil fuels is essential to keeping the Earth habitable, misinformation spread via the internet is an Exorcet missile in the hands of the Koch brothers and the fossil fuel companies. If you think those people are going to go away quietly, please see James Ayre’s recent article about ExxonMobil suing climate activists.
Misinformation campaigns, often referred to as “fake news,” are responsible for the disgraceful Donald Trump being president of the United States. Whether you think Vladimir Putin or Karl Rove was behind it, there can be little question that social media played a larger role in the last election than any other media source. The tools that enable manipulation of online information grow more sophisticated with every day that passes. Specialized algorithms filter through mountains of data to identify our worst fears and then design misinformation campaigns that play on them.
Such disinformation campaigns are responsible for the rise of lunacy such as a belief that the Earth is flat, a concept that most second graders recognize as demonstrably false. Yet a recent article about the Flat Earth society claiming the launch of the Falcon Heavy was faked was one of the most widely read stories on CleanTechnica in its entire history, with lots of people vehemently supporting the notion that the world is indeed flat. Scary stuff and disconcerting for those who wish to have an intelligent conversation about climate change.
Dr. Carolyn Fortuna, a media literacy expert and founder of IDigitMedia writes, “Because social media platforms are a primary medium through which young people develop their political identities, disinformation distributed online with the intention of misleading voters or simply earning a profit has serious consequences for the future of informed citizenries everywhere. Young people absolutely need tools to help them navigate social media and to help them to assess what they hear and see around them as they’re growing up.”
In a report released in January, 2017, researchers at the University of Cambridge demonstrated how exposing people to snippets of misinformation could make them more resistant to similar news reports, particularly with regard to the debate about climate change. Now they have expanded their focus to help protect against all forms of online misinformation. The psychological theory behind the research is called inoculation.
“A biological vaccine administers a small dose of the disease to build immunity. Similarly, inoculation theory suggests that exposure to a weak or demystified version of an argument makes it easier to refute when confronted with more persuasive claims,” says Dr Sander van der Linden, director of the social decision making laboratory at the University of Cambridge. “If you know what it is like to walk in the shoes of someone who is actively trying to deceive you, it should increase your ability to spot and resist the techniques of deceit. We want to help grow ‘mental antibodies’ that can provide some immunity against the rapid spread of misinformation.”
Van der Linden and his colleague Jon Roozenbeek have collaborated with DROG, a Dutch group led by Ruurd Oosterwoud, to create an online game designed to teach people how to recognize misinformation and resist the ulterior messages they see. “You don’t have to be a master spin doctor to create effective disinformation. Anyone can start a site and artificially amplify it through twitter bots, for example.
“But recognizing and resisting fake news doesn’t require a PhD in media studies either,” Roozenbeek says. “We aren’t trying to drastically change behavior, but instead trigger a simple thought process to help foster critical and informed news consumption. The framework of our game allows players to lean towards the left or right of the political spectrum. It’s the experience of misleading through news that counts,” he says.
The game the researchers created is now available online for anyone to play. The researchers will gather feedback from those who play their game in order to make improvements for future versions. Their goal is to increase media literacy and fake news resilience in what they call a ‘post-truth’ world. “We try to let players experience what it is like to create a filter bubble so they are more likely to realize they may be living in one,” says van der Linden.
One of the purposes of education used to be training people how to think. That role has now been largely taken over by social media. Marshall McLuhan once cautioned us, “We shape our tools and thereafter our tools shape us.” The internet and social media are tools and boy, howdy, have they have ever shaped us! Whether van der Linden, et al, have created the perfect antidote to disinformation campaigns, it’s a beginning and an important one at that. If you try the game, please share your feedback with us.
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.