From Bach to Bull****: How Facebook Mismanages Disinformation (And What To Do About It)

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Originally published by Union of Concerned Scientists, The Equation.
By Kate Cell, Senior Climate Campaign Manager

I’m an amateur classical pianist, and sometimes I share a video of my playing with my Facebook friends. Usually I post something, a couple of people listen and are nice about it, life goes on.

But in April, I posted a J.S. Bach fugue, and within 15 minutes I was advised that the Universal Music Group owned the copyright… on music performed by me and written by a guy who’s been dead for 270 years. Facebook “partially muted” my version, which was obviously recorded in my living room, and I had to contest it to get that limitation removed. Nearly two weeks later, UMG implicitly acknowledged that there is no copyright on music written that long ago and that my playing was mine, not someone good represented by a UMG label. Only then was my video finally released from “Facebook jail.”

This annoying experience led me to a stark realization: Facebook plainly has effective content controls when the profits of large corporations are concerned. This makes it all the more frustrating that they allow misinformation and outright disinformation on climate change (and elections, COVID, vaccines, etc.) — which often leads to greater harm to already vulnerable or under-represented populations such as women and BIPOC communities — to travel freely and speedily across the platform.

How big is the problem?

In 1710, the satirist Jonathan Swift wrote that “Falsehood flies, and the Truth comes limping after it.” He had the imagination to create Gulliver and the world he traveled, but he couldn’t have foreseen how swiftly falsehood would fly in the 21st century, enabled by Facebook and other social media platforms.

On Facebook, disinformation comes in waves. Last February, when Texas experienced a deadly ice storm for which their grid was willfully and woefully unprepared, Facebook was flooded by a storm surge of nonsense it did little to contain. The previous September, Facebook also monetized misery by allowing the spread of false rumors about wildfires in California and Oregon.

And Facebook’s failure to handle lies about the climate change-worsened extreme weather or the deployment of renewable energy in the United States is just one issue. It doesn’t cover, say, Facebook’s contribution to genocide in Myanmar or to the online abuse of women, or its role in facilitating global sex trafficking.

On climate, follow the (fossil fuel) money

My post was muted because the Universal Music Group is the biggest music company in the world: it owns labels and brands from Abbey Road Studios (the Beatles) to Young Money (Lil Wayne). Valued at $53 billion dollars, they can afford to pay a paltry $12 million fine for bribery and to guard their copyrights jealously, even from dancing babies.

In the case of climate change, we’re talking fossil fuel-scale money — the largest companies together are valued at around $2.7 trillion, which is more than the GDP of Saudi Arabia, Turkey, Switzerland, and the Netherlands combined. For decades these firms have protected their assets and social license to operate through a combination of disinformation,  greenwashing, and empty promises.

Facebook makes money from fossil fuel advertising: $15.6 million from May 7, 2018, to October 8, 2020 from ExxonMobil alone. The American Petroleum Institute, the trade association for the oil and natural gas industry, ramped up to more than fifty thousand dollars’ worth of daily ad buys immediately following President Biden’s April climate announcement. Battling disinformation or banning fossil fuel advertising outright would reduce Facebook’s ad revenue, although the fact that the company took in $25.44 billion from overall ad sales in 2020 sure makes it seem like this is a source they could choose to forgo if the company believed, as founder Mark Zuckerberg has said, that “[s]topping climate change is something we can only do as a global community, and we have to act together before it’s too late.”

What the factcheckers say

It’s hard to keep up with the waves of mis- and disinformation, especially when Facebook takes poor old Bach down in minutes but lets COVID and climate bull**** stand for days or weeks. Even when posts are factchecked and labeled as false information, recent research found that adding that label can double a post’s virality.

I recently had a chance to talk to one of the founding members of a team of dedicated scientists, many of them at the very top of their field, who are among the factcheckers that Facebook touts as the solution to the problem of misinformation. He told me that it can take their team anywhere from 12 hours to two weeks, with an average response time of two days, to factcheck a post. Meanwhile, unlike my immediately muted Bach video, the falsehood is still out there, infecting new people through likes and shares.

I was fascinated to hear about their process. They use a combination of their own system and Facebook’s to identify posts that might contain misinformation, which are getting a lot of likes, shares, and comments. Then they review the content and mark it based on Facebook’s classification system (with categories such as misleading, missing context, etc.). The factchecker I talked to explained that Facebook’s system to surface these posts is algorithm-based and there’s very little transparency on how it works, even for the factcheckers themselves. This is true despite transparency being a pillar of the non-partisan International Fact-Checking Network (IFCN), where Facebook gets their factchecking partners.

What else can be done?

UCS is part of a coalition of groups calling on Facebook to act on climate disinformation. You can see some of our requests to Facebook here. One of the most straightforward would be to enforce Facebook’s limited existing policies, really, consistently, and quickly.

The factchecker I spoke to also proposed a few ideas. First, Facebook could make sure something misleading can’t go viral. If they can take down a harmless homemade music video within minutes, they can work out if something climate-related is getting attention, and they can “throttle” or isolate that content until it’s reviewed for misinformation. Secondly, if a Facebook account has published bull**** in the past, their new content could be prevented from going viral before it’s checked.

The idea that most intrigued me was the approach of expanding beyond factchecking. Facebook could identify the narratives that mis- or disinformation is feeding and actively work to counter them. If, for instance, a person saw three factchecked posts with disinformation on climate change, Facebook could include some factual articles in their feed. To my mind, this effort would work best if at least one of those articles exposed the disinformation playbook or other resources that show how bad-actor companies continue to deploy disinformation in an infinite loop.

Antisocial media

I’m focusing a lot on Facebook here, but let’s be clear: Facebook isn’t the only problem. YouTube broadcasts climate misinformation to millions, and it’s still hard for real content to break through. (Guess that’s one reason why Google removed their famous “don’t be evil” clause?) Twitter thinks ads about climate change are problematic, but ads from big oil are fine.

In the social media landscape, all content is treated as if it were equal. But not all content IS equal, especially content that leads to harm: misleading or downright false election information. “Un”science about the safety and effectiveness of vaccines. Lies on climate change that deceive the public about the realities and risks and delay desperately needed action. We need social media platforms like Facebook to take clear, concrete steps to address the mis- and disinformation being fostered on their platforms.

We also don’t have to rely on social media companies to voluntarily comply. Congress can legislate, agencies can regulate, courts can litigate their way to a world where social media supports, and doesn’t undermine, the public good. We ourselves can be savvy consumers of social media, attuned to the fact that right now moneyed interests get protection, but the public interest and actual people — particularly Black and brown people and women — not so much.

And we need to keep the pressure on. UCS is ramping up its efforts to combat social media disinformation, and we’ll need your help along the way. In the meantime, please report it if you see it, and learn how you can spot and stop disinformation.

Featured photo by Darius Soodmand on Unsplash


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Guest Contributor

We publish a number of guest posts from experts in a large variety of fields. This is our contributor account for those special people, organizations, agencies, and companies.

Guest Contributor has 4378 posts and counting. See all posts by Guest Contributor