Published on November 2nd, 2018 | by Zachary Shahan0
How Trolls Dominate The Internet
November 2nd, 2018 by Zachary Shahan
If you’ve been on the internet lately, you’ve probably noticed a few trolls. In fact, if you’ve spent much time in discussions about Tesla, politics, or climate change, you’ve seen swarms of trolls roll in. Sometimes you may have thought, “Hey, these peeps be trolls?” And sometimes you may have simply assumed they were normal human beings honestly expressing their point of view.
Blogging professionally for approximately a decade and being a site director of multiple sites for most of that time, I’ve seen a lot of trolling, enough to notice patterns in how they enter a discussion, how they disrupt a discussion, how they string real commenters along, and what kinds of stories or sites they hit. One struggle of the past several years has been determining how to deal with this phenomenon.
There are simplistic approaches out there. Some people think it makes sense to let practically every comment enter or stay in the discussion. That totally does not make sense. It sounds nice, sounds democratic, but it’s not. If someone is paying even a few people to spend all day in comment threads about a certain topic, those few people can dominate/hijack the most popular threads. If someone is paying hundreds of people and employing bots to disrupt a topic, be sure, no prominent internet discussion on that topic is safe/normal/organic.
We actually know that there are sponsored political smear campaigns to go after electric vehicles, and Tesla being king of the EV world, that often means Tesla. It appears that solar and wind have long been industries of attack, and climate change itself was swarmed by Russian trolls long before Russian interference in the 2016 presidential election. But what are the actual aims of all these trolls and the people funding them?
There may be specific messaging aims — lies about the environmental impact of cleantech, misleading narratives on subsidies, etc. However, there are also generic aims. If they make discussions about a topic argumentative, controversial, and downright nasty, they drive people away from that topic. If your cousin John or sister Suzie finds a nasty discussion under any article about Tesla or solar power, there’s a decent chance they’ll stay away from the topic in order to not get slammed or even just to not have their state of mind disturbed. After all, people want to enjoy their lives, right?
There are also interesting scientific findings that if the comment thread under an article is negative, no matter what the points of the story were or which arguments you found more compelling, a few months down the road, you are more likely to have a negative opinion of the topic of the article (whether that be Tesla, rooftop solar power, or pumpkin-spiced popcorn). In other words, as long as trolls turn discussions into ugly poop-throwing fights, many passersby will develop negative attitudes about the targeted tech.
Think about that for a minute. How many nasty threads have you seen between pro-Tesla commenters and anti-Tesla commenters? What effect do you think that has on people who are strolling through Twitter, Facebook, reddit, or some tech blog?
And what is really disturbing is when manufactured dramas get big enough and gain enough real human legs that they grow into full-scale counterproductive movements pulling society down. Real people fall for the narratives, get hooked into troll-farm communities, and eventually start to outnumber the trolls.
A search for the faux scandal “climategate” I just conducted shows 2.2 million results.
Clearly, this article has been forming for a long time in my mind, but it was specifically triggered by a recent article about Russia’s Trump strategy on Twitter and my own observations of some nasty anti-Tesla Twitter users with suspect accounts and narratives — highly suspect, I should say. The article starts off with this sentence/paragraph: “Twitter on Wednesday released a trove of 10 million tweets it says represents the full scope of foreign influence operations on the platform dating back nearly a decade — including Russia’s consistent efforts to disparage Hillary Clinton and an initially erratic approach to Donald Trump that eventually settled on a concerted pro-Trump message during the 2016 campaign.”
Yes, 10 million tweets. How many tweets have you sent? How much influence do you think you have from all of those? How much influence do you think 10 million tweets could have, especially if coordinated?
“The huge data cache consists of tweets from some 3,400 accounts tied to the Kremlin troll farm known as the Internet Research Agency and 770 others linked to Iran. It also includes some 2 million GIFs, videos and other pieces of visual content.” 2 million GIFs, videos, and other visual content!
It’s been said that the vast majority of Americans were touch in some way and to some degree by this political influence campaign. Ya think?
“By Election Day, the Russian trolls’ tweets were nearly uniformly pro-Trump, expressing sentiments like, ‘I don’t want a criminal in office! I’d vote for Monica before I vote for Killary! #Trump #MakeAmericaGreatAgain #TrumpForPresident,’ according to the Atlantic Council lab’s findings published Wednesday.”
Politics is one beast, and I can’t say I’m particularly optimistic about the direction of US politics or global politics. Democracy is based on people being informed and engaged. People are by and large not informed and engaged. Thus, powerful, rich interests are running the show and manipulating people enough to steer the whole ship in a harmful direction.
The beast we primarily deal with here on CleanTechnica is cleantech, of course. While I also find it disturbing how much clear troll campaigns influence discussion about cleantech, especially Tesla, on Twitter and under articles, I think genuine consumer interest and action are more powerful. But people should be aware of what they’re facing. People should consider where it’s worthwhile to spend their time and where they are simply playing by the trolls’ rules on the trolls’ home pitch and in the interest of the trolls. Sometimes, engaging in a discussion is helpful. Sometimes, disruptive and suspect accounts should be flagged, reported, and questioned outside of the framing they prefer. (On many sites, just a few flags from other commenters will remove the comment for moderation — unless the site is abdicating any responsibility over the narratives they enable.)
There’s a long-held maxim that one should not feed the trolls, but we also shouldn’t let them dominate the discussion.
Apathy, distraction, and muddying of the waters (aka gaslighting) are often their chief aims. Don’t let the trolls win the internet. And that also means putting pressure on site directors and community managers to do their freakin’ jobs and not pretend that every commenter is a genuine, honest, organic contributor in search of the truth.