Published on February 5th, 2018 | by Steve Hanley0
“Don’t Be Evil” — Has YouTube Algorithm Violated Google’s Core Principle?
February 5th, 2018 by Steve Hanley
When Google was still a fledgling company, its founders adopted the phrase “Don’t Be Evil” as its corporate motto (the story about who came up with the original idea is a bit murky). According to Wikipedia, “While the official corporate philosophy of Google does not contain the words “Don’t be evil,” they were included in the prospectus of Google’s 2004 IPO (a letter from Google’s founders, later called the “Don’t Be Evil’ manifesto”): “Don’t be evil. We believe strongly that in the long term, we will be better served — as shareholders and in all other ways — by a company that does good things for the world even if we forgo some short term gains.”
Do The Right Thing
Paul Buchheit, the creator of Gmail, says he “wanted something that, once you put it in there, would be hard to take out,” adding that the slogan was “also a bit of a jab at a lot of the other companies, especially our competitors, who at the time, in our opinion, were kind of exploiting the users to some extent.” That jab could very well have been aimed at Bill Gates and Microsoft, which was known for operating systems that often resulted in what computer users came to refer to as “the blue screen of death.”
Google is now Alphabet, which has adopted the phrase “Do the right thing” as its motto. Along the way, Google/Alphabet has acquired YouTube, one of the most popular social media sites on the internet. And that is where the whole “Don’t be evil” thing has apparently come off the rails. The narrative is a familiar one in the capitalist business model that rules the commercial world — make as much money as you can in as little time as possible and demolish your competitors along the way if you can.
Algorithms Rule Our Lives
The result is a cautionary tale that should concern us all, especially those who are care about the fate of the planet. Whether it’s YouTube, Facebook, or Twitter, algorithms have taken over our lives. What started out as a simple “If you like this you will probably like this, too” tool has now evolved to the point where bits of computer code are shaping our beliefs. Machines now play a larger role in what we believe to be true than family, friends, religion or cultural norms.
Marshall McLuhan once said, “We shape our tools and thereafter our tool shape us.” Algorithms are tools and boy, howdy, have they ever shaped us. Republicans in Illinois intend to put up a Holocaust denier as a congressional candidate, something that would have been unthinkable in prior years. (Dwight Eisenhower ordered photographers and videographers to film the Nazi concentration camps being liberated by Allied troops because he knew there would be those in the future who would try to deny the atrocities committed on the road to the Third Reich.) The number of people who believe the Earth is flat is skyrocketing, thanks to YouTube pushing that nonsense to viewers.
Research Reveals Secrets of YouTube Algorithm
According to research done by a former YouTube engineer, the algorithm created by the company that suggests other videos for people to watch after they sign in to the site may have played a part in the victory by Donald Trump in the last election. No one factor is responsible for that happening, of course, but in an exhaustive survey that involved watching more then 6000 videos, Guillaume Chaslot has provided a glimpse into how the YouTube algorithm operates. His research is available on his AlgoTransparency website.
Prior to the election, Chaslot collected data on several days to determine which videos the YouTube algorithm was recommending to viewers who searched either for “Clinton” or “Trump.” The result showed that more than 80% of the recommendations made when people searched for “Clinton”
were to videos that favored the Republican candidate. When he searched for “Trump,” less than 20% of recommendations were favorable to the Democratic candidate.
A recent article in The Guardian about Chaslot’s research is long and contains many details about methodology that are pertinent but outside the scope of this article. You are encouraged to do your own reading if you are interested in this topic. The Guardian article concludes, “Partisan videos recommended by YouTube in the database were about six times more likely to favor Trump’s presidential campaign than Clinton’s.”
Senator Mark Warner, the highest ranking Democrat on the Senate Intelligence Committee has used The Guardian article about Chaslot’s research to warn that the YouTube algorithm is “optimizing for outrageous, salacious, and often fraudulent content” and is susceptible to “manipulation by bad actors, including foreign intelligence entities.” That last phrase is clearly a reference to the whole Russian interference in the last election trope that is roiling Washington at the moment.
The Illusion Of Truth
The point for us here at CleanTechnica is that the same manipulation of algorithms is available to other interest groups as well, such as fossil fuel companies who are anxious to promote climate denial. “A lie repeated often enough becomes the truth,” is a statement often attributed to Joseph Goebbels, the mastermind behind much of the Nazi movement, but it likely predates him by several centuries. Niccolo Machiavelli was certainly familiar with the basic premise of how to turn lies into truth.
The phenomenon known as the “illusion of truth” was the subject of a study by Lisa Fazio at Vanderbilt University published by the Journal of Experimental Psychology in 2015. It found that “the illusion of truth effect worked just as strongly for known as for unknown items, suggesting that prior knowledge won’t prevent repetition from swaying our judgments of plausibility,” according to a report by the BBC.
The problem is that the human brain is programmed to formulate constructs that categorize certain information that occurs frequently in our daily lives so we don’t have to think about it every time. Things like which side of the road to drive on and what foods are part of a normal breakfast become routinized in our thinking so we can focus on the things that do require analysis. Otherwise, we would be paralyzed by the need to evaluate every stimulus that excites our senses all day every day. Malcolm Gladwell explains the phenomenon beautifully in his book Blink.
The BBC article concludes with this admonition. “Part of guarding against the [illusion of truth] is the obligation it puts on us to stop repeating falsehoods. We live in a world where the facts matter, and should matter. If you repeat things without bothering to check if they are true, you are helping to make a world where lies and truth are easier to confuse. So, please, think before you repeat.”
Does Google Practice What It Preaches?
And that lies at the heart of what is wrong with the YouTube algorithm. Its only interest is driving more traffic to the site. It is incapable of thinking before repeating falsehoods and innuendo. One of the oldest maxims in the world of journalism is, “If it bleeds, it leads.” Don Henley made it the point of his song “Dirty Laundry” in which he parodies the news industry in graphic detail: “Can we film the operation? Is the head dead yet? You know, the boys in the newsroom got a running bet. Get the widow on the set! We need dirty laundry.”
Have Google and Alphabet violated their own corporate policies in the interest of profits? Is the algorithm they constructed evil? Is feeding the most vile, hateful ideas to YouTube viewers really doing the”right thing?” Is YouTube really any different than Fox News, a disinformation organization that shamelessly panders to the least well informed members of society? Should the parent company take a step back and re-evaluate its policies in light of its publicly stated corporate ethics?
If nothing else, the lesson we can take away from all this is that even though some bot is suggesting we might like another video or news story, we don’t have to click through to find it. We still have the power to control our exposure to information. One of the easiest ways to do that when using YouTube is simply to switch the Auto Play button that appears in the top right corner of the screen to “Off.” When we cede the power to analyze the information our senses gather every day to algorithms, we give up a measure of our humanity to machines. That is probably not a good thing.