An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clintons bid for the presidency?
It was one of January's most viral videos. Logan Paul, a YouTube celebrity, stumbles across a dead man hanging from a tree. The 22 -year-old, who is in a Japanese forest famous as a suicide spot, is visibly shocked, then amused.” Dude, his hands are purple ,” he says, before turning to his friends and giggling.” You never stand next to a dead guy ?”
Paul, who has 16 million mostly teen subscribers to his YouTube channel, removed the video from YouTube 24 hours later amid a furious backlash. It was still long enough for the footage to receive 6m views and a spot on YouTube's coveted list of trending videos.
The next day, I watched a transcript of the video on YouTube. Then I clicked on the” Up next” thumbnails of recommended videos that YouTube showcases on the right-hand side of the video player. This conveyor belt of clips, which auto-play by default, are available to seduce us to expend more hour on Google's video broadcasting platform. I was curious where they might lead.
The answer was a slew of videos of men mocking distraught teenage fans of Logan Paul, followed by CCTV footage of the rights of children stealing things and, a few clicks afterwards, a video of children having their teeth pulled out with bizarre, homemade contraptions.
I had cleared my history, deleted my cookies, and opened a private browser to be sure YouTube was not personalising recommendations. This was the algorithm taking me on a journey of its own volition, and it culminated with a video of two boys, aged about five or six, punching and kicking one another.
” I'm going to post it on YouTube ,” said a teenage daughter, who sounded like she might be an older sibling.” Turn around and punch the heck out of that little boy .” They scuffled for several minutes until one had knocked the other's tooth out.
* * *
There are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions. What they watch is shaped by this algorithm, which skims and ranks billions of videos to identify 20″ up next” clips that are both relevant to a previous video and most likely, statistically speaking, to keep a person hooked on their screen.
Company insiders tell me the algorithm is the single most important engine of YouTube's growth. In one of the few public explanations of how the formula works- an academic newspaper that sketches the algorithm's deep neural networks, crunching a vast pool of data about videos and the people who watch them- YouTube engineers describe it as one of the” largest scale and most sophisticated industrial recommendation systems in existence “.
Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos- which featured his children receiving flu shoots, removing earwax, and exclaiming over dead pets- told a reporter he had only been responding to the demands of Google's algorithm.” That's what got us out there and popular ,” he said.” We learned to fuel it and do whatever it took to please the algorithm .”
Google has responded to these disputes in a process akin to Whac–AMole: expanding the army of human moderators, removing offensive YouTube videos identified by journalists and de-monetising the channels that make them. But none of those moves has decreased a growing concern that something has gone profoundly awry with the artificial intelligence powering YouTube.
Yet one stone has so far been largely unturned. Much has been written about Facebook and Twitter's impact on politics, but in recent months academics have speculated that YouTube's algorithms may have been instrumental in fuelling disinformation during the course of its 2016 general elections.” YouTube is the most overlooked story of 2016 ,” Zeynep Tufekci, a widely respected sociologist and technology critic, tweeted back in October.” Its search and recommender algorithms are misinformation engines .”
If YouTube's recommendation algorithm actually has evolved to promote more disturbing content, how did that happen? And what is it doing to our politics?
‘Like reality, but distorted'
Those are not easy questions to answer. Like all big tech companies, YouTube does not allow us to see the algorithm that shape our lives. They are secret formulas, proprietary software, and merely select technologists are entrusted to work on the algorithm. Guillaume Chaslot, a 36 -year-old French computer programmer with a PhD in artificial intelligence, was one of those engineers.
During the three years he worked at Google, he was placed for several months with a team of YouTube technologists working on the recommendation system.The experience resulted him to conclude that the priorities YouTube gives its algorithm are dangerously skewed.
” YouTube is something that looks like reality, “but its” distorted to build you expend more day online ,” he tells me when we meet in Berkeley, California.” The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for republic .”
Chaslot explains that the algorithm never remains the same. It is constantly changing the weight it gives to different signals: the viewing patterns of a user, for example, or the length of period a video is watched before someone clicks away.
The technologists he worked with is in charge of continuously experimenting with new formulas that would increase advertising revenues by extending the amount of time people watched videos. “Watch time was the priority,” he recollects.” Everything else was considered a distraction .”
Chaslot was fired by Google in 2013, ostensibly over performance issues. He insists he was let go after agitating for change within the company, use his personal time to team up with like-minded technologists to propose changes that could diversify the content people see.
He was especially worried about the aberrations that might result from a simplistic focus on depicting people videos they discovered irresistible, generating filter bubbles, for example, that only show people content that reinforces their existing view of the world. Chaslot said none of his proposed fixes were is under consideration by his directors.” There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people insure ,” he says.” I tried to change YouTube from the inside but it didn't work .”
YouTube told me that its recommendation system had evolved since Chaslot ran at the company and now” goes beyond optimising for watchtime “. The company said that in 2016 it started taking into account user ” satisfaction”, by using surveys, for example, or looking at how many “likes” a video received, to” ensure people were satisfied with what they were viewing “. YouTube added that additional changes had been implemented in 2017 to improve the news content surfaced in searches and recommendations and discourage the promotion of videos containing” inflammatory religion or supremacist” content.
It did not say why Google, which acquired YouTube in 2006, waited over a decade to build those changes. Chaslot believes such changes are mostly cosmetic, and have failed to fundamentally alter some disturbing biases that have evolved in the algorithm. In the summer of 2016, he built a computer program to investigate.
The software Chaslot wrote was designed to provide the world's first window into YouTube's opaque recommendation engine. The program simulates the behaviour of a user who starts on one video and then follows the chain of recommended videos- much as I did after watching the Logan Paul video- tracking data along the way.
It determines videos through a word search, selecting a “seed” video to begin with, and recording several layers of videos that YouTube recommends in the” up next” column. It does so with no viewing history, ensuring the videos being seen are YouTube's generic recommendations, rather than videos personalised to a user. And it repeats the process thousands of hours, accumulating layers of data about YouTube recommendations to build up a picture of the algorithm's preferences.
Over the last 18 months, Chaslot has use the program to explore bias in YouTube content promoted during the course of its French, British and German elections, global warming and mass shootings, and published his findings on his website, Algotransparency.com. Each analyze procures something different, but the research suggests YouTube systematically amplifies videos the hell is divisive, sensational and conspiratorial.
When his program determined a seed video by searching the query” who is Michelle Obama ?” and then followed the chain of” up next” suggestions, for example, the majority of members of the recommended videos said she “is a man”. More than 80% of the YouTube-recommended videos about the pope detected by his program described the Catholic leader as “evil”, ” satanic”, or” the anti-Christ “. There were literally millions of videos uploaded onto YouTube to satiate the algorithm's craving for content claiming the earth is flat.” On YouTube, fiction is outperforming reality ,” Chaslot says.