An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clintons bid for the presidency?

It was one of January's most viral videos. Logan Paul, a YouTube celebrity, stumbles across a dead man hanging from a tree. The 22 -year-old, who is in a Japanese forest famous as a suicide spot, is visibly shocked, then amused.” Dude, his hands are purple ,” he says, before turning to his friends and giggling.” You never stand next to a dead guy ?”

Paul, who has 16 million mostly teen subscribers to his YouTube channel, removed the video from YouTube 24 hours later amid a furious backlash. It was still long enough for the footage to receive 6m views and a spot on YouTube's coveted list of trending videos.

The next day, I watched a transcript of the video on YouTube. Then I clicked on the” Up next” thumbnails of recommended videos that YouTube showcases on the right-hand side of the video player. This conveyor belt of clips, which auto-play by default, are available to seduce us to expend more hour on Google's video broadcasting platform. I was curious where they might lead.

The answer was a slew of videos of men mocking distraught teenage fans of Logan Paul, followed by CCTV footage of the rights of children stealing things and, a few clicks afterwards, a video of children having their teeth pulled out with bizarre, homemade contraptions.

I had cleared my history, deleted my cookies, and opened a private browser to be sure YouTube was not personalising recommendations. This was the algorithm taking me on a journey of its own volition, and it culminated with a video of two boys, aged about five or six, punching and kicking one another.

” I'm going to post it on YouTube ,” said a teenage daughter, who sounded like she might be an older sibling.” Turn around and punch the heck out of that little boy .” They scuffled for several minutes until one had knocked the other's tooth out.

* * *

There are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions. What they watch is shaped by this algorithm, which skims and ranks billions of videos to identify 20″ up next” clips that are both relevant to a previous video and most likely, statistically speaking, to keep a person hooked on their screen.

Company insiders tell me the algorithm is the single most important engine of YouTube's growth. In one of the few public explanations of how the formula works- an academic newspaper that sketches the algorithm's deep neural networks, crunching a vast pool of data about videos and the people who watch them- YouTube engineers describe it as one of the” largest scale and most sophisticated industrial recommendation systems in existence “.

Lately, it has also become one of the most controversial. The algorithm has been found to be promoting conspiracy hypothesis about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children's character Peppa Pig fees her parent or beverages bleach.

Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos- which featured his children receiving flu shoots, removing earwax, and exclaiming over dead pets- told a reporter he had only been responding to the demands of Google's algorithm.” That's what got us out there and popular ,” he said.” We learned to fuel it and do whatever it took to please the algorithm .”

Google has responded to these disputes in a process akin to Whac–AMole: expanding the army of human moderators, removing offensive YouTube videos identified by journalists and de-monetising the channels that make them. But none of those moves has decreased a growing concern that something has gone profoundly awry with the artificial intelligence powering YouTube.

Yet one stone has so far been largely unturned. Much has been written about Facebook and Twitter's impact on politics, but in recent months academics have speculated that YouTube's algorithms may have been instrumental in fuelling disinformation during the course of its 2016 general elections.” YouTube is the most overlooked story of 2016 ,” Zeynep Tufekci, a widely respected sociologist and technology critic, tweeted back in October.” Its search and recommender algorithms are misinformation engines .”

If YouTube's recommendation algorithm actually has evolved to promote more disturbing content, how did that happen? And what is it doing to our politics?

‘Like reality, but distorted'

[ youtube https :// www.youtube.com/ watch? v= aTxUetlqWmU? enablejsapi= 1& rel= 0& showinfo= 0& origin= https :// www.theguardian.com& w= 100& h= 100]

Play Video
3:13

How YouTube's algorithm distorts reality- video explainer