![]() Just 12 hours after the shooting, Geoff Samek arrived for his first day as a product manager at YouTube. On October 1, 2017, when a man used an arsenal of weapons to fire into a crowd of people at a concert in Las Vegas, YouTube users immediately began uploading false-flag videos claiming the shooting was orchestrated to foment opposition to the Second Amendment. (YouTube also challenged Farid and Chaslot's research, saying it “does not accurately reflect how YouTube's recommendations work or how people watch and interact with YouTube.”) But, within YouTube, the principle of “Broadcast Yourself,” without restriction, was colliding with concerns about safety and misinformation. “We don't see evidence that extreme content or misinformation is on average more engaging, or generates more viewership, than anything else,” Goodrow said. YouTube executives deny that the billion-hour push led to a banquet of conspiracies. “It turns out that human nature is awful,” Farid tells me, “and the algorithms have figured this out, and that's what drives engagement.” As Micah Schaffer, who worked at YouTube from 2006 to 2009, told me, “It really is they are addicted to that traffic.” They found the frequency rose throughout the year at the peak, nearly one in 10 videos recommended were conspiracist fare. This time, they ran the program daily for 15 months, looking specifically for how often YouTube recommended conspiracy videos. In 2018 a UC Berkeley computer scientist named Hany Farid teamed up with Guillaume Chaslot to run his scraper again. Critics laced into Facebook's algorithm for boosting conspiratorial rants and hammered Twitter for letting in phalanxes of Russian bots. Goodrow hit the target: On October 22, 2016, a few weeks before the presidential election, users watched 1 billion hours of videos on YouTube.Īfter the 2016 election, the tech industry came in for a reckoning. In turn, YouTube became a key source of revenue in the Alphabet empire. Recommendations had become the thrumming engine of YouTube, responsible for an astonishing 70 percent of all its watch time. In Brazil, a marginal lawmaker named Jair Bolsonaro rose from obscurity to prominence in part by posting YouTube videos that falsely claimed left-wing scholars were using “gay kits” to convert kids to homosexuality.Īs 2016 wore on and the billion-hour deadline loomed, the engineers went into overdrive. Teenage boys followed recommendations to far-right white supremacists and Gamergate conspiracies the elderly got stuck in loops about government mind control anti-vaccine falsehoods found adherents. All kinds of misinformation, some of it dangerous, rose to the top of watchers' feeds. It was during this period that Sargent saw his first flat-earth video. People spent more and more time on the site, and the new code meant small creators and niche content were finding their audience. ![]() By 2014, when Susan Wojcicki took over as CEO, the billion-hour goal “was a religion at YouTube, to the exclusion of nearly all else,” as she later told the venture capitalist John Doerr. So Goodrow and the engineers began thirstily hunting for any tiny tweak that would bump watch time upward. It was an audacious goal at the time, people were watching YouTube for only 100 million hours a day, compared to more than 160 million on Facebook and 5 billion on TV. In 2012, YouTube's vice president of product, Shishir Mehrotra, declared that by the end of 2016 the site would hit a billion hours of watch time per day. The recommendation system became increasingly crucial to YouTube's frenetic push for growth. Then the model would predict which videos you'd be most likely to actually watch, and presto: recommendations, more personalized than ever. The model would take your actions (whether you'd finished a video, say, or hit Like) and blend that with other information it had gleaned (your search history, geographic region, gender, and age, for example a user's “watch history” became increasingly significant too). ![]() By 2015, they would also introduce neural-net models to craft recommendations. Instead, they focused on “watch time,” or how long viewers stayed with a video it seemed to them a far better metric of genuine interest. Goodrow and his team decided to stop ranking videos based on clicks. ![]() Even if a viewer immediately bailed, the click would goose the view count higher, boosting the video's recommendations. Goodrow noticed another problem caused by YouTube's focus on views, which was that it encouraged creators to use misleading tactics-like racy thumbnails-to dupe people into clicking. In 2011, Google tapped Cristos Goodrow, who was then director of engineering, to oversee YouTube's search engine and recommendation system. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |