Sophie Bishop talking about 'algorithm bros' 😍🤜🤛 @sophiehbishop @DigiCultureKCL pic.twitter.com/bKFOEOk2W1— Zoe Glatt (@ZoeGlatt) November 7, 2018
The subject of her talk was a small cadre of young men who have managed to persuade millions of followers (as well as some large corporations) that they can reveal the "secrets" of the YouTube algorithm.
I want to comment on two aspects of her work in particular. Firstly there is the question, addressed in her previous paper, of "anxiety, panic and self-optimization". When people create content on YouTube or similar platforms, they have an interest in getting their content viewed as widely as possible - indeed wide viewership ("going viral") is generally regarded as the measure of success. But these platforms are capricious, in the sense that they (deliberately) don't make it easy to manipulate this measure, and this generates a sense of precarity - not only among individual content providers but also among political and commercial organizations.
So when someone offers to tell you the secrets of success on YouTube, someone who is himself already successful on YouTube, it would be hard to resist the desire to learn these secrets. Or at least to listen to what they have to say. And risk-averse corporations may be willing to bung some consultancy money in their direction.
YouTube's own engineers describe the algorithm as "one of the largest scale and most sophisticated industrial recommendation systems in existence". Their models learn approximately one billion parameters and are trained on hundreds of billions of examples. The idea that a couple of amateurs without significant experience or funding can "reverse engineer" this algorithm stretches credibility, and Bishop points out several serious methodological flaws with their approach, while speculating that perhaps what really matters to the growth hacking community is not what the YouTube algorithm actually does but what the user thinks it does. She notes that the results of this "reverse engineering" experiment have been widely disseminated, and presented at an event sponsored by YouTube itself.
What is the effect of disseminating this kind of material? I don't know if it helps to make YouTubers less anxious, or conversely makes them more anxious than they were already. No doubt YouTube is happy about anything that encourages people to devote even more time to creating sticky content for YouTube. A dashboard (in this case, YouTube's Creator Studio) provides a framing device, focusing people's attention on certain metrics (financial gains and social capital), and fostering the illusion that the metrics on the dashboard are really the only ones that matter.
The other aspect of Bishop's work I wanted to discuss is the apparent gender polarization on YouTube - not only polarization of content and who gets to see which content, but also a significantly different operating style for male and female content providers. The traditional feminist view (McRobbie, Meehan) is that this polarization is a response to the commercial demands of the advertisers. But other dimensions of polarization have become apparent more recently, including political extremism, and Zeynep Tufekci argues that YouTube may be one of the most powerful radicalizing instruments of the 21st century. This hints at a more deeply rooted schismogenesis.
Meanwhile, how much of this was intended or foreseen by YouTube is almost beside the point. Individuals and organizations may be held responsible for the consequences of their actions, including unforeseen consequences.
Sophie Bishop, Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm (Convergence, 24(1), 2018 pp 69–84)
Paul Covington, Jay Adams, Emre Sargin, Deep Neural Networks for YouTube Recommendations (Proceedings of the 10th ACM Conference on Recommender Systems, 2016, pages 191-198)
Paul Lewis, 'Fiction is outperforming reality': how YouTube's algorithm distorts truth (Guardian, 2 Feb 2018)
Zeynep Tufekci, Opinion: YouTube, the Great Radicalizer (New York Times, 10 March 2018)
Related post Ethical communication in a digital age (November 2018), Polarization (November 2018)