The Illusion of Choice: How Streaming Algorithms Limit What You Watch
Think you're choosing what to watch? Think again. A look at how streaming and social media algorithms create an illusion of choice, and what the research says about breaking free.
This post is an excerpt from my book, Lost in the Stream, available now wherever books are sold.
How did you first hear about the last movie you watched? As most of you likely said, it was through an online source. But it was also entirely out of your control. If you saw it from a social media post or a content creator like me, an algorithm pushed that content in front of you. If you saw a movie poster on Reddit, an algorithm pushed that to your front page. If you saw an article in your Google news alerts from Variety featuring an interview with Ben Affleck on his new movie, an algorithm made the decision to prompt you with that link.
This is what I and experts in this field of study would call “the illusion of choice.” We have more choices than we have ever had to get our news, connect with our friends, listen to music, engage in our communities, and of course, watch movies and TV shows. But we aren’t truly choosing any of that. It is all based on a machine that sucks up all of our data and spits it back out into tightly curated experiences that it believes we will enjoy.
While there are no definitive numbers on just how many algorithms we interact with each day, educated guesses by industry experts put that number around several hundred, if not 1,000 or more, every day. There are algorithms determining our healthcare, education, careers, criminal justice system, and of course, entertainment. If you are looking for a deeper dive into algorithms and the massive impact that they have in modern society, I highly recommend Hannah Fry’s book Hello World: Being Human in the Age of Algorithms, which is a fascinating and occasionally terrifying look into this recent phenomenon.
These algorithms can also be difficult to break, or reset, which amplifies the echo chamber that users can get trapped in. There is a fascinating recent study out of the University of Pennsylvania and their Annenberg School for Communication’s Computational Social Science Lab that aimed to determine how much impact YouTube algorithms versus user input played into the content that users would see on the platform. The main goal of the study was to investigate the impact that YouTube’s algorithm has on the radicalization of individuals. However, the most interesting part of that study for me was the experiment the team ran to determine how long it took to “break free” from the algorithm’s grasp.
To do this, researchers took one of their bot accounts that had a long history (120 videos) of watching far-right news and changed the bot’s consumption to more moderate news channels for the next sixty videos. The findings surprised me. From the report:
If partisan consumers switch to moderate content, YouTube’s sidebar recommender “forgets” their partisan preference within roughly thirty videos regardless of their prior history, while homepage recommendations shift more gradually toward moderate content.
In the world of YouTube, thirty videos would equate to roughly six hours of viewing based on an average length of twelve minutes per video. For many people, this may not take long at all — a couple of days with consistent viewing and you could “reset” your algorithm. However, this study did not consider all of the other data points that YouTube would have on a typical user as part of the Google ecosystem. These bots did not have a documented history of search inquiries, emails, website logins via Google, tons of phone data from Android users, and more. With this data included, I am sure that it would take even longer than thirty videos to break free from the algorithm’s hold.
If we apply some of that data to streaming apps and assume that they are using similar algorithms to YouTube and other entertainment apps, it is likely an even harder task to break free. Even if the streaming algorithms required less content, say ten to fifteen movies or TV shows to “break free,” that is multiple days or weeks for the average user. It also would require a consistent shift, which might be even harder to do.
For example, say you had a month or two where you were on a big comedy kick. Every time you opened Netflix, you went straight to I Think You Should Leave or Seinfeld or Arrested Development. Naturally, the algorithm is going to recommend similar content to keep you engaged. But say October rolls around and you want some horror movies, or it’s July and you are in the mood for some summer romance movies. Well, you won’t see those recommended to you, even after several watches. And if you do jump back into Seinfeld in the middle of this shift, it will set the algorithm straight again.
There are ways to manually reset recommendations, usually in user settings, by deleting all of your existing history and starting fresh. But even then, you are served recommendations based on what’s trending and popular amongst the masses. The reality is that you can’t put people into a box, but that is what these algorithms want to do. Humans don’t fit nicely into a single category. We are too complex for that. Algorithms will continue to improve and get “smarter,” but as long as they rely on user input for their learning, it is so hard for a streaming service to make a solid recommendation or challenge you to try something new without being human and thinking outside the box.
Is it depressing? Sure. Scary? Yep. But luckily, we don’t have to live that way. There are ways out of the bubble, and there are ways to break out of the algorithm and find movies that aren’t trending. There are so many great movies out there, even from just the past decade, that you likely missed. And if you are a member of the younger millennial or Gen Z generations, even more “old” movies don’t get nearly enough love from these algorithms because they aren’t “sexy” and new. They don’t drive clicks and new user sign-ups, so some of them disappear from the zeitgeist.
But they don’t have to disappear. You control what you watch. You control where you click. You control what you buy. And it is time that we all took back that control.
This post is an excerpt from my book, Lost in the Stream, available now wherever books are sold.



