The Right to Reset on Social Media
Our social feeds have been taken over by algorithms. Here's a way we can put some power back in users' hands.
It's happened gradually, but over the last decade or so, we've lost control over the content that we're exposed to on social media. I don't mean that our feeds have gotten worse or less interesting — if anything, the opposite is probably true. But the determination of what posts show up when and for whom has evolved quite a bit.
"So what?" you may ask, and that's fair. Maybe you sense a protest coming, and fear that I may be like one of those people who was afraid of elevators in the 1800s. But I'm a daily user of social media. I love my TikTok For You feed. I'm not deleting Instagram.
Here's the thing: Our feeds have moved from being hand-picked (posts from accounts that we follow) to algorithmic (posts that the systems predict that we'll like.) That on its own isn't bad. The problem is that there aren't enough human controls to ensure that these algorithms work for us and not the other way around.
The Oh-So-Simple No-Algorithm Era
Early Facebook, Twitter, and Instagram were all about recency. When we opened any of those apps in their first few years of existence, the very top post was the newest, the next post was the next newest, and so on. Sure, Twitter was for words and Instagram was for pictures, but in this respect, they worked exactly the same: what was recent was first.
Another commonality those networks shared: We, the users, explicitly chose the accounts that we wanted to see content from. By friending and liking on Facebook and following on Twitter and Instagram, we were essentially subscribing to our friends' and favorite creators' content. It was understood that that was what would go into our feeds — nothing else.
But what if a friend posted something six hours ago, something that you'd be really interested in, and it got buried in your feed under other newer but less interesting posts? Enter algorithms.
The First Change to the Feed: "Popular" Instead of "New"
Over the next few years, each major social media app transformed how their feeds were sorted by employing relatively simple algorithms.
Facebook introduced a re-worked News Feed in 2009, with posts now ranked by "engagement" rather than recency. (Recency still figured into the algorithm, but at a lesser weight.) Instagram and Twitter held out longer, not going to algorithmically-sorted feeds until 2016.
Before, these companies were just facilitating connection — now, they were playing publisher, determining, by way of their algorithms, which posts we would see first.
There's no analog equivalent that I can think of that can communicate the sneaky significance of this change. But this first step — when the social platforms started ranking and sorting the content that we see — was as consequential as any other development in social media since Mark Zuckerberg hit "publish" on thefacebook.com in 2004.
The Second Change to the Feed: "You May Also Like"
The social platforms weren't done tinkering with our feeds — they were just getting started.
When my high school class was starting on Facebook in 2006 and 2007, our networks were made up of our friends and classmates. There was a broad range of things we could comfortably post about. (And we've since spent years deleting it all!)
As time passed and aunts, uncles, neighbors, parents, professional acquaintances, etc. joined Facebook, the list of "acceptable" things to share shrunk. College announcements, engagements, moves, new jobs, babies...only the big things remained, and sometimes, not even those. We began revealing much less of ourselves, and doing it less frequently.
This phenomenon is called context collapse. In social media, it's what happens when you flatten yourself online to make your content palatable for a diverse audience of connections.
So while Facebook's user base grew dramatically, people — or at least an important set of Facebook users — began sharing less. Zuckerberg's Law proved to be more like Zuckerberg's Wish.
This shift in behavior meant that there would be fewer posts from friends in our News Feeds, which would mean less content to sell ads against, which would result in less revenue to satisfy the demands of a once-in-a-generation company¹.
So what happens? Facebook (and eventually Instagram and Twitter) starts showing us posts that we didn't explicitly ask for. They use words like "recommendations" and "you may like" to describe this content that slid between and competed with our friends' and favorite creators' posts. And again: This on its own isn't bad! Algorithms surface content that we may be interested in but wouldn't otherwise see or know about. But they get it wrong sometimes, too.
Our feeds were completely changed and we didn't really notice. Maybe we should have. Facebook's mission statement until 2017 was, "Making the world more open and connected." An honest mission statement then and now would be "Keeping people swiping for as long as possible."
The Third Change to the Feed: TikTok, A New Beast
If these initial steps — changing how our friends' posts are sorted and inserting recommended content — cracked the door open for platform-manipulated feeds, TikTok kicked the thing down.
In the United States, the TikTok story starts with Musical.ly, a lip-syncing app that launched in 2014². Musical.ly had some unique features, particularly on the creation side, but it basically felt like other American social media apps (though the company was based in China.)
In 2017, ByteDance, a Chinese tech conglomerate, acquired Musical.ly and combined it with TikTok (which ByteDance had successfully operated in China under the name Douyin.) Over the next couple years, ByteDance blanketed the American market with an aggressive ad spend and grew TikTok's user base very, very quickly.
I'll let the great Ben Thompson explain what makes the TikTok product so unique:
ByteDance’s breakthrough product was a news app called TouTiao; whereas Facebook evolved from being primarily a social network to an algorithmic feed, TouTiao was about the feed and the algorithm from the beginning. The first time a user opened TouTiao, the news might be rather generic, but every scroll, every linger over a story, every click, was fed into a feedback loop that refined what it was the user saw.
Meanwhile all of that data fed back into TouTiao’s larger machine learning processes, which effectively ran billions of A/B tests a day on content of all types, cross-referenced against all of the user data it could collect. Soon the app was indispensable to its users, able to anticipate the news they cared about with nary a friend recommendation in sight. That was definitely more of a feature than a bug in China, where any information service was subject to not just overt government censorship, but also an expectation of self-censorship; all the better to control everything that end users saw, without the messiness of users explicitly recommending content themselves[...]
On TikTok, the algorithm is the product. So instead of asking you to find your friends, invite your friends, or set your interests, TikTok starts by just serving you videos, confident that its algorithm and the many signals that it employs will soon know everything there is to know about you — or at least enough to keep you swiping, which is all that matters.
A New Challenge for Humans, Tech, and Washington
Because social media is (still) a truly new medium, the industry benefits from lawmakers' misunderstanding and enjoys lax regulation. Roll the clip of Zuckerberg explaining, "Senator, we run ads."
There's not nearly enough legislative curiosity about the effects or inner-workings of a technology that the average American uses moments after waking up, moments before falling asleep, and for hours in between.
When public conversation does bubble up around policy and the social media industry, its usually regarding topics like privacy, election integrity, and moderation — all important issues, to be sure, but separate from the feed itself, which facilitates our everyday usage of these apps.
Instead, the major social platforms are mostly left to govern themselves. And to be clear, there are changes that they push in the interest of user safety. (A cynic could suggest that these moves are token improvements that can be cited under pressure in Washington to keep the heat down, but I'll choose to assume good intent.) In fact, there are roles at these companies entirely dedicated to user safety and mental well-being. Unfortunately, we can only guess what happens when those teams’ interests conflict with their companies’ pursuit of profit.
One particular hazard of algorithmic feeds is their ability to send users down dangerous rabbit holes. Consider the 14 year-old girl getting inundated with dangerous weight-loss posts — a spiral that can begin with a single "like." Or the 23 year-old man who watches a couple fringy political videos and soon has a feed flooded with conspiracy content. These are self-reinforcing spirals and it can be hard for those experiencing them to tell the forest from the trees.
What's worse, there's no clear, easy way to course-correct our information diet on these platforms. Think about it: If you want to start eating better today, you can throw out your junk food, go to the grocery store, and shop for healthier options. If you don't want your brain to be polluted with low-quality content, there's no equivalent solution on social media, save logging off for good, which isn't practical for many, least of all young people.
The Idea
There's a lot that can be done, both by companies and by lawmakers, but a couple thousand words into this essay, I won't push my luck with your attention, and will share just one idea here.
I propose that on any social or entertainment app with an algorithmically-determined experience, users should be given easy access to a button that would blow up the algorithm's learnings and assumptions about them, and then prompt the user to explicitly re-define their own interests.
From there, the manually-tuned algorithm would begin working again, starting fresh, with this new input prominently factored into its calculations.
As-is, our options are limited. We can mark posts as "Not Interested" but it's tedious, opaque (what will this action actually do?), and somewhat hidden. We can delete our account and start over, but then we lose the authentic connections we do have.
This button would give users an always-there choice. If you’re unhappy with your feed, here’s a way to remake it. And it doesn't have to be reserved only for instances of dangerous content, like those cited above — it could be pushed by anyone drowning in dumb/irrelevant/annoying posts that aren’t serving their well-being and happiness.
The result? Fewer empty (digital) calories. Time better spent. Some sense of control.
It’s Still Early, But it’s Time to Act
I'm not arguing for a return to social media's primitive beginnings. I'm arguing that we're still in a primitive beginning, at least in our understanding of the social, societal, and psychological implications of this medium. But the clock is ticking, and new tech is coming.
I touched on TikTok's entirely algorithmically-determined feed — what happens when networks can create entirely synthetic content, meant just for us? Who's determining how that will work? What interests and incentives will they have, beyond just keeping people swiping?
Before we introduce new forms to consume content, perhaps we should debate and develop ways to control the formerly unfathomable, sometimes dangerous, always interesting sea of content that we already navigate in our feeds on a daily basis.
Let’s start with a button that lets us blow it all up.
¹ This is almost definitely more complex than I’ve breezily outlined here. Context collapse isn’t the only reason why Facebook and its competitors decided to start inserting recommended content, but it pretty clearly played a part. Also: it cannot be overstated how important Snap’s invention of the stories format was to Meta. Just as things were drying up in the feed, stories provided an intimate, low-pressure space for users to share freely.
² Musical.ly was an interesting product study in its own regard. This is a talk given by founder Alex Zhu about growing a user community that’s a hidden gem for anyone building a consumer social product.
Thanks for reading! This is the second of four internet essays I’m going to write for my third Season of Writing. They’ll all be out by the end of March, 2023. If you liked this one, please subscribe below, and you’ll receive my writing in your inbox.