It’s over. Facebook is in decline, Twitter in chaos. Mark Zuckerberg’s empire has lost hundreds of billions of dollars in value and laid off 11,000 people, with its ad business in peril and its metaverse fantasy in irons. Elon Musk’s takeover of Twitter has caused advertisers to pull spending and power users to shun the platform (or at least to tweet a lot about doing so). It’s never felt more plausible that the age of social media might end—and soon.
Now that we’ve washed up on this unexpected shore, we can look back at the shipwreck that left us here with fresh eyes. Perhaps we can find some relief: Social media was never a natural way to work, play, and socialize, though it did become second nature. The practice evolved via a weird mutation, one so subtle that it was difficult to spot happening in the moment.
The shift began 20 years ago or so, when networked computers became sufficiently ubiquitous that people began using them to build and manage relationships. Social networking had its problems—collecting friends instead of, well, being friendly with them, for example—but they were modest compared with what followed. Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
But now, perhaps, it can also end. The possible downfall of Facebook and Twitter (and others) is an opportunity—not to shift to some equivalent platform, but to embrace their ruination, something previously unthinkable.
A long time ago, many social networks walked the Earth. Six Degrees launched in 1997, named after a Pulitzer-nominated play based on a psychological experiment. It shut down soon after the dot-com crash of 2000—the world wasn’t ready yet. Friendster arose from its ashes in 2002, followed by MySpace and LinkedIn the next year, then Hi5 and Facebook in 2004, the latter for students at select colleges and universities. That year also saw the arrival of Orkut, made and operated by Google. Bebo launched in 2005; eventually both AOL and Amazon would own it. Google Buzz and Google+ were born and then killed. You’ve probably never heard of some of these, but before Facebook was everywhere, many of these services were immensely popular.
Content-sharing sites also acted as de facto social networks, allowing people to see material posted mostly by people they knew or knew of, rather than from across the entire world. Flickr, the photo-sharing site, was one; YouTube—once seen as Flickr for video—was another. Blogs (and bloglike services, such as Tumblr) raced alongside them, hosting “musings” seen by few and engaged by fewer. In 2008, the Dutch media theorist Geert Lovink published a book about blogs and social networks whose title summarized their average reach: Zero Comments.
Today, people refer to all of these services and more as “social media,” a name so familiar that it has ceased to bear meaning. But two decades ago, that term didn’t exist. Many of these sites framed themselves as a part of a “web 2.0” revolution in “user-generated content,” offering easy-to-use, easily adopted tools on websites and then mobile apps. They were built for creating and sharing “content,” a term that had previously meant “satisfied” when pronounced differently. But at the time, and for years, these offerings were framed as social networks or, more often, social-network services. So many SNSes proliferated, a joke acronym arose: YASN, or “yet another social network.” These things were everywhere, like dandelions in springtime.
As the original name suggested, social networking involved connecting, not publishing. By connecting your personal network of trusted contacts (or “strong ties,” as sociologists call them) to others’ such networks (via “weak ties”), you could surface a larger network of the trusted contacts of trusted contacts. LinkedIn promised to make job searching and business networking possible by traversing the connections of your connections. Friendster did so for personal relationships, Facebook for college mates, and so on. The whole idea of social networks was networking: building or deepening relationships, mostly with people you knew. How and why that deepening happened was largely left to the users to decide.
That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts. Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
The terms social network and social media are used interchangeably now, but they shouldn’t be. A social network is an idle, inactive system—a rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
A 2003 paper published in Enterprise Information Systems made an early case that drives the point home. The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
This was a novel concept. When News Corp, a media company, bought MySpace in 2005, The New York Times called the website a “a youth-oriented music and ‘social networking’ site”—complete with scare quotes. The site’s primary content, music, was seen as separate from its social-networking functions. Even Zuckerberg’s vision for Facebook, to “connect every person in the world,” implied a networking function, not media distribution.
The toxicity of social media makes it easy to forget how truly magical this innovation felt when it was new. From 2004 to 2009, you could join Facebook and everyone you’d ever known—including people you’d definitely lost track of—was right there, ready to connect or reconnect. The posts and photos I saw characterized my friends’ changing lives, not the conspiracy theories that their unhinged friends had shared with them. LinkedIn did the same thing with business contacts, making referrals, dealmaking, and job hunting much easier than they had been previously. I started a game studio in 2003, when LinkedIn was brand new, and I inked our first deal by working connections there.
Read: What if Rumble is the future of the social web?
Twitter, which launched in 2006, was probably the first true social-media site, even if nobody called it that at the time. Instead of focusing on connecting people, the site amounted to a giant, asynchronous chat room for the world. Twitter was for talking to everyone—which is perhaps one of the reasons journalists have flocked to it. Sure, a blog could technically be read by anybody with a web browser, but in practice finding that readership was hard. That’s why blogs operated first as social networks, through mechanisms such as blogrolls and linkbacks. But on Twitter, anything anybody posted could be seen instantly by anyone else. And furthermore, unlike posts on blogs or images on Flickr or videos on YouTube, tweets were short and low-effort, making it easy to post many of them a week or even a day.
The notion of a global “town square,” as Elon Musk has put it, emerges from all of these factors. On Twitter, you can instantly learn about a tsunami in Tōhoku or an omakase in Topeka. This is also why journalists became so dependent on Twitter: It’s a constant stream of sources, events, and reactions—a reporting automat, not to mention an outbound vector for media tastemakers to make tastes.
When we look back at this moment, social media had already arrived in spirit if not by name. RSS readers offered a feed of blog posts to catch up on, complete with unread counts. MySpace fused music and chatter; YouTube did it with video (“Broadcast Yourself”). In 2005, at an industry conference, I remember overhearing an attendee say, “I’m so behind on my Flickr!” What does that even mean? I recall wondering. But now the answer is obvious: creating and consuming content for any reason, or no reason. Social media was overtaking social networking.
Instagram, launched in 2010, might have built the bridge between the social-network era and the age of social media. It relied on the connections among users as a mechanism to distribute content as a primary activity. But soon enough, all social networks became social media first and foremost. When groups, pages, and the News Feed launched, Facebook began encouraging users to share content published by others in order to increase engagement on the service, rather than to provide updates to friends. LinkedIn launched a program to publish content across the platform, too. Twitter, already principally a publishing platform, added a dedicated “retweet” feature, making it far easier to spread content virally across user networks.
Other services arrived or evolved in this vein, among them Reddit, Snapchat, and WhatsApp, all far more popular than Twitter. Social networks, once latent routes for possible contact, became superhighways of constant content. In their latest phase, their social-networking aspects have been pushed deep into the background. Although you can connect the app to your contacts and follow specific users, on TikTok, you are more likely to simply plug into a continuous flow of video content that has oozed to the surface via algorithm. You still have to connect with other users to use some of these services’ features. But connection as a primary purpose has declined. Think of the change like this: In the social-networking era, the connections were essential, driving both content creation and consumption. But the social-media era seeks the thinnest, most soluble connections possible, just enough to allow the content to flow.
Social networks’ evolution into social media brought both opportunity and calamity. Facebook and all the rest enjoyed a massive rise in engagement and the associated data-driven advertising profits that the attention-driven content economy created. The same phenomenon also created the influencer economy, in which individual social-media users became valuable as channels for distributing marketing messages or product sponsorships by means of their posts’ real or imagined reach. Ordinary folk could now make some money or even a lucrative living “creating content” online. The platforms sold them on that promise, creating official programs and mechanisms to facilitate it. In turn, “influencer” became an aspirational role, especially for young people for whom Instagram fame seemed more achievable than traditional celebrity—or perhaps employment of any kind.
The ensuing disaster was multipart. For one, social-media operators discovered that the more emotionally charged the content, the better it spread across its users’ networks. Polarizing, offensive, or just plain fraudulent information was optimized for distribution. By the time the platforms realized and the public revolted, it was too late to turn off these feedback loops.
Obsession fueled the flames. Compulsion had always plagued computer-facilitated social networking—it was the original sin. Rounding up friends or business contacts into a pen in your online profile for possible future use was never a healthy way to understand social relationships. It was just as common to obsess over having 500-plus connections on LinkedIn in 2003 as it is to covet Instagram followers today. But when social networking evolved into social media, user expectations escalated. Driven by venture capitalists’ expectations and then Wall Street’s demands, the tech companies—Google and Facebook and all the rest—became addicted to massive scale. And the values associated with scale—reaching a lot of people easily and cheaply, and reaping the benefits—became appealing to everyone: a journalist earning reputational capital on Twitter; a 20-something seeking sponsorship on Instagram; a dissident spreading word of their cause on YouTube; an insurrectionist sowing rebellion on Facebook; an autopornographer selling sex, or its image, on OnlyFans; a self-styled guru hawking advice on LinkedIn. Social media showed that everyone has the potential to reach a massive audience at low cost and high gain—and that potential gave many people the impression that they deservesuch an audience.
The flip side of that coin also shines. On social media, everyone believes that anyone to whom they have access owes them an audience: a writer who posted a take, a celebrity who announced a project, a pretty girl just trying to live her life, that anon who said something afflictive. When network connections become activated for any reason or no reason, then every connection seems worthy of traversing.
That was a terrible idea. As I’ve written before on this subject, people just aren’t meant to talk to one another this much. They shouldn’t have that much to say, they shouldn’t expect to receive such a large audience for that expression, and they shouldn’t suppose a right to comment or rejoinder for every thought or notion either. From being asked to review every product you buy to believing that every tweet or Instagram image warrants likes or comments or follows, social media produced a positively unhinged, sociopathic rendition of human sociality. That’s no surprise, I guess, given that the model was forged in the fires of Big Tech companies such as Facebook, where sociopathy is a design philosophy.
If Twitter does fail, either because its revenue collapses or because the massive debt that Musk’s deal imposes crushes it, the result could help accelerate social media’s decline more generally. It would also be tragic for those who have come to rely on these platforms, for news or community or conversation or mere compulsion. Such is the hypocrisy of this moment. The rush of likes and shares felt so good because the age of zero comments felt so lonely—and upscaling killed the alternatives a long time ago, besides.
If change is possible, carrying it out will be difficult, because we have adapted our lives to conform to social media’s pleasures and torments. It’s seemingly as hard to give up on social media as it was to give up smoking en masse, like Americans did in the 20th century. Quitting that habit took decades of regulatory intervention, public-relations campaigning, social shaming, and aesthetic shifts. At a cultural level, we didn’t stop smoking just because the habit was unpleasant or uncool or even because it might kill us. We did so slowly and over time, by forcing social life to suffocate the practice. That process must now begin in earnest for social media.
Something may yet survive the fire that would burn it down: social networks, the services’ overlooked, molten core. It was never a terribleidea, at least, to use computers to connect to one another on occasion, for justified reasons, and in moderation (although the risk of instrumentalizing one another was present from the outset). The problem came from doing so all the time, as a lifestyle, an aspiration, an obsession. The offer was always too good to be true, but it’s taken us two decades to realize the Faustian nature of the bargain. Someday, eventually, perhaps its web will unwind. But not soon, and not easily.
A year ago, when I first wrote about downscale, the ambition seemed necessary but impossible. It still feels unlikely—but perhaps newly plausible. That’s a victory, if a small one, so long as the withdrawal doesn’t drive us back to the addiction. To win the soul of social life, we must learn to muzzle it again, across the globe, among billions of people. To speak less, to fewer people and less often–and for them to do the same to you, and everyone else as well. We cannot make social media good, because it is fundamentally bad, deep in its very structure. All we can do is hope that it withers away, and play our small part in helping abandon it.