Social Media Platforms and the Information Exchange They Run
In the first post of our blog series on social media, we ask: Why is regulating (or not regulating) social media such a big deal?
Social media has become our one-stop-shop for online content—a place where we can access friends, news, and entertainment all at once. Within this ecosystem, platforms—like Facebook, Twitter, and TikTok—play a central role. But how platforms operate remains largely unregulated. In light of recent events—including the Cambridge Analytica scandal and the release of the Facebook Files—many are now calling for social media oversight. What’s at the heart of the demands to regulate? What types of regulation are feasible? And, if we decide to regulate, where do we start? In this blog series, we’ll explore these questions and more.
Social media began as an online social network—a place where people could connect with friends and family. However, in the years since, this simple yet powerful idea has grown and evolved. Platforms like Facebook, Twitter, and TikTok are no longer just social networks. They’re the first places we turn to for updates on anything from yesterday’s sports highlights to the latest in politics. Today, social media is our one-stop-shop for online content, providing us with news, entertainment, and visibility into the lives of friends, family, and celebrities all at once.
In many ways, social media has improved our lives. It connects us with people across the world, removes barriers to collective action, and, above all, puts information at our fingertips so that we don’t have to search for it. As with any disruptive technology, however, it’s also caused its fair share of problems.
Over the past two decades, social media has come under fire for radicalizing teens, spreading misinformation, propagating discriminatory practices, amplifying hate speech, and more. In one of the most well-known incidents, Facebook was fined for allowing Cambridge Analytica to collect the personal data of 87 million users (often without consent) and use that data to deliver personalized political ads. In 2020, documents known as the Facebook Files revealed that Meta (Instagram’s parent company) hid research showing that Instagram negatively affects the mental health of its teenage users. Viral TikTok challenges—such as swallowing Tide Pods or choking oneself until passing out—have landed users in the hospital and even resulted in deaths. On top of all this, there’s evidence that platforms are sometimes aware of what causes these problems but choose to not intervene.
So, should social media platforms bear the blame for any of these problems, or are these issues an inevitable part of moving our lives online? To answer this question, let’s take a step back and explore the role that platforms play in the social media ecosystem.
Key players in social media
Other than platforms themselves, there are three key players in the social media ecosystem: users, content creators, and advertisers.
Users are the main consumers (and lifeblood) of social media. They come to social media to create an online presence, connect with friends, catch up on celebrity gossip, read the breaking news, post life updates, and more. Because platforms rely on an audience for their content (and their advertisements), they work hard to make sure that users keep coming back. Luckily for platforms, a lack of users hasn’t been much of a problem in recent years, with social media now seeing nearly five billion of them.
This massive user base has, in turn, attracted content creators. Although the first content creators were primarily users themselves (back in the day, most feeds were a mix of posts from friends and family), there are now many types of creators. Some creators (like National Geographic or ESPN) produce content that redirects traffic back to their own websites. A separate but growing population of creators, known as influencers, produce content in order to build their own personal brands, which they then use to make a living through, for example, sponsorships. No matter the type, content creators all have something they want to share, and social media has become their favorite (and, sometimes, only) outlet.
No business is complete without a source of revenue, and the players that keep social media platforms afloat are advertisers. Advertisers come to social media for access to users and, in particular, the ability to target specific users. This ability is made possible by the detailed user data that platforms collect, including everything from user demographics (like gender and education level) to fine-grained behavior (like how a user scrolls down their feed). Using this information, ads are only delivered to users who are likely to be receptive to them, making each ad impression on social media worth more than in traditional advertising. Indeed, advertisers pay (in total) hundreds of billions of dollars a year to feature their products on social media.
These three players are the key ingredients to the social media recipe. Creators produce content, users consume content, and advertisers keep the whole business going. So, what role do platforms play in all this?
Social media platforms as information brokers
Unlike the other key players in social media, platforms don’t create or consume information. Instead, they mediate interactions between those who do. For instance, although an influencer may create a video, it's the platform that determines which users get to see the video on their feeds. In this way, platforms run what we might call an information exchange.
On one side of this exchange, platforms gather content—including articles, videos, and ads—and decide how to deliver this content to users. Platforms choose not only what content users see, but also when, where, and how they see it.
On the other side of the exchange, platforms collect data—like which profiles a user visits most often, what keywords a user is interested in, or the date and time of the user’s every click—then decide how to process and share this information. Platforms may, for instance, crunch the numbers on which topics are going viral and share them with influencers. Or platforms may share information that helps lenders determine a user’s creditworthiness (in fact, Facebook patented a technology that supposedly allows them to predict a user’s creditworthiness based on the user’s social network).
But platforms don’t do all this for free. They leverage their position to generate revenue, primarily through advertisements or the sale of user data. For instance, platforms often charge advertisers a small fee (known as a pay-per-click fee) for every successful ad impression on their platform. They also earn quite a bit by gathering and selling user data to third parties, like mortgage lenders and even political campaigns.
All in all, platforms act as information brokers. They execute information "transactions" between users, creators, and advertisers, all the while taking a “cut” from each transaction.
The power of information
In many ways, we need platforms. There’s simply too much online content to consume, so we rely on platforms to curate it for us. Platforms filter out content that they think is irrelevant, and personalize what’s left to our habits and interests. It’s in large part thanks to them that what we see on social media interests us at all.
Because they provide this much-needed service, we often turn a blind eye to the level of influence that platforms have. Indeed, platforms and only platforms have access to all the posts, videos, comments, likes, and messages between users. And as information brokers, platforms are the ones who determine where each piece of information ends up. They decide what users see, how data is collected, what content can be monetized, and even what content is removed.
By controlling the flow of information, platforms can shape how we think, feel, and behave—from what we buy to who we vote for. After all, information is power, and social media platforms have a lot of it.
This level of influence means that every decision that platforms make can have outsized downstream effects, sometimes for the worse. In 2016, social media provided a venue for misinformation during the US elections and Brexit, not because platforms were interested in spreading misinformation, but because providing content that maximized user engagement unintentionally favored fake news. Even minor choices, like how to interpret a user giving a post a "thumbs-up" compared to an "angry react", have significant trickle-down effects. In fact, this very decision unintentionally promoted toxic content and calls to violence. And it was the nuanced and challenging nature of such design decisions that caused many to worry about Musk’s recent (tumultuous) purchase of Twitter.
So that’s why we’re here. Even though platforms provide an undeniably useful service (or maybe because they provide this service), they now have unprecedented power over information, and many are no longer willing to look the other way.
Although platforms’ rising power has attracted public scrutiny, social media remains largely unregulated in the US. For the most part, platforms set their own standards and self-regulate. Facebook, for example, tries to detect and remove problematic posts (e.g., those that encourage violence or contain hate speech) of its own accord. Similarly, Twitter has developed methods to slow the spread of misinformation by, for example, warning users when they are about to share a misleading post.
Not everyone is satisfied with self-regulation. Many argue that governmental regulation is the only way to minimize the current and future harms of social media. They believe that leaving platforms to monitor themselves without government oversight is wishful thinking.
So, should the state regulate social media? Or should social media be left to evolve on its own? In the rest of this blog series, we’ll unpack these questions. We’ll begin in the next post by discussing the current debate on social media and why regulation of this space isn’t easy.
By connecting us to people around the world and putting information at our fingertips, social media platforms have earned themselves a large and loyal user base.
Recently, platforms have received a lot of attention for their roles in various issues, such as the rise in misinformation, declining mental health among teenagers, lack of user privacy protections, and discriminatory online practices.
In the social media ecosystem, we can think of platforms as information brokers that run an information exchange. On one side of the exchange, platforms gather content, then decide how to distribute the content to users. On the other side, they collect data (e.g., from users) and decide who to share it with. All the while, platforms take a cut, profiting from advertisements, the sale of user data, and more.
Running this information exchange has given platforms a lot of power but also attracted a lot of scrutiny. Many are now demanding that social media be regulated, and we’ll dive into the debate on social media regulations next.
We'll come back to this analogy—and its implications for regulating social media—in a later post.