Imagine a world where a few companies have the power to control the discourse of almost 4 billion people. These companies decide independently who gets to work with the most essential tools of the 21st century.
You don’t have to, because it already exists.
On May 5, a Facebook advisory board decided to maintain Former President Donald Trump’s ban on the platform, citing his incitement of US Capital riots on Jan. 26. In an era where social media is at the forefront of societal discourse, we need to rethink how social media companies operate, make decisions, and place reasonable checks on their power.
To simply say social media is popular greatly understates how important it is in our world. Sites like Facebook, Youtube, and Twitter have transformed our culture. Social media is used by half the global population, with the average person having 8.6 social media accounts. Globally, the average person spends 2 hours and 24 minutes a day on these sites. These are just a few of many statistics that display social media’s prominence.
Usually when people talk about the idea of banning people, or “deplatforming,” most people go to the defense of “they’re private companies, they can do whatever they want.” And while Donald Trump’s Twitter ban doesn’t violate any US laws, this argument downplays how essential social media is. Our societal infrastructure relies so heavily on social media. Twitter has become a hub for breaking news. Youtube has become a beacon for video sharing, with content ranging from cooking tutorials to full-length movies. Social media has even impacted the political realm, with movements like the Arab Spring being documented online and US politicians announcing policy proposals on Twitter.
Social media is the new public square, and it’s unreasonable to keep these companies as private institutions when their products are used in all aspects of life. Social media is only getting bigger, and we need to start treating it like the public utility it has become.
Advocates of social media censorship argue that misinformation and hate speech are a threat and that this is a necessary move to protect our democracy. This argument is noble in it’s pursuit. We’ve seen the horrific effects of misinformation through Alex Jones spreading Sandy Hook conspiracies, pizzagate, and Donald Trump’s lies about the 2020 election.
Hate speech and misinformation have terrible outcomes that need to be taken seriously, no doubt. But consider this: Who is defining what hate speech and misinformation are? Some definitions might be reasonable, but what about Donald Trump’s? The former President was a pathological liar who refused to take any sort of criticism. If he was put in charge of who can and can’t post on twitter, it’s fair to say that wouldn’t go well. The point is that these questions are more and more vague than we think, and will change depending on the answers people give. These are genuinely tough decisions, and it’s wrong to put the responsibility solely on the shoulders of these corporations. There needs to be way more oversight on this issue, requiring the input of a broad community of qualified people all over the globe.
In regards to misinformation, the idea of having some sort of “department of truth” is faulty for a number of reasons. Error is built into human nature. Not everyone is going to have an accurate depiction of what’s true and what isn’t. Mistakes are made all the time, and no social media user, journalist, or news station is above that. The problem with having a select group of people making these important decisions is that it creates an incredible imbalance of power.
And that’s the key issue. Having large corporations with hierarchical power structures making decisions that impact almost 4 billion people creates a dangerous dynamic between people and the powerful. This can result in almost zero ways to properly place reasonable checks and regulations on their powers. This can result in some very dangerous setbacks in the freedom of speech and press.
Hate speech and misinformation are dangerous concepts that need to be addressed in some form, but discussions also need to be had regarding the unbalanced power structures of social media companies.