By Joshua Anderson
The year 2020 has no doubt had its eventful moments of public prominence. With the presidential election now starting to leave the current news cycle, discussion of social media corporations has come to light due to the Senate hearings of Facebook and Twitter, as well as the release of the increasingly popular film “The Social Dilemma.” These high criticisms of social media platforms have been on the rise for several years now, but have recently received national attention when high profile conservative public figures such as Dan Bongino and Sen. Ted Cruz opted to leave Twitter for a start-up competitor, Parler. The reasoning behind the switch being substantial accusations of bias with disputes over the recent “fact-checking” crackdown on these major social media platforms. Given this dramatic debate concerning these platforms, the question that begs an answer is: “What is the future of social media?”
Launched in 2018, Parler is a Twitter-like platform that claims to be: “An unbiased social media focused on real user experiences and engagement. Free expression without violence and no censorship.” Facebook and Twitter have been unforgiving (and arguably biased) in their attempts to police “hate speech,” “violent content,” and “fake news.” In response, Parler has attracted many opposed to Big Tech’s method of enforcing controversial “exceptions” to the First Amendment.
Although, Parler itself has had issues with conflict over censorship. Parler has been accused of banning users in Twitter-like fashion over strange clauses in the community guidelines. The app has been banning a number of people for sexual content, foul language, and oddly, posts relating to “fecal matter.” This has led many who are hesitant to join the platform to be skeptical of the “free speech platform.”
Parler not only claims to be a “censorship-free platform,” but also promises to not sell user data to any outside entities. They state in their privacy document that they not only collect data you provide, but also implicit data they can collect from user activity including the following: location, device information, usage, contacts, and information from cookies (or similar technologies). They list off how they use your data, which consists of mostly ordinary practices, but also state that – regardless of the clause to not sell user data – they reserve the right to share it with third parties for assistance in analysis.
Taking all into consideration, the more that is revealed about Parler, the more it seems to be the same package with a new face. So, if a new platform may not resolve the current tensions in social media, what will? The answer to this will likely reshape social media to be something completely different.
Over the last several years, there has been some non-partisan support of reforming the Communications Decency Act of 1996 (CDA), originally put into law under the Clinton Presidency. The original intent of this law was to regulate indecent content flooding the internet, such as pornography. Some sections of the CDA regarding the ban of “adult content” were unanimously struck down by the Supreme Court for certain clauses violating the First Amendment. The most relevant active legislation from the CDA is Section 230. This law was in the national spotlight as recently as 2018 due to a series of high profile lawsuits against Facebook. The law has continued to be disputed since.
Section 230 of the CDA creates legal protections for internet platforms inconsistent with traditional American law. These platforms are immune to liability while retaining the privilege of monitoring their content. In some ways, this has been helpful as it allows a company like Yelp to remove reviews from apparent non-customers without worry, yet also allowing the current, controversial fact-checking censorship with no accountability. This differs from traditional American legislation on communication because it breaks the convention of a publisher, distributor, and platform distinction. In short, this distinction historically has separated liability with moderation on a public communication outlet, i.e., if a company chooses to regulate public content, they will be legally liable for what they do not remove.
From the public’s increasing distrust of Big Tech, there has been significant non-partisan support to reform this legislation, including President Trump and apparent President-elect Biden, yet there has been a divide on the next step. Some are pushing for a revived net neutrality approach, which was abandoned in 2017, while others push for the traditional publisher-versus-platform approach. Both have their drawbacks. Those advocating for net neutrality would revert the nationwide increased internet speeds, give the government access to monitor internet traffic, and hold social media companies liable for what is on their sites, redefining social media as we know it. Those advocating to establish social media companies as platforms rather than publishers would revoke privileges of big tech to police without providing an efficient alternative to address violations of the First Amendment, and provide no immediate accountability for actions already taken by these companies.
I personally believe there is no objective solution as every route has its drawbacks. Either way, everyone agrees something needs to change. Over the next four years, we will likely see a dramatic change causing Parler, Twitter, and Facebook to be something other than what we know today. The uncertainty will most definitely be answered by what we as a country decide is more valuable: protecting freedom or ensuring security.
Joshua Anderson is a first-year graduate student at Chapman University studying Computational and Data Sciences. He is a technology columnist for The Hesperian.