Let’s talk about free speech. It seems like a simple concept, right? The United States constitution essentially states that Americans are allowed to speak freely. However, in no way does this demand that companies allow you to do the same on their sites, or in their establishments. It also does not mean that hateful speech does not have repercussions, even in the context of United States law.
But what does this have to do with social media exactly? The main problem: internet trolls.

One of the internet’s oldest and most bothersome groups of people. There are many people who may not have the confidence to something mean to someone’s face, but are willing to pester and make fun of them online at whim.
Many people who have been victims of these terrible internet interactions would claim that they are not trolls, but harassers. People who again do not have the guts to say something to someone’s face, or can’t, yet find comfort in making degrading and destructive comments on social media that have the potential to truly harm a person, business, or organization. Now, pretty much all social media sites have chosen to create “community standards” that when broken, can be silenced by the social media site. Some would argue that this is a violation of free speech, but many others say that Facebook and sites like it have the right to cultivate the values of their site.
How does Facebook do this? You ask. Well, you probably didn’t, but let’s get into it anyway. Facebook uses its people. If someone sees something that they feel goes against Facebook’s community guidelines, it can be reported. And if it’s reported, it is most often taken off immediately. This process is explained here on C-net. In theory this process makes a lot of sense. However, censorship being at the whim of other people who could be reporting content for reasons beyond the Facebook Community Guidelines has led to uproar when Facebook allows content to be taken down. This practice is the new wave of “trolling”, especially in an increasingly divided America.

Recently, this topic of harassment and censorship has gone a step further. According to Forbes’ article “Facebook Should do More to Address Censorship and Harassment Issues, Rights Groups Say” (bit of a long title, no?) eighty rights groups have signed a letter to Facebook asking for censorship reform on its site. The letter demands that “Facebook provide a simple and accessible appeals process, increase transparency around content removal and carry out an external audit to review the human rights outcomes of its censorship, stating that the company’s current audits aren’t ‘sufficiently independent’”. These groups feel that activist content especially has been subject to trolls reporting content that they disagree with, not that goes against Facebook’s Community Guidelines.

Beyond that, that Facebook has willingly responded to these reports by removing the content without transparency to the groups being reported, or giving the option to appeal content being taken off. The group argues that these censorship practices have disproportionately affected people of color. They have found many instances of racist Facebook members reporting content simply because of it’s relation to “Black Lives Matter” and other activist initiatives.
Content on Facebook should absolutely have the opportunity to be reported, especially when the aspect of harassment is involved. However, should content really be taken off just because it does not follow another person’s opinions? I and many others think not. What do we do with the new era of censorship trolls? I think that this will be a highly debated issue in the coming years with the current U.S political regime who has already done their own trolling of censoring non- harassment based social media (ahem EPA).