Monday, October 25, 2021

Deplatforming and Free Speech - Jessica Hartle

               “Cancel Culture” is a buzzword we have all become very familiar with since it gained popularity in 2014. However, for the most part, cancel culture does not have the permanent and detrimental effect that is implied in the name. Comedian Louis C.K. was “cancelled” in 2017 when he admitted to multiple sexual assaults and returned with a 24 city tour in 2021. Does “cancellation” only derail a career for a few years and provide an opportunity to capitalize on being canceled?

              It is arguable that the threat of “cancelation” is not an effective hindrance against harmful behaviors but what about deplatforming?

              In 2012, the horrors of the Sandy Hook Mass Shooting unfolded in front of the entire nation. Not everyone was as impacted by the deaths of 20 six-seven year olds; radio personality Alex Jones had questions about the truth of the event and embarked on a conspiracy campaign alleging Sandy Hook was a “false flag” and accused the parents of being crisis actors. During Jones’ daily 3hr InfoWars shows he encouraged followers to begin harassing the bereaved parents and survivors of the event. This has led to an ongoing class-action lawsuit by those affected by his speech, but Jones continued to freely post on the largest social media platforms.

              It wasn’t until six years later that Jones was deplatformed with the tech giants stating they blocked Infowars not because of the conspiracy theories, but because of hate speech.

  • Spotify:  Infowars “expressly and principally promotes, advocates, or incites hatred or violence against a group or individual based on characteristics.”
  • Facebook: for “glorifying violence, which violates our graphic violence policy, and using dehumanizing language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies.”
  • Apple: “Apple does not tolerate hate speech, and we have clear guidelines that creators and developers must follow to ensure we provide a safe environment for all of our users,” adding, “podcasts that violate these guidelines are removed from our directory.”

Alex Jones protesting a COVID lockdown at a Texas Park

Studies have begun to show deplatforming is an effective reduction in “the overall activity and toxicity levels of supporters”. Is this reduction of harmful rhetoric, that often leads to pain and violence, worth the possible infringement on free speech?

       Is deplatforming by private companies a violation of free speech? Should we consider Twitter, Facebook, Youtube, etc. as a “public square” where citizens should be able to engage in the marketplace of ideas?

       Should the tech platforms be allowed to remove users and limit platforms for them to espouse such speech?

What, if any, do you find as an acceptable level of “promoting violence or hate speech” to justify deplatforming someone from a social media network?

       If deplatforming is a justifiable strategy, is there a route to redemption that allows the offender back onto the platforms? 

12 comments:

  1. I don't think that deplatforming is a violation of free speech. Ultimately these are private companies, and you agree to their terms of service when you register an account.

    I think it is fair to remove users who espouse hate speech. I remember reading a statistic not too long ago that said something like 90% of the Covid mis-information on facebook came from just 10 individuals with large platforms. Limiting those users would go a long way in stopping the spread of mis-information. The same goes for hate speech.

    I don't really find any level of hate speech acceptable!

    As for allowing people back onto the platforms once they've been banned, Facebook had a novel idea of appealing to some sort of panel made up of independent people, but I'm not sure that one panel of 10 individuals has the capacity to deal with these kinds of complaints. I think that it's just something you have to deal with, much like getting banned from a local establishment. You either find something else, or you don't, but letting you back into the original space almost always causes more trouble than its worth.

    ReplyDelete
    Replies
    1. " I remember reading a statistic not too long ago that said something like 90% of the Covid mis-information on facebook came from just 10 individuals with large platforms. "

      The disinformation dozen! They were just mentioned on a podcast I was listening to. https://www.counterhate.com/disinformationdozen

      Delete
  2. Privately held companies and platforms have a right to set expectations for their users and to hold the users to those expectations. Private entities and their rights have been a focus of discussions around business rights regarding masking and vaccines. These user requirements do not change our free speech rights, just that we can’t demonstrate them on a privately owned platform. It is not a right to have a facebook account, or other social media account, it is a choice and we choose to live by the requirements imposed by using their product.

    These user expectations are not a violation of free speech. These are not a public square in my mind due to the fact that they are owned with limitations.

    Yes, tech platforms should be able to remove users for violating their terms and setting the limits they choose on their product. I do not find any level of promoting violence or hate speech as acceptable. I wish no one would use it and the venue does not matter to me.

    None of these platforms are limiting my free speech, I can say whatever I want, it just may not have a large audience, if any.

    ReplyDelete
  3. No, deplatforming by private companies is not a violation of free speech and those companies shouldn’t be considered as a “public square” means of communication. Think of it this way…look at the space where you make your “speech”. Is it owned by the government or is it owned by a private party? Sure, you can go to the city park and stand on a platform and speak on how you feel about the way the grounds are kept. That’s public space owned by a public entity. Now imagine going to your neighbor’s house and standing on their front lawn giving that same speech about how their yard or grounds are kept. That neighbor has every right to kick you off the property and possibly follow that up with other legal action.

    Tech platforms should be allowed to remove users for whatever reasons they want to. It’s their company, their rules, their money, their technology, and their prerogative. Imagine the wailing and gnashing of teeth if the government were given the authority to have that deep of a hold on any private company. Hell, the government can’t even collect their taxes.

    I don’t subscribe to any level of hate speech or promoting violence so my opinion would be that there is not an acceptable level. We’ve seen what sites like 4Chan and 8Chan are responsible for (and Alex Jones, for that matter) as they have made names for themselves allowing a policy of anything goes.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. Even in public there are certain actions which are barred, regardless of free speech protections. We likely all know the old adage that you can't yell "FIRE!" in a crowded movie theater. This is because there is a level of dangerousness to it. Unfortunately, if someone wants to espouse hate in public they have the right to, so long as they are not promoting violence or inciting panic.

    With that being said, I agree with all the previous comments. The social media platforms are private entities and therefore have the right to remove anyone they want. It would be no different than if I started yelling hate speech in a store, and they kicked me out. It is a private business and as the signs say, they reserve the right to refuse service to anyone.

    I wish these platforms took stronger action against hate speech. I firmly believe social media is a leading cause of the degradation of civil society. I hope someone brings suit against every one of them for allowing this to go on for too long.

    On a side note, just a few weeks ago a Texas judge ruled in favor of two sandy hook parents, because Alex Jones failed to follow discovery rules. Just thought that was ironic given this post as well as the topic of discovery discussed in this class.

    https://www.nytimes.com/2021/10/01/us/alex-jones-lawsuit-sandy-hook.html

    ReplyDelete
    Replies
    1. Jacob, I agree with your thoughts on social media being a leading cause of the degradation of civil society. It's hard to really trust anything that is being posted because people often times just blindly "repost" content and with social media at everyones fingertips, the information spreads so quickly. How do you track what the original source was and if its credible. I do not have social media (at the moment) but friends have shared the banners that Instagram (maybe FB too?) started putting on to help reduce the misinformation being shared about COVID - in theory it seems like a good move, im curious if it helped with the content people were still sharing

      Delete
  6. I concur with everyone's positions above. Our society has been consumed with social media and making our voices heard. Some are worthy and most are not. I join whole-heartedly with Stephanie and Jacob. Hate speech has no place in our society. Tech platforms should remove vulgar and offensive material from their platforms. In today's world, we don't need this. It is not what we are about or who we are. I understand the Free Speech element, but still, what is the purpose? Why not promote good and worthy endeavors. This may be an old school approach, but I believe these platforms have no place on our lives.

    ReplyDelete
  7. This comment has been removed by the author.

    ReplyDelete
  8. I concur with all who have commented. Free speech is a right. Hate speech is not. In comparing this to Eric's cheerleader post, it seems that careful consideration of freedom of speech is dependent on the level of harm caused. In the case of these platformers, I believe, harm was significant and removal of hate speech from social media sites is warranted. Unfortunately, the harm to some is done before any action is taken. Social media has taken over the well-reasoned thinking of many. It seems like the public often check their discriminating minds at the door and jump on the bandwagon of whatever is posted. I heard this commonly quoted phrase nearly every day from my 6th grade teacher. "Your right ends where someone else's nose begins."

    ReplyDelete
  9. Susan, I love your 6th grade teacher! Just imagine if everyone kept that in mind. I also concur with much of what has been said. Being alive before the internet, I remember when the creators of newspapers, magazines, even radio and tv, the "media" were much more careful about self-regulating. Extremes were in existence but not nearly as prevalent as what occurs on many internet platforms today. Keyboard warriors and laptop instigators, often maintain little or no self-regulation. It kind of frightens me. Private tech companies absolutely have the right to shut down hate speech and cut off users. And I believe hate speech, only leads to more hate.

    ReplyDelete
  10. Is this reduction of harmful rhetoric, that often leads to pain and violence, worth the possible infringement on free speech?

    Well, since private companies are doing it, I would say no. However, deplatforming does not de-program altright folk. Deplatfroming redirects such rhetoric to different spaces.

    Is deplatforming by private companies a violation of free speech? Should we consider Twitter, Facebook, Youtube, etc. as a “public square” where citizens should be able to engage in the marketplace of ideas?

    No, it isn't. These companies are not public squares,... yet. However, I believe with time, these virtual spaces will be where much discourse takes place. Either some are nationalized or there needs to be a government-sanctioned space in the virtual realm. Now that would be complex.

    Should the tech platforms be allowed to remove users and limit platforms for them to espouse such speech?

    Yes, but if collusion takes place amongst the platforms, then it could be dangerous. Also, many leftists individuals get shadow-banned or deplatformed for espousing BDS support, anti-US sentiment, and even anti-capitalist rhetoric.

    " As defined by the FBI and DHS, anarchist violent extremism encompasses the potentially unlawful use or threat of
    force or violence in furtherance of an anti-government or anti-authority violent extremist ideology that considers
    capitalism and centralized government to be unnecessary and oppressive." (https://www.fbi.gov/file-repository/fbi-dhs-domestic-terrorism-strategic-report.pdf)

    What, if any, do you find as an acceptable level of “promoting violence or hate speech” to justify deplatforming someone from a social media network?

    Yeah, I am not sure. My bias against white supremacist, homo/trans-antagonistic, anti-immigrant, etc. ideologies makes me want to not have those beliefs be out there for more people to take in and adopt.

    If deplatforming is a justifiable strategy, is there a route to redemption that allows the offender back onto the platforms? Of course there should be, but then again, it's not my call but rather the company's and their shareholders' responsibility.

    ReplyDelete

Note: Only a member of this blog may post a comment.

Racialized System: Design or Fate?

By Cristobal Villegas Introduction The United States of America in 2021 is the result of the decisions of past political leaders, resurgent ...