by Lauren Wong, Cassie Gaudes & Preston Wimbish
Friday, Sept. 20 brought two professors and one University of Tampa student to a Free Speech and Social Media panel, sponsored by a Board of Fellows grant, to debate if technology companies have too much control over our speech. Room 245 in the Southard Family Building, packed with students and faculty had people standing, lining the walls, and sitting, huddled around the podium.
Panelists included Stetson Law School professor Catherine Cameron, UT speech professor Kristen Foltz, and UT senior cybersecurity major Eustathios Siandris.
Cameron began the panel with defining free speech and the misconceptions that can come along with it. She explained how students should broaden their understanding of all the rights that fall under the First Amendment and included other examples from countries where citizens are not as fortunate to be granted this right. The general law is that as a U.S. citizen, we’re given the right of free speech, to say what we feel and believe. “We start with this broad concept and whittle it down,” Cameron said.
She explained a case that happened in Egypt in which a man was jailed for three years after posting a picture of their president wearing Mickey Mouse ears.
Media companies such as Facebook, Instagram, Twitter, even a school email, all have the right to censor words, photos, and any engagements users do within their platform. If you were to post a controversial comment or image on any of these media outlets, that company can take it down because they are privately owned.
This brought up the topic of government censoring social media. While social media companies have been censoring posts, the government cannot do so due to the First Amendment. “The government won’t get involved because of the idea that people will figure out what is permissible and what is not,” said Foltz. Explaining further, Siandris said, “There is not a formal definition of what counts as a hate speech according to the Supreme Court.”
When a user posts on social media, posts go through an algorithm to determine if they violate the companies’ policies. “When you post something it gets sent through a machine algorithm, it determines if a similar post has been removed, and [if so] it can remove a post,” said Siandris. “If the machine is unsure, it will send it to somebody to look at.” If the social media employee determines that the companies’ policies were broken, the worker can remove the post.
With this, there is the question of who, or what, is deciding what is ethical and what is not. Humans do not check until there is a question; computers tend to do most of the checking. “We’re trusting a computer to regulate speech. I don’t really trust it,” Cameron said.
Another ethical issue that arose, is when incitement comes into play. Incitement is the encouragement from one person to another to commit a crime. This is a hard case to prove. One of the more prevalent examples of this is the case when Michelle Carter texted her boyfriend telling him to commit suicide. She was charged with involuntary manslaughter.
Cameron then gave advice to the students. She said to “slow down when you’re downloading things, and read the fine print because they are taking away some of your rights to free speech.”
Lauren Wong can be reached at firstname.lastname@example.org
Cassie Gaudes can be reached at email@example.com
Preston Wimbish can be reached at preston.wimbish@theminaretonline. com