Before the Internet became part of daily life, we relied on the idea that things published in the media did not accurately reflect reality. Maybe that mentality had some truth to it back then, but now we live in a society governed in part by online media, and our digital landscape is much more of a reflection of the uncomfortable and frightening world around us.
As the world has changed, social media companies have evolved to follow the moral guidelines of the majority.
Growing up in the early 2000s, social media was a lawless landscape where I could find pretty much anything I wanted and in most cases, much more than I wanted. There were few filters or anyone to tell me the information I was after was not something a stranger on the other side of the world should be teaching me.
In a lot of ways, I’m grateful for the filters and content warnings modern social media provides. I’m glad middle schoolers cannot stumble across graphic images of corpses and brutal violence after a simple Google search like I did. I’m glad kids are getting a proper sex education from trustworthy adults, instead of from dramatized porn. I’m glad they have the protection I wasn’t offered, but I worry we might be protecting them too much.
Video sites like TikTok and YouTube are implementing content filters to remove “harassment, bullying and violence,” from their platforms.
According to the MIT Technology Review, Tiktok filters its content through the analysis of trigger words that moderators deem hateful, sensitive, or easily misused. However, these filters have caused many users to feel even more targeted and marginalized, as terms of self-identification, like “black,” or “lesbian,” are often misinterpreted by the algorithm as hate speech. In addition to race or sexuality labels, Tiktok’s algorithm also looks unfavorably on terms relating to sex, death, drugs, or COVID. Posts using these terms in the app’s voice-to-text function, captions, comments, or profiles will be banned, suspended, or “shadow-banned,” meaning that the algorithm will push these posts and users to the bottom of the feed, rendering them virtually invisible.
YouTube has implemented very similar policies to its short-form sister site. In 2019, YouTube channels Nerd City and YouTube Analyzed tested over 15,000 commonly searched terms against the platform’s monetization algorithm to see what terms would label a video as unfit for sponsorship. Their research found the obvious, that any terms slightly related to sex, violence or cursing would lose monetization, but they also found that anything relating to COVID-19 would trigger the algorithm. Some trigger words didn’t seem to follow any particular pattern of theme. The word “democrat” was flagged, but “republican” was not. “Lesbian” set off the alarm, but “gay,” “transgender” and “bisexual” did not.
My concern with these filters isn’t limited to the seeming randomness of many banned terms. I worry that giving a massive media corporation like Google, ByteDance or Meta control of the language we use is going to set us back decades in terms of communication reform.
For generations, we have fought to distinguish “sex” from “rape,” “culture” from “racism,” “gay” from “unnatural.” We need these terms to define our experiences, pleasant or uncomfortable, and by taking away our ability to openly discuss our experiences, these companies risk a breach in communication.
We are only human. Sex, death, sickness and hate are going to happen whether we like it or not, so we might as well give our kids the proper language to talk about it in a healthy way.
Trying to eliminate discomfort and fear by censoring conversations about uncomfortable topics does not make the fear or shame disappear. In fact, it amplifies the feeling as we internalize sex and pain as taboo. Not having words for the way the world discomforts you doesn’t make it any more comfortable, it just makes you less able to reach out about your problems.
The function of communication is not to minimize immediate discomfort, it’s to share ideas and experiences with the people around us.