The reporting feature across social media platforms has the potential to be very effective, and a lot of the time it is. The problem begins when the platforms deactivate accounts that are not harmful but take way longer than necessary to take down the accounts that are.
Take Twitter, for example. A simple search of “locked out of twitter” will result in countless tweets of those whose accounts have been inactivated despite being well within positive social media conduct. Some say the app didn’t recognize their age, while others are locked out for including potential “bad” words in their tweets (such as choke or kill, to be exact).
Meanwhile, the photos of a murder victim’s body can spread around like wildfire, with the consequences for posting them coming after the damage has already been done.
In 2017 Buzzfeed Article, Twitter user Maggie H. talks about the accounts that were stalking her, threatening her, and revealed where she lived. She reported the accounts and contacted the platform in an attempt to get help, but no action was taken until her story reached the media.
The only way to say this is that it is toxic. There are people across social media spreading videos of animal abuse, Facebook has only recently started to ban the white supremacist groups that exist on its platform, and Twitter is a good host for pedophiles under the guise of something else. While these things clearly should not be allowed on social media, they won’t go away because the platforms are too focused on banning everyday users for significantly less harmful reasons.
Earlier this year, a video circulating around the internet showed a group of Birmingham, Alabama teens throwing around racial slurs and jokes about the Holocaust. Yes, they were met with backlash throughout social media, but whether they faced any repercussions from the platforms they used to spread hatred is unlikely considering the observable trend.
I personally have reported an Instagram video of a man yelling at an East Asian child. The man then proceeds to yell “ching chong” among other mocking slurs at the poor kid. Unsurprisingly, the report had no effect. With the video having more than 20,000 views, I am failing to understand how it’s funny and not just sad or offensive.
All in all, social media companies should focus on filtering out the genuinely bad content on their platforms instead of wasting time on harmless antidotes.
Written by: Jawana Kamal / @jawanakamal
Edited by: Isabel Hope / @isabama / @isabamahope
Olivia Blanton / @o.blanton