Standing up to big tech companies in the age of online hate speech
This blog was written by Omar Letson, intern at Spitfire Strategies
Numb - that's the overwhelming consensus of Americans when headlines arise of yet another horrific mass shooting. With this demoralizing clockwork of news becoming painstakingly common, who we are as a country has become almost synonymous with bigotry and hate, breeding this constant, senseless violence. While the frequency of mass shootings has been on the rise within the past decade, a troubling trend has begun to change how we contextualize these tragedies.
We saw it in Buffalo with the Tops Friendly Markets massacre and, more recently, in the attack at Club Q in Colorado Springs. Just a few months into the year and the US has surpassed 100 mass shootings in 2023. In each of these gruesome scenarios, and countless others before them, the attacks were perpetrated by young men radicalized by racist, homophobic hate speech incubating from online forums and social media platforms. Why has hate speech been on the rise? Why do big tech companies willfully ignore impending doom? What can we do to change the trajectory of this dangerous cycle of events?
Hate Speech is More Than Words on a Screen
Statistics show a staggering uptick in harmful rhetoric targeting communities of color, queer people, and women, ranging from radicalized conspiracy theories to outright threats of violence. Forums like 4chan and Discord have always been an incubator for communities that produce this sort of toxicity, with many instances of violence tied to anonymous posts. But, with Elon Musk taking the helm of Twitter and ransacking its protective policies, one of the world's biggest platforms is slowly but surely becoming an open season for derogatory messaging and blatant threats against Black, Indigenous, and people of color (BIPOC) and LGBTQIA+ communities.
Tweets containing anti-Semitic slurs skyrocketed 61% just two weeks after Elon's takeover. A report conducted by the Center for Countering Digital Hate found that since Musk purchased Twitter, there have been nearly 3,900 posts a day that include a slur against Black people — more than triple the 2022 average of 1,282. Tweets using a slur against gay people rose 58%, from 2,506 to 3,964 a day Posts that included a transgender slur jumped by 62%, from 3,159 to 5,117 and there was a 33% increase in posts using a disparaging term about women.
A space where tumultuous discourse is free to fester and go wildly unchecked can only spell trouble. While some only see these users as harmless keyboard warriors, we've seen time and time again how quickly radicalized online rhetoric can lead to lethal real-life implications.
And yet, there’s still so much we can do to fight back against hate and cultivate safe online spaces for all. Check out Spitfire Gabriel Rodriguez's blog for guidance on navigating Twitter’s new policies and helpful tips to share effective messaging amid the chaos.
Standing Up to Bigotry
At a time when BIPOC, LGBTQIA+ and other communities who continue to face violence and discrimination face onslaughts of abhorrent abuse daily, our mission as communicators to be the guiding force toward a brighter, safer future is more important than ever. Now is the time to take a stance against Big Tech companies that allow hate speech to go unchecked. Individuals advocating against online bigotry through strategic messaging will pressure technology executives to make their platforms a safer space for all and ultimately save lives.
Recently, Spitfire partnered with a technology justice working group focused on countering hate speech to develop values-based messaging to help advocates communicate the root problem. Here are three important takeaways:
Lead with shared values. Drive forward with messages that resonate with your audiences’ morals around online safety. Showcase the endless possibilities that can come from your advocacy - that enacting change isn’t impossible! By centering lived experiences and describing shared values, your audience will begin to see what the future of an internet free from hate speech can look like.
Place a name to the blame. There's no doubt that users propagating malicious content are a priority. But when you're solely demonizing a crowd of blank profiles spewing hate, you're letting Big Tech companies evade accountability. The scale of the problem needs to line up with the solution at hand: holding these organizations accountable for profiting off polarizing algorithms. Be sure to name Big Tech’s role in profiting off of hate speech when describing the issue at hand. This is more than a few bad apples at their keyboards: this is a system that profits off of radicalizing people and capturing their attention.
Social math, a way of using data to tell a story, goes a long way. Nonprofit and political organizations have championed social math as a potential solution to the challenges of difficult-to-frame numerical data. Share statistics that contextualize the real-world implications hate speech can create. Numbers speak volumes and can be a powerful tool in conveying how this phenomenon of vitriol is far more sinister than a few mean words encapsulated in 280 characters. Whether it's the increasing rates of depression and fear amongst marginalized groups or the countless innocent lives lost to violence, it is vital audiences understand the grim reality of Big Tech's negligence.
Above all, there is power in solidarity. By communicating about the impact of online hate speech using an intersectional lens, we can see and address its full impact. Using social math, we can contextualize how online bigotry poses a threat to different communities across the nation. Together, we can build a more equitable internet by holding Big Tech accountable to end hate speech across platforms.This entry was posted on Thursday, April 20, 2023 at 12:15 pm and is filed under Communication planning, Frame, narrative and message development and Opposition containment. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.