Skip to main content

Roe is the floor. But the world we’re reaching for is possible.

Last month marked what would have been the 50th anniversary of Roe v. Wade. 

At Spitfire, we’re reflecting on what’s needed to protect real access to abortion — in practice, not just on paper — for the next 50 years. We join advocates who are looking to a future that decades from now is not only rooted in abortion access, but reproductive justice.

Over the last several months, we partnered with advocates and experts who work across abortion access, criminalization and technology to develop guidance for communicating about the intersection of these issues. Our goal is to equip leaders across these movements with the messages needed to ensure all of us have the tools and knowledge to protect ourselves, and that our renewed fight is a winning one. 

Advocates in the technology advocacy and reproductive justice movements have spent decades fighting to enshrine autonomy, access and freedom. Those efforts existed before Roe was at risk. But without powerful, united messaging and a shared vision for change across those two fields about this emerging threat, there is a real risk that our work will be met with indifference, hostility and powerlessness. To meet this current moment, we need a narrative that proactively shares a unifying, winning vision for a just and free future. The future will ripple far beyond the fight for reproductive justice and propel our interconnected journey toward justice for all.

Here is some of what we learned and recommend for those communicating about reproductive justice and surveillance:
 

Frame abortion surveillance as a facet of the culture of surveillance, not as an outlier.
 

The use of technology to surveil abortions is a symptom of the culture of surveillance and pervasiveness of data that is being tracked, bought and sold. As advocates, you must fundamentally communicate that this issue is not about one app being sinister (such as a period tracker). Instead, emphasize that the technology people use across the board every single day (such as Google searches, emails and text messages) is designed purposefully to extract an incredible amount of personal data, and law enforcement can easily access that data through a variety of widely available tools. 
 

Name the people, organizations or systems that need to be held accountable for failing to act on identified solutions. 
 

Many people, companies, and movements – law enforcement, tech companies, crisis pregnancy centers, anti-abortion policymakers – have various incentives to embolden the role of tech in surveilling pregnancy outcomes. But surveillance overreach extends beyond harmful technologies into culture in the way that people police each other. Some of the biggest risks for abortion criminalization come from health care workers, neighbors and friends who will share known or suspected health information. When communicating, the responsible party you name should draw people’s attention to a logical solution. Make sure to name a party responsible for harm instead of leaving systems of power invisible, and make sure that person, organization or system holds responsibility for action on the solution you’re proposing.
 

Meet people where they are.
 

Many people rely on technology for access, ease and autonomy. That is particularly true for people with disabilities, who oftentimes use technology that is necessary for their lives. There is also tension in the reality that because of technology and internet access, abortion via medication at home is increasingly the norm and highly accessible. It’s important to not relay a blanket sentiment that technology itself or the use of technology is inherently bad or that the solutions to surveillance capitalism lie in individual people walking away from their digital products. Instead, it’s important to name the people and systems that benefit from extracting and selling data through current technology systems, such as Meta and Alphabet (Google’s parent company). The people and systems who are responsible for monetizing personal data can then be held accountable for change and solutions. While individuals can take steps to maintain their digital hygiene, the focus should be on holding companies and the government accountable for protecting consumers – not on an individual person’s actions.
 

Use a consciousness-raising model to make the connection between abortion and surveillance. 
 

Abortion is an issue that affects at least 1 in 4 people. But despite the common nature of the procedure, the narrative about the issue has been plagued by stigma and misinformation, making it appear to be controversial, when in fact it is well-supported. Sentiment analysis has shown that most people in the U.S. agree that abortion should be legal in most cases. Although a majority of people already approve of abortion to some degree, most have little to no understanding of digital surveillance. As a result, it’s vital to still meet people where they are. Messaging should focus on helping people in favor of abortion connect the dots between protecting bodily and digital autonomy because the way many people live with technology makes them inextricably intertwined. Make it clear throughout your communications that real control over reproductive health is not possible if people are subject to digital surveillance.
 

Leverage the value of ‘privacy’ cautiously, especially when calling for reproductive justice. 
 

Abortion rights advocates have long called for bodily autonomy beyond a protection of privacy. The right to privacy ultimately became the foundation of Roe v. Wade, but acted as a faulty cornerstone for securing abortion access and fighting against abortion stigma. Broadly, abortion rights advocates have moved away from framing around ‘privacy’ or ‘choice’-- emphasizing personal decisions--and instead focus on a more liberatory framing for the future, such as bodily autonomy, self-determination, and reproductive justice. At the same time, privacy is a common and powerful framing of digital rights, particularly when it comes to data surveillance. But it is not a reality that everyone has equal access to, especially communities being explicitly targeted by police surveillance. Advocates should use ‘privacy’ deliberately and cautiously. It may be a relatable entry point for some audiences, but messengers should quickly pivot to more visionary frames of autonomy, access and freedom.
 

Avoid doom and gloom. Relay the facts. 
 

When discussing technology and abortion rights, balance communicating the urgency of the moment with presenting the facts and known or emerging solutions in a way that will motivate, not disempower, your priority audiences. Avoid sensationalizing, pointing to a dystopian future or indicating that technology overreach is already beyond our control. Staying stuck in repeating the challenges people experience now – that the situation is dire – is not motivating to those making change. Instead, give your valuable airtime to describing what a just future would look like and naming that it is possible.  
 

Talk about security as a function of community, with individual actions everyone can take.
 

Advocates have emphasized that security against abortion criminalization, and safety more broadly, is a function of the community. Like washing your hands to keep yourself and everyone you interact with healthy, it’s imperative that people take individual actions to keep themselves and their communities safe by understanding and taking action to protect their own data. For example, messaging can urge priority audiences to both use Signal and get friends to join Signal too while pointing to campaigns or legislation holding tech companies and platforms accountable. Supplying audiences with concrete steps they can take to improve their own and their loved one’s digital security will prevent them from giving into the feelings of powerlessness that often surround both abortion rights and privacy. 

Beyond that, advocates are also calling for digital civil rights and imagining new systems outside of surveillance capitalism so that individuals carry less of the burden of protecting their data. Campaigns to pressure companies to encrypt communications and browser searches, to restrict data brokers, and eliminate loopholes for seizing information without a warrant  are some examples of broader efforts to mitigate digital harms. People will not be able to achieve autonomy, access and freedom alone. Underscore that throughout your messaging, and provide concrete examples of how people can protect themselves and their community.
 

Be mindful of the messenger.
 

Who delivers a message is just as important as the message itself. Those advocating for digital civil rights can point to existing, trusted messengers calling for reproductive justice. People who have been criminalized for their reproductive choices can and should have safe, trusted platforms to tell their own stories, which advocate messengers can then amplify. Identify key stories that support the new narrative, and share them with advocates alongside considerations for ethical and safe storytelling practices. In current media coverage, an alarmingly small amount of coverage speaks to the impact that abortion criminalization and digital surveillance will have on people living with disabilities. There is an opportunity to speak more inclusively about the various communities that are at risk. 

Click here to download the full guide

 

Related Readings and Resources

This entry was posted on Friday, February 24, 2023 at 09:59 am and is filed under Frame, narrative and message development. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.