March 28, 2024

Design Ethics for Gender-Based Violence and Safety Technologies

Authored (and organized) by Kate Sim and Ben Zevenbergen.

Digital technologies are increasingly proposed as innovative solution to the problems and threats faced by vulnerable groups such as children, women, and LGBTQ people. However, there exists a structural lack of consideration for gender and power relations in the design of Internet technologies, as previously discussed by scholars in media and communication studies (Barocas & Nissenbaum, 2009; boyd, 2001; Thakor, 2015) and technology studies (Balsamo, 2011; MacKenzie and Wajcman, 1999). But the intersection between gender-based violence and technology deserves greater attention. To this end, scholars from the Center for Information Technology at Princeton and the Oxford Internet Institute organized a workshop to explore the design ethics of gender-based violence and safety technologies at Princeton in the Spring of 2017.

The workshop welcomed a wide range of advocates in areas of intimate partner violence and sex work; engineers, designers, developers, and academics working on IT ethics. The objectives of the day were threefold: (1) to better understand the lack of gender considerations in technology design, (2) to formulate critical questions for functional requirement discussions between advocates and developers of gender-based violence applications; and (3) establish a set of criteria by which new applications can be assessed from a gender perspective.

Following three conceptual takeaways from the workshop, we share instructive primers for developers interested in creating technologies for those affected by gender-based violence.

 

Survivors, sex workers, and young people are intentional technology users

Increasing public awareness of the prevalence gender-based violence, both on and offline, often frames survivors of gender-based violence, activists, and young people as vulnerable and helpless. Contrary to this representation, those affected by gender-based violence are intentional technology users, choosing to adopt or abandon tools as they see fit. For example, sexual assault victims strategically disclose their stories on specific social media platforms to mobilize collective action. Sex workers adopt locative technologies to make safety plans. Young people utilize secure search tools to find information about sexual health resources near them. To fully understand how and why some technologies appear to do more for these communities, developers need to pay greater attention to the depth of their lived experience with technology.

 

Context matters

Technologies designed with good intentions do not inherently achieve their stated objectives. Functions that we take for granted to be neutral, such as a ‘Find my iPhone’ feature, can have unintended consequences. In contexts of gender-based violence, abusers and survivors appropriate these technological tools. For example, survivors and sex workers can use such a feature to share their whereabouts with friends in times of need. Abusers, on the other hand, can use the locative functions to stalk their victims. It is crucial to consider the context within which a technology is used, the user’s relationship to their environment, their needs, and interests so that technologies can begin to support those affected by gender-based violence.

 

Vulnerable communities perceive unique affordances

Drawing from ecological psychology, technology scholars have described this tension between design and use as affordance, to explain how a user’s perception of what can and cannot be done on a device informs their use. Designers may create a technology with a specific use in mind, but users will appropriate, resist, and improvise their use of the features as they see fit. For example, the use of a hashtags like #SurvivorPrivilege is an example of how rape victims create in-groups on Twitter to engage in supportive discussions, without the intention of it going viral.

 

ACTION ITEMS

  1. Predict unintended outcomes

Relatedly, the idea of devices as having affordances allows us to detect how technologies lead to unintended outcomes. Facebook’s ‘authentic name’ policy may have been instituted to promote safety for victims of relationship violence. The social and political contexts in which this policy is used, however, disproportionately affects the safety of human rights activists, drag queens, sex workers, and others — including survivors of partner violence.

 

  1. Question the default

Technology developers are in a position to design the default settings of their technology. Since such settings are typically left unchanged by users, developers must take into account the effect on their target end users. For example, the default notification setting for text messages display the full message content in home screen. A smartphone user may experience texting as a private activity, but the default setting enables other people who are physically co-present to be involved. Opting out of this default setting requires some technical knowledge from the user. In abusive relationships, the abuser can therefore easily access the victim’s text messages through this default setting. So, in designing smartphone applications for survivors, developers should question the default privacy setting.

 

  1. Inclusivity is not generalizability

There appears to be an equation of generalizability with inclusivity. An alarm button that claims to be for generally safety purposes may take a one-size-fits-all approach by automatically connecting the user to law enforcement. In cases of sexual assault, especially involving those who are of color, in sex work, or of LGBTQ identities, survivors are likely to avoid such features precisely because of its connection to law enforcement. This means that those who are most vulnerable are inadvertently excluded from the feature. Alternatively, an alarm feature that centers on these communities may direct the user to local resources. Thus, a feature that is generalizable may overlook target groups it aims to support; a more targeted feature may have less reach, but meet its objective. Just as communities’ needs are context-based, inclusivity, too, is contextualized. Developers should realize that that the broader mission of inclusivity can in fact be completed by addressing a specific need, though this may reduce the scope of end-users.

 

  1. Consider co-designing

How, then, can we develop targeted technologies? Workshop participants suggested co-design (similarly, user-participatory design) as a process through which marginalized communities can take a leading role in developing new technologies. Instead of thinking about communities as passive recipients of technological tools, co-design positions both target communities and technologists as active agents who share skills and knowledge to develop innovative, technological interventions.

 

  1. Involve funders and donors

Breakout group discussions pointed out how developers’ organizational and funding structures play a key role in shaping the kind of technologies they create. Suggested strategies included (1) educating donors about the specific social issue being addressed, (2) carefully considering whether funding sources meet developers’ objectives, and (3) ensuring diversity in the development team.

 

  1. Do no harm with your research

In conducting user research, academics and technologists aim to better understand marginalized groups’ technology uses because they are typically at the forefront of adopting and appropriating digital tools. While it is important to expand our understanding of vulnerable communities’ everyday experience with technology, research on this topic can be used by authorities to further marginalize and target these communities. Take, for example, how tech startups like this align with law enforcement in ways that negatively affect sex workers. To ensure that research done about communities can actually contribute to supporting those communities, academics and developers must be vigilant and cautious about conducting ethical research that protects its subjects.

 

  1. Should this app exist?

The most important question to address at the beginning of a technology design process should be: Should there even be an app for this? The idea that technologies can solve social problems as long as the technologists just “nerd harder” continues to guide the development and funding of new technologies. Many social problems are not necessarily data problems that can be solved by an efficient design and padded with enhanced privacy features. One necessary early strategy of intervention is to simply raise the question of whether technologies truly have a place in the particular context and, if so, whether it addresses a specific need.

Our workshop began with big questions about the intersections of gender-based violence and technology, and concluded with a simple but piercing question: Who designs what for whom? Implicated here are the complex workings of gender, sexuality, and power embedded in the lifetime of newly emerging devices from design to use. Apps and platforms can certainly have their place when confronting social problems, but the flow of data and the revealed information must be carefully tailored to the target context. If you want to be involved with these future projects, please contact Kate Sim or Ben Zevenbergen.

The workshop was funded by the Princeton’s Center for Information Technology Policy, Princeton’s University Center for Human Values, the Ford Foundation, the Mozilla Foundation, and Princeton’s Council on Science and Technology.

 

 

Comments

  1. Anyone here old enough to remember anon.penet.fi, the anonymous email server? I seem to recall that the author ended up doing prison time. No good deed goes unpunished, perhaps. Or perhaps the widespread acclaim from many sex abuse survivors was only one side of the story. In either case, that was back when hackers didn’t have to be social entreprenoors. Back when freedom to tinker was a question of what you could get away with.