I'm watching: "Designing Against Domestic Violence" by Eva PenzeyMoog. These are notes from her talk.

  • 95% of domestic violence is perpetrated by males, so the talk will not be gender neutral, says the speaker. This is primarily a problem of male violence. Eva encourages that if the talk makes you feel uncomfortable as a male, you should just sit with that feeling for a while.

1 in 3 women in the US will experience physical violence at the hands of an intimate partner in her lifetime.
1 in 4 men will.

"Domestic violence is not an edge case."

# Bank Accounts

The first example is of a couple with a shared bank account. The female partner is reliant on the male partner for security questions whenever she logs into a new computer to try to access the shared account. 90% of domestic violence situations contain an aspect of financial control. In this case, the male partner could start not giving the answers to the questions, or changing the shared password so she can't access the account at all.

  • Banking tools for couples should be fully joint, not a modified single-person account.
  • There are already tools for monitoring and flagging financial abuse in banking, used to monitor the accounts of the elderly. We could expand those tools to all accounts to help prevent intimate partner violence, too.

# Home automation

Next example: a man and woman live in a home filled with "smart home" technology. While he's out of town, she notices the lights start going off, even immediately after she turns them back on. She goes for a walk and when she comes back, her code for the door doesn't work. She calls her husband and he gives her the new code and claims that the code has always been set that way. Later it gets really warm in the house and she realizes the thermostat has been set higher. When she calls him he says she must have done it herself and forgot. The goal is to make her forget her own experience.

With the Alexa smart speaker in their house, he can use the 'drop in' feature to make a call that is automatically accepted and which allows him to listen in on her private conversations. She might not have noticed the call started if she was in the other room or the volume was turned down low.

89% of Domestic Violence workers report having dealt with cases of tech abuse in the last year, so this is a new yet very pervasive problem.

  • history logs are essential. Who did what when?

# Pregancy

The number one cause of death of pregnant women in the US is murder at the hands of an intimate partner.

Pregnancy apps don't currently have a way to record injuries during pregnancy. What if there was an option to add injury records in pregnancy tracking apps, where certain words would automatically flag and offer resources on domestic violence if the injury had similar characteristics to DV?

# Stalking

81% of abuse victims are stalked by their abusers. That's 1 in 6 women in America who are stalked at some point in their lives.

GPS should always run in the foreground; it should not be a background process. For example, a woman who was being stalked by a partner didn't understand why he was able to track her - she wasn't using location on social media, and unsynced her phone regularly in case he had synced it while they were together. Eventually she learned her new vehicle had an on-by-default "feature" that allowed any registered vehicle user to track the position of the car in real time. This should not be hidden, and the user should be alerted about it all the time.

Hotels have lists of registered guests who are alllowed in, but social engineering can get past those - (I'm a delivery driver and she said to come right to her room, but didn't give the room number!) Perhaps hotels should also have an anti-guest list where if the person is on the list, they don't get access to the room no matter what they say.

# Framework for Designing for Safety

  1. Include people with experience with domestic violence in your user research.
  2. Imagine scenarios for abuse and design against them.
  3. Identify opportunities for safe and meaningful intervention
  • Financial abuse, fitness/health/pregnancy products
  • Is there a place in your product where a certain type of user behavior could indicate abuse?

There's a lot of work to do, but this is the work we can do while we're at work.