I’ve been batting this around for a few weeks now, ever since I started drafting Free As In Health Care. I’m not sure where to take it at this point, so I’m going to publish it and see what happens. I’d like to thank a number of my friends and colleagues for feedback with this list, but this is high-tension-wire stuff, so I’m not going to do that here. You know who you are and I’m grateful.
This was inspired by a combination of things, starting with Joel Spolsky’s now-ancient “Twelve Steps To Better Software” post and Atul Gawande’s “Checklist Manifesto”, washed down with a few dozen pages of early automotive and highway safety legislation.
“A score of 12 is perfect, 11 is tolerable, but 10 or lower and you’ve got serious problems. The truth is that most software organizations are running with a score of 2 or 3, and they need serious help, because companies like Microsoft run at 12 full-time.” – Joel Spolsky
One notable part of that early legislation is that for the most part it outlines the minimum standards that must be achieved without specifying how to achieve them; that was left up to the manufacturers’ ingenuity. But those manufacturers needed to prove that they’d met or exceeded all of those standards for that vehicle to go to market.
I believe that we can and should take the same approach to our design decisions about how software treats people. And I think we should be doing that at the very earliest parts of the design and planning stages; like security, like Jobs’ old quote about design, this isn’t a coat of paint you can add later.
To that end, here’s a list of vulnerable user stories. My goal was to end with a preflight checklist developers can quickly run down to give a yes or no (or doesn’t apply) answer to the list of risks or challenges that marginalized and vulnerable people will need to face if they’re navigating a life with this software in it.
This not meant to be comprehensive (or perfect, or finished). All I want to do is set a bar, knowing what we know about social software in 2016. The very lowest bar you have to clear to consider yourself a responsible developer – that is to say, a responsible human being, whose craft happens to be software.
I’ve put them on GitHub, if you want to look at them there.
The Minimum Viable Set of User Stories
- User changes email addresses
- User changes physical addresses
- User is or becomes homeless
- User changes legal status
- User changes legal name
- User changes gender
- User identifies themselves by a pseudonym
- User is not always, and/or or not reliably, connected to the internet
- User does not control the hardware they use to access the internet
- User is trying to escape an abusive spouse or partner
- User is trying to escape an abusive family
- User is trying to escape a cult
- User is estranged from their family
- User is managing an addiction
- User is managing a mental health issue
- User is targeted for abuse by an individual
- User is targeted for abuse by an informally organized group
- User is targeted for abuse by a corporation
- User is targeted for abuse by a nation-state
- User is a member of a group or demographic targeted for abuse by an informally organized group
- User is a member of a group or demographic targeted for abuse by a nation-state