You Can’t Fix Unethical Design by Yourself
Individual action isn’t enough to spur the paradigm shift we need
Nearly every tech conference right now has at least one, if not many, sessions about ethics: Ethics in artificial intelligence, introductions to data ethics, why letting the internet go to sleep is the ethical thing to do, or just plain integrating the basics of ethics into your design. We as a community are doing a great job raising questions about the implications of technology and spreading awareness to our communities about the potential for harm.
I should know — I’m one of the speakers doing it. My talk, “Designing Against Domestic Violence,” has been accepted to more conferences than I ever dreamed. I’ve been happily surprised to hear from conferences specifically themed around development languages that I never once mention. Conference organizers want talks like this. Overall, I’m thrilled with the interest in how our digital products can facilitate domestic abuse and how we can design against it.
Which is why I want you to trust me when I say that this kind of ethics education is not enough. Don’t get me wrong — it absolutely matters. I wouldn’t be giving up all of my free time and have quit my lucrative teaching side hustle if I didn’t think it was worth it. People seek me out during conferences to ask for help with a specific feature that my talk prompted them to realize could be used for interpersonal abuse, and others have said it’s changed how they do their design work.
That said, this type of ethics education won’t fully solve the problem. It will make meaningful impacts on individuals, yes — but in an environment with enormous profits to be gained and few consequences for bad behavior, where government oversight is laughable, where a CEO like Facebook’s Mark Zuckerberg has shown over and over again that he can issue an apology and continue doing the exact same terrible things as before without losing power or money, how much can individual designers and developers and product managers really do?
The recent uproar about Superhuman is a perfect example, and I was happy to see Charlie Warzel, writer of my favorite email newsletter The Privacy Project, from The New York Times, lay it out so concisely: