Wednesday, April 5, 2017

"If you walk in someone else's shoes, then you've taken their shoes": empathy machines as appropriation machines

In a 2015 TED talk (pictured above) VR filmmaker Chris Milk claimed that virtual reality could be the ultimate "empathy machine". Instead of fading away into irrelevance like most TED talks, this concept of the VR empathy machine has somehow survived into 2017. VR boosters like the United Nations' VR program and influential podcast Voices of VR continue to push this line of thinking.

I'm here to argue absolutely in the strongest terms: I am against the promise of any claim to a "VR empathy machine", and I am against it forever.

The rhetoric of the empathy machine asks us to endorse technology without questioning the politics of its construction or who profits from it. Empathy is good, and VR facilitates empathy, so therefore VR is good -- no questions please. (And if you hate VR, that means you hate empathy!) It's a disturbing marketing strategy, and I hope it's obvious how making a refugee tourism simulator your "flagship" VR experience can come across as an extremely cynical use of pain and suffering to sell your product.

I also doubt any empathy machine supporters have ever been the actual "target" of an actual empathy machine. Ironically, as empathizers, they seem totally unable to empathize with the empathized, so let me spell this out. The basic problem with empathy machines is what if we don't want your fucking empathy?

You might think this is a photograph of hell itself, but it's also the "Scotiabank Digital Factory"
First, let's get some other basic questions about VR empathy out of the way: How do you know this is actually empathy you're feeling? Do you really need to wear a VR headset in order to empathize with someone? Can't you just fucking listen to them and believe them? You need to be entertained as well? Are you sure this isn't about you?

I'm very familiar with people annexing other peoples' experiences under the banner of empathy. Specifically, I've been making realistic 3D games about gay relationships for a while, and the vast majority of my players and fans happen to be straight people. This leads to a widely-held but incorrect assumption that I make my games for "straight people to understand what being gay is like" -- and some of the worst homophobes on YouTube even call my games "gay simulators" so they can react with disgust toward it.

This "straight empathy" suddenly makes my games more about "how beautiful and benevolent the straight people are, to tolerate my gay existence instead of vomiting" -- instead of highlighting gay culture or queer solidarity, as I intended. I want to imagine fantastic worlds where straight people aren't as important -- and yet, they demand that I dance for them in VR, whenever they want, forever. For this reason, I hate it when people think my games are like empathy machines. I don't want your empathy, I want justice!

As usual, tech needs to step back, and maybe, for once, consider the possible cruelty of what they're doing and demanding of people. Empathy, like any emotion, is not just a buzzword to decorate your bank-branded tech startup playground (see photo above) -- it is also a complicated tool with history and politics to its deployment.

In her book "Long Past Slavery: Representing Race In The Federal Writers Project", historian Catherine Stewart talks about one of the most powerful "analog" empathy machines ever built: the American slave narrative. You can read the full write-up on Slate for a more detailed account, but I'll try to summarize it right here and now:

During the Great Depression, the US government sent out researchers to interview elderly ex-slaves across the American South to record their experiences and histories about slavery. This sounds like a great and noble thing, putting people to work to record and recognize the pain of oppressed people -- but, uh, that's not exactly how it actually turned out.

For one thing, the researchers were mostly white people. And they weren't just any white people, but local white people who had enough privilege to be educated to read and write and win government research positions. They were white people often descended from educated slave-owning families across the South. Some of the interviewers were even members of the United Daughters of the Confederacy.

So let's try this """empathy""" experiment. Imagine you are an ex-slave, forever traumatized by white people enslaving you. One day, totally unannounced, your former torturer's white friends knock on your door and ask you about your life as a slave...

Would you tell these white people about how bad white people were? Or...
  • You would try to put a positive spin on slavery, and tell this white person that it wasn't so bad, white people were always fair and good, and even claim that you deserved to be whipped.
  • Or maybe you would change the subject, tell them some folk tales. It'll seem to them like slave culture was so exotic and eclectic that it was a "shame" that it ended.
  • You could even tell a story about how you're afraid of ghosts and how white people remind you of ghosts. They're more likely to accept stereotypes of black superstition than tales of white cruelty.
When Stewart compared white interviewers' reports with the smaller number of black interviewers' reports, she found that black interviewers recorded many more upfront stories about cruel oppression and pain. In some cases, white supervisors would even advise their black researchers to look for more "variety" because they had enough of the "white people are bad" content, they wanted to hear more about the "good times" during slavery... So what seemed like "raw" "authentic" "immediate" stories to white readers (even if they were Northern abolitionists) were actually these extremely constructed and self-conscious stories that black people told to appease white people.

Black people were just trying to survive white empathy. In a similar way, are Syrian refugees trying to survive UN VR empathy, even as many of them blame the UN for not doing more? (Don't count on the UN to forward these objections, though!)

Do black people want you to feel better about enslaving them, or would they prefer safety and resources and equal treatment in the eyes of the law? Do Syrian refugees want you to appreciate their resilience, or do they want political justice and stability and to return to their homes?

If we want to empathize, we must always question who really "benefits" from our """empathy""". But VR empathy machines, especially slick UN-sponsored empathy productions built to milk donations from millionaires at Davos, definitely do not foster any kind of that critical reflection.

To quote Wendy HK Chun's excellent talk at Weird Reality: if you "walk in someone else's shoes", then you've taken their shoes. If you won't believe someone's pain unless they wrap an expensive 360 video around you, then perhaps you don't actually care about their pain.

I think empathy machine apologists are lying to themselves. The "embodied" "transparent immediacy" of virtual reality (or much less, 360 video) does not obliterate political divisions. Even a culturally-advanced medium like books can barely chip away at the problem, so VR definitely can't. In this political sense, VR can't actually offer any embodiment, transparency, or immediacy to anyone. At best, VR can only offer the illusion of empathy.

I don't use the "A-word" lightly, but I think this is such a bad look for VR culture that we must deploy the white liberal kryptonite immediately: VR empathy machines are just VR Appropriation machines. They are fundamentally about mining the experiences of suffering people to enrich the self-image of VR users... or, even worse, they're about mining the experiences of suffering people to enrich the cultural appeal of VR brands? The world is going to shit, and VR wants to profit from it. Great!

Wait, no. Yuck. Gross. Let's try to be better than this, OK?



Q: Shouldn't we try to reclaim or rehabilitate "empathy"?
A: Nope. Like tech's poisoning of "disrupt", empathy is now toxic and radioactive, so let's bury it underground for 10,000 years. Someone else will reclaim it, but it will not be us. Let's call this a strategic retreat. You have better things to do.

Q: What would an ethical "empathy machine" look like?
A: It would be made by the empathized community, to benefit the empathized community. Which probably means it can't be on VR, because currently anything made for VR mainly benefits the VR industry. This is not a technological problem nor a design problem, it is a cultural and political problem. VR is currently an expensive tech dude toy, and it will take a long time for VR culture change, if ever. Don't use the suffering of others to force that culture change, because anyway, it won't even work.