Black Mirror in the Rear-View Mirror: An Interview with the Authors
![]() |
Image courtesy of Wikimedia Commons. |
The Neuroethics Blog hosted a special series on Black Mirror over the past year, originally coinciding with the release of its third season on Netflix. Black Mirror is noted for its telling of profoundly human stories in worlds shaped by current or future technologies. Somnath Das, now a medical student at Thomas Jefferson University, founded the Blog’s series on Black Mirror. Previous posts covered "Be Right Back", "The Entire History of You", "Playtest", "San Junipero", "Men Against Fire", "White Bear", and "White Christmas". With Season 4 released at the end of December 2017, Somnath reconvened with contributing authors Nathan Ahlgrim, Sunidhi Ramesh, Hale Soloff, and Yunmiao Wang to review the new episodes and discuss the common neuroethical threads that pervade Black Mirror.
The discussion has been edited for clarity and conciseness.
*SPOILER
ALERT* - The following contains plot spoilers for the Netflix television series Black Mirror.




Somnath: My next question with respect to brain data and privacy is more about public opinion and how ethicists respond to public opinion. With Google Glass, we saw that many people were really uncomfortable with brain-computer-interfaces (BCI’s) being integrated into their lives. There were two issues here. One issue was that people didn't want to have random people wearing this glass and taking photos or videos of them, which is a pretty obvious argument. And the second argument was that people were uncomfortable with what the data could be used for. But as we've seen with a lot of technologies, like with cars, people [used to be] scared of the internal combustion engine exploding. And nowadays we accept them. We walk around them very easily. We're very familiar with them. I was wondering, in the vein of the episode “The Entire History of You” where everybody has a brain computer interface that can record and store memories, do you think people would eventually be able to accept these BCI’s as normal?

Yunmiao: For example, in “Crocodile,” people do have access to neural data. One could go to the extreme. Mia (the central protagonist) essentially killed everyone who could potentially have a memory of her murder(s). On one hand, such technology might stop people from committing any crime, knowing someone might be watching. On the other hand, it might also be a threat to the society because people will feel the threat of that information getting out.
![]() |
Image courtesy of Wikipedia. |
Somnath: My next question is about “The Emulated Self” and focuses on storing people's consciousness against their will. In “Hang the DJ,” however, were introduced to a dating app that simulates hundreds of versions of ourselves and other people with a near perfect emulation of our thoughts feelings and personalities. It basically takes the mystery out of dating. The app then mysteriously kills off the simulations, or deletes their code, when it determines that two people could be matched. But for me that begs the question: would emulating those perfect copies of people, taking their memories away, putting them in an unknown place, and then deleting their code be unethical? Is that considered imprisonment? And does that even matter?
Hale: You're not deleting an individual artificial intelligence within that universe, you’re deleting the entire thing at once. So you're not causing any sort of relational harm. You're not killing an individual that other individuals know and will grieve over. Everyone disappears at once within that universe. But of course, a lot of it comes down to an unanswerable question: how can we possibly know whether a simulated person actually experiences the emotions that they appear to experience?
Nathan: Yes, “Hang the DJ” has a good outcome in the end. But I think it's unfair of us to judge that technology and the consequences of it based on a dating app when the exact same technology could be used differently, like in the finale “Black Museum.” With pretty much the same technology as the dating app, a man, whether deservedly or not, was put into a perpetual torture. Or, at least the emulation of his consciousness is.
![]() |
Image courtesy of Flickr user Many Wonderful Artists. |
Sunidhi: Also, how much of it is actually deleted? Is it fully deleted, or does it continue to exist somewhere?
Somnath: “San Junipero” showed us a positive way a similar technology was used, as a way of ensuring a good death. Or rather, a life beyond death. The episode concluded with one of the most remembered love stories in pop-culture. When the person died in the real world, a new version of that person was created in the simulation. My question is: does the company then own your life? You'd be at the whims of that company. Is that necessarily a bad thing? The people inside are living a good life even though they're dependent on this company owning them. Is it a good thing to live in a simulation or is it not?
Nathan: It can never be a good thing as long as there is a distinction between the simulation and the real world. There was no perceptual difference between the simulation and the real world in “San Junipero.” Even so, the real world seemed to treat the simulation as something quantitatively different. The people in the simulation had different legal rights. We instinctively think of a person undergoing a change like Alzheimer’s Disease as retaining their identity. Their personality is different, their memories change, but you know it’s the same person much more than a simulation in “San Junipero,” where their personality is identical. As long as we think of a simulated world as something demonstrably different, what if you don't renew your contract with the company who built San Junipero? Then they’re entitled to terminate you. You’d die. Again.
Sunidhi: What’s interesting in “San Junipero” is that the simulated copies still retain the same memories. Then it’s an iffy line as to how you can be different people but still retain the same memories, life experiences, etc.
Yunmiao: I think the question is whether the simulated self is continuous with the original person, or whether it’s another life or person. What if they both exist at the same time, like in many other Black Mirror episodes? I don't think the copy, or simulated person, is an extension of the original person. I think they have their own mind, and they are their own person. Thus, the original person should not have any ownership of that emulated self.
Somnath: My final question is about emulation. We’re pretty far away from emulating human bodies. The research is still in its infancy. When I wrote about it on the blog, the research basically said that neuroscientists are still trying to figure out how to emulate ion channels. Never mind complex neural activity or entire human beings. So why do you think the show keeps coming back to the emulated self if we’re so far away from it? Do you think it just makes for a good story, or do you think there is something more important about how the American consciousness reacts to this technology when it is portrayed in the show?
![]() |
Image courtesy of Pixabay. |
Hale: I agree, and I think that people can’t help but ask themselves when they’re seeing it on the show: how will this affect me and my life? If a technology feels completely distant because you won’t see it for hundreds of years, you will only be casually interested. But these technologies are presenting a pseudo-immortality. Even now, we might be close to a point to saving our brains or brain data, if only cryogenically. One day in the future, when we have the technology to do something with that, we could digitally pop into existence again. People see this and feel it’s not within arm’s reach, but it is just beyond their fingertips.
Nathan: I’m more of a skeptic when it comes to this technology possibly ever bearing fruit. But I still think that, even if it is completely fantastical, it’s important to get into the public consciousness. Science fiction as much as fantasy can serve as an allegory for the questions we are really asking. Like Yunmiao said, questions from immortality to identity and personhood. It’s a lot easier to enter the conversation if you’re asking, ‘What if I upload myself to Star Trek?’ (as in “USS Callister”) instead of, ‘what if I misrepresent myself on Facebook and my boss thinks I’m someone completely different?’
![]() |
Image courtesy of Flickr user FrenchKheldar. |
Proponents of these technologies, like those who are trying to make emulation happen, often contend that our hesitancy is driven by fear. They contend that progress is impeded by hand-wringing ethicists. We can’t ignore that the show brings a fascination to all these technologies, regardless of the grim consequences. That's why ethicists do need to respond to the show. Ethicists do better when they get out of their ivory tower. At the same time, pop-culture phenomena like
Before we close, I have to ask: favorite episodes? For me, “White Bear” was the most fascinating for the neuroethical implications. But as a consumer, “The Waldo Moment” is my favorite.
Yunmiao: “White Christmas”
Hale: “USS Callister”
Sunidhi: “Men Against Fire”
Nathan: “Hated in the Nation”
Want to cite this post?
Ahlgrim, N. (2018). The Neuroethics Blog Series on Black Mirror: Black Mirror in the Rear-view Mirror: An Interview with the Authors. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2018/03/black-mirror-in-rear-view-mirror.html