The Neuroethics Blog Series on Black Mirror: San Junipero
By Nathan Ahlgrim
![]() |
Image courtesy of Wikimedia Commons. |
*SPOILER ALERT* - The following contains plot spoilers for the episode “San Junipero” of the Netflix television series Black Mirror.
Your body, in many ways, is an extension of your identity. The coupling of the physical to the psychological can be represented by straightforward demographic details like sex, ethnicity, and age. Your body can also restrict your identity by illness, injury, and disability. The unavoidable link between body and identity only exists as long as you are stuck with what you’re born with. Science fiction, and some science fact, is working to decouple the mind and body using virtual worlds and virtual minds, casting a lure of limitless possibilities. Location, money, age, ability; all are at the user’s command. Advances in computer technology and neuroscience are making that lure more lifelike, more attractive, and (possibly) more attainable.
Image courtesy of the U.S. Department of Defense. |
Plot Summary and Technology Used
“San Junipero” opens with a 1980’s party town. The painfully dweebish Yorkie shuffles into a nightclub full of arcade games (and resident arcade geeks) where she meets her polar opposite in Kelly, for all accounts a carefree partier. During the couple’s first night together in San Junipero, Kelly discloses her previous and lengthy marriage. Both are young 20-somethings, and this is the first overt clue (although many subtler ones had appeared before) that the timeline is not quite normal.
The two reunite after Yorkie enters a time-bending search for Kelly through the 80’s, 90’s, and 00’s. Their ensuing conversation reveals that Yorkie is getting married and that Kelly is dying. Kelly insists on meeting up in the real world. She says “the real world” because the city of San Junipero is an elaborate simulation in which living people can visit or take up permanent residence by downloading their brain as their body expires.
![]() |
San Junipero sharply contrasts with the real world. Images courtesy of Pexels and Pixabay. |
Here we learn the totality of San Junipero’s technology: living people are allowed five hours of virtual life per week to have a “trial run” before making the decision to have their brains completely scanned and uploaded before they die. This, of course, requires WBE: the complete digital representation of a person’s brain.
Kelly successfully cajoles Greg into giving her and Yorkie a covert five minutes in San Junipero. She proposes in the digital world, and Yorkie accepts. The following scene shows Kelly asserting her Power of Attorney back in the real world, authorizing Yorkie’s euthanasia and permanent upload to San Junipero.
The couple reunite in the virtual world, and Kelly repeats her wish to die without becoming a “local” – no brain scan and no eternal life. She is thrown back into the real world at midnight by the state-imposed limits on time in San Junipero, leaving Yorkie to think she may not return.
In a stark tonal change from the majority of Black Mirror endings, Kelly reappears at her San Junipero beach house, and the two literally drive off into the sunset; the final scene shows the two – now two placidly blinking discs of light, robotically being archived and immediately lost amongst a wall of identical lights.
Current State of Technology
![]() |
Image courtesy of Wikipedia. |
Complete WBE seems stuck in sci-fi territory, but scientists and philanthropists love a good moonshot, and all the better if success means immortality. Many projects, from the government-funded Human Brain Project and BRAIN Initiative to the startup Carboncopies, have been fueled by the dream of a San Junipero-esque human simulator. Fields from across the sciences, including engineering, quantum computing, neuroscience, and cryonics are converging on this goal; but they have yet to even find where to stake the finish line. Even Anders Sandberg, one of the most vocal WBE evangelicals, labels it a “theoretical technology” [2]. On the bright side, if we have the technology to upload human brains to a digital form, simulating a California town will be a humble afterthought.
![]() |
Remembering the good old days often has positive psychological effects. Image courtesy of Wikimedia Commons. |
Ethical Considerations: What Black Mirror gets right and what it misses
The narrative of “San Junipero” operates among people who all have access to the immortalizing technology, so the fair distribution of WBE technology will not be discussed here. Rather, the ethics of these incorporeal minds and the transition between the analog California and the digital San Junipero merit more consideration than was offered in the episode.
Does a computer code have rights? That question sounds flippant, but the premise is not so outlandish when you consider that corporations now have rights according to the U.S. legal system. If code deserves the same protection, will it ever be intelligent enough to deserve the same rights as flesh-and-blood humans? The debate is already swirling over artificial intelligence; the World Economic Forum now lists the rights of intelligent programs as one of the nine ethical challenges for the field. San Junipero locals, who are embodied by a single disc of information, are treated wholly differently than the visitors, with some troubling implications.
San Junipero is a paradise. It is easy to imagine people choosing paradise over real-world pain brought on by loss, illness, disability, or simply the daily grind. Realizing this, the laws of this world put up barriers to entry, as Greg described to Kelly:
“[The] state's got a triple-lock on euthanasia cases. You gotta have sign off from the doc, the patient, and a family member. Stops people passing over just 'cause they prefer San Junipero flat out.”
Compare that to the locals. When Kelly balks at the idea of "forever", Yorkie throws back:
“you can remove yourself like that”
This distinction is taken as a given. Why, though, should physical euthanasia be more controlled than a digital death? In being selectively protective, the government is treating the digital copies as less-than-human. Ironically, this distinction flies in the face of San Junipero’s purpose: to extend life after death. The distinction between biological and digital, once a person’s entire consciousness is downloaded, is seen by some to be more artificial than San Junipero itself.
The biological/digital distinction was the focal point of a convincing (but later debunked) fan theory that Kelly never actually changed her mind, and that she really, truly, permanently died. The theory posits that the simulation then made a copy of Kelly to accompany Yorkie, since its purpose is to make its residents happy. There again is the same question: if Kelly is simulated to have all the same behaviors and thoughts, how is that different than the ‘normal’ downloading of consciousness? Michael Hendricks of McGill University points out that there is no distinction between transferring and replicating consciousness, since the computer code of a person did not exist before the transfer. Therefore, even if the fan theory was true, neither Yorkie nor Kelly could tell the difference between a simulation or a “real” copy.
![]() |
Image courtesy of Wikimedia Commons. |
It is for this reason that some ethicists follow the “principle of assuming the most.” Doing so assumes that simulations possess the most sentience as is reasonable to believe even in the absence of empirical data. Under this principle, any virtual mouse should be given the same protections as a real mouse would, down to its virtual pain killers. In the words of Anders Sandberg, “it is better to treat a simulacrum as the real thing than to mistreat a sentient being” [6].
![]() |
Image courtesy of Wikimedia Commons. |
Conclusions
Digital versions of Grandma may never happen. Even so, the ethics of artificial intelligence and virtual worlds are pertinent to existing technologies. Virtual worlds and virtual personalities on platforms like Second Life have already spawned marriages, divorce, and even semi-official government embassies. These digital actions have real consequences even when those spaces are not populated by fully conscious computer programs. Should these avatars be held to the same moral standards and governed by the same laws that apply to flesh and blood people? “San Junipero” thinks not. It was constructed as an escape that offered wish-fulfillment and freedom from consequences. That kind of marketing pitch makes it obvious why the government put in so many controls to prevent anyone and everyone from ‘passing over’ when in perfect health, with heaven just a zap away.
Sadly, Black Mirror never addresses the most important question of them all: what happens to this heaven when the power goes out?
References
[1] Markram H et al. (2015) Reconstruction and simulation of neocortical microcircuitry. Cell 163:456-492.
[2] Sandberg A, Bostrom N (2008) Whole brain emulation: A roadmap.
[3] Sedikides C, Wildschut T (2016) Past forward: Nostalgia as a motivational force. Trends in cognitive sciences 20:319-321.
[4] Raunio H (2011) In silico toxicology – non-testing methods. Frontiers in Pharmacology 2:33.
[5] Leist M, Hasiwa N, Rovida C, Daneshian M, Basketter D, Kimber I, Clewell H, Gocht T, Goldberg A, Busquet F, Rossi A-M, Schwarz M, Stephens M, Taalman R, Knudsen TB, McKim J, Harris G, Pamies D, Hartung T (2014) Consensus report on the future of animal-free systemic toxicity testing. ALTEX 31:341-356.
[6] Sandberg A (2014) Ethics of brain emulations. Journal of Experimental & Theoretical Artificial Intelligence 26:439-457.
Want to cite this post?
Ahlgrim, N.S. (2017). The Neuroethics Blog Series on Black Mirror: San Junipero. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/10/the-neuroethics-blog-series-on-black.html.