Monday, October 9, 2017

The Neuroethics Blog Series on Black Mirror: San Junipero

By Nathan Ahlgrim

Image courtesy of Wikimedia Commons.
Humans in the 21st century have an intimate relationship with technology. Much of our lives are spent being informed and entertained by screens. Technological advancements in science and medicine have helped and healed in ways we previously couldn’t dream of. But what unanticipated consequences may be lurking behind our rapid expansion into new technological territory? This question is continually being explored in the British sci-fi TV series Black Mirror, which provides a glimpse into the not-so-distant future and warns us to be mindful of how we treat our technology and how it can affect us in return. This piece is part of a series of posts that will discuss ethical issues surrounding neuro-technologies featured in the show and will compare how similar technologies are impacting us in the real world.

*SPOILER ALERT* - The following contains plot spoilers for the episode “San Junipero” of the Netflix television series Black Mirror.

Your body, in many ways, is an extension of your identity. The coupling of the physical to the psychological can be represented by straightforward demographic details like sex, ethnicity, and age. Your body can also restrict your identity by illness, injury, and disability. The unavoidable link between body and identity only exists as long as you are stuck with what you’re born with. Science fiction, and some science fact, is working to decouple the mind and body using virtual worlds and virtual minds, casting a lure of limitless possibilities. Location, money, age, ability; all are at the user’s command. Advances in computer technology and neuroscience are making that lure more lifelike, more attractive, and (possibly) more attainable.

Image courtesy of the U.S. Department of Defense.
Technology has moved virtual worlds far beyond the days of The Sims or Second Life. The Emmy Award-Winning Black Mirror Episode “San Junipero” is hardly the first example of pop-culture waxing poetic over the narratives of downloaded minds. Movies like Tron and Avatar describe worlds beyond reality, with such advanced computing that the "virtual reality" became a second reality. Therein, of course, lies the question: how should virtual minds be treated in a virtual world? “San Junipero” describes a world in which the entirety of human consciousness can be transferred and even downloaded via Whole Brain Emulation (WBE). In a world with fantastical technology, the treatment of the disembodied minds is taken as a given. However, real-world scientists and entrepreneurs are setting their sights on their own San Junipero, and we cannot assume the ethics of WBE and the treatment of digital people will fall into place by itself. The rights of computer code are not as straightforward as they seem when the code is Grandma.

 Plot Summary and Technology Used

“San Junipero” opens with a 1980’s party town. The painfully dweebish Yorkie shuffles into a nightclub full of arcade games (and resident arcade geeks) where she meets her polar opposite in Kelly, for all accounts a carefree partier. During the couple’s first night together in San Junipero, Kelly discloses her previous and lengthy marriage. Both are young 20-somethings, and this is the first overt clue (although many subtler ones had appeared before) that the timeline is not quite normal.

The two reunite after Yorkie enters a time-bending search for Kelly through the 80’s, 90’s, and 00’s. Their ensuing conversation reveals that Yorkie is getting married and that Kelly is dying. Kelly insists on meeting up in the real world. She says “the real world” because the city of San Junipero is an elaborate simulation in which living people can visit or take up permanent residence by downloading their brain as their body expires.

San Junipero sharply contrasts with the real world.
Images courtesy of Pexels and Pixabay.
In the real world, both women are elderly: Yorkie a quadriplegic since adolescence and Kelly dying from an illness. Kelly visits Yorkie in the hospital, where she meets Greg, to whom Yorkie will be married later that month. Greg is a caregiver and agreed to marry Yorkie so that his Power of Attorney would allow her to “pass over”— to be euthanized and become a permanent digital resident of San Junipero.

Here we learn the totality of San Junipero’s technology: living people are allowed five hours of virtual life per week to have a “trial run” before making the decision to have their brains completely scanned and uploaded before they die. This, of course, requires WBE: the complete digital representation of a person’s brain.

Kelly successfully cajoles Greg into giving her and Yorkie a covert five minutes in San Junipero. She proposes in the digital world, and Yorkie accepts. The following scene shows Kelly asserting her Power of Attorney back in the real world, authorizing Yorkie’s euthanasia and permanent upload to San Junipero.

The couple reunite in the virtual world, and Kelly repeats her wish to die without becoming a “local” – no brain scan and no eternal life. She is thrown back into the real world at midnight by the state-imposed limits on time in San Junipero, leaving Yorkie to think she may not return.

In a stark tonal change from the majority of Black Mirror endings, Kelly reappears at her San Junipero beach house, and the two literally drive off into the sunset; the final scene shows the two – now two placidly blinking discs of light, robotically being archived and immediately lost amongst a wall of identical lights.

Current State of Technology

Image courtesy of Wikipedia.
San Junipero is a world enabled by completely comprehensive WBE. The residents have their entire consciousness scanned and transferred in digital form to an ostensibly immortal computer bank. Everyone from Cristof Koch of the Allen Brain Institute to Michio Kaku recognizes the human brain as the most complicated object in the universe, so it should come as no surprise that such technology is more of a fantasy than the city of San Junipero itself. The human brain is estimated to have a petabyte (1 quadrillion bytes) memory capacity, and researchers from the Salk Institute say that a computer simulation with similar memory and processing power would need the energy from “basically a whole nuclear power station” for just one person. For all of our supercomputers, deep neural networks, and artificial intelligence, real-life brain modelling has currently peaked at a ‘crumb’ of rat cortex (0.29 ± 0.01 mm3 containing ~31,000 neurons [1]). Even this state-of-the-art model disregards any and all glial cells, blood vessels, and capacity for plasticity. All this to say that many computer scientists and neuroscientists doubt WBE will ever be possible.

Complete WBE seems stuck in sci-fi territory, but scientists and philanthropists love a good moonshot, and all the better if success means immortality. Many projects, from the government-funded Human Brain Project and BRAIN Initiative to the startup Carboncopies, have been fueled by the dream of a San Junipero-esque human simulator. Fields from across the sciences, including engineering, quantum computing, neuroscience, and cryonics are converging on this goal; but they have yet to even find where to stake the finish line. Even Anders Sandberg, one of the most vocal WBE evangelicals, labels it a “theoretical technology” [2]. On the bright side, if we have the technology to upload human brains to a digital form, simulating a California town will be a humble afterthought.

Remembering the good old days often
has positive psychological effects.
Image courtesy of Wikimedia Commons.
In contrast to the potentially unreachable technology, the therapeutic purpose of San Junipero has been modelled outside of the Black Mirror universe, albeit in a decidedly analog fashion. Kelly, in describing the sales pitch for San Junipero, calls it “Immersive Nostalgia Therapy.” Nostalgia is known to encourage positive coping and psychological states, which can be brought on by old songs, familiar smells, or just by reminiscing about the good old days. Constantine Sedikides and colleagues have induced nostalgic feelings in hundreds of research participants easily and reliably, doing nothing more than letting their participants revel in those past experiences. The resulting feelings of reminiscence and bittersweet longing were sufficient to increase coping, closeness, and optimism while decreasing defensiveness [3]. Sadly, none of that research has induced feelings of nostalgia by transporting their subjects’ consciousness to an 80’s dance club.

Ethical Considerations: What Black Mirror gets right and what it misses

The narrative of “San Junipero” operates among people who all have access to the immortalizing technology, so the fair distribution of WBE technology will not be discussed here. Rather, the ethics of these incorporeal minds and the transition between the analog California and the digital San Junipero merit more consideration than was offered in the episode.

Does a computer code have rights? That question sounds flippant, but the premise is not so outlandish when you consider that corporations now have rights according to the U.S. legal system. If code deserves the same protection, will it ever be intelligent enough to deserve the same rights as flesh-and-blood humans? The debate is already swirling over artificial intelligence; the World Economic Forum now lists the rights of intelligent programs as one of the nine ethical challenges for the field. San Junipero locals, who are embodied by a single disc of information, are treated wholly differently than the visitors, with some troubling implications.

San Junipero is a paradise. It is easy to imagine people choosing paradise over real-world pain brought on by loss, illness, disability, or simply the daily grind. Realizing this, the laws of this world put up barriers to entry, as Greg described to Kelly:
“[The] state's got a triple-lock on euthanasia cases. You gotta have sign off from the doc, the patient, and a family member. Stops people passing over just 'cause they prefer San Junipero flat out.”
Compare that to the locals. When Kelly balks at the idea of "forever", Yorkie throws back:
“you can remove yourself like that
This distinction is taken as a given. Why, though, should physical euthanasia be more controlled than a digital death? In being selectively protective, the government is treating the digital copies as less-than-human. Ironically, this distinction flies in the face of San Junipero’s purpose: to extend life after death. The distinction between biological and digital, once a person’s entire consciousness is downloaded, is seen by some to be more artificial than San Junipero itself.

The biological/digital distinction was the focal point of a convincing (but later debunked) fan theory that Kelly never actually changed her mind, and that she really, truly, permanently died. The theory posits that the simulation then made a copy of Kelly to accompany Yorkie, since its purpose is to make its residents happy. There again is the same question: if Kelly is simulated to have all the same behaviors and thoughts, how is that different than the ‘normal’ downloading of consciousness? Michael Hendricks of McGill University points out that there is no distinction between transferring and replicating consciousness, since the computer code of a person did not exist before the transfer. Therefore, even if the fan theory was true, neither Yorkie nor Kelly could tell the difference between a simulation or a “real” copy.

Image courtesy of Wikimedia Commons.
The conversation about digital people is, of course, theoretical. Even WBE proponents acknowledge that “brain emulation is … vulnerable to speculation, ‘handwaving’ and untestable claims.” [2]. But, science is on the brink of inflicting digital pain. Scientists, animal rights activists, and others are pushing for digital animal models, also known as in silico experiments. American and European governments are funding the development of non-animal models, especially for toxicology studies [4,5]. These experiments present precise simulations of biological systems to model new treatments and probe new questions. Currently, in silico experiments are largely performed on digital organs and not whole organisms, but the latter is in the works, which again presents that artificial distinction. If the model is complex enough to simulate the organism, then it stands to reason that it feels the same pain, and suffers just like its skin-and-bones counterpart. The goal of replacing animal models with simulations is self-limiting: the better the simulation, the more digital suffering is inflicted.

It is for this reason that some ethicists follow the “principle of assuming the most.” Doing so assumes that simulations possess the most sentience as is reasonable to believe even in the absence of empirical data. Under this principle, any virtual mouse should be given the same protections as a real mouse would, down to its virtual pain killers. In the words of Anders Sandberg, “it is better to treat a simulacrum as the real thing than to mistreat a sentient being” [6].

Image courtesy of Wikimedia Commons.
If it ever becomes sufficiently sophisticated, WBE will represent a new, practical dualism (the belief that the mind and body are distinct entities) that is completely independent from philosophical beliefs. After successfully recreating consciousness in a digital form, the physical form of the body and brain become artefactual and unnecessary to maintain thought. The question of whether the body is needed to maintain identity and personhood, though, will always remain a philosophical one. In fact, identity and personhood may need entirely new definitions in the face of WBE technology before philosophers can meaningfully debate the issue. Digital lab rats, digital family members, and digital worlds are immune from physical harm, but their complexity gives them the capacity to suffer regardless of “who” they are. As such, consistent ethical standards require the treatment of digital and physical life to be determined by that life’s complexity, not its relation to the physical world.

Conclusions

Digital versions of Grandma may never happen. Even so, the ethics of artificial intelligence and virtual worlds are pertinent to existing technologies. Virtual worlds and virtual personalities on platforms like Second Life have already spawned marriages, divorce, and even semi-official government embassies. These digital actions have real consequences even when those spaces are not populated by fully conscious computer programs. Should these avatars be held to the same moral standards and governed by the same laws that apply to flesh and blood people? “San Junipero” thinks not. It was constructed as an escape that offered wish-fulfillment and freedom from consequences. That kind of marketing pitch makes it obvious why the government put in so many controls to prevent anyone and everyone from ‘passing over’ when in perfect health, with heaven just a zap away.

Sadly, Black Mirror never addresses the most important question of them all: what happens to this heaven when the power goes out?

References

[1] Markram H et al. (2015) Reconstruction and simulation of neocortical microcircuitry. Cell 163:456-492.
[2] Sandberg A, Bostrom N (2008) Whole brain emulation: A roadmap.
[3] Sedikides C, Wildschut T (2016) Past forward: Nostalgia as a motivational force. Trends in cognitive sciences 20:319-321.
[4] Raunio H (2011) In silico toxicology – non-testing methods. Frontiers in Pharmacology 2:33.
[5] Leist M, Hasiwa N, Rovida C, Daneshian M, Basketter D, Kimber I, Clewell H, Gocht T, Goldberg A, Busquet F, Rossi A-M, Schwarz M, Stephens M, Taalman R, Knudsen TB, McKim J, Harris G, Pamies D, Hartung T (2014) Consensus report on the future of animal-free systemic toxicity testing. ALTEX 31:341-356.
[6] Sandberg A (2014) Ethics of brain emulations. Journal of Experimental & Theoretical Artificial Intelligence 26:439-457.

Want to cite this post?

Ahlgrim, N.S. (2017). The Neuroethics Blog Series on Black Mirror: San Junipero. The Neuroethics Blog. Retrieved on , from  http://www.theneuroethicsblog.com/2017/10/the-neuroethics-blog-series-on-black.html.

No comments: