Tuesday, December 19, 2017

The Neuroethics Blog Series on Black Mirror: White Christmas

By Yunmiao Wang

Miao is a second year graduate student in the Neuroscience Program at Emory University. She has watched Black Mirror since it first came out, and has always been interested in the topics of Neuroethics. 

Humans in the 21st century have an intimate relationship with technology. Much of our lives are spent being informed and entertained by screens. Technological advancements in science and medicine have helped and healed in ways we previously couldn’t dream of. But what unanticipated consequences may be lurking behind our rapid expansion into new technological territory? This question is continually being explored in the British sci-fi TV series Black Mirror, which provides a glimpse into the not-so-distant future and warns us to be mindful of how we treat our technology and how it can affect us in return. This piece is the final installment of a series of posts that discuss ethical issues surrounding neuro-technologies featured in the show, and will compare how similar technologies are impacting us in the real world. 

SPOILER ALERT: The following contains plot spoilers for the Netflix television series, Black Mirror

Tuesday, December 12, 2017

Neuroethics in the News Recap: Psychosis, Unshared Reality, or Clairaudiance?

By Nathan Ahlgrim

Even computer programs, like DeepDream, hallucinate.
Courtesy of Wikimedia Commons.
Experiencing hallucinations is one of the most sure-fire ways to be labeled with one of the most derogatory of words: “crazy.” Hearing voices that no one else can hear is a popular laugh line (look no further than Phoebe in Friends), but it can be a serious and distressing symptom of schizophrenia and other incapacitating disorders. Anderson Cooper demonstrated the seriousness of the issue, finding the most mundane of tasks nearly impossible as he lived a day immersed in simulated hallucinations. Psychotic symptoms are less frequently the butt of jokes with increasing visibility and sensitivity, but people with schizophrenia and others who hear voices are still victims of stigma. Of course, people with schizophrenia deserve to be treated like patients in the mental healthcare system to ease their suffering and manage their symptoms, but there is a population who are at peace with the voices only they can hear. At last month’s Neuroethics and Neuroscience in the News meeting, Stephanie Hare and Dr. Jessica Turner of Georgia State University painted the contrast between people with schizophrenia and people that scientists call “healthy voice hearers.” In doing so, they discussed how hearing voices should not necessarily be considered pathological, reframing what healthy and normal behavior should include.

Tuesday, December 5, 2017

Neuroethics, the Predictive Brain, and Hallucinating Neural Networks

By Andy Clark

Andy Clark is Professor of Logic and Metaphysics in the School of Philosophy, Psychology and Language Sciences, at Edinburgh University in Scotland. He is the author of several books including Surfing Uncertainty: Prediction, Action, and the Embodied Mind (Oxford University Press, 2016). Andy is currently PI on a 4-year ERC-funded project Expecting Ourselves: Prediction, Action, and the Construction of Conscious Experience.

In this post, I’d like to explore an emerging neurocomputational story that has implications for how we should think about ourselves and about the relations between normal and atypical forms of human experience.