Tuesday, October 31, 2017

The Neuroethics Blog Series on Black Mirror: Men Against Fire

By Sunidhi Ramesh

Image courtesy of Pexels.
Humans in the 21st century have an intimate relationship with technology. Much of our lives are spent being informed and entertained by screens. Technological advancements in science and medicine have helped and healed in ways we previously couldn’t dream of. But what unanticipated consequences may be lurking behind our rapid expansion into new technological territory? This question is continually being explored in the British sci-fi TV series Black Mirror, which provides a glimpse into the not-so-distant future and warns us to be mindful of how we treat our technology and how it can affect us in return. This piece is part of a series of posts that will discuss ethical issues surrounding neuro-technologies featured in the show and will compare how similar technologies are impacting us in the real world. 

SPOILER ALERT: The following contains plot spoilers for the Netflix television series, Black Mirror

Tuesday, October 24, 2017

Too far or not far enough: The ethics and future of neuroscience and law

By Jonah Queen

Image courtesy of Pixabay.
As neurotechnology advances and our understanding of the brain increases, there is a growing debate about if, and how, neuroscience can play a role in the legal system. In particular, some are asking if these technologies could ever be used to accomplish things that humans have so far not been able to, such as performing accurate lie detection and predicting future behavior.

For September’s Neuroethics and Neuroscience in the News event, Dr. Eyal Aharoni of Georgia State University spoke about his research on whether biomarkers might improve our ability to predict the risk of recidivism in criminal offenders. The results were published in a 2013 paper titled “Neuroprediction of future rearrest1," which was reported in the media with headlines such as “Can we predict recidivism with a brain scan?” The study reports evidence that brain scans could potentially improve offender risk assessment. At the event, Dr. Aharoni led a discussion of the legal and ethical issues that follow from such scientific findings. He asked: “When, if ever, should neural markers be used in offender risk assessment?”

Tuesday, October 17, 2017

Hot Off the Presses: The Neuroethics Blog Reader and Issue 8.4

It is our pleasure to present you with two newly released publications: the second edition of The Neuroethics Blog reader and the 8.4 issue of the American Journal of Bioethics Neuroscience.

Image courtesy of Flickr user Leo Reynolds.

Monday, October 9, 2017

The Neuroethics Blog Series on Black Mirror: San Junipero

By Nathan Ahlgrim

Image courtesy of Wikimedia Commons.
Humans in the 21st century have an intimate relationship with technology. Much of our lives are spent being informed and entertained by screens. Technological advancements in science and medicine have helped and healed in ways we previously couldn’t dream of. But what unanticipated consequences may be lurking behind our rapid expansion into new technological territory? This question is continually being explored in the British sci-fi TV series Black Mirror, which provides a glimpse into the not-so-distant future and warns us to be mindful of how we treat our technology and how it can affect us in return. This piece is part of a series of posts that will discuss ethical issues surrounding neuro-technologies featured in the show and will compare how similar technologies are impacting us in the real world.

*SPOILER ALERT* - The following contains plot spoilers for the episode “San Junipero” of the Netflix television series Black Mirror.

Tuesday, October 3, 2017

“It is sometimes a sad life, and it is a long life:” Artificial intelligence and mind uploading in World of Tomorrow

By Jonah Queen

"The world of tomorrow" was the motto of the
1939 New York World's Fair
Image courtesy of Flickr user Joe Haupt
“One day, when you are old enough, you will be impregnated with a perfect clone of yourself. You will later upload all of your memories into this healthy new body. One day, long after that, you will repeat this process all over again. Through this cloning process, Emily, you will hope to live forever.”

These are some of the first lines of dialogue spoken in the 2015 animated short film, World of Tomorrow.* These lines provide an introduction to the technology and society that this science fiction film imagines might exist in our future. In response to a sequel, which was released last month, I am dedicating a post on this blog to discussing the film through a neuroethical lens.