The Social Impact of Brain Machine Interfaces: Value Sensitive Design in Neurotechnology
By Tim Brown and Karen Rommelfanger
![]() |
Image Courtesy of Pixabay |
In our third and final workshop, we considered how brain–machine interfaces can be designed in ways that take human values into account—in order to recognize, prevent, or mitigate potential harms or injustices. To this end, we considered the role human values play (or don’t play) in medical device design broadly, whether those values align between stakeholders (patients, caretakers, clinicians, communities), what role these values play in the development of BMIs currently, and how to best incorporate human values into the design of future BMIs. In other words, we asked what values are embedded in neurotechnologies, who they benefit, and how the field can do better. Joining us for this conversation were panelists: Dr. Laura Y. Cabrera (Michigan State University), Dr. Batya Friedman (University of Washington), Dr. David G. Hendry (University of Washington), and Lassana Magassa (University of Washington). Here, I summarize the themes that emerged from our conversation, catalyzed by a series of five discussion questions.
Editor’s note: We previously published posts about the first and second of these workshops.
Editor’s note: We previously published posts about the first and second of these workshops.
1. What do we (collectively) mean when we talk about human values? What are they?
Our panelists started with a simple working definition of value—as ideals and goals that we uphold and strive for. Very quickly, however, the group puzzled over the distinction between values held by individuals, values held by communities, and values shared universally. That is, values are (at least in part) socially constructed and highly contested. We asked if universal values, or values we share insofar as we are all human, exist in the first place. After all, our practices of making sense of what we and how we value seem to vary widely between social contexts—across groups, communities, cultures, and generations. Many of the values that medical professionals try to preserve or respect—e.g., dignity—seem to take on a different meaning across those contexts. Even further, people from different cultures might have radically different ideas about what it means to preserve, respect, or uphold a value they believe they share.
![]() |
Image Courtesy of Pixabay |
Laura’s approach was to isolate a set of simpler values that we are more likely to share between cultures. Dignity, for example, a complex concept that has a life of its own in academic circles—with long lineages of thinkers defending different conceptions of dignity, arguments about who it applies to, and recommendations for respecting it. It’s possible, however, that simpler values become complicated eventually, or that values we think of as “simple” are filled with complexities on further inspection. For example, many attempts to promote health as a value take for granted a narrow view of what constitutes a health, and construe disability as a problem to fix. People with disabilities, however, maintain that there is value in disability, that their lives are good lives, and that a conception of health should capture lives like theirs. No matter what our approach, we need to prevent ourselves from building stigmas into our system of values.
2. How are values exhibited (or not exhibited) in the design of medical technologies, and how are those exhibited (or not) in the design of BMIs specifically?
We started by recognizing that values and technology have a reciprocal relationship: our values shape the technologies we create, and our technologies shape our values. BMIs for medical use, in particular, are not developed and used in isolation—they, at the very least, are the result of interactions between doctors, patients, and engineers. But these interactions are often predicated, again, on the clinical goal of “fixing” disability. We considered the case of cochlear implants as an example of this. Some deaf parents would rather their deaf (or hard of hearing) children not receive cochlear implants: they worry that cochlear implants are not good enough to give their children full access to hearing communities, and they worry that these implants will put them at the margins of deaf communities. These parents face a backlash from hearing people—but this backlash is predicated on underlying value systems that cast deafness as a problem to fix. This is a deep problem for parents who would prefer their children to have full access to a deaf cultural identity instead of feeling stuck between two identities. We asked further: will other forms of BMI conflict with the cultural identities of people with disabilities? It seems very possible—and device designers will need to engage with communities to better understand these possibilities.
3. What are the risks of failing to recognize and incorporate human values into the design of BMIs?
One possible risk Batya identified is analogous to what Martha Nussbaum calls “the tragic question”: when none of our available options are free from moral wrongdoing. BMI users could run into the tragic question when the constraints of their BMI forces them to make a choice between several difficult outcomes. Designers should do everything to eliminate posing the tragic question through their devices. Take for example how some technologies (like wireless networks and cloud computing platforms) saturate our lives in ways that make it almost impossible to opt out. Will we soon live in a world where we’ll rely on BMIs in the same ways? Will BMIs tie into the same infrastructures that we might want to opt out of for the sake of upholding our values?
![]() |
Image Courtesy of Pixabay |
4. Which values ought to play a role in the design of BMIs? Whose values should play a role? How do we decide?
One thing was clear from our conversation: we are late, and the technology is already here. But it isn’t too late. We must determine how to make sure BMI design aligns to values we (as a society) endorse, promote, and prioritize. We need to think about individual and social values, and especially to ask how to support and propagate the right social values. That is, device designers and key members of supportive infrastructures—from academics to corporations—need to engage users and recognition, inclusion, and justice are values that are in high demand but in short supply. If marginalized people are forgotten in the design process, and are denied a seat at the table, it is almost certain that BMIs will codify values that further marginalize them. As such, we not only need to decide which values play a role in the design of BMI—we need to actively decide who will have the power to uphold or enforce values, and how that power will be expressed.
communities of potential users to ascertain their values. After all:
communities of potential users to ascertain their values. After all:
5. What practices should engineers and designers adopt to recognize human values and incorporate those values into future BMIs?
![]() |
Image Courtesy of Pixabay |
Beyond the design of technologies themselves, however, we must also excavate the social structures our technologies are developed are distributed in. Laura reminds us that infrastructures themselves are also technologies that should also reflect the values of the people who interact with them. We need to have conversations about technologies, their infrastructures, and human values in public in order to figure out what goals we should pursue, decide who is responsible for the consequences, and build structures of transparency and accountability. Further, the scientific community is responsible to produce the necessary literacy so that we can move forward rather than through the ideas of capitalists.
Further Reading:
- Cabrera, Laura Y. "How does enhancing cognition affect human values? How does this translate into social responsibility?." In Ethical Issues in Behavioral Neuroscience, pp. 223-241.
- Friedman, Batya, and Hendry, David G. (2019). Value sensitive design: Shaping technology with moral imagination. MIT Press.
- Sparrow, Robert. (2005). "Defending deaf culture: The case of cochlear implants." Journal of Political Philosophy, 13(2), 135-152.
_________________

Timothy Brown is an NIH postdoctoral research associate in the Department of Philosophy at the University of Washington and the lead architect of the Social Impacts and BMI workshop series. His work explores the role neural technologies—like deep–brain stimulators and brain–computer interfaces—(will) play in our experiences of self, in our interpersonal relationships, and in our societies more broadly.
Dr. Karen S. Rommelfanger received her PhD in neuroscience and received postdoctoral training in neuroscience and neuroethics. Her research explores how evolving neuroscience and neurotechnologies challenge societal definitions of disease and medicine. Dr. Rommelfanger is an Associate Professor in the Departments of Neurology and Psychiatry and Behavioral Sciences, the Neuroethics Program Director at EmoryUniversity’s Center for Ethics, and Senior Associate Editor at the American Journal of Bioethics Neuroscience. She is dedicated to cross-cultural work in neuroethics is co-chair of the Neuroethics Workgroup of the International Brain Initiative. She is an appointed member to the NIH BRAIN Initiative Neuroethics Working Group and is ambassador to the Human BrainProject’s Ethics Advisory Board. She also serves as Neuroethics Subgroup member of the Advisory Committee to the Director at NIH for designing a roadmap for BRAIN 2025. She recently was appointed to the Global Futures Council on Neurotechnology of the World Economic Forum. A key part of her work is fostering communication across multiple stakeholders in neuroscience. As such she edits the largest international online neuroethics discussion forum The Neuroethics Blog and she is a frequent contributor and commentator in popular media such as The New York Times, USA Today and The Huffington Post.
Want to cite this post?
Brown, T. & Rommelfanger, K. (2020). The Social Impact of Brain Machine Interfaces: Value Sensitive Design in Neurotechnology. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2020/09/the-social-impact-of-brain-machine.html.