The Neuroethics of Brainprinting
Lawrence Farwell invented Brain Fingerprinting as a method of determining what information is contained in the brain (Ahuja, & Singh, 2012). This began in 1986 with the investigation of event related potentials. Event related potentials under the P300 response category are electrical signals that occur 300 milliseconds after the subject has been shown a stimulus that is recognized as familiar (Farwell, 2014). Over time, many Brain Fingerprinting methods developed with additional factors supplementing the analysis of the P300 brain response. One of the more prevalent approaches produces a neurological reaction named the “memory and encoding related multifaceted encoding electroencephalographic response” or MERMER (Ahuja, et al., 2012).
|This image depicts Dr. Lawrence Farwell conducting a Brain
Fingerprinting test on Terry Harrington.
(Image courtesy of Wikimedia Commons.)
In 1977, Terry Harrington was convicted of murder and 22 years later Farwell performed a MERMER test on him that was influential in the decision to release him from prison (Farwell, 2014). The crime-related stimuli did not produce a MERMER signal, while the alibi-related stimuli produced a MERMER signal, thus implying that Harrington’s brain contained information concerning the alibi and not the details of the crime. However, it was ultimately due to a Brady violation that the Iowa court overturned the case (Harrington vs State, 2003). Some researchers are suspicious about the highly acclaimed accuracy of brain fingerprinting, as the results have not been verified by researchers extending beyond Farwell’s lab (Holley, 2009).
|A sample of human EEG data.
(Image courtesy of Wikimedia Commons.)
These neural authentication systems herald a new wave of neurotechnology with heightened efficiency, demanding a greater analysis of the privacy violations that may arise. While this constant mining of brain data can offer better security, it also poses a potential threat to the individual. Texas Tech researcher Abdul Serwadda and graduate student Richard Matovu discovered that a six electrode EEG authentication system leaked personal information while verifying the identity of the user. In a study with 25 clinically diagnosed alcoholics and 25 non-alcoholics, the EEG security system recognized the alcoholics with approximately 68-75% accuracy (Matovu, & Serwadda, 2016). These results were obtained through a mutual information metric, which compared the alcoholic sample user EEG data to alcohol usage behavior (Matovu, et al., 2016). However, by only slightly compromising authentication accuracy, this private information could not be extracted from the EEG data (Matovu, et al., 2016). Another study has shown that all three beta bands, brain waves within signature frequency ranges, displayed an unmistakable amplification in EEG readings with alcohol dependent subjects (Rangaswamy et al., 2002).
|Image courtesy of Pexels.
The use of neural authentication security systems turns brain data, previously only accessible to health care professionals, into a commodity. Some companies only make raw EEG data available to the consumer in their more expensive products, turning others’ neurological data into a tool that can be easily sold for greater profit. Such brain data provides corporations and governments with a whole new database of personal data that they can access in the name of quality assurance or national security. Without dialogue to educate the public on EEG systems and the information that can be derived from an EEG signal, many of us may be unaware of the richness of personal information that can be channeled through an electrode.
|Image courtesy of Wikimedia Commons.
Americans have compromised on privacy (Larkin, 2011) many times, with post 9/11 airport security initiatives as an example, to better protect the general population. However, the violation of neuroprivacy is unlike previous breaches of privacy executed in the name of security, for none of the other compromises intrudes so personally and directly into the workings of the mind. Situational justification does not validate this abuse; instead, it sets a precedent, which eventually could lead to a wide and indiscriminate acceptance of brain fingerprinting. Championing efficiency, safety, and convenience to push new biometric brain authentication methods misleads the public into thinking that these methods are both harmless and guarantee protection from hackers.
Farwell, L. A., & Makeig, T. H. (2005).
Holley, B. (2009). It’s All in Your Head: Neurotechnological Lie Detection and the Fourth and Fifth Amendments . The Institute of Law, Psychiatry & Public Policy-The University of Virginia,28(1), 1-76. Retrieved April 20, 2017.
Want to cite this post?
Farrell, A. (2017). The Neuroethics of Brainprinting. The Neuroethics Blog. Retrieved on
, from http://www.theneuroethicsblog.com/2017/09/the-neuroethics-of-brainprinting.html