Skip to main content

The Neuroethics of Brainprinting

By Anna Farrell 

Anna Farrell is a rising second year undergraduate student at Emory University. Early on in her Neuroscience major she became interested in Neuroscience’s interdisciplinary nature and continued on to declare a second major in English. 

As cyber espionage and hacking are on the rise (Watson, 2016), major corporations, governments, and financial systems have pushed for using biometrics as a more secure way to guard their data. Biometrics measures unique physical characteristics as a way of ascertaining someone’s identity. A wide range of physical characteristics are currently used in biometrics, including DNA, iris, retina, face, fingerprint, finger geometry, hand geometry, odor, vein, and voice identification (Types of Biometrics). Governmental uses for biometrics span border control, customs services, and online access to critical systems. However, fingerprint and iris identification results are becoming more replicable as hacker’s abilities advance (Watson, 2016), causing researchers to begin to look beyond the typical biometric features. One of the new methods being studied is electroencephalogram (EEG)-based neurological identification. However, using brain wave biometrics as a means of identification establishes a framework which, if underestimated, could put sensitive personal data in jeopardy. 

Lawrence Farwell invented Brain Fingerprinting as a method of determining what information is contained in the brain (Ahuja, & Singh, 2012). This began in 1986 with the investigation of event related potentials. Event related potentials under the P300 response category are electrical signals that occur 300 milliseconds after the subject has been shown a stimulus that is recognized as familiar (Farwell, 2014). Over time, many Brain Fingerprinting methods developed with additional factors supplementing the analysis of the P300 brain response. One of the more prevalent approaches produces a neurological reaction named the “memory and encoding related multifaceted encoding electroencephalographic response” or MERMER (Ahuja, et al., 2012). 

This image depicts Dr. Lawrence Farwell conducting a Brain

Fingerprinting test on Terry Harrington.

(Image courtesy of Wikimedia Commons.)

In 1977, Terry Harrington was convicted of murder and 22 years later Farwell performed a MERMER test on him that was influential in the decision to release him from prison (Farwell, 2014). The crime-related stimuli did not produce a MERMER signal, while the alibi-related stimuli produced a MERMER signal, thus implying that Harrington’s brain contained information concerning the alibi and not the details of the crime. However, it was ultimately due to a Brady violation that the Iowa court overturned the case (Harrington vs State, 2003). Some researchers are suspicious about the highly acclaimed accuracy of brain fingerprinting, as the results have not been verified by researchers extending beyond Farwell’s lab (Holley, 2009). 

Now, Farwell’s brain fingerprinting is being researched so that it can be applied to neurological authentication systems. The argument for the development of neurological authentication systems is compelling because brains are physically less accessible to hackers than are fingerprints or DNA, and are thus harder to falsify. However, interference from the skin and skull can make it difficult to get a good signal from an EEG (Usakali, 2010), which may lead to inconsistencies in signal identification. More recently, functional near-infrared spectroscopy (fNIR) security systems have been developed, which have a much higher signal-to-noise ratio than EEG. While EEG systems measure electrical brain activity (Spine, 2016), fNIR measures blood flow in the brain like a fMRI, allowing the method greater spatial resolution but with the mobility of the EEG system (Strait, & Scheutz, 2014). 

A sample of human EEG data.

(Image courtesy of Wikimedia Commons.)

These neural authentication systems herald a new wave of neurotechnology with heightened efficiency, demanding a greater analysis of the privacy violations that may arise. While this constant mining of brain data can offer better security, it also poses a potential threat to the individual. Texas Tech researcher Abdul Serwadda and graduate student Richard Matovu discovered that a six electrode EEG authentication system leaked personal information while verifying the identity of the user. In a study with 25 clinically diagnosed alcoholics and 25 non-alcoholics, the EEG security system recognized the alcoholics with approximately 68-75% accuracy (Matovu, & Serwadda, 2016). These results were obtained through a mutual information metric, which compared the alcoholic sample user EEG data to alcohol usage behavior (Matovu, et al., 2016). However, by only slightly compromising authentication accuracy, this private information could not be extracted from the EEG data (Matovu, et al., 2016). Another study has shown that all three beta bands, brain waves within signature frequency ranges, displayed an unmistakable amplification in EEG readings with alcohol dependent subjects (Rangaswamy et al., 2002). 

Although these technologies are not ready for commercial use, ethical complications ought to be analyzed before the hypothetical becomes reality. Serwadda states that in the wrong hands, these brain waves could be used to gain insight into employees’ possible mental health conditions, learning disabilities, substance abuse, and more (Watson, 2016). Although this is speculative, it rightly emphasizes precaution as we continue to develop authentication measures. If this sort of sensitive information could be gleaned from the EEG system, the unwarranted access that could be surrendered with the data collected by the fNIR security system should certainly not be underestimated. 

Image courtesy of Pexels.

The use of neural authentication security systems turns brain data, previously only accessible to health care professionals, into a commodity. Some companies only make raw EEG data available to the consumer in their more expensive products, turning others’ neurological data into a tool that can be easily sold for greater profit. Such brain data provides corporations and governments with a whole new database of personal data that they can access in the name of quality assurance or national security. Without dialogue to educate the public on EEG systems and the information that can be derived from an EEG signal, many of us may be unaware of the richness of personal information that can be channeled through an electrode. 

Other concerns about neural fingerprinting focus on how their implementation may infringe on our constitutional rights. Should neurological data be treated like other non-invasive physiological samples, such as blood or urine? Or does it qualify as testimony, deserving protection under privacy? As brain fingerprinting data does not fit into either realm fully, some are advocating for a new legal framework to address possible violations of constitutional rights, specifically the fourth (Farahany, 2012) and fifth (Waller, Bernstein, & Ladov, 2012) amendments. The 4th amendment protects the individual from unreasonable searches and seizure. The 5th amendment guarantees that no one should have to testify against themselves. Possible inaccuracies of the proposed methods further cloud the discussion if these technologies are ever to be called upon in a legal setting. 

Image courtesy of Wikimedia Commons.

Americans have compromised on privacy (Larkin, 2011) many times, with post 9/11 airport security initiatives as an example, to better protect the general population. However, the violation of neuroprivacy is unlike previous breaches of privacy executed in the name of security, for none of the other compromises intrudes so personally and directly into the workings of the mind. Situational justification does not validate this abuse; instead, it sets a precedent, which eventually could lead to a wide and indiscriminate acceptance of brain fingerprinting. Championing efficiency, safety, and convenience to push new biometric brain authentication methods misleads the public into thinking that these methods are both harmless and guarantee protection from hackers. 

Dialogue revolving around ownership of brain data and how security systems should go about receiving consent for brain data access is essential to creating a safe and effective means of security authentication services. Research highlighting the possible compromises we may face with brain biometrics is guided by the above principles and must continue to be so guided, to ensure our autonomy over our brain data in the face of technological advances. 


Ahuja, D., & Singh, B. (2012). Brain fingerprinting. Journal of Engineering and Technology Research ,4(6), 98-113. doi: 10.5897/JETR11.061 

Basulto, D. (2014, November 21). The heartbeat vs. the fingerprint in the battle for biometric authentication. Retrieved March 28, 2017, from 

Behavioral Biometrics. (n.d.). Retrieved April 20, 2017, from 

Beyond therapy: biotechnology and the pursuit of happiness. (2004). Choice Reviews Online, 42(03). doi:10.5860/choice.42-1550 

Bonaci, T., Calo, R., & Chizeck, H. J. (2015). App Stores for the Brain : Privacy and Security in Brain-Computer Interfaces. IEEE Technology and Society Magazine,34(2), 32-39. doi:10.1109/mts.2015.2425551 

Chuang, J., Nguyen, H., Wang, C., & Johnson, B. (2013). I Think, Therefore I Am: Usability and Security of Authentication Using Brainwaves. Financial Cryptography and Data Security Lecture Notes in Computer Science, 1-16. doi:10.1007/978-3-642-41320-9_1 

Dalbey, B. (1999, August 17). Farwell’s Brain Fingerprinting traps serial killer in Missouri. Retrieved May 18, 2017, from 

Farah, M. (2015). An Ethics Toolbox for Neurotechnology. Neuron, 86(1), 34-37. doi:10.1016/j.neuron.2015.03.038

Farahany, N. A. (2012). Searching Secrets. University of Pennsylvania Law Review,160(5), 1239-1308. Retrieved August 13, 2017. 

Farwell, L. A. (2014). Brain Fingerprinting: Detection of Concealed Information. Wiley Encyclopedia of Forensic Science, 1-12. doi:10.1002/9780470061589.fsa1013
Farwell, L. A., & Makeig, T. H. (2005). 

Farwell Brain Fingerprinting in the case of Harrington v. State. Open Court X, Indiana State Bar Assoc., 7-10. Retrieved May 17, 2017. 

Goodman, M. (2015, February 24). Fingerprint and Iris Scanners Seem Secure, but They Aren’t Hack-Proof. Retrieved March 10, 2017, from 

Harrington v. State (February 26, 2003), FindLaw 01-0653.
Holley, B. (2009). It’s All in Your Head: Neurotechnological Lie Detection and the Fourth and Fifth Amendments . The Institute of Law, Psychiatry & Public Policy-The University of Virginia,28(1), 1-76. Retrieved April 20, 2017. 

Jimmy Ray Slaughter. (n.d.). Retrieved May 18, 2017, from 

Matovu, R., & Serwadda, A. (2016). Your substance abuse disorder is an open secret! Gleaning sensitive personal information from templates in an EEG-based authentication system. 2016 IEEE 8th International Conference on Biometrics Theory, Applications and Systems (BTAS). doi:10.1109/btas.2016.7791210 

Larkin, P. (2011, December 9). How Must America Balance Security and Liberty. Retrieved March 26, 2017, from 

Liberatore, S. (2016, August 04). Hackers could get inside your BRAIN: Experts warn of growing threat from monitoring and controlling neural signals. Retrieved April 21, 2017, from 

Purcell, R., & Rommelfanger, K. (2015). Internet-Based Brain Training Games, Citizen Scientists, and Big Data: Ethical Issues in Unprecedented Virtual Territories.Neuron,86(2), 356-359. doi:10.1016/j.neuron.2015.03.044 

Rangaswamy, M., Porjesz, B., Chorlian, D. B., Wang, K., Jones, K. A., Bauer, L. O., . . . Begleiter, H. (2002). Beta power in the EEG of alcoholics. Biological Psychiatry52(8), 831-842. doi:10.1016/s0006-3223(02)01362-8 

Strait, M., & Scheutz, M. (2014). What we can and cannot (yet) do with functional near infrared spectroscopy. Frontiers in Neuroscience, 8. doi:10.3389/fnins.2014.00117

Strong, K. (2014). “Pass-thoughts” and non-deliberate physiological computing: When passwords and keyboards become obsolete. The Neuroethics Blog. Retrieved on April 20, 2017, from 

Spine, M. B. (2014, April). Electroencephalogram (EEG). Retrieved March 26, 2017, from 

The issues with biometric systems. (n.d.). Retrieved March 10, 17 from 

Types of Biometrics. (n.d.). Retrieved March 26, 2017, from 

Ulman , Y. I., Cakar, T., & Yildiz, G. (2015). Ethical Issues in Neuromarketing: “I Consume, Therefore I am!”,. Science and Engineering Ethics,20. doi:DOI 10.1007/s11948-014-9581-5 

Usakli, A. B. (2010). Improvement of EEG Signal Acquisition: An Electrical Aspect for State of the Art of Front End. Computational Intelligence and Neuroscience,2010, 1-7. doi:10.1155/2010/630649 

Waller, G., Bernstein, S., & Ladov, L. (Directors). (2012, March 20). Neuroimaging in the Courtroom: Video by Neuroethics Creative Team[Video file]. Retrieved August 13, 2017, from 

Watson, G. (2016, September 30). Professor Shows Brain Waves Can be Used to Detect Potentially Harmful Personal Information. Retrieved February 20, 2017, from

Want to cite this post?

Farrell, A. (2017). The Neuroethics of Brainprinting. The Neuroethics Blog. Retrieved on
, from


  1. I am a US Army veteran and non-consenting victim of continued, as your article put it, "constant mining of brain data", I believe as part of an unethical and illegal deep-coercive investigation in my state and would like to make contact with your agency so as to give you my information and story.


Post a Comment

Emory Neuroethics on Facebook