Who Owns My Thoughts?

I attended the excellent Neuroscience, Law, and Ethics of Lie Detection Technologies Symposium in May, and as a consequence, I have spent the last month trying to answer questions I hadn’t even thought to ask before: Who owns the thoughts in my head? Could I be compelled to submit them? Can someone else decide that keeping my ideas to myself is a violation of the law or a threat to my country? If they force me to surrender them, do I lose ownership? So this week, I thought I would share some of the things I learned as I tried to find out answers.

You can actually buy this online. I am considering getting it printed on a hat.

Two preliminary points: first, I want specify what I mean when I say “compelled” to undergo a brain scan. It seems, at least it seemed to me while sitting in the audience, that Americans are pretty afraid of having someone else read their minds without their permission, or, worse, being forced to have their minds read. This extends even to a simplistic form of mind reading such as rudimentary lie detection. I have to say, I understand this fear, and for me, it boils down to this – I would be afraid that the government could, by compelling me to undergo a brain scan, make me give up information that I didn't even know I was concealing. Lest I spin totally into conspiracy theory territory, I tried to approach the question more systematically by researching how brain scans fit into our current constitutional protections against unlawful search and seizure.

Second, a note about how lie detection is currently used. Both during the lunch we had earlier in the afternoon and the symposium itself, all of the featured speakers pointed out that fMRI lie detection evidence isn’t admissible in court.[1] Laken himself has even been involved in several landmark cases. In actuality, the forensic application of lie detection technology goes far beyond courtrooms. Results from a lie detector, including those done via fMRI, can be used in a variety of situations, including, but not limited to: arbitration, civil commitment, and parole, sentencing or administrative hearings. They can also, in theory, be used by law enforcement officials in the course of an investigation as long as such procedures lead to evidence that can be used in court, and nothing is obtained illegally (more on what that means later).[2] So, fMRI technology can already be used for legal applications, both in the civil and the criminal areas (although I am leaving aside some of the more complex legal arenas, like military courts and investigations done under the PATRIOT act). 

So, let’s go back to my original question: who owns the thoughts in my head? Do I own them? What process must someone follow in order to seize them?

I started by thinking about thoughts as a product of my body. After all, don’t I own what is inside my body?  Well, it turns out, I only own it until someone takes it from me. This might happen as part of a routine medical examination, where a doctor takes a blood or urine sample for testing. I might even request that someone remove something from inside of me- a tumor, for instance. But once they have taken it, guess what? I don’t own it anymore.[3] And that is just for medical use. Legally, persons can be compelled to give up physical evidence, such as DNA, or succumb to measurements and recording, such as fingerprints. All of these fall under the Fourth Amendment, meaning as long as law enforcement attains the proper warrants, they can gather physical evidence- even if that evidence is part of your body.

But thoughts are different- or, at least, they probably are. As both Paul Wolpe and Hank Greely emphasized during the symposium, lie detection technology, even technology measured through fMRI, is likely going to be considered testimony and not physical evidence. That is, it would be subject to the rules of the Fifth Amendment and not the Fourth.[4] The Fifth Amendment, for anyone who hasn’t spent a ridiculous amount of his or her life watching Law & Order, is the rule that says you can refuse to testify if the testimony you give would incriminate you. Since about the mid 1960s, physical evidence has been exempt from the Fifth Amendment, meaning you can be forced to surrender physical evidence (or, for example, try on a glove or clothing in front of a jury) even if that evidence would incriminate you. 

To paraphrase Nita Farahany, the Fifth Amendment covers what comes out of your mouth,
as long as what comes out is words and not saliva.

Okay, that’s all well and good - there are people considering how and when someone can scan my brain in the event that I am charged with a crime. But what about accidental discovery? What if, while being scanned about whether or not I ran that stop sign over on Clairemont Avenue last week (for the record, I absolutely did not), I happen to let slip that I’ve discovered the secret to safe, efficient nuclear power? (I also haven’t done that, just to be clear.) Does the person questioning me now own that statement too? Could they compel me to release it to the government? Or, worse yet, could they claim it as their own?

There are already laws and procedures in place for what happens if, in the course of the investigation of one crime, law enforcement officers find information about other criminal activity (you can, for example, give a witness immunity in order to convince them to testify.) But my right to protect my knowledge about nuclear power is another matter entirely. In fact, I’d be willing to bet that questions about brain images and ideas that only exist in someone’s head (i.e., haven’t been written down yet) get into a fair amount of copyright, trademark and patent law… and my head already hurts.

I know those feels, man.
Luckily for me and my aching head, legal scholar Nita Farahany has already started investigating these questions. In “Incriminating Thoughts” she points out that emerging neurotechnology has so changed the way we  measure the mind, it justifies an entirely new system of cataloging evidence.[5] She argues for abandoning the older physical/testimony dichotomy (which I’ve starting thinking of as the Fourth/Fifth Amendment dichotomy) in favor of a spectrum of evidence which includes “identifying, automatic, memorialized, and uttered.” This would cover all the different ways a person’s thoughts could be measured or recorded during the investigation of a crime.

In a newer article titled “Searching Secrets,” set to be published sometime later this year, she applies this standard to a wider spectrum of information, including “tangible and intangible thoughts, ambitions, and expressions.”[6] In her system, investigators would be guided by the rules of intellectual property law rather than more traditional Fourth Amendment concepts of property (home, possessions, papers.) This system integrates copyright concerns into discussions of what secrets can be investigated and uncovered, by whom, and for what purpose, and would offer more protection. This is largely because it would have a wider concept of the “reasonable expectation of privacy,” the guiding principle when deciding what can and cannot be collected as evidence without a warrant.[7] This integration would also, as far as I understand, allow for a more thorough investigation into how copyright functions when it comes to un-uttered and un-written ideas.

Alright, so, that covers whether or not I can be forced to submit my mind to scrutiny, and what people can do with the thoughts they may find there. The answers are far from set in stone, but there are definitely debates going on, which puts my mind at ease (har har.)


What about the things I am thinking of doing? What if, in the course of an investigation of my thoughts (admittedly one using a much more advanced system than we have now) law enforcement agents find that I am planning to commit a pretty terrible crime?
Precrime. It Works.
This may seem like I have ventured into the realm of science fiction (when an audience member asked a similar question during the symposium, Paul Wolpe answered "What you are talking about is Minority Report.") In fact, future dangerousness has long been a concern of forensic psychiatry, and there are forms of prediction in forensic application now. Civil commitment hearings are designed to determine the likelihood that someone will cause harm in the future, that is, whether the person in question is a danger to self and others, and therefore should be locked up. But what about beyond that? What about systems designed not only to curtail the actions of dangerously ill persons, but systems which attempt to prevent crime by predicting it?

In terms of brain imaging, and certainly as far as the technology discussed at the symposium, this is a futuristic vision indeed. But that doesn't mean there aren't emerging crime prediction technologies. (Go ahead and Google "precrime" if you don't believe me.) Tune in next month, where I’ll be blogging about how, where, and why "precrime" technology is being developed.

Want to cite this post?
Cipolla, C. (2012). Who Owns My Thoughts?. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2012/06/who-owns-my-thoughts.html

[1] For details about the use of Steven Laken’s technology in court, see David Nicholson’s blog post. For an overview of the standards for admitting scientific evidence, see Jamie Witter’s guest post.  
[2] For an overview of emerging uses for neuroimagining,  including fMRI, see Joseph R. Simpson, Neuroimaging in Forensic Psychiatry : From the Clinic to the Courtroom (Chichester, West Sussex: Wiley-Blackwell, 2012).
[3] The rules of ownership governing medical tissue samples have been the subject of a lot of recent media attention, largely due to the publication of Rebecca Skloot’s The Immortal Life of Henrietta Lacks.
[4] Sarah E. Stoller and Paul Root Wolpe, “Emerging Neurotechnologies for Lie Detection and the Fifth Amendment,” 33 Am. J.L. & Med. 359 (2007).
[5] Nita A. Farahany, “Incriminating Thoughts,” Stanford Law Review Vol. 64, 351 (2012); Available at SSRN.
[6] Nita A. Farahany, “Searching Secrets,” University of Pennsylvania Law Review, (2012). Available via UChicago.edu.
[7] Basically, and I am really paraphrasing here, the key is that copyright also gives people the right not to publish something, that is, to keep it secret. Farahany uses the famous J.D. Salinger case as an example.

Emory Neuroethics on Facebook

Emory Neuroethics on Twitter

AJOB Neuroscience on Facebook