Getting Out While the Getting’s Good
Dr. Davis is currently at Lehigh University. She taught at Cleveland-Marshall College of Law (Cleveland State University) and Central Michigan University. She received her doctorate in religion from the University of Iowa and her J.D. from University of Virginia. Her specialty is bioethics, and her specific focus is on the ethics of genetic medicine and genetic research. Dr. Davis’ latest book is Genetic Dilemmas: Reproductive Technology, Parental Choices, and Children’s Futures (2nd Edition, University of Oxford Press, 2010). Dr. Davis has been a Fulbright scholar in India, Italy, Israel, Indonesia, and Sweden. Dr. Davis serves on the Central Institutional Review Board of the National Cancer Institute, and is a member of the NIH Embryonic Stem Cell Eligibility Working Group.
At this point, I hold my breath. I am about to ask my audience to choose whether they would prefer to be M or F, but the rest of my presentation relies on the assumption that most people will choose M. What if they don’t? At a recent conference in South Carolina, I almost funked it, made nervous by my own stereotypes about the South, and also because the previous speaker had given a heart-warming presentation of elderly people with dementia responding to music and clowns. At Emory University, my talk was preceded by a tremendously appealing presentation from a gentleman with a family history of Alzheimer’s. He spoke movingly of his aunt’s life and death with the disease; dare I suggest that most of us would prefer to die before we become symptomatic?
|A portrait of a man with dementia.
(Image courtesy of Pixabay.)
However, despite the different venues, the response is always the same: virtually everyone in the room would prefer to die suddenly than to live a decade longer but with dementia. I know that there is a big gap between preferring to die before dementia, and taking that death into one’s own hands. Nonetheless, I know that I am not the only person who plans to “get out while the getting’s good,” and to attempt to end my life before dementia robs me of the ability to act.
|Image courtesy of Flickr.
Although my mother had spoken often during her life of her intention to commit suicide rather than live with dementia, she had missed the window of opportunity and had left it too late. But what would have been the right time? My mother did not begin to experience dementia until she was nearly 90, so had she arbitrarily decided to end her life at 85 (a point at which half of all Americans have some form of dementia) she would have lost some good years.
Where is the sweet spot?
This problem is brilliantly portrayed in Lisa Genova’s best-selling novel, Still Alice.1 Alice is a successful academic, at the top of her game, when she is diagnosed with early onset AD. She knows that research and teaching will soon be beyond her, but she is hoping for a few more years of enjoying her family and the mundane pleasures of an ice cream cone or a walk in the park. On the other hand, she is protective of her dignity and is determined not to end her life with a protracted decline into dementia. She crafts a strategy in which she programs her smartphone to buzz her every week with a simple quiz; when she is no longer able to respond appropriately to questions about the date or the names of her daughters, she will be directed to open a folder on her computer in which she has left a letter, written by Alice now to her later, demented self. However, Alice fails to realize when she begins to fail the quiz, and eventually she leaves the phone in the freezer, which ruins it. But one day, aimlessly clicking through files on her computer, she finds the letter she had written to her later self. The letter opens with words of love and reassurance, and then directs Alice to go upstairs to her bedroom, find a bottle at the back of her nightstand drawer marked “Alice,” and to take all the pills in the bottle with a big glass of water, get into bed, and go to sleep. The letter warns Alice not to discuss this with anyone—just do it. Alice wants to comply, but as she walks up the stairs to her bedroom, she forgets her purpose. She goes downstairs again to read the letter, remembers her purpose, but forgets again as she climbs the stairs. She wishes she could print out the letter to bring it with her, but no longer remembers how to work the printer. Eventually, she is distracted by her husband’s voice, and forgets the whole thing.
|Image courtesy of Wikimedia Commons.
Although I doubt we will ever see a perfect solution to the “sweet spot” problem, the last decade has seen progress in a number of areas that can help individuals assess their background risk for Alzheimer’s, and the immanence of its approach. First, it is now possible to use direct-to-consumer genetic testing, such as 23andMe, to test oneself for an important genetic variable that influences one’s risk of getting Alzheimer’s: APOE. Although even having two APOE4 variants does not doom one to the disease, it substantially raises the likelihood. Those who inherit one copy of the e4 form have a three-fold higher risk of developing AD than those without the e4 form, while those who inherit two copies of the e4 form have an 8- to 12-fold higher risk.2 People who have themselves tested and discovered a higher than average risk might wish to take further steps to monitor for any signs of the disease. Intriguingly, Alzheimer’s is now seen as a “3-stage disease,” of which the first stage occurs even before symptoms develop, perhaps decades before.3 It is increasingly possible to identify those people for whom the disease process has begun, but before they are symptomatic. Even better, it might be possible to track the disease progression, so as to end one’s life as close as possible to the last “good” moment.
Current efforts to diagnose AD in the presymptomatic stage are driven by two scientific goals. First, finding the disease at the earliest possible stage identifies appropriate patients for treatments that could slow or perhaps even reverse the course of the disease. This is crucial, because there is general consensus that the reason there are no effective medications for Alzheimer’s is that by the time the disease produces symptoms, it is way too late. Second, presymptomatic diagnosis is a crucial building block in medical research that seeks to find and test those at high risk for the disease.
Presymptomatic testing runs a gamut that includes neuroimaging to track volume loss and cerebral blood flow in the brain, concentrations of amyloid in the cerebral spinal fluid, PET scans, blood tests, and noninvasive tests of episodic memory.4 Other possibilities include motion sensors and “smart carpets” that diagnose impending dementia from changes in gait.5 These monitoring systems are part of a general movement to use technological surveillance to aid people in aging “successfully” at home, but there is no reason why a savvy and determined person could not make use of them to direct information only to herself.
|Image courtesy of Wikimedia Commons.
The degree of certainty one needs to act is, obviously, a matter for each individual to decide. Each of us has a different balance of how we weigh more years of life against the value of not becoming demented. People have been making these kinds of judgments for years, usually in the face of uncertainty. Women, for decades, have been asked to weigh the risk of having a child with Down Syndrome, against the risk of losing a pregnancy through amniocentesis. People with cancer balance the possible benefits of various treatments (many of them experimental) against the possibility of side effects that can include cardiac damage and even secondary cancers. This is no different. Death is irreversible, but so is dementia. And once one has started down the dementia road, it is too late to turn back.
Want to cite this post?