Asking the Right Questions in Alzheimer’s Research
If the United States is the world’s innovation leader, it must lead the way on new directions in aging and neurodegeneration research.
I recently watched the final episode of the Wallander television series with Kenneth Branagh in the role of the eponymous Swedish police detective. Wallander is given his diagnosis of familial Alzheimer’s disease (ALZ), and proceeds to cognitively fall apart while still working to solve a complex crime. When he finally tells his daughter that the disease that destroyed her grandfather is now going to destroy her father, you can see in her eyes all that means to a young wife and mother who is for the moment thinking only about her father and wanting to care for him, even as Wallander’s eyes are filled with fear, doubt, and worry as he looks into his daughter’s future.
Forty years into a full-on effort to defeat Alzheimer’s disease as a major cause of cognitive decline and death—indeed, it is the sixth leading cause of death in the United States, afflicting nearly six million people and wreaking enormous emotional and financial tolls on patients and families—this is where things stand: we have no treatments, and though efforts to improve early-stage diagnosis have had some success, their main impact is to inject enormous new uncertainties and anxieties into a patient’s view of the future and sense of self. Meanwhile, a string of high-profile failures of ALZ therapies based on a dominant scientific hypothesis is the main result of expenditures of huge scientific, corporate, and societal resources. ALZ remains a disease with no known prevention, treatment, or cure.
We must, and we can, start doing better.
I am not a medical, legal, or ethical expert on aging or Alzheimer’s. So what perspective am I offering? One obvious perspective is that of a person entering her seventh decade along with many loved ones and friends who, if we are lucky, will continue aging while also enjoying rich and productive lives. Second, my own scholarly area of interest is metabolic brain disease and the physiology of the brain. Third, I am the president of a private foundation that has for the past 30 years supported emerging research in the fields of cognition and learning, complex systems, and enhancing recovery from neurological insults. The James S. McDonnell Foundation (JSMF) was an early supporter of the nascent field of network neuroscience, and I believe quite strongly that we must begin to take network science and what we are learning about complex, dynamic, adaptive systems seriously if we are to understand the alterations in cognitive function that accompany both normal and pathological aging.
An important part of JSMF’s mission is to support work that questions common wisdom, tests unexamined assumptions, and revisits dominant hypotheses. I was in graduate school when the amyloid hypothesis emerged as the dominant research direction in ALZ, and even at that time cogent concerns were expressed about putting all our eggs into this one basket of pathology and focusing solely on ridding the brain of a particular type of protein fragment, called beta-amyloid, as the main avenue to treatment.
How, then, did science and the drug industry drive itself into the cul-de-sac of a single dominant theory of ALZ causation, leaving us, having invested billions of dollars over decades, little better off than we were when we started? There is no single answer, but for certain, dominant hypotheses have a way of reinforcing themselves. Once a field commits to a particular hypothesis, the research resources—funding, experimental models, and training—all get in line. Alternative hypotheses struggle to survive. An important consequence of hitching the field of ALZ research to the amyloid hypothesis is the dependence of researchers on established experimental animal models that have little or nothing to say about human neurodegeneration. Alzheimer’s disease is unique to the human brain. The rats and mice preferred by biomedical scientists do not develop Alzheimer’s. To be useful as disease models, the animals must be engineered to reflect characteristics of the human disease—but not in the human context. The engineering of disease models means further narrowing the quest for mechanisms and potential therapies because, as a practical matter, only a small number of specific genetic or molecular mechanisms can be pursued. The reductionist approach inherent in these models results in the reproduction of a few characteristics of the disease—often achieved in a way that is not recapitulating the actual pathogenesis in the human brain. It has not just been the problem of looking for the keys under the street light, but of finding in that light a bent nail and declaring that the key has been found.
This “focus on the model and not the disease” approach in ALZ research dominates federal funding and academic science, leading to easily measurable outputs such as publications and citations, on which careers are built and millions of dollars spent. The “bent nail” thoroughly dominates the biomedical scientific imagination about how best to proceed. Although I would love to see some of the recently announced funding windfall for ALZ research used to create more and better sources of light, I am doubtful that borrowing reductionist experimental models from other diseases is the right solution, and worry that this will mostly just generate more bent nails.
Our work at JSMF has been, in part, an effort to open the imagination of scientists and science funders so they can go beyond the research boundaries so strongly circumscribed by the incentives and culture of the current system. Evidence of our too-limited imaginations includes the technological as well as the scientific. Every year The Scientist magazine strives to identify the latest and greatest tools, technologies, and techniques to hit the life-science landscape. The 2017 selection of winning products included such diverse items as a single-cell protein-analyzing microfluidic chip, a streamlined blood-testing device, advanced reagents for precision genome editing, and machines for analyzing transcriptomes, whole genomes, and peptide profiles. Glaringly absent from the list is anything that contributes to life sciences at the level of the person—and certainly nothing at the level of a person imbedded in a complex social ecosystem.
But what of the emerging incredible technological abilities to acquire knowledge directly from humans, whether it’s sophisticated and refined cognitive and behavioral testing, or wearables that can track changes in real behavior from such things as the quality and frequency of social interactions or the intricacy of writing and speaking? Why is the revolution in technologies that track our every word and move as whole humans rarely integrated into the biomedical approach? A richer understanding and characterization of the human disease in the full context of how an organism does or does not accomplish the behaviors needed to thrive in its environment might suggest very different targets for therapeutic interventions. Why wouldn’t we at least seriously entertain the possibility that such tools are a more fertile entry point for research on neurodegeneration than utterly abstracted tools such as genes and mouse brains? Yes, there are nascent attempts along these lines, but they represent boutique approaches in the vast enterprise of biomedical science.
In 2013 JSMF organized a multidisciplinary workshop on cognitive aging, with a focus on opening up the question of why scientific studies are always conducted, even when focused on normal aging, through the lens of cognitive loss. Participants represented the full range of expertise from cellular and molecular mechanisms of memory function to studies of wisdom. The impetus for the meeting was to revisit the commonplace “fact” that there is an inevitable (and perhaps irreversible) nonpathological “loss” of cognitive function in individuals over time.
We wanted to reexamine the findings on cognitive decline in a broader discussion about system aging and adaptation. What do we know about the life span trajectory of cognitive function as it is tuned not just by biology but by experience, knowledge, learning, and social relations? When given a problem to solve, is a 75-year-old adaptive system performing the same task as that of a 20-year-old system? These were the sorts of questions that drove our deliberations. We believe this network context allows new thinking about how neurodegenerative processes disrupt cognition and function. But network science remains thoroughly marginal to the ALZ research agenda.
We want to do more to change this. Beginning in 2018, through a partnership with the Santa Fe Institute, JSMF is supporting a working group of experts from neuroscience and complex systems science using what has been learned from studies of other networked systems—from transportation to the power grid—to seek new ideas about how brain networks “break.” We hope this reframing can lead to new ideas that can open up new pathways in neurodegeneration research.
It is obvious that the biological determinants of cognition act in continual interconnected feedback with the environment, with both bottom-up and top-down influences of social context on gene expression, epigenetics, protein synthesis, neuromodulator levels, and circuit and network dynamics. There is no good reason to think that effective intervention to stem or adjust for cognitive decline will depend mostly on an understanding of the few biological responses we have selected and are most heavily invested in pursuing. Thinking at the whole human level may provide a better entry point.
As just one example, it is not unreasonable to think that a cognitive system in its youth, when many situations are novel, functions quite differently from a mature system that has “seen a lot” and tends to rely more on “gist”—calling on its store of knowledge and experiences for a solution-match that is good enough. Indeed, we know that with age, even as memory may decline, wisdom and abilities are growing in other ways. But most cognitive psychology experiments don’t look at how that might be. Rather than accepting a story of decline, what if we think of this shifting balance among cognitive abilities as adaptive and compensatory? As our productive life spans continue to increase, and we expect to continue doing into our late years what we did in our younger years, what are the supportive environments that we might create to help cultivate, sustain, and even enhance such adaptation and compensation?
Such a perspective can also help reveal emerging challenges to adaptation. For example, in a world where social and economic relations are continually changing due to new information and communication technologies, are the strategies of replacing the active learning of a young brain with the wisdom, judgment, and “knowing enough to get by” of the mature brain going to be challenged even further? In a world where we are more likely to encounter novel situations and challenges across our entire life span, perhaps we need our 70-year-old brain networks to be more like 20-year-old networks in some ways and in some circumstances. If so, how do we accomplish that? How do we create supportive environments, assistive technologies, and smart homes and cars that include and empower our aging brains, rather than isolate and alienate them?
Such questions aim at “normal” cognitive evolution over a lifetime. But complex systems are also vulnerable to catastrophic or cascading failure. Are there ways to make brain systems more resilient to neurodegenerative diseases? We know almost nothing about how to make brains more resilient, but is there something to be learned from studying the normal adaptive strategies of the aging brain? Research purporting to show that advanced education “protects” against ALZ is likely nonsense, but perhaps some protection is conferred to those individuals who have more redundancy in their cognitive tool kit of solutions or more options for solving a problem because they have a wider set of life experiences.
What could we be doing—biologically, physiologically, and at the scale of the whole person, and even the community—for those suffering significant cognitive decline? How much of our ability to function effectively in everyday life is already supported by our habits, our strategies, our communities? How important are the cognitive shortcuts, heuristics, cues, and social arrangements that we develop over a lifetime? (Just think of how much longer it takes to find your way through an unfamiliar grocery story or airport to get a sense of how we all depend on such shortcuts to operate effectively in the world.) When an older person moves from a familiar environment to a new one, is it possible to tailor that unfamiliar setting in a way that allows for new learning to occur?
How might research priorities change if we start with the recognition that brains do not function on their own but rather are immersed in a rich context that includes awareness of possibilities and options? Current science, with its dominant focus on linking the molecular, genetic, and cellular to emergent functions—even with an increased focus on the brain-as-network—remains stuck. Humans are inescapably social beings who evolve individually and as a species through interaction with the world around them. Why would we imagine that a science that ignored this fundamental reality of human cognition could ask the types of questions necessary to understand and intervene in the processes of cognitive decline during aging?
Two-thirds of people over age 70 experience untreated hearing impairments and hearing loss. Although the technology has improved and continues to advance, the reality for most is that hearing aids don’t work very well. Many aging adults also experience vision loss through conditions such as macular degeneration, as well as diminished taste, smell, and tactile sensations. Sensory losses often cause aging individuals to scale back social interactions, limiting engagement. Lives become less joyful, more isolated, and more sedentary in a reinforcing downward spiral. Changes in sleep patterns, diet, and fitness all affect the person at every level of organization. The changes all have effects on the adaptive capacity of brain networks. This is the context in which science needs to engage the aging brain.
The research landscape could be incredibly rich once we escape from the shackles of reductionism and start to look at the complexity of the aging brain in its biosocial context, a context that demands that the brain be understood as an evolving, complex, adaptive network. Even if effective preventions and cures lie in the distant future, some technological, behavioral, and environmental interventions should be testable in the near future and could lead directly to reduced suffering for ALZ patients and their loved ones. But first we will need to learn how to ask the right questions.