After decades of sluggish progress the battle against Alzheimer’s disease is moving fast. Just last year scientists showed that spinal fluid can diagnose Alzheimer’s disease with a 100 percent accuracy. And a pair of studies published earlier this month showed brain scans to be reliable enough for commercial use. Now a group of researchers has shown that blood can be used to diagnose the disease. Routine blood tests could lead to earlier diagnoses and prove invaluable in efforts to treat the disease early and eventually find a cure.
The findings of Samantha Burnham and her colleagues from the Australian national research organization CSIRO caused quite a buzz at the latest Alzheimer’s Association International Conference. Of the 273 study participants, the blood screen correctly diagnosed Alzheimer’s in 83 percent of people previously diagnosed with cognitive tests or brain scans. It also correctly excluded 85 percent of people without the disease. For further confirmation they applied their model to two studies conducted by other groups. After plugging in the data, the model was very good at diagnosing the disease as well as predicting the amount of plaques–the tangles of protein thought to muck up neuron function in Alzheimer’s.
The screen measures the blood levels of nine different proteins or hormones. Included among these is beta-amyloid protein, the tangled protein in plaques and the long-standing poster boy for Alzheimer’s markers. Five of the other markers had already been implicated in Alzheimer’s but three had not. The group is concealing the identity of these latter three, waiting to verify the individual predictive power of the new markers before they go public with them. If they’re confirmed, the field will have three new markers with which to diagnose and possibly three new targets for treatment.
A blood screen for Alzheimer’s disease represents a major advance from current diagnostic methods. Although both positron emission tomography (PET) brain scans and spinal taps are accurate, they’re far from convenient. Alzheimer’s disease is typically diagnosed when someone begins showing symptoms such as dementia. But it is believed that the disease actually begins about ten years before those symptoms begin to show. However diagnosing Alzheimer’s before symptom onset would require people to undergo routine PET scans and/or spinal taps, which most people, understandably, aren’t willing to do. Simple blood screens make early detection of Alzheimer’s a realistic goal. Not only will it benefit patients who could start treatment earlier, but it would enable scientists to track the disease’s progress early in its development. Not much is known about the physiological changes that take place during the early stages. Learning about them could lead to new insights into underlying causes – something scientists at the moment seriously lack.
Which brings up an important point. Right now there is no cure for Alzheimer’s disease. Currently our small handful of drugs only treat symptoms. They can improve cognitive function or help with behavioral symptoms such as anxiety or sleeplessness. But even so the effectiveness of the drugs is limited. As the drugs are only expected to improve symptoms for a few months or a few years, being diagnosed with Alzheimer’s is about the worst kind of news a person can get from the doctor’s office. The only thing American’s fear more than hearing they have Alzheimer’s is hearing they have cancer.
So, if it’s a lost cause, why tell the patient in the first place? The question is one that is currently being debated. And for a great number of people it’s a pertinent one. Right now 5.4 million people in the US are suffering from Alzheimer’s disease. It’s the sixth leading cause of death, with just under 75,000 deaths in 2009.
Brian Carpenter, associate professor of psychology at Washington University in St. Louis, has taken a scientific approach to answering the question of whether or not to tell someone they have Alzheimer’s disease. He cites clinicians who don’t tell their patients they have dementia because “they think patients won’t understand or remember what they are told, that their is little to be done for patients anyway, and that telling them could spark depression or suicide.” But Carpenter – who does think patients ought to be told – rebuts this reasoning by observing that “these concerns are not an issue for most patients.” In a 2008 study his lab followed 90 patients and their families during the course of their diagnosis. They didn’t see much depression or anxiety. The typical reaction, in fact, was relief to have their sudden and frustrating symptoms explained. In practice though most doctors do tell their patients when they’ve developed dementia. But with the new blood screen doctors may very soon be deciding whether or not to tell otherwise perfectly healthy people that they will most probably develop Alzheimer’s.
Blood screens such as Burnham's are poised to fundamentally change clinical diagnosis. We've covered screens that detect Down syndrome and schizophrenia. There’s still some ground to cover before the current test can be used in clinics, however. The announcement by Burnham and her colleagues is based on a single study. But I’m sure the group is wasting no time to confirm their initial findings. We can only hope that the axiom knowing more is better than knowing less will have real world consequences for Alzheimer’s patients, their families, and the scientists searching for a cure.