Can we test for early signs of Alzheimer’s?
In some ways the human brain is like a well-oiled car engine, purring along without being noticed, until something goes awry. Neuro-imaging techniques such as electroencephalogram (EEG) or fMRI – which measures brain activity by detecting associated changes in blood, give us different ways of peering into the working brain, but mostly this happens in the rarefied world of academic research experiments, or in a clinic once we already know something has gone wrong.
We know much less about how the brain works out in our day-to-day lives, and when people are doing real-life tasks, like talking to a friend, working at their desk, or watching TV.
Why does this matter? Well think about the car again. If you’re buying a used car, you’ll want to know “healthy” it is. One of the first things you’ll do is listen to the engine. But you can tell much more by how it performs on a test drive – slow around town, fast out on the motorway, and back by some windy country roads. And two important things are happening in neuroscience now that mean that soon you, or your doctor, might be able to “test-drive” your brain regularly, to catch any early signs of destructive diseases like Alzheimer’s.
One development is that EEG or “brainwave” technology is now getting so compact and so cheap that it is being sold as a consumer wearable device. And brainwaves carry a lot of information about a person’s brain health. New research we carried out at Queen’s University Belfast, working with Gabriele Miceli’s lab at the University of Trento in Italy, found that cognitive decline can be detected in older people, from just 30 minutes of brainwaves recorded in the lab. We took 40 healthy people ranging in ages from 25 to 80, and ten people with various early forms of dementia (age-related cognitive decline) in their 60s and 70s and found we could measure a healthy person’s “brain age” by how quickly and strongly it responded to pictures we flashed on a screen.
As with the rest of your body, the brain slows down naturally with ageing. In the healthy people we saw brain responses (naturally) grow slower and weaker with age, but those with dementia were outliers – their brains appeared to be older than their chronological age would suggest. Their brain activity appeared older, in a way that could be used in the future for automatic diagnosis.
Bringing stories alive
With another team based in Carnegie Mellon University, I also looked at how the brain reacted while reading one of the chapters of Harry Potter and the Philosopher’s Stone. When we are immersed in reading a story, a lot is going on. Apart from our emotional engagement with a good yarn, we need to recognise individual words, retrieve their meaning from our “mental dictionary,” sew those words into sentences, and then keep track of how the characters and story develops.
Traditional brain imaging experiments would tackle each of these mental processes separately. But in this work – led by Tom Mitchell and PhD student Leila Wehbe – we were able to observe these processes interacting during a real-world task. Eight people’s brains were scanned in an fMRI research scanner to isolate the parts of the brain involved in the complex and enjoyable process of reading. We were able to track how the brain performed in real time, during a real-world activity – a kind of brain workout.
We hope that these two developments – the consumerisation of brainwave technologies, and the ability to track brain activity during complex real-world tasks – could be used in detecting cognitive problems, and for measuring the effect of medication and other therapies. Memory and vocabulary are some of the first mental capabilities to go downhill as people get older. Combining slowing or weakening patterns from EEG with the engaging task and “all-round” workout of the story reading could help work out the best way forward.
Similar methods could be used for specific language and learning problems such as dyslexia, and also to measure attentional and emotional involvement as a probe for conditions like ADHD and autism. We are exploring commercial applications of this technology and how it may apply to consumers in their day-to-day life.
This article is published in collaboration with The Conversation. Publication does not imply endorsement of views by the World Economic Forum.
To keep up with Forum:Agenda subscribe to our weekly newsletter.
Author: Brian Murphy is a Lecturer in Electronics for Electrical Engineering and Computer Science at Queen’s University Belfast.
Image: An elderly man stands in Copacabana in Rio de Janeiro September 13, 2011. REUTERS/Ricardo Moraes.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Future of Global Health and Healthcare
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Emerging TechnologiesSee all
Filipe Beato and Jamie Saunders
November 21, 2024