Ultimate mind reader (almost)


OHSU researchers try to translate brain waves into words

by: TRIBUNE PHOTO: CHRISTOPHER ONSTOTT - Aimee Mooney, speech and language pathologist with OHSU's assistive technology program, applies gel (top) to conduct brain signals past the scalp to electrodes in a cap. The cap has 16 electrodes picking up P300 signals.Talk about pressure.

Sixteen electrodes on a cap attached to my skull are reading my mind, and I’m worried about what they are going to say.

Within the next few weeks, President Obama is expected to officially announce the next big public/private science initiative. The Human Genome Project has been judged a success by scientists around the world. New gene therapies for previously untreatable diseases appear within reach.

Next up is a push to map the human brain. Scientists agree it will be a much more expensive and complex undertaking than mapping the genome turned out to be. It’s the reason I’m sitting in the Reknew lab at Oregon Health & Science University. I’m getting a glimpse of the future, of the type of advances that might occur as research focuses big-time on the brain.

Thousands of people suffer from what is called “locked-in syndrome.” Lou Gehrig’s disease (ALS), spinal cord injuries or severe strokes have left them completely paralyzed. But their brains still work fine.

Their brains still think thoughts, choose one word over another, form sentences. Which is why the leads attached to my skull are picking up my brain waves. The computer software in the laptop off to my side is translating those waves. If I can think or mentally grab a letter or a word displayed on a laptop screen, the software might be able to interpret a change in a brain signal called P300 and produce the letter or word on the screen — reading my mind. In theory.

Hurdles remain

This is still experimental science. Yet, remarkably, most of the big hurdles have been overcome. Researchers now can select the brain waves they want and develop software to translate my thoughts. But hundreds of details still stand in the way of widespread use of mind-reading technology.

Even the way letters are displayed on the laptop screen poses a detail problem. In phase one, my job will be to grab letters as they quickly appear and disappear on the screen. Speech and language pathologist Aimee Mooney, running the experiment today, says I need to find one model of mental grabbing that works best for me. Some subjects, when they see the letter they want, imagine it exploding on the screen. Mooney says she’s verbal by nature, so when she sees the letter she mentally shouts it in her head.

I’d like all the letters of the alphabet up there so I can take my time and focus on the one I want. But many paralyzed patients can’t move their eyes around a screen. So instead, it’s one letter at a time at an incredibly rapid pace and in random order. Even the random order vexes me. If I want to grab a T, why not have the alphabet in proper sequence so I can anticipate its arrival? But OHSU researchers think the surprise response at seeing the T enhances my brain waves, so they’d rather I didn’t know what was coming.

I choose a two-pronged strategy. When the letter I want appears on the screen (for less than half a second) I imagine a hand reaching out and grabbing it. In addition, I attempt to add an emotional component — purposely thinking a sense of excitement that I’ve found my letter.

The results are exciting. A 10-minute run in a darkened room yields an even baseline graph that shows I have been successful in quieting the noise in my mind so that my brain waves clearly peak when I try to grab a letter. The first time through, most subjects are unable to get up to 80 percent accuracy in grabbing the right letter. I’ve scored 88 percent. And I’m teaching the computer to recognize my brain waves.

In the second round I’m supposed to form the words “not,” “the” and “and.” I grab the letters perfectly and watch the words form on the screen.

Mind over matter fails

In the third round, I’m instructed to start by grabbing “from.” I successfully grab an F, R and O, but the next letter the computer shows is a T. I’ve stumbled.

I become aware of noise behind the lab door. I hear a cell phone ring somewhere nearby. I need to grab a delete sign as it flashes by, but I find it nearly impossible. My mistakes have created frustration.

Attempting to grab a delete symbol, I instead secure a second wrong letter. Grabbing two deletes is beyond me.

It’s clear that my growing frustration is keeping me from enthusiastically grabbing the delete signs. A missed letter has become a cascade of missed signals. My mind is no longer focused, and the brain wave sensors on top of my skull are recording all that.

Which provides just a hint at how complex the brain is, and how difficult it can be to decipher. That’s why it may take five years for OHSU’s Brain Computer Interface system to produce a product for home use, says Melanie Fried-Oken, director of OHSU’s assistive technology program.

Fried-Oken says she’s excited about the possibility of a national focus on mapping the brain, but she’d like to see that done with scientists working hand in hand with physicians and therapists who deal every day with patients. She recalls a BCI conference she attended five years ago. Nearly all the participants were engineers, she says. That made no sense to her.

“They were spending a week talking about how these machines are good for communication,” Fried-Oken recalls. “There was not one person there who understood communication.”

Fried-Oken is a rehabilitation specialist accustomed to working with patients. She knows there are a number of teams of scientists around the country taking different approaches to the direct brain communication problem. She says the OHSU team is the only one that includes a computational linguist. The idea is to incorporate an intuitive spelling program into the system so that, much like with computer search engines, if a familiar user grabs the first couple of letters in a word, the computer can recognize what the full word is probably going to be and get there faster.

Barriers begin to fall

“We are on the cusp of making this work,” Fried-Oken says.

She is also aware of variations being used to conquer other neurological hurdles. An Australian company called Emotiv markets a product with a simpler electrode cap they claim can allow users to communicate and play virtual games without moving a muscle. Imagine how much quicker gamers can play, Fried-Oken says, if they can bypass motor skills and make selections straight from their brains.

Others are using BCI technology for stroke victims whose brains are unable to get messages through to move limbs. The cap picks up the brain waves that tell the arm to move, sends them to a computer, which is separately attached to the nerves in the arm. The brain is able to tell the muscles to move.

Other scientists have talked about an electrode cap that can tell when a driver is falling asleep and sends signals to wake him up. The eventual uses of BCI technology are countless, Fried-Oken says.

Too much information?

But they also come with new ethical considerations, she adds. The OHSU communication project is, in a very primitive way, reading the subject’s thoughts.

“If we can get it to work perfectly, you know when you have thoughts you don’t want to share with anyone else?” she says. “The machine is going to display those thoughts.” Somebody might have to devise a filtering system, she says, to identify which thoughts the user wants displayed.

Meanwhile, in the living room of a comfortable house in the Beaumont-Wilshire neighborhood of Northeast Portland, Cynthia Greene is embracing what may be her future. OHSU speech and language pathologist Betts Peters has brought the BCI equipment to Greene’s home for Greene’s ninth training session.

Eventually, BCI will have to be something people can use on their own, not just in OHSU’s lab. And it will have to be simple to set up. For example, Peters says, there will need to be a simpler and more efficient cap that doesn’t require her to squeeze gel she applies to conduct the brain’s signals past Greene’s scalp and hair to the electrodes in the cap.

Fried-Oken says that finding a way to reduce the noise of unwanted brain activity at the outside of the skull may be the most significant hurdle that needs to be overcome. Some labs have chosen to open up patients’ heads and implant the electrodes beneath the skull in order to pick up cleaner signals. OHSU is betting that eventual users will prefer a less invasive model.

Greene, who suffers from ALS, can’t speak; her vocal cords don’t work. She can’t type because her hands don’t work. Wheelchair-bound, she wouldn’t be able to breathe without the ventilator beneath her seat. But she can think. Her brain works just fine. And each time Peters visits with the OHSU equipment, Greene is becoming more proficient at making her brain waves indicate the letter or word that she wants.

Greene is not completely locked in. She can still tap with her right foot, and she has a computer screen on her wheelchair that allows her to communicate by tapping to indicate letters or words she wants to spell out. But ALS is a progressive disease and Greene knows that someday she will lose her remaining muscle control.

For now, she practices when Peters brings the BCI cap and laptop out to her home. She says she prefers the letters on the screen to pop up faster because that helps her stay focused. Greene says she’s also concentrating on not blinking, because the brain waves created when she moves her eyelids show up as noise on the laptop, which interferes with her grabbing the letter or word that she wants. But today she manages low-80s percent accuracy. She’s improving, and all the time she’s aware of how significant overcoming her obstacles and the OHSU team overcoming their obstacles is for her future.

“I hope not to need this,” Greene taps out with her foot. “But it is hard to think about when it would be necessary. Betts better get all the bugs out.”