Quirks and Quarks·QUIRKS & QUARKS

Protecting our neural privacy

The ability to tap into our deepest thoughts using technology is exciting but also alarming.
A woman poses with a brain-computer interface (BCI), a solution for future human-machine cooperation. (REUTERS)

We're already pretty intimate with our technology. Just look at our smart phones. They're key to our social lives, careers and work as our prosthetic memories. But it won't be long before our connection to technology is a lot more direct and intimate than it is now.  

Companies like Facebook and Elon Musk's "Neuralink" want to do away with clumsy keyboards and thumb-typing, and move to brain-computer interfaces. Our very thoughts would control our devices, and we could search directly from our minds.  

Exciting, fascinating, but also, more than a little bit scary. Do we really want computers routinely monitoring our thoughts? What about privacy? Who's going to have access to the information these devices are recording from our brains? 

To get a picture of just how good current cutting-edge technology is at seeing inside our brains and decoding our thoughts Bob McDonald spoke to Dr. Jack Gallant, a Professor of Psychology at the University of California, Berkeley.

As we move into the future where the latest cool gadget in gaming or eventually in social media is a brain computer interface, there is a real worry about what could be done with your raw brain data. This is something Dr. Howard Chizeck, a professor of electrical engineering at the University of Washington, and his team are concerned about. They've been studying just what kind of information you can get from these simple devices that measure brain waves emanating from our skulls.


These interviews have been edited for length and clarity.

Bob McDonald: You have done some work on decoding language and visual images from the brain. Let's start with language. How do you decode words that are happening in the brain?

Dr. Jack Gallant: In principle you can decode any kind of thought that is occurring in the brain at any given point in time. So if we have a suitable brain recording device, we can measure those patterns of activity and infer the relationship between what was going on in your brain and your behavior or your thought.

And you can think about this sort of like writing a dictionary. If you were, say an anthropologist, and you went to a new island where people spoke a language that you had never heard before, you might slowly create a dictionary by you know pointing at a tree and saying the word "tree" and then the person in the other language would say what that tree was in their language. And over time you could build up a sort of a dictionary to translate between your language and this other foreign language.

A demonstration of a brainwave-reading headset as a controller for the video game "MindSet", produced by an American venture "NeuroSky." (AFP/Getty Images)

And we essentially play the same game in neuroscience. If we know what someone is thinking for example, say if we think about a dog, we can measure their brain activity and we can look at the patterns of brain activity that are associated with thinking about dogs. And then we can ask them think about a cat and then we can measure the brain activity associated with cats.

And by doing that over a very long period of time, we can build up essentially a brain dictionary that translates between the patterns of brain activity that we can measure and other kinds of thought or behaviour. So we don't do this one word at a time. That would be very inefficient. We have much more efficient ways to do it but essentially we can, given enough time and money and a good enough brain measurement device, we can build an arbitrarily complicated and rich dictionary.

BM: How rich is your dictionary at the moment, how much detail can you get?

JG: At the moment, we typically generate sort of feature spaces or dictionaries of a couple of thousand different categories. For example, we might have you listen to stories just through a set of headphones while we're recording brain activity. And we can recover the representation of a couple of thousand different objects and actions in the stories. Or we can have you watch movies and we could recover the representation of a couple of thousands of different objects that can occur in the movies and interactions between the objects.

BM: But you're saying that if I'm watching a movie you would be able to tell from my brain activity whether what type of movie I was watching - whether it was say a space movie or a romantic movie or the characters in it? You can see that kind of detail?

JG: When you say you can decode this or you can tell what somebody is thinking, the question is: how well? You can you can crudely tell what they're watching. You can have somebody watch a movie in the MRI machine and tell for example how bright the screen is, and vaguely what the colours are, and maybe recreate sort of a pale shadow of what they're seeing. You might be able to classify whether the movie is making them happy or sad or whether the movie has a lot of road outdoor scenes like a western or whether it's a closed in you know small enclosed space like you might see in a gangster movie where they're having a conference inside a room. But you wouldn't be able to tell most of the details. That's simply because our methods for measuring the brain are simply not that accurate at this point in time.

BM: So what are your biggest concerns when it comes to things like our neural privacy?

JG:  As our ability to measure the brain precedes, our ability to decode the brain will also advance and eventually you will be able to decode anything that's sort of an active working brain space. And when that happens, it opens up all kinds of both conscious and unconscious information that people normally keep hidden from others that will be accessible to the brain measurement device. And the question is who is going to control those devices and how is the information going to be managed and how are you going to know what has happened to that information after it leaves your head.

Dr. Howard Chizeck, a professor of electrical engineering at the University of Washington, and his team, have been studying what kind of information you can get from currently available consumer gaming headsets that use simple brain wave monitoring.

Bob McDonald: When it comes to these consumer brain computer interface devices, how vulnerable are we have and how urgent is this issue?

Dr. Howard Chizeck: I think there is a very high level of vulnerability. The reason is that you don't have to actually pick up entire thoughts. There are signals that our brains make when we see something that we recognize or we see something that's unexpected. And these are very specific shape waveforms that can be detected. So if the electrodes or the recording devices are accurate enough and the processing is available, you can detect these things. And this leads to an opportunity to interact with a person to extract information.

So for example, if I showed you in a video game certain people racial type, sex, whatever, and you reacted differently I could gather information about you. And if I do 20 questions and keep narrowing down the information I could learn brand preferences or political opinions or things that you might be enticed to buy. The problem really is that there's a two way communication between these online video games and online tools. And so even without extracting entire thoughts, you can recognize interest.

A man, wearing an EEG brain scanning apparatus on his head, plays a pinball game solely through willing the paddles to react with his brain. (Getty Images)

BM: Well how did you test all of this in your laboratory?

HC:  So we we have any EEG system and we put it on an unsuspecting person. And we have a video game which we called Flappy Whale. Basically it's a video game where either by hand control or muscle control or some other controller you make the whale move on a pathway and avoid barriers. The game gets harder as you go on. So this focuses the attention of the individual.

Subliminal images can be put in the game. By monitoring with EEG to see  what the person's responses, we can gather information.- Howard Chizeck

HC: I have a graduate student who's working on different ways to extract things like PIN numbers by showing patterns of numbers and seeing what the recognition is and extracting that information. And my former Ph.D. student Tamara Bonacci did a set of trials looking at brand preferences, Starbucks versus another coffee brand or different burgers.

Let me go back and say an important thing is that we would like to get out ahead of these problems before they happen. With e-mail spam, we didn't get out of ahead of it before it happened. We started this work about four years ago our hope was to try and capture the issue bring it to awareness begin to look for solutions before these things happen.