Interview 4 – Nicho Hatsopoulos, neuroscientist

Interview 4 – Nicho Hatsopoulos, neuroscientist

In this series of interviews. I ask scientists, engineers, and ethicists how technology might change our future. We had these conversations during the research for my book, Welcome to the Future (Quarto, 2021).

Interview 4 – Nicho Hatsopoulos

Dr. Nicholas “Nicho” Hatsopoulos is a neuroscientist at the University of Chicago. He studies how the brain controls body movements and receives sensations from the body. He works on brain computer interfaces (BCIs), which are systems implanted into or onto the brain that allow someone to control a computer or robot with their mind. I spoke with him in October 2019 and again in January 2021.

Do you think people might control all their devices with their minds in the future?

I think it could happen. What’s holding it back is a method of implanting a device that is non-invasive or minimally invasive. If you’re talking about healthy individuals implanted with these things, the devices we’re using now wouldn’t be approved. No doctor would do this [to a healthy person].

How might researchers create a BCI that is safe enough?

To my knowledge, there is no technology that is completely non-invasive with the necessary spatial and temporal resolution. Somehow, you have to get into the brain.

I know Elon Musk’s company called Neuralink is trying to develop an implant that could be minimally invasive and implanted with a sewing machine apparatus. I think his mission [to implant healthy individuals] is somewhat far-fetched, at least in the short term.

There have been some other thoughts about putting implants into the blood vessels. In that case you wouldn’t have to actually open up the skull. You inject it into a big blood vessel then it travels up to the brain and implants. In fact, there is some current work being done using this approach.

What about MRI machines (which measure blood flow in the brain) or EEG caps (which measure electrical activity from outside the skull)?

MRI or EEG electrodes that sit on the scalp can’t give you the kind of control you’d need to interact with your cell phone, for example. EEG signals comprise thousands and thousands of neurons that work in concert, creating these global patterns of activity that are arbitrarily associated with certain behaviors, like moving a cursor to the left or right.

This is pretty different from what we’re doing when we implant a set of micro-electrodes in the motor cortex [the movement center of the brain]. We’re recording from single neurons. If you think about moving your arm to the left, certain neurons fire in a certain way. We’re using those signals.

So you’re actually reading the mind? Or at least, a small part of it?

We’re reading the natural coding scheme the brain uses to control the arm, and instead of controlling the arm, we’re controlling a cursor or a robot.

In early experiments, back in the beginning of 2000, monkeys were trained first to play a video game. That took months to train because of course monkeys don’t know how to play video games. Once we got them trained to play a video game, we switched them from arm control to brain control. And they have no problem doing it – they just continue doing the task – thinking that they’re moving it with their arm, but they are actually moving it with their brain. Then eventually what they discover — and this is a bit of mystery — they discover they don’t have to move their arms and they stop moving their arms. But they can still activate those areas of the brain to move the cursor. Presumably those cells are controlling an arm. How is it you can still activate those cells but not move your arm?

But they can still activate those cells to move their arm when they’re not playing the game?

Yep. No problem.

Do you think it’s possible to get to that level of control with a system that’s outside the brain?

I can’t think of a way. It would have to be a new kind of science. I’m not saying it’s impossible but it would have to use a completely new technology. How do you get this kind of fine resolution of single cells or small groups of cells at a distance like that? I don’t know how to do it.

So you can use the brain to control a computer or robot. What about the reverse? What progress have you made in sending sensations to the brain?

That’s a big challenge. Incorporating sensory feedback, meaning both touch and kinesthesia [the sense of the body’s posture and motion in space]. These are key things for normal motor behavior. We know this because we know that patients that have neuropathies that affect their sense of touch and kinesthesia are severely motor-disabled. Currently almost all brain machine interfaces now [in 2019] are like these patients – there’s no somatosensory feedback. There’s visual feedback, but the problem with vision is it doesn’t help when you’re interacting with objects because you can’t feel what you’re touching. You don’t know how much force you’re putting on an object. You can’t see that. The sense of touch provides that feedback. Also kinesthesia (also referred to as proprioception) allows one to sense the motion of the joints and the forces applied by the muscles. You don’t want to always have to see your arm when you are doing tasks.

In the future, will people feel as if their brain-controlled devices are parts of themselves?

Yes, that’s the concept of embodiment. It’s the idea that these devices will become part of you. I almost have the feeling even today with my cell phone. It’s almost like I can’t be without it. Whether or not they’re directly connected to the brain, these devices will become a part of you.

There was an experience we had with one of our first human subjects. They were controlling a cursor [using a device implanted in their brain]. We told the subject, “think about moving your hand to move a mouse so as to move a cursor. That’s how you’re going to control the cursor.” And they moved it. But then the subject said, “I’m no longer thinking about my hand. I’m thinking about the cursor. I’m just moving the cursor directly. I don’t think about my hand at all.” The cursor had become a part of them. They were directly controlling it. The subject had in some sense embodied the cursor.

If you can control a device with your brain, could someone use the device to control you?

As soon as you incorporate feedback, yes, in principle there would be a way to interact with brain. That’s kind of frightening! That’s definitely going to be an ethical issue we’ll have to consider. But that’s far out.  

How many years do you think it might be before everyone’s walking around with some sort of brain implant?

I’ve been asked this question repeatedly going back to the year 2000, and I’ve always said five years from now. And then it never happens. So I have no idea.

We reconnected in 2021, and I asked Nicho for an update on his team’s work. Here’s what he said.

Despite the pandemic, we managed to find a wonderful volunteer and implant him in November 2020.  Things are going really well. We are still at the early stages. He has two arrays implanted in his motor cortex. We ask him to think about moving his arm while he’s watching an avatar of an arm and hand moving in virtual reality. He has to reach out [in this virtual space] and grab an object and then release it. The next step is to focus on how we can decode forces when subjects grasp and manipulate objects. Right now, we’re kind of cheating. When his virtual hands get close to the object, we call that a grasp. But really what we want is for him to modulate the force he’s applying. That means if it’s a heavy or slippery object, he should grasp with higher force, but if it’s a delicate object, he should be more careful. That’s what we’re gearing up to do. Part of that effort is providing him with feedback. So he also has two electrode arrays in his sensory cortex that can deliver sensory stimulation to give him a sense of touch, so he can receive feedback that vision doesn’t provide.

Can he feel sensation through these arrays?

Yes, he can. We’re focused on the hands. He’s feeling different digits on his hand. We can get every digit except for the pinky.

If we ever get to a future where healthy people choose to have these kinds of implants, what ethical issues might we face?

There’s a question of equity. These are going to be expensive. Some people will try to enhance their performance in some way and will have an advantage over others that don’t have the means to do that. Obviously there are also health-related medical issues. The implant itself causes some damage. If someone is paralyzed, the implant is causing damage in an area that isn’t normally working. But if you start going to other brain areas involved in cognition and thinking, would getting an implant affect someone’s thinking? That’s an important ethical consideration.

Any last words?

We kind of already have brain machine interfaces. Our cell phone is a brain machine interface. We just happen to be using our fingers to interface with it and using our eyes to receive information.

The volunteer Dr. Hatsoupolos mentioned is named Scott Imbrie. I spoke with him as well and will post his interview later in this series.
Back to Top