Advertisement

'Battle for your brain': What the rise of brain-computer interface technology means for you

46:01
Download Audio
Resume
Aamir Ahmed Khan, PhD, Principal Electrical Engineer for Paradromics, works on the Transceiver which connects to the brain implants. The device is wirelessly powered and does not have a battery to charge. 
(Photo by Julia Robinson for The Washington Post via Getty Images)
Aamir Ahmed Khan, PhD, Principal Electrical Engineer for Paradromics, works on the Transceiver which connects to the brain implants. The device is wirelessly powered and does not have a battery to charge. (Photo by Julia Robinson for The Washington Post via Getty Images)

Sign up for the On Point newsletter here

This rebroadcast originally aired on March 17, 2023.

Computer brain interfaces used to be the stuff of science fiction.

Now, headphones and earbuds with sensors that can read your brain waves – and sell your data – are hitting the market.

"Nobody should walk into this blindly thinking that this is just another fun tool," Nita Farahany says.

"This is the most sensitive organ we have. Opening that up to the rest of the world profoundly changes what it means to be human and how we relate to one another."

But that brainwave information can also be used by corporations and governments.

"China has very clearly said that they believe that the sixth domain of warfare is the human brain," Farahany adds.

"They are investing tremendous dollars into developing brain computer interface, but also figuring out ways to disable brains or to spy on brains."

Today, On Point: Big business, big government and your brain.

Guest

Nita Farahany, professor of law and philosophy at Duke University. Her new book is titled The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. (@NitaFarahany)

Margaret Kosal, teaches international affairs at the Georgia Institute of Technology, currently on leave to the Savannah River National Laboratory. (@mekosal)

Also Featured

Tan Le, CEO of EMOTIV, which manufactures wearable neural sensing devices. (@TanTTLe)

Transcript

Part I

MEGHNA CHAKRABARTI: Wearable tech, your Fitbit, smartwatch and the like. They can already do things like measure your heart rate or how well you're sleeping just based on how you're moving or signals through your skin. So, what do you think the next frontier might be in wearable tech? The next new thing devices can monitor and measure. Just think about it. Really think.

TAN LE: I use my earbuds every day because I want to know how my brain changes based on all of the things that I do, because my brain is changing all the time. It's the most sophisticated learning apparatus that we have.

So I use my earbuds as a way to understand what's happening to my brain as I play with my daughter, hang out with my cat, listen to music, work. And it's really interesting. I learn a lot about myself. I learn a lot about what makes me happy and perform better. And when I'm really stressed, what impact that has on me.

CHAKRABARTI: This is Tan Le, co-founder and CEO of EMOTIV, one of a new crop of companies that sees great potential in BCI or brain computer interface technology.

Le believes the possibilities for such tech are endless. Helping the elderly experiencing cognitive decline, empowering the disabled community to perform actions simply through thinking. Even helping you understand yourself better how to be happier or more efficient.

Le says brain computer interface tech will one day be able to do all of these things. Through major advances in miniaturized electroencephalography technology or EEG, which can read signals from the human brain and send them to amplifiers, which in her company's case are in those earbuds.

LE: It's giving you feedback on your computer. So if I click on the icon to see what's going on in my brain at the moment, I can see what's happening in my brain. And then I can also see a report over the course of the day, when during the day my brain was in an optimal state. And then I can correlate that with what I was doing at that time.

So when I look back on my afternoon on Sunday, I knew exactly what I was doing. So I knew why that was different to the barrage of back-to-back meetings I had on Friday afternoon, which caused my brain to be a much more intense state. And so that allows me to change my day a little bit, carve out more time for focused work so that I can actually work more optimally.

CHAKRABARTI: Well, Tan Le isn't the only one who thinks this is utterly fascinating. Her three-year-old daughter sees her at her desk, wearing her earbuds and checking in on her state of mind.

LE: She said, Mommy, I want to see. And I said, This is mommy's brain. And she said, I want to see my brain. And I said, You're too little. So it doesn't fit her. But she's so intrigued by it.

CHAKRABARTI: Currently, EMOTIV earbuds are available only on their website. Le says she hopes that one day they'll be available in stores for widespread use in the consumer market. But for now, her main clients are not consumers, they're employers.

LE: One of our clients is JLL. JLL is a large real estate organization, and JLL came to us saying that, you know, the future of work is changing rapidly. How can we design our workplaces better so that we can make sure that when people are at work, they're getting what they want from the work environment?

So in that case, we will invite volunteers within the organization to sign up for a research study where they will wear a device for a certain period of time. And what we do is we capture brain data from those experiences in order to try and map out what is the relationship between an environment that's conducive to teamwork and collaboration. This is something that doesn't actually achieve those desired outcomes.

CHAKRABARTI: By the way, JLL is also known as Jones Lang LaSalle, Inc, one of the largest real estate companies in the world, ranked 185th on the Fortune $520 billion in revenue last year and 100,000 employees worldwide, some of which have been asked to participate in the kind of research study Le mentioned. So what happens to the data those employees' brains are pumping out into EMOTIV earbuds?

LE: What's really important about EMOTIV is that fundamentally we do not believe in how companies have transacted with data in the past. We are a company that was born about ten years ago. And so we've seen a lot of the changes in the public's view of how data is mined for corporate advantage without the informed consent of the users and participants.

And so we conduct ourselves in a very thoughtful and ethical manner in regards to data. The users need to have control of when they collect data, how data is shared, and in fact, we don't sell or share your data with anyone without explicit consent.

CHAKRABARTI: Well, this is On Point. I'm Meghna Chakrabarti and that was Tan Le, co-founder and CEO of the Neurotechnology firm EMOTIV, one of a new group of companies that's rapidly advancing the possibilities of brain computer interface technology. Well, my guest today says the positive possibilities of such tech are exciting and essential. But it's naive to think that power to read brainwaves will be used exclusively for good because the potential for exploitation is just too great, both by corporations and governments. So she says now, as brain computer face, technology is starting to enter our lives and our minds. Now is the time to establish new rules, to defend the right, to think freely and to keep our minds, our own private property.

That comes from Nita Farahany, professor of law and philosophy at Duke University. Her new book is The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology.

Professor Farahany, Welcome back to On Point.

NITA FARAHANY: Thank you. It's great to be here.

CHAKRABARTI: So I would like you to take us back to the first moment you realized that this revolution in tech was coming. You write about a 2018 summit at the Wharton School in Pennsylvania. What happened there?

FARAHANY: So I had been studying neurotechnology and even consumer neurotechnology for quite a few years, but at that summit, early on in the summit, Josh Duyan stood up.

He was one of the people at a company that was a startup called Control Labs. And he was showcasing this new device where they were taking electrodes and putting them into what looks like an everyday watch. And he held up his hands and he said, "Wouldn't it be great if instead of having the kind of clumsy output that we have, that is these hands, these like sledgehammer like devices, we could interact much more seamlessly with all of the rest of our technology with a device like the one on my wrist."

Or if we wanted to type, we could type by thinking about typing, rather than by having to pound away on a keyboard. And how we've gone backwards in time typing on phones with our two thumbs. What he was showcasing was something altogether different than anything I had seen before. Because while I had played with and seen these devices in the past, they hadn't really solved the form factor.

They were still electrodes that you would have to wear across your forehead and a headband that was both silly looking and uncomfortable. But the applications were also much more limited. They were limited to things like meditation or, personal gaming devices that you might play. This idea that you could take and make brain-wearable devices integrated into our everyday devices to power our interaction with all of the rest of our technologies, that was the moment when I realized all of the things that I'd been thinking about and worrying about for quite a long time.

Suddenly, were going to come true. And I was convinced, given the form factor, that it would just make sense for Apple to acquire Control Labs. But I was floored when a year later it was Meta who acquired them instead.

CHAKRABARTI: (LAUGHS)

FARAHANY: I thought that was the pivotal acquisition and then I was like, okay, it is time to get writing this book.

CHAKRABARTI: A.k.a. Facebook. A.k.a. Mark Zuckerberg. Okay.

FARAHANY: Exactly. Exactly. I was like, if Mark Zuckerberg is investing in this technology the things I was worried about, they're going to come true. And it also just made it so clear to me that this is a mainstream movement. This is the next big thing.

It's not a niche application for people who are interested at home and trying to quantify and see their own brains. This was going to become the way in which we interact with the rest of our technology by using our brains and our thoughts as the way we interact with everything around us.

That was a revolutionary moment, and that acquisition was both terrifying, but also a call to action, to get writing and to get this message out.

CHAKRABARTI: Okay, so but your view on the brain computer interface technology is quite nuanced. You don't see it as a universal bad. So we're going to talk about its potentials, the complex potential in a minute here, but doesn't it makes sense though, that this would be, you call it the last fortress, that technology hasn't yet fully overwhelmed.

But the brain is very much how we, in a sense, what happens in the brain is how we define ourselves as human beings. So it is what generates all our thoughts, feelings, actions. So it would seem very logical that technology would want to understand, harness, and maximize what it can do with that.

FARAHANY: Absolutely. So first of all, you're right. My view is nuanced. And my view is nuanced. Because I believe that this technology is the next step for humans in ways that can be deeply empowering. And I also think the fact that our brains have remained this black box and mysterious, even to us, that we can only access through self-reflection in ways that aren't objective.

That's not good for addressing any of the major causes of human suffering, such as neurological disease and disorder and mental illness, or even just understanding ourselves. So of course it makes sense. that this is where the next step in both self quantification, but also the thing that we as humans would be pushing for, which is access to and understanding our own brains would be happening.

It also just makes sense that we have all of these clumsy interfaces between us and other technology and the ability to be able to much more seamlessly interact with other technology would be deeply appealing to other companies. But I'm also a skeptic on motivations. And, both I think my own cultural heritage, but just the work that I do as an ethicist and a law professor, it's always made me look at, okay, but what are the complex set of motivations that bring these different organizations to the table.

Yeah, that mix is what makes you our favorite kind of guest, Professor Farahany. So we'll talk a lot more about the positives, the negatives, and really most importantly, what kind of questions you say we should be asking ourselves now as a society, as this technology comes at us at full pace.

Part II

CHAKRABARTI: Professor Farahany, you've actually worn some of these devices that currently exist yourself.

Can you tell us a little bit about what it is that you wore, how it worked and what it felt like to have it?

FARAHANY: Sure. I have been toying around in some ways with many of these devices, but also using them personally for some applications. So the earliest of these devices were hard, plastic devices that would go across your forehead and some of them tuck behind your ears or fit tightly across your scalp.

And the idea was to make contact with dry electrodes to your forehead or to different parts of your scalp that could allow the electrical activity in your brain to be picked up. And then interact with, through Bluetooth, some kind of application on your phone. And the more recent devices, as you described, and as the conversation with Tan Le made clear, that I also have had access to, are electrodes that are embedded inside of your ear buds.

And these feel just like the normal ear buds you would use to make a phone call or listen to music or do a Zoom call or headphones, where the soft cups that go around your ears have electrodes inside of them as well. So you can't detect them. They're just like the rest of the technology that you would wear or one of these watches that has a sensor inside.

I've used them primarily, both to test them out, but also for meditation. So I'm not great at self-meditation. Being able to both keep my focus and ability to stay in that state, but also am I doing it right? And what's neat about these devices is the interaction with an application lets you get real time, what's called neurofeedback.

So if I get my brain states into a way that brings down my stress level and shows that I'm in this kind of meditative state. You have signatures in your brain that can be detected and decoded that suggest that. Then you get something like chirping birds or some other kind of audible feedback.

And that's been really helpful for me. I'm a chronic migrainer and high stress levels can really trigger a migraine for me. And using these as a preventive tool, something where, even if I just spend a few minutes of bringing my stress levels down and remaining in a meditative state, for me, have been really helpful in limiting the frequency and the severity of my migraines.

CHAKRABARTI: It occurs to me that there are, then, there's so many potential applications, positive applications for this technology. I've suffered from depression for most of my life, and I think it would be amazing to have a device that would give me some sort of feedback to say, "Your brain patterns right now are indicating, I don't know, some sort of negative feedback loop that's going to deepen your depression." Or something, anything like that.

FARAHANY: No, you're right. You're right. So first I'm so sorry that it's [something] you've grappled with.

CHAKRABARTI: It's okay.

FARAHANY: But you're one of many millions of people who are grappling with different effects of the brain, whether it's migraines or depression or people who suffer for epilepsy, for example, and need an early warning of having an epileptic seizure. These devices can be quite powerful. In fact, I talk about some of those and in the book from using both feedback, but also neurostimulation, which has been transformative for some people with depression or people who have ADHD.

For example, there are a lot of studies that show that using neurofeedback, especially in children, over a number of weeks, can actually be more powerful than drugs or drugs alone and certainly have far fewer side effects. Or somebody who has epileptic seizures like a very close family friend of ours died of an epileptic seizure without early warning. She was alone at the time. She vomited from the epileptic seizure and then died from they believe choking on her own vomit, if she had a one hour, in advance early warning of having that seizure, she could have gotten herself to safety.

She could have made sure that she took just in time medication. There's so much good that could come from being able to track our own brains and improve them, enhance them, use neurofeedback. Our own daughter, our eight-year-old, while she doesn't use one of these neurological devices, uses biofeedback through a heart rate monitor, which has been gamified.

She can play games, which get harder when her stress levels and heart rate increase. And then the way that she wins the game is by being able to self-control by emotionally regulating. And learning those skills at a young age, I think are powerful and important. So I'm definitely not a Luddite when it comes to this technology.

I think it's both coming, but it also has a lot of promise for humanity. If done, if implemented with the right safeguards, if used in ways that benefit individuals. I think it can be incredibly transformative.

CHAKRABARTI: That poor word, 'if,' it carries so much weight on its shoulders.

FARAHANY: It does. It does. And unfortunately, I have to say that because I am somebody who is deeply optimistic and I want the good of this technology for humanity. But I see the risks and I see the risks, especially in this domain, because there is really nothing more sensitive and fundamental than what it means to be human, than having that space of inner monologue, of private thought, of being able to entertain and turn over ideas in your own mind without fear of it being misused by other people, accessed by other people, commodified by companies, interfered with by governments and the potential of connecting our brains to technology makes all of those risks a possibility.

CHAKRABARTI: Just as an aside thought, there's the technology in and of itself, the hardware. Then there's the means by which we can interpret it, right? The kind of feedback the machines generate. But how much confidence at this moment do you have about the interim phase, like the analysis of what the brain, those EEG signals are sending, do we actually know and understand how to read what the signals are?

FARAHANY: Yeah, it's a great question. A lot of people ask me, how good are the devices? And my answer to that is, it depends on what you're using them for.

Can it decode your literal thoughts, the true inner monologue that you're having? No, both the technology itself, like the electrodes, the sensors, the hardware, have improved vastly over the past decade, but there's still some noise and interference. And different people may have them applied in different ways that aren't quite the right fit, to pick up exactly the right signal.

And there can be interference from muscle twitches or eye blinks or other devices in your environment. Because it's electrical activity that is picking up. And then the software, the AI, I think everybody knows that AI has been having just exponential growth in its capabilities. And what we're picking up here from the brain.

Through these devices, what they're detecting, really is patterns of data. And those patterns of data increasingly can be interpreted and decoded with the sophistication of the algorithms at play. So I think depending on what we're talking about, it can be very accurate and very good for basic brain states like attention and boredom and cognitive decline and stress.

And are you happy? Are you sad? It can be very accurate for probing the brain for information, through particular signals of recognition in the brain. But it doesn't do, unless it's implanted neurotechnology. There's not very good real time decoding of speech, for example, even though that is coming in many ways.

And in some ways, we can talk about even your intention to type or to communicate or send a text message, can be decoded with this technology. So it just depends.

CHAKRABARTI: Intention. Okay.

FARAHANY: Intention, right? I say that because there's you thinking in your mind and having a kind of moment of self-reflection and then you intending to type something, which is speech that you mean to go from your mind out into the rest of the world, and that has different representation in the brain. It's easier to decode speech you intend to communicate than that inner monologue.

CHAKRABARTI: Yeah. So this is where we get into Minority Report territory, but we're going to hold that thought, if I can use that pun here, for a moment.

Because now what I'd like to do is push into the possible futures that you think through in the book, The Battle for Your Brain, because we'll get to governments in a few minutes, but I think the most immediate place of change we might see was hinted at by Tan Le at the beginning of our show. Because workplaces would be very interested or are very interested in whatever means to make work better, workers more efficient, what have you.

So if you don't mind, I want to read a little bit of a scenario that you imagine here at the beginning of the book, and then you can talk us through the rest of this. So this is what Nita Farahany says. We might be closer to than we think. So it goes like this.

"You're in the zone. You can't even believe how productive you've been.

Your memo is finished and sent. Your inbox is under control, and you're feeling sharper than you have in a decade. Sensing your joy, your playlist shifts to your favorite song, sending chills up your spine as the music begins to play. You glance at the program running in the background on your computer screen. And notice a now familiar site that appears whenever you're overloaded with pleasure.

Your theta brainwave activity decreasing in the right central and right temporal regions of your brain. You mentally move the cursor to the left and scroll through your brain data over the past few hours. You can see your stress levels rising as the deadline to finish your memo approached, causing your beta brainwave activity to peak right before an alert popped up telling you to take a brain break.

But what's that unusual change in your brain activity when you're asleep? It started earlier in the month. You compose a text to your doctor in your mind and send it with a mental swipe of your cursor. 'Could you take a quick look at my brain data? Anything to worry about?'"

So what happens next in your imagined scenario here?

FARAHANY: So from there, there's a number of different pieces from the employer looking at the brain data and sending a message to the employees saying, "Congratulation on your brain metrics over the past quarter. And you've earned another performance bonus." You're excited about that.

You still have your earbuds in, not thinking about all of the data that you're giving to your employer as you go home jamming to the music and having forgotten that brainwave data is being collected at the same time. And then you come to the office the next day and a somber cloud has fallen over the office.

And you discover that the government has subpoenaed all of the brainwave data along with all of the other information about employees because they're looking for coconspirators for a crime. And it's funny, that scenario, my brilliant editor at St. Martin's press, he invited me to write a scenario that could really put it all together in kind of one easy to understand narrative. What's the full spectrum of this, from the promise, which is your ability to do things like hone your own focus and attention and track your own brain activity and bring down your own stress levels and have real time feedback about when you're suffering from cognitive overload.

To the risks and the ways in which employers are already using this technology. Where it's dystopian in what I describe it as, I believe, of having your brain be part of the performance metrics. There's so much happening in the workplace right now on productivity scoring and, the, I think, over surveillance of employees in ways that really are not helping morale or the dignity of work. And these brain metrics are already being used by companies and increasingly will.

And then the frightening possibility, which we've already seen with other kinds of personal health data, whether it's Fitbit data or heart rate data, which has been subpoenaed by law enforcement and used in criminal cases. And the idea that once you open up your brain passively, thinking that you're using it to track your own attention, that all is fair game.

And can give a lot of insights. The example that I use in that scenario is that they're looking for synchronization of brain activity between different workers. Turns out when you're working with people, you have higher degrees of synchronization in your brainwave patterns, and you could actually use that to figure out who's collaborating, who's developing a union, who's working together, who you wouldn't expect to see those patterns of synchronization.

And so as I start to imagine all of that, and all of those scenarios, are possible with existing research and existing technology, I think it makes clear what the kind of dystopian possibilities are of surveillance of the brain. When you even talk about in this imagined scenario, that maybe you might find a coworker attractive and that would be recorded in your brain activity.

FARAHANY:  Yes. Yes. You can pick up those. You can tell amorous feelings that, these inward and deeply held feelings are not things you would want to reveal. I'll tell you a funny story, Meghna, which is my eight-year-old has her first crush. She'll be embarrassed that I'm sharing this with you and with the world, but her friend apparently has a crush on her as well.

It's the most mortifying thing to them possible that they both have a crush, right? It's the thing that everybody is teasing them about, even though we think it's darling and wonderful. And, but that, imagine when you were a child and you have these early crushes, which are so incredibly formative, you don't want anybody else to know and you're in a classroom required to wear neural interface, brain wearables to track your attention and focus, which can pick up so much more information, including, these kinds of amorous feelings.

Those are things you should be able to keep to yourself. Those are things that other people shouldn't have access to. Those are things that are so formative to self-identity. And so when I talk about these ideas of mental privacy and the importance of this last bastion of freedom, this last fortress, I think it's the most important fortress.

It's the one that's most formative to who you are as a human being.

CHAKRABARTI: About the workplace then, it seems like there's two major sets of issues here. One is A, how this technology can have an impact on workers, both positively and negatively. But B, in terms of the economy that we all function in, this all sounds like surveillance capitalism potentially on megasteroids.

And we've got about a minute and a half before the next break, Professor Farahany. Can you just tell, walk us through a couple of the major questions, therefore, you think we should be asking ourselves as a society right now, when it comes to commercial or capitalism's use of this technology?

FARAHANY: Thank you. I think the first and most important, is there any justified use of brain metrics by employers? And I outlined an example in the book of commercial drivers who are already having their brain activity monitored for fatigue levels. And if you were just measuring that, the only thing that you were extracting through the algorithms and brainwave data was whether or not a commercial pilot or truck driver was sleepy or awake, and it was more precise than other kinds of information.

Maybe those are circumstances in which we might think it's a good use of the technology, but when you're using it to track attention, when periods of mind wandering are punished rather than celebrated, as the most important moments of insights, when you're introducing a more kind of global surveillance of even what a person is thinking and feeling, I think that can be undermining for just the abilities for humans to flourish, to feel like they have trust in the workplace to want to continue to think freely.

So those are the kind of worries that I have in that context.

CHAKRABARTI: So again, the need to put guardrails, legal and ethical guardrails around this.

Part III

CHAKRABARTI: Just before the break, Professor Farahany had talked about, there's a world in which kids in classrooms might put on headbands and teachers would be able to measure how much they're focusing or how much they're able to concentrate on a given assignment. She wasn't just making that up because that world actually exists.

The Wall Street Journal recently visited a classroom just a few hours outside of Shanghai to see how both AI and brain computer interface technology is being used in Chinese classrooms.

WALL STREET JOURNAL: For this fifth grade class, the day begins with putting on a brain wave sensing gadget. Students then practice meditating.

The device is made in China and has three electrodes, two behind the ears and one on the forehead. These sensors pick up electrical signals sent by neurons in the brain. The neural data is then sent in real time to the teacher's computer. So while students are solving math problems, a teacher can quickly find out who's paying attention and who's not.

A report is then generated that shows how well the class was paying attention. It even details each student's concentration level at 10 minute intervals. It's then sent to a chat group for parents.

CHAKRABARTI: That's from a Wall Street Journal documentary about AI in China and the Journal's reporters noted that it's not entirely clear what the headbands are measuring or if they're accurate, but you can see the potential and the use and purpose of this kind of technology.

Now, combine that with what we already know about China's well-established surveillance state that is carefully observing its citizens.

DW DOCUMENTARY: Here in this Shanghai surveillance center, no resident goes unwatched. Hundreds of millions of cameras are installed all over China. We have algorithms that automatically recognize certain behaviors.

If someone isn't wearing a mask, for example, we immediately detect this wrongdoing.

CHAKRABARTI: So that's a little bit about China's established surveillance state from the documentary wing of the German broadcaster DW. Joining us now is Margaret Kosal. She's an assistant professor and teaches International Affairs at the Georgia Institute of Technology.

She's currently on leave to the Savannah River National Laboratory. Professor Kosal, welcome to you.

MARGARET KOSAL: Thank you, Meghna. I'm very happy to be here to talk about emerging technology and international security. So I started my work as a PhD chemist and having experience in the high tech startup field before I got to this work.

And by the way, I'm an associate professor not an assistant professor.

CHAKRABARTI: Oh, you know what? I had the word associate written on my page, but somehow my brain and mouth said assistant. My apologies. If I'd been wearing the right kind of device, maybe I wouldn't have made that mistake. But so go ahead.

Tell us a little bit more. And China is an easy place for us to focus. Because again, they already have such an established surveillance state. Do we know how the Chinese government is viewing the potential of this kind of brain computer interface technology?

KOSAL: There is a whole wealth of information to unpack there in your question.

And I do have to start out by saying that while I am currently a professor of international affairs at Georgia Tech, I have to be sure to convey that my views do not necessarily reflect the positions of the Department of energy, Department of defense, or any other organization that I've been affiliated. Back to your question.

Some of the empirical and quantitative work that I've done has looked at this question of likely adoption of brain computer interfaces, as well as other neurotechnologies, particularly by China, in comparison to the United States. First of all, understanding the inner workings of the PRC often is very difficult, but China has been quite articulate on some of its aspects of where they're going with what they call the Brain China project.

And particularly they have this articulated vision of what they call "one body, two wings." So that's for building the core and developing the applications. And some of this is intentionally going to effective approaches for diagnosis, intervention of brain disorders, and some of it is intentionally to more security implications in terms of the types of technologies.

CHAKRABARTI: Okay, excuse me, pardon me, but let's, so let's focus on that second part. Because that's really where we wanted to take the conversation. Can you just give me what the maybe top concern is amongst the national security establishment here regarding China's view of the potential of brain computer interface technology?

KOSAL: So there hasn't been articulated a specific concern here within the United States with respect to any details. It is in the broader concerns about Chinese technology, Chinese Acquisition and the ability for them to challenge us. One of the concerns that in my work, we have articulated, is that because of the likelihood, looking at these different factors and studying them, not just the technology, but understanding how different technology gets deployed.

It is more likely for these kind of technologies in particular, BCIs, to be deployed and adopted first in the PRC.

CHAKRABARTI: Okay. So now help me understand something. Does China see, have this phrase the sixth domain? I've seen that floating around. What does that mean?

KOSAL: So that's in reference to the war fighting domain.

So we have war fighting domains in the United States too. So the sixth domain, war fighting domain in China, in the People's Liberation Army, is the cognitive domain. The cognitive domain can be split up into a number of different pieces. The biggest piece is things like information warfare, which that can be everything from mischaracterization, disinformation, the traditional propaganda, to use of the internet. But it also can be things that are targeting neurotechnologies, targeting the brain, targeting sort of the ability to undermine the self.

CHAKRABARTI: Wow. Okay. Associate Professor Margaret Kosal teaches International Affairs at Georgia Tech and currently on leave to the Savannah River National Laboratory. Thank you so much for joining us today.

KOSAL: You're most welcome.

CHAKRABARTI: Okay. Professor Farahany. Just give me your quick thoughts about what the China example tells us we need to be thinking about.

FARAHANY: So I think a couple of things. One is, we can think about it from a national security perspective in the United States. So the Biden administration in late December 2021 sanctioned a number of Chinese companies. For creating so called or purported brain control weapons. On this kind of idea of the cognitive domain, there is both influence campaigns.

This is what we're worried about, for example, with TikTok and shaping views and minds, but also picking up biometric data and precise profiles on American citizens. But also this anxiety about a kind of arms race and brain computer interface. And that could be everything from the development of super soldiers.

So there's been a lot of talk about that. There's even a conference that was just recently held by the Commerce department here in the U.S. with all of the major implanted BCI manufacturers about whether there should be export controls to prevent China from using our technology and what could be a race for capabilities within the military.

But then beyond the kind of domain of influence and military use, there's been these anxieties around the creation of weapons that could disable or disorient minds. And while the Havana syndrome cases have largely been dismissed by the intelligence community at this point as being fueled by or funded by foreign adversaries, there's still a lot of worry about that kind of focus of developing kinds of technologies, whether it's electromagnetic or microwave technologies that could be aimed at human brains and minds.

And I think we need to worry about it from a national security perspective. And then we also need to learn from and worry about government use of the technology in surveillance or in interference with freedom of thought. And so I worry about it not just from a U.S. v. China perspective, but what their example of the surveillance state shows us of interfering with what I think is the most important, again, aspect of what it means to have human flourishing, the ability to think freely.

And whether or not the technology works, if you're required, whether in the workplace or in everyday life, to wear brain computer interface technology that could be intercepted by the government, the informational asymmetry is usually so powerful that you might be afraid to even think bad thoughts or dissident thoughts.

I think their example teaches us a lot about the risks of the technology.

CHAKRABARTI: That's right. And not just in the national security context, just in our own lives, based on what governments can do to their own people, anywhere in the world. Look, we're already living in a world where people are afraid to say certain things.

I can very much see a next step being like, afraid to even think them. So in the last few minutes of the conversation, Professor Farahany, the real purpose of your book is, as you say, to get us to start thinking about a new aspect of freedom. That we need to incorporate into social norms, into ethical guidelines, into our legal structure.

And you call it cognitive liberty, which includes mental privacy, freedom of thought, and self-determination. So tell us more about how you conceive of this notion of cognitive liberty.

FARAHANY: Thank you. That, I think, is part of what gives me the optimism and the hope that we were talking about in the beginning of the conversation. We're at the forefront of this transformational moment with this technology, which really is going to become much more ubiquitous and part of our everyday lives.

And the question is, Are we going to give up our rights, our mental privacy, our freedom of thought just as easily as we've given up all of the rest of our privacy in exchange for the convenience of typing or swiping with our minds? And I think we, at this moment, at this juncture, at this earlier stage, have a choice to make.

To change the terms of service and put it in favor of individuals. I see cognitive liberty as an update to our conception of liberty, but in the digital age. It's a concept that applies well beyond just neurotechnology. It applies to how we think about social media and addictive technologies and neuromarketing and weapons that are being designed to attack the brain.

And how I think of it as both as a legal but also a cultural and social norm. As a legal norm, it would invite us and require us to update our international human rights to recognize this right to cognitive liberty as a civil and political right, which would direct us to update three existing rights, our right to privacy, to explicitly include the right to mental privacy, freedom of thought.

To apply more broadly than just religious freedom and belief, but to include a right not to have our thoughts used against us and not to be punished for our thoughts and not to have our thoughts manipulated. And self-determination, updating that from a concept of what's really been understood as a political and collective right, to an individual right, to access our own brains, to be able to enhance or change them and to determine how we want to shape our own mental experiences.

I want to continue living in a world where my last truly safe and protected space is inside my own mind, right? It's our final retreat, right?

FARAHANY: Yes. Yes. It is our last fortress. And it's one, I think, we can't afford to quietly go. I think it's so urgent that people join the conversation and the call to action now because it will be too late to claw it back later.

But it isn't too late now to really define the way in which this technology will be integrated into society and how our relationships to others will be when it comes to the most precious thing we have, which is our minds, our ability to think freely.

CHAKRABARTI: We have one minute left, Professor Farahany, and there's something you teased us with a little earlier that I'd love to close with, and that is your own family background, your cultural background, and how this plays into how you're thinking about these technologies.

FARAHANY: So I'm Iranian American. My parents left Iran really a decade before the revolution, but always intended to go back. Weren't able to as the political unrest occurred, but all of my first cousins, all of my aunts and uncles still live in Iran. And I've grown up in a world in which I understand and see people who are afraid to speak freely.

Family members who are afraid to tell us what's happening for fear of being persecuted. And that world, those conversations, I think, attune me to the ways in which technology can be misused. The ways in which surveillance can interfere with people's ability to rise up, defend their own freedom, defend their own rights.

It's a world I don't want us to unveil through a kind of Orwellian future of neurotechnology, but to ensure we safeguard our right to think freely.

CHAKRABARTI: But you also talk about how in the Iran example, you see the power of technology to mobilize people for change. All those Iranian women, for example.

FARAHANY: Yes. And the Twitter, the use of Twitter during the Green revolution, there is hope and there is peril. And we get to decide which one we decide to champion in life.

Book Excerpt

From The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita A. Farahany. Copyright © 2023 by the author and reprinted by permission of St. Martin’s Publishing Group.

This program aired on March 27, 2024.

Related:

Headshot of Stefano Kotsonis

Stefano Kotsonis Senior Producer, On Point
Stefano Kotsonis is a senior producer for WBUR's On Point.

More…

Headshot of Meghna Chakrabarti

Meghna Chakrabarti Host, On Point
Meghna Chakrabarti is the host of On Point.

More…

Advertisement

More from On Point

Listen Live
Close