play_arrow

keyboard_arrow_right

Listeners:

Top listeners:

skip_previous skip_next
00:00 00:00
playlist_play chevron_left
volume_up
  • play_arrow

    Omanyano ovanhu koikundaneki yomalungula kashili paveta, Commisiner Sakaria takunghilile Veronika Haulenga

Lifestyle

In a future with more ‘mind reading,’ thanks to neurotech, we may need to rethink freedom of thought

todayApril 10, 2024 19

Background
share close

Our minds are buffeted by all kinds of influences, though some seem more menacing than others.
wenjin chen/DigitalVision Vectoria via Getty Images

Parker Crutchfield, Western Michigan University

Socrates, the ancient Greek philosopher, never wrote things down. He warned that writing undermines memory – that it is nothing but a reminder of some previous thought. Compared to people who discuss and debate, readers “will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing.”

These views may seem peculiar, but his central fear is a timeless one: that technology threatens thought. In the 1950s, Americans panicked about the possibility that advertisers would use subliminal messages hidden in movies to trick consumers into buying things they didn’t really want. Today, the U.S. is in the middle of a similar panic over TikTok, with critics worried about its impact on viewers’ freedom of thought.

To many people, neurotechnologies seem especially threatening, although they are still in their infancy. In January 2024, Elon Musk announced that his company Neuralink had implanted a brain chip in its first human subject – though they accomplished such a feat well after competitors. Fast-forward to March, and that person can already play chess with just his thoughts.

Brain-computer interfaces, called BCIs, have rightfully prompted debate about the appropriate limits of technologies that interact with the nervous system. Looking ahead to the day when wearable and implantable devices may be more widespread, the United Nations has discussed regulations and restrictions on BCIs and related neurotech. Chile has even enshrined neurorights – special protections for brain activity – in its constitution, while other countries are considering doing so.

A cornerstone of neurorights is the idea that all people have a fundamental right to determine what state their brain is in and who is allowed to access that information, the way that people ordinarily have a right to determine what is done with their bodies and property. It’s commonly equated with “freedom of thought.”

Many ethicists and policymakers think this right to mental self-determination is so fundamental that it is never OK to undermine it, and that institutions should impose strict limits on neurotech.

But as my research on neurorights argues, protecting the mind isn’t nearly as easy as protecting bodies and property.

Thoughts vs. things

Creating rules that protect a person’s ability to determine what is done to their body is relatively straightforward. The body has clear boundaries, and things that cross it without permission are not allowed. It is normally obvious when a person violates laws prohibiting assault or battery, for example.

The same is true about regulations that protect a person’s property. Protecting body and property are some of the central reasons people come together to form governments.

Generally, people can enjoy these protections without dramatically limiting how others want to live their lives.

The difficulty with establishing neurorights, on the other hand, is that, unlike bodies and property, brains and minds are under constant influence from outside forces. It’s not possible to fence off a person’s mind such that nothing gets in.

A light-colored wood fence set against a cloudy sky.
Go ahead, build a fence – but it’s easier to keep out intruders than ideas.
duckycards/E+ via Getty Images

Instead, a person’s thoughts are largely the product of other peoples’ thoughts and actions. Everything from how a person perceives colors and shapes to our most basic beliefs are influenced by what others say and do. The human mind is like a sponge, soaking up whatever it happens to be immersed in. Regulations might be able to control the types of liquid in the bucket, but they can’t protect the sponge from getting wet.

Even if that were possible – if there were a way to regulate people’s actions so that they don’t influence others’ thoughts at all – the regulations would be so burdensome that no one would be able to do much of anything.

If I’m not allowed to influence others’ thoughts, then I can never leave my house, because just by my doing so I’m causing people to think and act in certain ways. And as the internet further expands a person’s reach, not only would I not be able to leave the house, I also wouldn’t be able to “like” a post on Facebook, leave a product review, or comment on an article.

In other words, protecting one aspect of freedom of thought – someone’s ability to shield themselves from outside influences – can conflict with another aspect of freedom of thought: freedom of speech, or someone’s ability to express ideas.

Neurotech and control

But there’s another concern at play: privacy. People may not be able to completely control what gets into their heads, but they should have significant control over what goes out – and some people believe societies need “neurorights” regulations to ensure that. Neurotech represents a new threat to our ability to control what thoughts people reveal to others.

There are ongoing efforts, for example, to develop wearable neurotech that would read and adjust the customer’s brainwaves to help them improve their mood or get better sleep. Even though such devices can only be used with the consent of the user, they still take information out of the brain, interpret it, store it and use it for other purposes.

In experiments, it is also becoming easier to use technology to gauge someone’s thoughts. Functional magnetic resonance imaging, or fMRI, can be used to measure changes in blood flow in the brain and produce images of that activity. Artificial intelligence can then analyze those images to interpret what a person is thinking.

Neurotechnology critics fear that as the field develops, it will be possible to extract information about brain activity regardless of whether or not someone wants to disclose it. Hypothetically, that information could one day be used in a range of contexts, from research for new devices to courts of law.

A tiny golden brain, about to hit by a wooden gavel with a gold band on it.
Should there be limits on how people’s brain activity is used as legal evidence?
porcorex/iStock via Getty Images

Regulation may be necessary to protect people from neurotech taking information out. For example, nations could prohibit companies that make commercial neurotech devices, like those meant to improve the wearer’s sleep, from storing the brainwave data those devices collect.

Yet I would argue that it may not be necessary, or even feasible, to protect against neurotech putting information into our brains – though it is hard to predict what capabilities neurotech will have even a few years from now.

In part, this is because I believe people tend to overestimate the difference between neurotech and other types of external influence. Think about books. Horror novelist Stephen King has said that writing is telepathy: When an author writes a sentence – say, describing a shotgun over the fireplace – they spark a specific thought in the reader.

In addition, there are already strong protections on bodies and property, which I believe could be used to prosecute anyone who forces invasive or wearable neurotech upon another person.

How different societies will navigate these challenges is an open question. But one thing is certain: With or without neurotech, our control over our own minds is already less absolute than many of us like to think.The Conversation

Parker Crutchfield, Professor of Medical Ethics, Humanities, and Law, Western Michigan University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Written by: Contributed

Rate it

0%