New neurotechnology is blurring the lines around mental privacy ? but are new human rights the answer?

by LAURA Y. CABRERA

A woman tries out neurotechnology equipment during Tech Week in Bucharest, Romania, in May 2023. IMAGE/Cristian Cristel/Xinhua via Getty Images

Neurotechnologies – devices that interact directly with the brain or nervous system – were once dismissed as the stuff of science fiction. Not anymore. Several companies are developing and some are even testing “brain-computer interfaces,” or BCIs, of which the most high-profile is likely Elon Musk’s Neuralink. He announced on Jan. 29, 2024, that the first human in the company’s clinical trials has received a brain implant.

Like other companies, Neuralink’s immediate goal is to improve autonomy for patients with severe paralysis or other neurological disorders.

But not all BCIs are envisioned for medical use: There are EEG headsets that sense electrical activity inside the wearer’s brain covering a wide range of applications, from entertainment and wellness to education and the workplace. Yet, Musk’s ambitions go beyond these therapeutic and nonmedical uses. Neuralink aims to eventually help people “surpass able-bodied human performance.”

Neurotechnology research and patents have soared at least twentyfold over the past two decades, according to a United Nations report, and devices are getting more powerful. Newer devices have the potential to collect data from the brain and other parts of the nervous system more directly, with higher resolution, in greater amounts and in more pervasive ways.

However, these improvements have also raised concerns about mental privacy and human autonomy – questions I think about in my research on the ethical and social implications of brain science and neural engineering. Who owns the generated data, and who should get access? Could this type of device threaten individuals’ ability to make independent decisions?

In July 2023, the U.N. agency for science and culture held a conference on the ethics of neurotechnology, calling for a framework to protect human rights. Some critics have even argued that societies should recognize a new category of human rights, “neurorights.” In 2021, Chile became the first country whose constitution addresses concerns about neurotechnology.

Advances in neurotechnology do raise important privacy concerns. However, I believe these debates can overlook more fundamental threats to privacy.

A glimpse inside

Concerns about neurotechnology and privacy focus on the idea that an observer can “read” a person’s thoughts and feelings just from recordings of their brain activity.

It is true that some neurotechnologies can record brain activity with great specificity: for example, developments on high-density electrode arrays that allow for high-resolution recording from multiple parts of the brain.

The Conversation for more

Comments are closed.