My Blog
World News

Colorado Bill Aims to Protect Consumer Brain Data

Colorado Bill Aims to Protect Consumer Brain Data
Colorado Bill Aims to Protect Consumer Brain Data


Consumers have grown accustomed to the prospect that their personal data, such as email addresses, social contacts, browsing history and genetic ancestry, are being collected and often resold by the apps and the digital services they use.

With the advent of consumer neurotechnologies, the data being collected is becoming ever more intimate. One headband serves as a personal meditation coach by monitoring the user’s brain activity. Another purports to help treat anxiety and symptoms of depression. Another reads and interprets brain signals while the user scrolls through dating apps, presumably to provide better matches. (“‘Listen to your heart’ is not enough,” the manufacturer says on its website.)

The companies behind such technologies have access to the records of the users’ brain activity — the electrical signals underlying our thoughts, feelings and intentions.

On Wednesday, Governor Jared Polis of Colorado signed a bill that, for the first time in the United States, tries to ensure that such data remains truly private. The new law, which passed by a 61-to-1 vote in the Colorado House and a 34-to-0 vote in the Senate, expands the definition of “sensitive data” in the state’s current personal privacy law to include biological and “neural data” generated by the brain, the spinal cord and the network of nerves that relays messages throughout the body.

“Everything that we are is within our mind,” said Jared Genser, general counsel and co-founder of the Neurorights Foundation, a science group that advocated the bill’s passage. “What we think and feel, and the ability to decode that from the human brain, couldn’t be any more intrusive or personal to us.”

The law takes aim at consumer-level brain technologies. Unlike sensitive patient data obtained from medical devices in clinical settings, which are protected by federal health law, the data surrounding consumer neurotechnologies go largely unregulated, Mr. Genser said. That loophole means that companies can harvest vast troves of highly sensitive brain data, sometimes for an unspecified number of years, and share or sell the information to third parties.

Supporters of the bill expressed their concern that neural data could be used to decode a person’s thoughts and feelings or to learn sensitive facts about an individual’s mental health, such as whether someone has epilepsy.

“We’ve never seen anything with this power before — to identify, codify people and bias against people based on their brain waves and other neural information,” said Sean Pauzauskie, a member of the board of directors of the Colorado Medical Society, who first brought the issue to Ms. Kipp’s attention. Mr. Pauzauskie was recently hired by the Neurorights Foundation as medical director.

The new law extends to biological and neural data the same protections granted under the Colorado Privacy Act to fingerprints, facial images and other sensitive, biometric data.

Among other protections, consumers have the right to access, delete and correct their data, as well as to opt out of the sale or use of the data for targeted advertising. Companies, in turn, face strict regulations regarding how they handle such data and must disclose the kinds of data they collect and their plans for it.

“Individuals ought to be able to control where that information — that personally identifiable and maybe even personally predictive information — goes,” Mr. Baisley said.

Experts say that the neurotechnology industry is poised to expand as major tech companies like Meta, Apple and Snapchat become involved.

“It’s moving quickly, but it’s about to grow exponentially,” said Nita Farahany, a professor of law and philosophy at Duke.

From 2019 to 2020, investments in neurotechnology companies rose about 60 percent globally, and in 2021 they amounted to about $30 billion, according to one market analysis. The industry drew attention in January, when Elon Musk announced on X that a brain-computer interface manufactured by Neuralink, one of his companies, had been implanted in a person for the first time. Mr. Musk has since said that the patient had made a full recovery and was now able to control a mouse solely with his thoughts and play online chess.

While eerily dystopian, some brain technologies have led to breakthrough treatments. In 2022, a completely paralyzed man was able to communicate using a computer simply by imagining his eyes moving. And last year, scientists were able to translate the brain activity of a paralyzed woman and convey her speech and facial expressions through an avatar on a computer screen.

“The things that people can do with this technology are great,” Ms. Kipp said. “But we just think that there should be some guardrails in place for people who aren’t intending to have their thoughts read and their biological data used.”

That is already happening, according to a 100-page report published on Wednesday by the Neurorights Foundation. The report analyzed 30 consumer neurotechnology companies to see how their privacy policies and user agreements squared with international privacy standards. It found that all but one company restricted access to a person’s neural data in a meaningful way and that almost two-thirds could, under certain circumstances, share data with third parties. Two companies implied that they already sold such data.

“The need to protect neural data is not a tomorrow problem — it’s a today problem,” said Mr. Genser, who was among the authors of the report.

The new Colorado bill won resounding bipartisan support, but it faced fierce external opposition, Mr. Baisley said, especially from private universities.

Testifying before a Senate committee, John Seward, research compliance officer at the University of Denver, a private research university, noted that public universities were exempt from the Colorado Privacy Act of 2021. The new law puts private institutions at a disadvantage, Mr. Seward testified, because they will be limited in their ability to train students who are using “the tools of the trade in neural diagnostics and research” purely for research and teaching purposes.

“The playing field is not equal,” Mr. Seward testified.

The Colorado bill is the first of its kind to be signed into law in the United States, but Minnesota and California are pushing for similar legislation. On Tuesday, California’s Senate Judiciary Committee unanimously passed a bill that defines neural data as “sensitive personal information.” Several countries, including Chile, Brazil, Spain, Mexico and Uruguay, have either already enshrined protections on brain-related data in their state-level or national constitutions or taken steps toward doing so.

“In the long run,” Mr. Genser said, “we would like to see global standards developed,” for instance by extending existing international human rights treaties to protect neural data.

In the United States, proponents of the new Colorado law hope it will establish a precedent for other states and even create momentum for federal legislation. But the law has limitations, experts noted, and might apply only to consumer neurotechnology companies that are gathering neural data specifically to determine a person’s identity, as the new law specifies. Most of these companies collect neural data for other reasons, such as for inferring what a person might be thinking or feeling, Ms. Farahany said.

“You’re not going to worry about this Colorado bill if you’re any of those companies right now, because none of them are using them for identification purposes,” she added.

But Mr. Genser said that the Colorado Privacy Act law protects any data that qualifies as personal. Given that consumers must supply their names in order to purchase a product and agree to company privacy policies, this use falls under personal data, he said.

“Given that previously neural data from consumers wasn’t protected at all under the Colorado Privacy Act,” Mr. Genser wrote in an email, “to now have it labeled sensitive personal information with equivalent protections as biometric data is a major step forward.”

In a parallel Colorado bill, the American Civil Liberties Union and other human-rights organizations are pressing for more stringent policies surrounding collection, retention, storage and use of all biometric data, whether for identification purposes or not. If the bill passes, its legal implications would apply to neural data.

Big tech companies played a role in shaping the new law, arguing that it was overly broad and risked harming their ability to collect data not strictly related to brain activity.

TechNet, a policy network representing companies such as Apple, Meta and Open AI, successfully pushed to include language focusing the law on regulating brain data used to identify individuals. But the group failed to remove language governing data generated by “an individual’s body or bodily functions.”

“We felt like this could be very broad to a number of things that all of our members do,” said Ruthie Barko, executive director of TechNet for Colorado and the central United States.

Related posts

Prigozhin’s Wagner Rebellion Came as His Power Faded

newsconquest

As Ukraine war rages, Congress struggles to legislate a reaction

newsconquest

Michael Jordan’s 1998 NBA Finals sneakers sell for a record $2.2 million

newsconquest