Get updates delivered to you daily. Free and customizable.
Kisha Walker
Your Brain Waves Are Safe Here: Colorado Passes The First U.S. Brainwave Privacy Law
2024-04-30
Colorado passed legislation to prevent companies from selling your brainwaves. But is it enough to stop the likes of Meta and Apple?
Colorado Governor Jared Polis just signed a bill into law that will protect people’s brainwaves, the first legislation of its kind. The bill had impressive bipartisan support, passing by a 34-to-0 vote in the state Senate and 61-to-1 in the House.
Sponsors of the bill said it was necessary as quick advances in neurotechnology make scanning, analyzing and selling mental data increasingly more possible as well as profitable.
State representative Cathy Kipp, a sponsor of the legislation, said in a statement that while advancements in the neurotechnology field hold great promise for improving the lives of many people, “we must provide a clear framework to protect Coloradans’ personal data from being used without their consent while still allowing these new technologies to develop.”
The law expands the definition of “sensitive data” under the Colorado Privacy Act to include “neural data” or data found in a person’s brainwaves. This means that neural data will have the same protections as fingerprints, facial recognition or other sensitive data.
While sensitive data collected from medical devices is already protected by federal health law, data from consumer-level brain technologies is largely unregulated.
However, you can buy lots of devices off Amazon right now that would record your brain data (like the Muse headband, which uses EEG sensors to read patterns of activity in your brain, then cues you on how to improve your meditation). Since these aren’t marketed as medical devices, they’re not subject to federal regulations; companies can collect and sell your data.
State senator Kevin Priola, another of the bill’s sponsors, said that neurotechnology “is no longer confined to medical or research settings” and that when it comes to consumer products, the industry “can currently operate without regulation, data protection standards, or equivalent ethical constraints.”
The bill states that these technologies “raise particularly pressing privacy concerns given their ability to monitor, decode, and manipulate brain activity,” noting that these technologies cause an involuntary disclosure of information.
Therefore, the law focuses on closing that loophole.
The Neurorights Foundation, a non-profit promoting the ethical development of neurotechnology, said Colorado’s bill, which it supported, was the first of its kind in the U.S.
While Colorado is leading the way, California has a similar bill called the NeuroRights Act making its way through the state legislature,( The approved SB 1223, an innovative bill that applies consumer protections that are already in state law to an individual’s neural data authored by Senator Josh Becker (D-Menlo Park); SB 1223 defines neural data as sensitive personal information and applies the same protections in law over its use as other personal information) and lawmakers in Minnesota are working on their own version as well; (Section 1. Minnesota Statutes 2022, section 13.04, is amended by adding a subdivision to read:Subd. 1a. Right to mental data.
(a) An individual has the right to mental privacy. A government entity must not, without informed consent, collect data transcribed directly from brain activity.
(b) An individual has the right to cognitive liberty. A government entity must not interfere with the free and competent decision making of the individual when making neurotechnology decisions.)
Several countries including Chile, Brazil, Spain and Mexico have already given brain data constitutional protection, or are taking steps to do so.
Indeed, the neurotechnology industry is growing at a rapid rate. In fact, according to Harvard Business Review, the global market for it is growing at an annual rate of 12% and is expected to surpass $21 billion by 2026. For context, the market was $13.47 billion in 2023.
Neurotechnology devices have traditionally been used in the medical field, but they are being increasingly marketed to consumers. Big Tech giants like Apple and Meta, including Facebook and Instagram’s parent company Meta Platforms (META.O), along with Elon Musk’s Neuralink are developing technology that can detect brain activity then potentially put it to commercial use. Mined brain data has endless potential, be it to better target ads, exploit human moods, sell more stuff or regenerate lost brain function. As aforementioned, there are already some products on the market.
According to a report from the NeuroRights Foundation, there are at least 30 products available for purchase by members of the public. These devices fall into the wellness, recreation/entertainment or research categories.
U.S. neurologist Sean Pauzauskie used to rely exclusively on expensive and cumbersome hospital kit to capture his patients’ brainwaves and analyse problems in their electronic pathways.
However, in recent years, non-invasive brain-computer interface (BCI), which has the ability to decode continuous language from the brain, enabling an outside observer to read the general gist of what we’re thinking even if we haven’t uttered a single word.
This is possible due to the marriage of two technologies: fMRI scans, which measure blood flow to different areas of the brain, and largeAI language models, similar to the now-infamous ChatGPT.
The Colorado neurologist has turned to consumer headbands, commonly sold online to monitor sleep patterns or boost brain function, to capture the brain activity of some patients suffering seizures.
This option is cheaper and easy to use. The headbands which can cost just a few hundred dollars, capture similar electronic data as state-of-the-art hospital machines, only with far less fuss.
“In the beginning I was thrilled, I thought: ‘patients can even do all this themselves, at home,’” he told Context.
“But then I thought: ‘wait a second, that means all their brain data is going to some private company.’”
Get updates delivered to you daily. Free and customizable.
It’s essential to note our commitment to transparency:
Our Terms of Use acknowledge that our services may not always be error-free, and our Community Standards emphasize our discretion in enforcing policies. As a platform hosting over 100,000 pieces of content published daily, we cannot pre-vet content, but we strive to foster a dynamic environment for free expression and robust discourse through safety guardrails of human and AI moderation.
Comments / 0