On April 7, 2026, Anthropic did something no frontier AI company has ever done before. They finished training their most capable model — and announced they would not be releasing…
On April 7, 2026, Anthropic did something no frontier AI company has ever done before. They finished training their most capable model — and announced they would not be releasing…
Most of us assume that when our health info moves between a doctor’s office, a hospital, a pharmacy, or an app, there are basic safety checks in place—like “you can’t log in without strong protection,” “your data is encrypted,” and “there’s a record of who accessed it.” Those basics matter because when health data leaks or gets misused, the harm isn’t abstract. It can mean an abuser finds you, an employer learns something they shouldn’t, your insurance situation gets complicated, or you lose trust in care and stop seeking it.
Protecting your privacy online can feel overwhelming. But we can help.As you browse online for health conditions or advice—looking up symptoms, medications, or clinics—ad trackers can quietly broadcast clues about what you’re reading and where you are billions of times a day across the web. Data brokers turn those clues into “audience lists” of people likely dealing with things like asthma, depression, or diabetes (sometimes even tagging caregivers or people in government/medical roles), even when platforms say they ban this. And once your data is pushed into that system, there’s no practical way to control who sees it, how it’s combined with other data, or who it’s sold to next.
When health systems and policymakers think about designing AI for the clinic—slow, controlled, locked inside compliance and limited scope of a clinical encounter. Technology companies like Meta, Google, and Microsoft, on the other hand, thinks about “users”—scaling fast, collecting data first, and asking forgiveness never. And right in that no-man’s land sit patients, who are using AI every day without rights, safety nets, or protections. They’re treated as neither full citizens of the clinic nor valued customers of tech—just data streams to be mined. That gap is where harm festers, safety issues linger, where trust collapses, and where the most vulnerable are left to carry the risk alone.
Your DNA isn’t just personal — it’s permanently identifying, impossible to change, and increasingly valuable to insurers, researchers, marketers, and law enforcement. As technology advances, your genetic and health data becomes more profitable to others and more predictive of your future — not just your ancestry, but your disease risk, reproductive potential, even your kids’ health. And yet, the protections around it remain fragile, fractured, and easy to sidestep.
Today is World Patient Safety Day, and The Light Collective is proud to share new community-based participatory research on patient priorities in health technology. Health data regulations today are fraught…
Are you at a hospital that can’t help patients? Here is the short term fix.
We lost our friend, colleague, and bright light in the breast cancer community Marlena Murphy. To know and work with Marlena was to love her. In 2018, at the young…
In a nutshell, the ruling that hospitals can share patient browsing data to Meta, TikTok and other third parties via adtech if patients view health related content, voiding part of OCR’s ban on tracking technologies.
This year recognized an unmet need to bring patient community representation to shape the governance of AI Policy, through our diverse and growing coalition of patient advocacy organizations and leaders. To address…
This year’s summit will feature discussions led by healthcare providers, cybersecurity researchers, public health policymakers, representatives of rural healthcare, and more. The event will focus on improving long-term resilience and sustainability of the public health sector, examining effectiveness and efficiency of so-called “best efforts” to date. Using real world examples, we will help to build a better framework for improving healthcare cybersecurity, drawing on public policy, operational experience, and economics.
This webinar features the latest updates from The Light Collective. Tommy Wang, PhD, will moderate a panel of speakers, including Andrea Downing and Valencia Robinson from The Light Collective, Ysabel Duron from the Latino Cancer Institute, and Liz Salmi from OpenNotes.
On April 26, 2024, the Federal Trade Commission (FTC) issued its finalized changes to the Health Breach Notification Rule. Some may remember the prior history of the FTC failing to provide protections for health groups back in 2019. At the time patients raised an FTC Complaint about the privacy of ‘Closed’ groups, yet the complaint went unheeded. It is a notable to see after five years the tides are turning toward stronger consumer protection and health privacy.
In April 2024, the FTC (Federal Trade Commission) made important updates to a rule designed to protect people’s health information, especially when it’s managed by health apps and wearable devices that aren’t covered by traditional health privacy laws like HIPAA.
In response to national initiatives forming artificial intelligence (AI) standards, codes of conduct, and bills of rights in healthcare settings, largely in the absence of input from patient communities, AI Rights for Patients was launched today by The Light Collective.
Check out this resource from GoInvo which visually describes how your health data are used by different entities. When we as patients lack access to our own complete medical records, one thing to consider is how many others had such easy access to our data.
A 2023 Study by Joanne Kim at Duke University, found that data brokers are marketing highly sensitive data on individuals’ mental health conditions on the open market, with seemingly minimal…
“If you know the enemy and know yourself, you need not fear the result of a hundred battles.“ ― Sun Tzu, The Art of War AI Prompt: Sun Tzu, The Art of…
In an age where our lives are increasingly digitized, and patients are often reduced to numbers and odds of survival – our stories matter more than ever. We are very…
Safety – Back in the Day For over a decade, I’ve had the privilege of engaging with various patient communities, both in person and online. During this time, I’ve witnessed…
Building a Protective Shield For Patient Communities Online One of the most important, and hardest, lessons I’ve learned in life is to ask for help. My struggle has always been…
by Rachel Tobac, Social Proof Security What happened in this data leak? Cyber criminals were able to find passwords that were involved in other breaches online and use a method…
by Tambre Leighn, Well Beyond Ordinary We’ve got a secret In terms of healthcare privacy, my first significant encounter came when I was a young adult caring for my late…
We are at DEF CON this week presenting our latest work, and the timeline since our proof-of-concept study published last year.
The Light Collective is thrilled to announce our partnership with CancerX, a public-private partnership co-hosted by Moffitt Cancer Center and the Digital Medicine Society (DiMe).
The Light Collective is advancing the collective rights, voices, and interests of patient communities in healthcare technology. You can read all about our mission here. Healthcare technology can move faster than our…
It is with a heavy heart that we announce the unexpected passing of our dear friend KJ Surkan. KJ passed away peacefully in his sleep on Saturday, January 28th. Born February 6, 1969, KJ was a light in this world and was an integral part of our community. There are no words to express our deep sorrow but we are grateful to have known KJ.
It is both appropriate and regrettable that October is Breast Cancer Awareness Month and Cybersecurity Awareness Month. As breast cancer survivors, collaborators with online patient communities, and founders of an organization representing the rights and voices of patients in healthcare technology, we are committed to using technology to educate, support, and empower as many breast cancer patients as possible so they can have the knowledge and tools to beat cancer and live their best lives.
There is a new report from George Washington University called Digital Trade and Data Governance Hub rates 68 countries on their approaches to data governance. This report scores each country on 26 factors over six categories.
While these changes were widely praised and covered in the media as a step to make the platform safer, this new analysis shares the current state of ad targeting tools to infer an individual’s “interests” as a proxy for medical history.
Facebook pledged to remove race, health conditions, and political affiliation from ad-targeting options, but The Markup found advertisers can still easily target the same people By: Angie Waller and Colin…
We are expanding our team and hiring part time patient leaders to work on an exciting new project.
Our latest research was featured in Wired and Ars Technica!
We are thrilled to have the chance to make the news again, and will continue to shed light on ways patient communities can have their privacy and collective rights respected.
Such organizations often deal in sensitive issues, like mental health, addiction, and reproductive rights—and many are feeding data about website visitors to corporations
There have been many new developments since we originally found this security flaw with Facebook’s group settings that impacted millions of people. SicGRL is no longer the only issue that the patient community has with Facebook. There have also been subsequent vulnerabilities discovered beyond SicGRL that impact the safety of Facebook users. This blog post is a high-level summary and update about what has changed since the FTC got involved—and, importantly, what has not changed.
Patients have been innovating for decades. HIV activists forced FDA action. The cystic fibrosis community moved Kalydeco from bench to bedside. People with breast cancer organized for access to Herceptin. Type 1 diabetes advocates normalized continuous glucose monitoring. Long COVID groups mapped symptoms and pushed for repurposed therapies. Different diseases, same playbook: build community‑run networks, get smart on the science, rewire trials and policy, and then use targeted leverage to change the rules — fast.
A growing number of experts and lawmakers are sounding the alarm on how AI and your personal data are being used by the federal government—with little oversight and massive potential consequences. This post outlines what you can do.
FOR IMMEDIATE RELEASE The Light Collective, a patient advocacy group committed to protecting privacy and patient rights, today announced it has submitted a formal complaint to the Federal Trade Commission…