Skip to main content
A collage featuring elements like a locked login screen, medical symbols, a concerned woman holding a 'Confidential' sign, a figure resembling a hacker, and expressions of denial related to insurance, with a message stating 'No Aggregation Without Representation'.

Most of us assume that when our health info moves between a doctor’s office, a hospital, a pharmacy, or an app, there are basic safety checks in place—like “you can’t log in without strong protection,” “your data is encrypted,” and “there’s a record of who accessed it.” Those basics matter because when health data leaks or gets misused, the harm isn’t abstract. It can mean an abuser finds you, an employer learns something they shouldn’t, your insurance situation gets complicated, or you lose trust in care and stop seeking it.

That’s why we commented on HTI-5, a proposed federal rule where ONC (the federal office that sets standards for certified health IT) says it wants to reduce burden and increase flexibility for technology developers. The problem is that one big part of that plan would remove existing security certification requirements—the baseline “floor” that helps ensure certified systems meet minimum security expectations.

At The Light Collective, we say “No Aggregation Without Representation.” Health data is being shared and aggregated more than ever—and increasingly used in analytics and AI—yet patients aren’t consistently represented in the agreements and decisions that shape how that data is secured and reused. When people don’t have real power in the system, baseline security becomes even more important.

The key issues we raised

  • Don’t create a security gap. If the old security rules are removed before replacement protections are real and enforceable, patients could be exposed during the transition.
  • Keep a minimum security floor. Some basics shouldn’t be optional: strong sign-in protections (like multi-factor authentication), encryption, and audit logs (so misuse can be detected).
  • Independent security checks still matter. If ONC wants to reduce program testing, there should still be credible third-party security assessments so “secure” isn’t just a marketing claim.
  • Transparency people can actually use. Patients and providers should be able to quickly see what security protections a product supports and whether health data might be used for AI-related purposes.
  • Count real-world harm, not just paperwork burden. Breaches and misuse create safety and civil rights impacts—especially for people facing domestic violence, stigma, discrimination, or immigration-related risk.

What we’re asking

We’re not trying to block innovation. We’re asking for common-sense guardrails so that modernizing health tech doesn’t come at the expense of patient safety. If ONC changes the rules, it should do so in a way that doesn’t lower the baseline protections people rely on—especially when health data is moving faster and farther than most patients can track or control.

If you want to read our full public comment, here is the link.


Discover more from Light Collective

Subscribe to get the latest posts sent to your email.

Discover more from Light Collective

Subscribe now to keep reading and get access to the full archive.

Continue reading