Skip to main content

By Fred Trotter

Over the past year, I have been part of a grassroots team of patient advocates, cybersecurity experts, and legal experts who have sought to close a massive security vulnerability that impacts health-related support groups on Facebook. I was a co-signer on an FTC complaint about Facebook. The complaint covers (in detail) the issues that Andrea Downing first discovered with the default settings on health support groups on Facebook. We named this vulnerability “SicGRL.” You can read about SicGRL, including the Facebook response to the problem and our subsequent FTC complaint, at missingconsent.org.

There have been many new developments since we originally found this security flaw with Facebook’s group settings that impacted millions of people.  SicGRL is no longer the only issue that the patient community has with Facebook. There have also been subsequent vulnerabilities discovered beyond SicGRL that impact the safety of Facebook users. This blog post is a high-level summary and update about what has changed since the FTC got involved—and, importantly, what has not changed. 

What is the SicGRL vulnerability?

All systems have flaws and when those flaws expose information from users, we call it a vulnerability.  As a cybersecurity expert who co-founded the Healthcare Industry CyberSecurity Task Force, and originally trained in the US Airforce Information Warfare Center, I have overseen the resolution of many health data breaches in my career.  The SicGRL problem has been unprecedented in scale and impact.  

This vulnerability impacted a person if they become a member of a health support group on Facebook in the last decade. While this vulnerability is partially closed, it has not been fixed by Facebook. As of the writing, any group member can download the real names and (usually) locations of all of the other members of the group. When group membership represents a clinical fact, this represents a breach of healthcare data.  For example, this can happen if a group is “only for people with Diabetes”—i.e., there is a “strict inclusion” screening requirement that you must have a specific medical or other condition to join the group.  

Instead of focusing on a specific individual, a malicious actor would seek to join as many “closed” (now renamed to “private visible”) support groups as possible, creating a lookup table of which individuals were in which groups. A SicGRL attack on a closed group can then create data profiles of these members by doing a “reverse lookup” to match two different sets of data from the group – all without the consent or knowledge of the members. Hence, we call the problem “Strict Inclusion Closed Group Reverse Lookup Attack” or SicGRL for short.

At its most dangerous permutation before July 2018, Facebook exposed information in a way that allowed non-members of a group to exploit the SicGRL vulnerability.  Before that time, attackers could scrape thousands of “Closed” groups at once without even being a member of the group, and without any group member knowing that a user had done this. After we reported the problem through Facebook’s white hat portal, Facebook changed their design and made it so only members of the group could download the information. We consider this a major victory.

Yet, Facebook never admitted that there was a problem, nor did they admit to a data breach, which is required by law in these situations.  Facebook’s quiet change to privacy settings represent an improvement to the pace at which the problem can be taken advantage of, but it does not solve the problem.  This remedy was more like putting a bandaid on an infected wound.

Our December FTC Complaint estimated that a large number of significant casualties, including serious injuries and death, resulted from the current privacy design (most of these are the documented cases of Facebook related violence in Myanmar). We believe that this vulnerability was also involved in casualties that have been documented concerning those seeking treatment for opioid addiction

Didn’t the FTC agreement fix the SicGRL problem?

We hoped the FTC would respond to our complaint by making requirements for Facebook to put strict controls on its groups product. Indeed, there are other specific areas, like facial recognition technology and password management, where the new agreement made specific requirements for Facebook.

The FTC did not address the egregious ongoing flaws in the privacy design of health-related support groups. Nor has the FTC addressed the casualties that continue to result from those flaws, and the agreement absolved both Facebook and its executives from responsibility for them. The specific language was:

Furthermore, this Stipulated Order resolves all consumer-protection claims known by the FTC prior to June 12, 2019, that Defendant, its officers, and directors violated Section 5 of the FTC Act.

Facebook’s support group product is clearly a Personal Health Record under the FTC definition, and despite this the FTC failed to regulate the product as a PHR. No mention of our complaint or the fact that some of Facebook’s products are PHRs was made in the new settlement with Facebook.

We are not sure why the FTC has ignored this, but at this point, we regard the FTC as much to blame for failing to regulate Facebook’s clinical support products as Facebook is for failing to fix the underlying problem.  

Wasn’t there a Congressional investigation?

In February 2019, the House Energy and Commerce committee asked Facebook to provide a staff briefing on this issue. While that sounds important, it is not the same thing as Congressional testimony. A staff briefing has no video recordings and any written materials that were provided to staffers is not covered under the Freedom of Information Act. 

That means that a congressional staffer asked a Facebook employee to explain a complicated topic, and there was no accountability for what was said. We do not know what Facebook said, and we do not know the staffers asked. We have no way of rebutting or correcting what was told to Congress on this matter. Given that we, as victims of Facebook’s privacy violations, were entirely excluded from the accountability process, this “staff briefing” is the same as Congress doing nothing, while still appearing to do something. 

What does it take for Facebook to fix the SicGRL Vulnerability?

In our original vulnerability report to Facebook in May 2018, we gave Facebook multiple solutions to address the privacy bug, and we can no longer afford to have that flexibility on the table. Instead, support groups need one thing: Name Privacy or something that provides identically high levels of privacy, in all support groups.

Basically, right now if I am in a group that is only for people with heart disease, I am listed as:

“Fred Trotter, Houston”

We need that listing changed to just read:

“Fred”

All mentions of my profile, including the group membership listing and posts I make to the group, should not link to my full Facebook account. For the purposes of my interactions with other people on the clinical support group, I will just be “Fred” and they will not be able to find out more information about me.

We should note that this is similar to Facebook’s new “dating” features. Using the “dating” features, you create a portion of your profile that is exposed to people you do not know, as they consider whether to ask you out on a date. 

Facebook’s current position amounts to: “if you want to meet one person for coffee, then we will facilitate privacy, but if you want to share your health details and location with the world, then we do not support this privacy feature.” 

Implementing name privacy would be an extreme redesign, but it does not prevent support groups from functioning properly. There are other more subtle privacy designs that would also achieve these goals, but Facebook has demonstrated they are not capable of designing these types of systems without supervision. We will compare any technical solutions that Facebook proposes to the full name privacy feature that we are requesting here, and we will not accept any solution that is not at least as good at protecting privacy. 

We are asking that Facebook make all of their “support” + “private” groups have name privacy until they can create a consortium of privacy experts and patient advocates to design a product that has more subtle privacy constraints, but still protects health-related support groups. 

Didn’t Facebook already fix SicGRL?

The answer to this is that Facebook has only implemented partial fix. So far there have been three changes to the design of Facebook’s clinical support products. Two of these changes have been improvements. And one of them has made things decidedly worse.

1. It is no longer possible to download the membership list of “closed” Facebook groups without being a member.

Previously, anyone could download all of this data in one massive scraping project. Now, any attacker can still scrape this data, but they have to be a member of a given group first. This is an improvement and prevents full automation of scraping. 

Instead, it incentivizes the creation of “suck” puppets, fake user accounts that exist for no other reason than to slurp up the data of other users. Support group communities have seen an uptick in strange users attempting to join health-related support groups, since public access was reduced. This is why our proposed solution specifically seeks to make scraping useless, rather  than a continued cat-and-mouse game that Facebook cannot win, and for which Facebook users pay the price for losing.

2. It is no longer possible to “Force Add” other users to clinical support groups.

The central problem with Facebook apps that Cambridge Analytica abused was the fact that User A could make privacy decisions for all of their friends at once by signing up for an app. The notion that your friends can make significant privacy decisions, without having any idea that they are doing so is clearly deceptive and incompatible with any notion of informed consent. The notion that this was done in a healthcare context should be signalling massive alarm bells. 

Similarly, it was previously possible for one Facebook user to add other users to a group without consent. So if you thought that your high school buddy really needed to be a member of an alcoholic support group, you could just “force add” them to that group and then they were a member. Even if they had previously refused to join.

Facebook fixed this problem too, and they also removed all users from groups who had never interacted with the group in any way. This is an improvement.

However, there are still likely millions of users who do not realize that they were force-added to clinical support groups and did not realize that messages that they “liked” or otherwise interacted with were from the inside of a clinical support group that they were forced into.

All members who were force-added to clinical support groups need to be removed from the clinical groups and given the right to re-join if they proactively consent.

3. Facebook changed groups to “private.” This did not fix the problem.

Recently, Facebook has announced that it will rename “closed” groups to “private visible” and its “secret” groups to “private hidden”. So far, we have not actually seen this change on the platform itself.  Not only does this not fix the privacy problems with clinical support groups, it makes it worse.

Currently, everything you do on Facebook is done under your real legal name because of Facebook’s controversial and broken real-name policy.

Previously, Facebook used the term “closed” to give the impression that information was protected in health support groups. This was deceptive language and gave a massively false impression. But this terminology gave Facebook some deniability. Now the same group settings that do nothing to protect privacy are simply called “Private”.

Facebook has knowingly deceived their users about the privacy settings on health support groups. In fact, Facebook recently took the position in court that nothing is “private” on Facebook. A position that the judge this week rejected saying: “Facebook’s view is so wrong.”

And while they continue to claim that these groups are private, any user can exploit SICGRL from within the group. This means that if your group is breached, your information as part of a health support group links back to your real name and location.

What do we do next?

We are a small community of health support group members who are deeply involved in figuring out how to respond to problems with Facebook’s privacy designs. We do not yet have a consensus on how to respond to the fact that the FTC has failed to enforce the law regarding its PHR rule while still forgiving Facebook for all of the times they violated those laws. Neither do we have consensus on how to move forward as Facebook sweeps repeated privacy abuses and data breaches under the rug, while aggressively marketing to support groups that their platform is a safe place to gather for vulnerable users seeking health information and support communities.

At the present time, it is enough to explicitly state the following: 

  • The SIcGRL vulnerability is not fixed and is still causing health data to leak from health-related support groups. This continues to result in casualties for Facebook users at an unknowable rate. 
  • The privacy settings that are abused under SicGRL represent deceptive trade practices by Facebook that the FTC is responsible for regulating but has failed to correct.
  • The SicGRL data breach is covered under the PHR breach notification rule which the FTC is responsible for enforcing, but has so far said nothing about publicly. 
  • SicGRL is a violation of the terms of the 2012 FTC consent order that Facebook agreed to follow. 
  • The FTC has granted immunity to both the Facebook corporation and its executives for violating the PHR breach notification rule previous to June 12th, 2019.  They have done this without addressing any of the SicGRL or other problems that support group communities have faced on the Facebook platform.
  • According to the standards for fines associated with PHR breach notification violations, Facebook has already accrued hundreds of millions of dollars of new potential fines, since June 12th, 2019.
  • The ongoing SicGRL breach is also a violation of the second 2019 consent decree, and has been since the moment it was signed.
  • Despite all of this, Facebook has failed to fix the problem, and both the FTC and Congress have failed to shed light on the SicGRL breach and related issues. 

If you are interested in keeping in touch as privacy issues with Facebook evolve, please sign up for our email notice here.

5 Comments

Leave a Reply