Skip to main content

Andrea Feedback: 

The detailed write-up we got jumped right into “no dark UX patterns.”  That is one of 1000 ways that vulnerable data can be exploited from groups by manipulating / exploiting human vulnerabilities.  Instead of saying don’t do “bad practice XYZ” – we need to define this more generally.  

When we are vulnerable online, we share things about ourselves that can be used against us.  Those things can also be used to manipulate our exploit us.  

But underlying that – this theme needs to broadly say:  don’t use the things that make us vulnerable against us.  Protect vulnerabilities shared in groups, don’t exploit them or make voodoo dolls.  

Only then can we give examples like Dark UX patterns. but the top needs to start with simple theme.  Then how to find tech platforms that don’t do “bad practice XYZ”

According to Wikipedia a “dark pattern” is “a user interface that has been carefully crafted to trick users into doing things, such as buying insurance with their purchase or signing up for recurring bills.”

The main website for exploring this concept is https://www.darkpatterns.org/

Basically, this are websites and computer programs that are intentionally confusing or deceptive in the way that is designed to trick people

It is hard to measure what a Dark Pattern is, but the worst cases are well-documented and clearly wrong. They have been used in the past to trick people into sharing clinical data about themselves.

Dark UX Patterns have no place in clinical information systems or anyplace that caters to patients.

While many Big Tech companies and many small websites have subjected patients to Dark Patterns, Facebook has used the following dark patterns in their software design against patients at a scale never before known. Here are the few of the things that they did using their UX that were manipulative.

  • They have conflated “adding” users to groups and “inviting” users to groups. This terminology difference has been used to blur who needs to provide consent for a given user. No Facebook user ever decided to allow their friends to make important privacy decisions for them
  • They have avoided the use of the word “privacy” in their terms of conditions, press releases and keynote addresses for years. Until recently, and now they say it so much that it has become a platitude. But in their user interfaces they have used terms like “Closed” which makes people think there is something private happening, when in fact that is not the case.
  • Facebook has labeled groups as “support” groups in the UX and then failed to provide the specific features that any real support group would need. This is basically a “its safe to swim here” sign over the Alligator pond.
  • Facebook allows a user to control which “public group” they can display on their profile. It is not possible to display participation in a “closed group” forum on a users public profile. This makes it seem to the user “oh, people cannot see that I am in this closed group”. But this is not true. In order to see that a person is a closed group, you just had to visit the membership page of this group. Any adversary could figure out which “closed groups” any Facebook user was a member of by checking every closed groups. This is called a reverse lookup mechanism and it is a common method among informatics professionals. So the UX says “this is a secret” but the setting says “this is public, but only if you know enough to look hard enough”
  • Sometimes a user would go to a Facebook group and see what it was (say a forum exclusively for HIV patients) and then, after reading the privacy settings documentation, decide “This is not something I am comfortable doing” and then click away and refuse to join the group. But because they started to fill in the form to join the group, the group admin can simply decide to add them to the group in any case. This means that there was no way for the user to say “I definitely do not want to be in this Closed Group”.
  • Group administrators were forced to make patient support groups “closed” not because they wanted the security settings on that type of group, but instead because that is the most secure group that is also findable in search. So if you actually wanted to provide support to patients (i.e. be found on facebook) you had to choose piss-poor security settings for your forum in order to be included in search.