Skip to main content
For Everyone

Trust gap: Health apps and data sharing

APRIL 29, 2019 By SUSANNAH FOX 

There has been a steady drip-drip-drip of articles documenting how health apps are sharing data with third parties:

  1. Data sharing practices of medicines related apps and the mobile ecosystem: traffic, content, and network analysis, by Grundy et al. (British Medical Journal, Feb. 25, 2019)
  2. You Give Apps Sensitive Personal Information. Then They Tell Facebook. Wall Street Journal testing reveals how the social-media giant collects a wide range of private data from developers; ‘This is a big mess’, by Sam Schechner and Mark Secada (Wall Street Journal, Feb. 22, 2019)
  3. Is your pregnancy app sharing your intimate data with your boss? As apps to help moms monitor their health proliferate, employers and insurers pay to keep tabs on the vast and valuable data, by Drew Harwell (Washington Post, April 10, 2019)
  4. Assessment of the Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation, by Huckvale, Torous, and Larsen (JAMA Network Open, 2019)

This post is my chance to share some relevant data, add my perspective, and ask for your input.

First, the data from a 2018 Hopelab/Well Being Trust study I helped write:

  • 71% of female teens and young adults say they have tried mobile apps related to health, compared to 57% of males. Three in ten (30%) females say they currently use a health app, compared to two in ten (20%) males.
  • Fully 48% of females ages 18- to 22-years-old and 25% of teen girls say they have used a period tracking app, compared with 2% of males.
  • Sixteen percent of females use a meditation app, compared with 5% of males.

From a 2015 Pew Research Center study:

  • Eight in ten smartphone owners said they have downloaded apps and 60% of app downloaders surveyed said they have “chosen not to install an app when they discovered how much personal information it required in order to use it, while 43% had uninstalled an app after downloading it for the same reason.”
  • People appear to use popularity as a proxy for trustworthiness: 57% of apps downloaders say it is important to know how many times an app has been downloaded when they are making their choice about which app to use.

This is a fraction of the data available about people’s use of apps. Bottom line: This is a big and growing market.

From my perspective, here are the two sentences that are the crux of the JAMA Network Open article:

Our data highlight that, without sustained and technical efforts to audit actual data transmissions, relying solely on either self-certification or policy audit may fail to detect important privacy risks. The emergence of a services landscape in which a small number of commercial entities broker data for large numbers of health apps underlines both the dynamic nature of app privacy issues and the need for continuing technical surveillance for novel privacy risks if users and health care professionals are to be offered timely and reliable guidance.

What does this mean in practice? Who is responsible for auditing these apps’ security practices?

In England, the NHS features an apps library on their front page and NHS England takes responsibility for evaluating the apps featured (although it appears they rely heavily on self-certification). The existence of the library is emblematic of how the UK approaches health and health care — leaning in a bit more paternalistically to digital health than we in the U.S. have done. I was the CTO at HHS when NHS was launching their library and warned them to be cautious, which of course they already knew they needed to be, and we had some spirited discussions about how to approach the creation of their recommendations list. I’m happy to see the library featured so prominently since I suspect it means it is a popular feature.

HHS, by contrast, limits itself to recommending fact sheets about prevention and wellness. Nothing dynamic or personalized — just the basics of immunization schedules, physical activity guidelines, etc. Very much the “better safe than sorry” approach to digital health. HHS’s main regulatory arm, the Food and Drug Administration (FDA) has chosen to focus their oversight of digital health on medical apps, not wellness apps, but also created a cybersecurity oversight structure that provides guidance for developers. HHS’s Office of Civil Rights (OCR) maintains fact sheets about what entities are covered by HIPAA. (OCR also created a developer portal but I’m not linking to it since a warning keeps popping up in my browser that it’s an insecure site. Oops.) Meantime the Federal Trade Commission (FTC) has created a handy checklist for app developers who want to stay on the right side of all the laws that could apply to them.

The basic regulatory structures for consumer protection are being locked into place but it’s still just scaffolding.

Here are my other take-aways:

  • We know people are hungry for guidance and are eagerly using health apps, with varying degrees of success and satisfaction.
  • We also know people are not fully aware of the data sharing that health apps are engaging in.
  • Public shaming by reporters and researchers is currently the main check on companies’ use of people’s personal data.
  • There is a big trust gap that could be filled by government agencies or by companies and organizations willing to do the work of continually testing and auditing health apps’ effectiveness and security practices.

Now: Your turn.

What are you seeing in the health apps marketplace? Which apps do you trust? Which apps have you stopped using or deleted because of data sharing concerns? If you are in a leadership role, either in the government or at an organization that could hold sway, what are you doing to build toward a vibrant ecosystem? Or are you in a protective crouch? (Note: You can comment with a pseudonym.)

Leave a Reply