Why Healthcare Chief Information Security Officers Advise Deleting Your Health App: Technical & Legal Drivers

Sherri Douville
3 min readJul 1, 2022

Is your data private? The answer is sometimes. What about private, regulated data? Again, the answer is sometimes. This post will cover some basic technical and legal themes that inform this position.

The Technical Bottom Line: Data Re-identification is exactly how it sounds

Industry insiders have known for several years that data re-identification of de-identified data is a real risk as described in this 2012 paper. For the lay person, this Becker’s summary gives a great gist from 2019: “AI can re-identify de-identified health data, study finds” https://www.beckershospitalreview.com/healthcare-information-technology/ai-can-re-identify-de-identified-health-data-study-finds.html

The Legal Bottom Line: What protection is there for the re-identification of data?

HIPAA allows for the process and use of de-identified data. It is up to each consumer to trust that data not get reidentified or decide when they don’t know whether to trust. Under California’s law, CCPA; as long as conditions are met as outlined in section 2 of this article, the entity releasing the de-idenitifed data achieves a “safe harbor” which is just “legalese” for “you are now off the hook and can wash their hands of responsibility for the privacy of that data.” There are disclosures and other limitations but then you’re trusting any related tech or other company to fully understand and voluntarily comply with these practices instead of just deciding to pay any potential fines that likely don’t impact their P&L a lot. Privacy advocates point out that the impact of a lack of enforcement is fairly similar to not even having regulations at all from an applied use case standpoint.

Does GDPR Fix this?

It appears that anonymized (de-identified) data also no longer falls under GDPR similar to HIPAA. I’d love to hear from legal experts in Brussels about how this plays out with health information from your perspective.

My app maker is in Europe, don’t they have better privacy practices?

Vendors are theoretically obligated legally to comply with each jurisdiction for where users reside. One thing that cracks me up is people complaining about “it’s so hard and so complex.” as a reason to do nothing or not comply and just pay fines. Proceed with that in mind.

The Bottom Line for You

If you are young and male and perfectly healthy; you probably don’t need to worry about this. Though if you buy insurance, are female, or have ever had a health problem; you want to pay attention to how data can be used for actuarial decisions, credit decisions, employment decisions. User beware.

The Bottom Line For Us

At Medigram, Inc., we predicted that working with patient and consumer generated health data would be an increasingly messy tangle of incentives, questionable clinical value, an evolving, complex and ill understood legal landscape, and economic pressures to deal with data in ways that would alienate the medical community — who take their oaths and ethics seriously for the most part, as well as the potential to alienate even more stakeholders. That’s one big reason why we don’t work in patient generated data and focus instead on enabling the care team.

By Sherri Douville, CEO at Medigram, the Mobile Medicine company. Recognized in 8 categories of top CEOs by Board Room Media (Across SMS, mHealth, iOS, IT, Database, Big Data, Android, Healthcare). Top ranked medical market executive worldwide and #1 ranked in mobile technologycategories (mhealth, iOS, Android), #1–2 (on any given day) for the cybersecurity market in the U.S. on Crunchbase. Best selling editor/author, Mobile Medicine: Overcoming People, Culture, and Governance & Advanced Health Technology: Managing Risk While Tackling Barriers to Rapid Acceleration, Taylor & Francis; Series Editor for Trustworthy Technology & Innovation + Trustworthy Technology & Innovation in Healthcare. (contracted to advise top academic and professional education publisher Routledge, Taylor & Francis).

Sherri is the co-chair of the IEEE/UL JV for the technical trust standard SG project for Clinical IoT in medicine, P2933. She is passionate about redefining technology, software and data for medicine and advanced health technologies in a way that’s worth the trust of clinicians, our family, and friends. Ms. Douville leverages her books to inform her work on the CHIMECDH security specialization certification. She also advises and co-founded the Cybersecurity curriculum for the Black Corporate Board Readiness and Women’s Corporate Board Readiness programs at Santa Clara University.