Where The American Data Privacy Protection Act Stands Today Lacks Significant Relevance to Medicine Right Now
The American Data Privacy Protection Act H.R. 8152 passed out of the House Energy and Commerce Committee. What does that mean for you? This post is about the things that we know for sure and please note that this post is specific to medical industry stakeholders. This is looking at risk from a holistic approach of how it impacts users, consumers, patients — across data/apps, devices, and networks.
First we want to recognize the hard work of lawmakers and staff along with the difficulties a national privacy law presents.
For now, the federal privacy bill appears stalled over questions including whether the duty of loyalty in it is as strong as it could be (whereas data is to be used in a way that serves the consumer’s best interests and not in ways that are purely self-serving to the corporate bottom line).
At the same time, Kelvin Low, Professor at Faculty of Law, National University of Singapore commented recently on the regulatory impact of a separate matter on Linked in “that the devil is in the details and that ”Regulatory pushes can be hijacked by industry lobbies.” This perspective is critical because it’s the meaning of the enforcement of regulations flowing from passed laws that really make a difference in changing behaviors of companies and industries. As I wrote in this post, the impact of regulatory inflection is a long road full of uncertainty and misalignment with undesirable side effects. By paradox, I saw that when a company does live and breathe compliance, they can earn incredible trust and license to operate.
One comparator is HIPAA law which is not the main subject of this article and has had some impact in healthcare though with patient and clinician safety gaps identified.
My perspective and bias I bring to this post: I absolutely love appropriate uses of technology that are useful, helpful, safe, and don’t create useless unnecessary training or explanation or any extra work for physicians! Some of my tech friends said that I’m critical about technology. I told them I’m discerning. I love tech, for the right things. What I don’t like relative to use of tech in medicine is irrelevant, reckless, useless (no improved clinical outcome), and thoughtless applications where the technology doesn’t meet the need of the use case.
Illustrated here are basics of how a bill gets passed, and further below are more details for those who want it, and how this contrasts with for example, medical device regulation which is a hurdle that must be cleared in order to market medical products at all.
What we know for sure:
- Thousands of incredibly talented public affairs, lobbying, and PR professionals at several individual companies have already started flooding the market, media, and op eds with content. While most doctors are immune to propaganda, the latter is designed to and will have a yet to be determined impact on general public and investor sentiment.
- Entire ecosystems and their fans and followers have subsequently and will increasingly parrot talking points with and most often without context. Congratulations in advance for the successful messaging/parroting that will take place.
- There is for practical purposes no impact on medical technology since this primarily concerns software for consumers. We anticipate as inferred in the excerpt at the bottom of this post; that software for medical use will continue to be increasingly assigned risk informed classes and regulated.
What it all means is that landmark activity has taken place though there’s still a long road to meaningful change even for consumer (not medical) privacy as explained below.
In medicine, it’s important to be clear that patients and clinicians are impacted by loss of privacy across Connectivity, devices here in chapter 11 of this book, data, and more. Most even technical, engineering, and product or even privacy stakeholders sitting in companies are coming from a siloed perspective that accompanies a gap of priority for privacy risk in one of the other three main areas or parts of the technical stack that impacts the consumer and as patients.
This study in Nature Communications found that anonymized data can be re-identified 81% of the time: https://lnkd.in/gajkHFzd
Further, most app developers fail to recognize that our devices can be tracked by bluetooth and other means: https://lnkd.in/gvgi4N28
Just as a note, Medigram, Inc. is not interested in selling any of your data in any form and therefore our perspective prioritizes patient and clinician safety and experience.
As a second side note, what we found when we wrote the best selling medical technology and medical informatics book, Mobile Medicine is alarming; few (we couldn’t find ANY company) who had medical grade or even enterprise grade risk management comprehension for mobile devices so we created a resource for you ourselves with one of the top health system CISOS in the industry who managed their own mobile device deployment and implementation, Ch 11 “Risk Considerations for Mobile Device Implementations.” It guides you in building your risk management infrastructure for mobile devices. Find it here: https://lnkd.in/gYBh_6d
There are real challenges of potential harms right now to physicians and patients seeking legitimate healthcare which is also not the focus of this post. If that is your concern, here are some resources organized into respective playlists such as this one for healthcare providers and seekers provided by the Electronic Frontier Foundation https://lnkd.in/g8QJyTxy
One key point that medical stakeholders will care about is that “data sales to government agencies have a carveout in the latest draft of the American Data Privacy and Protection Act, H.R. 8152, a bill that would make it more difficult to collect and sell sensitive data, which includes location data.”
In tech, nothing stops anyone from marketing harmful products that lack privacy protections for consumers. The new house bill doesn’t change that for practical purposes in my view for the foreseeable future.
What do I mean by practical? I am looking at the gulf of expectations in medicine with what could be characterized as the nonexistent protection in the consumer space.
If the App Isn’t Classified as a Medical Device, It Can be Marketed Freely
The difference in the markets is that you can’t market medical products without regulatory approval and herein lies another gulf with consumer technology. Who cares about a privacy law if it doesn’t prevent useless apps from being marketed and the law doesn’t get enforced in a meaningful way. Meaningful is highly subjective. For example, a company may consider a $5B fine the cost of doing business instead of bothering with compliance.
Now that the house E&C committee has sent the American Data Privacy and Protection Act to the full House floor, it will face a tougher vote at the full house level after several committee members communicated that they still weren’t ready to send the bill to the Senate.
Two members that voted against sending the ADPPA in its current form to the full House were U.S. Reps Anna Eshoo and Barragan. (Both from California & had concerns the federal bill as it is now could weaken privacy laws in California & other states.)
Bluntly the “two sides” -because I’m sure there’s a spectrum of grey; they are the App Association who appears to be motivated to drive to minimize regulation and make doing business as an app developer as easy as possible and consumer watchdogs illustrated by consumer reports and organizations such as the ACLU or the Electronic Frontier Foundation, EFF wanting to advocate for consumer protections from harms.
Important note: “State attorneys general oppose preemption in proposed American Data Privacy and Protection Act” (via @DailyDashboard) ow.ly/i4CX50K0ujU.
The rubber meets the road at enforcement which happens after it advances to the senate then becomes signed into law. Will it be just the FTC? What about FCC authority? We don’t know if the FCC will lose enforcement authority due to ADPPA and therefore the potential impact on the overall risk profile of ADPPA either increasing or decreasing risk to the patient in a way that’s meaningful across devices, connectivity, and apps is uncertain. We will have to see if ADPPA blocks the Federal Communications Commission (FCC) from enforcing this federal privacy law
This post does not focus on software as a medical device though here is what to know about SaMD it in a nutshell:
“In the EU, under the recently revised regulations pertaining to medical devices, software can be considered a medical device if it is “active.” That is, if the device depends on a source of energy other than that generated by the human body or by gravity and which acts by changing the density of or concerting that energy. Additionally, a classification system has been developed and depending on a variety of parameters, software can be categorized by rules into several Classes (I, Im, IIa, IIb and III. Im is a sub class for Class I devices for which the involvement of the notified body shall be limited to the aspects relating to the conformity of the devices with the metrological requirements.)
In the US, FDA’s perspective on Software as a Medical Device (SaMD) is defined as “software intended to be used for more than one or more medical purposes that perform these purposes without being part of a hardware medical device.” Too, software does not meet the definition an SaMD if its intended purpose is to drive a medical hardware device. Expect the definitions and classifications that do or do not consider software a medical device to change as new medical devices requiring software are developed. As noted by Walker, issues of whether software can be used by itself or with other medical devices, who are the end users, what type of information is provided as output and how is the information used will continue to be variables needing review.” (Fillmore, 2019)
Fillmore, Randolph. Is your software a medical device? March 8, 2019. Regulatory Focus
Is your software a medical device?
This article, based on presentations from RAPS Regulatory Convergence, October 2018, explores whether and when computer…
In summary, I have to express gratitude to both longtime prior employer Johnson & Johnson as well as the Junior League of Palo Alto•Mid Peninsula for providing two different master classes from different perspectives on legislation, regulation, and the criticality of trust as a young professional in my 20’s at the time. Today I must thank the Santa Clara University Markkula Center for Applied Ethics (largest applied ethics center in the country), Internet ethicist, Irina Raicu and their ecosystem, and my illustrious book coauthors, including the gifted former regulator Lucia Savage and digital health privacy big gun , Peter McLaughlin and many other colleagues and coauthors including Medigram CTO/CSO Eric Svetcov, the IEEE/UL JV for Trust, Identity, Privacy, Protection, Safety, & Security, TIPPSS of Clinical IoT for the ongoing debate and learning. This blog represents the views of myself and the Medigram organization and not any other collaborators mentioned.
In closing, I expect this space to evolve over decades. Eventually, there will be alignment and convergence between wellness and consumer grade software that consumers use and medicine; though in my view, we’re decades and at least a decade away from that and ADPPA doesn’t change that right now.
By Sherri Douville, CEO at Medigram, the Mobile Medicine company. Recognized in 8 categories of top CEOs by Board Room Media (Across SMS, mHealth, iOS, IT, Database, Big Data, Android, Healthcare). Top ranked medical market executive worldwide and #1 ranked in mobile technologycategories (mhealth, iOS, Android), #1–2 (on any given day) for the cybersecurity market in the U.S. on Crunchbase. Best selling editor/author, Mobile Medicine: Overcoming People, Culture, and Governance & Advanced Health Technology: Managing Risk While Tackling Barriers to Rapid Acceleration, Taylor & Francis; Series Editor for Trustworthy Technology & Innovation + Trustworthy Technology & Innovation in Healthcare. (contracted to advise top academic and professional education publisher Routledge, Taylor & Francis).
Sherri is the co-chair of the IEEE/UL JV for the technical trust standard SG project for Clinical IoT in medicine, P2933. She is passionate about redefining technology, software and data for medicine and advanced health technologies in a way that’s worth the trust of clinicians, our family, and friends. Ms. Douville leverages her books to inform her work on the CHIMECDH security specialization certification. She also advises and co-founded the Cybersecurity curriculum for the Black Corporate Board Readiness and Women’s Corporate Board Readiness programs at Santa Clara University.