At the time of writing, there are probably around 500,000 medical "apps" (i.e., software tools) on the 'market' (in quotes, since some may be free, albeit nothing is really free). Some are designed to be used by patients, others by clinicians. This article assesses the regulatory environment and evidence on their safety w.r.t. privacy and data integrity. While current regulations seem to be enough to ensure data integrity at the healthcare provider level, the weak link seems to be in third party apps that allow patients to interact with the providers data.
There are many usage categories for medical apps.
Diagnosis/advice
These can target both patients and/or clinicians and range from the mundane ("do I have a cold") to serious (heart attack and stroke, for example).
Nudging
Mostly for patients. Nudge people to exercise, stick to a diet, etc.
Monitoring vital signs
Monitoring treatment compliance
General patient relationship management (e.g., MyChart).
Medical records storage, maintenance, and transcription.
As with healthcare and technology in general, the regulatory environment encompasses multiple organizations over multiple jurisdictions and both with and without legal enforcement authority. environment
https://researchdirect.westernsydney.edu.au/islandora/object/uws:69000/datastream/PDF/view
WHO places these apps in the following categories:
mHealth: Monitoring using mobile devices, e.g., for diabetes or heart issues such as afib. They generally do not require regulatory approval,
Software as Medical Device: A broad category applying to more sophisticated devices not exclusively mobile. Many health agencies are involved in regulating these devices and, in particular, the FDA requires manufacturers to provide risk assessments.
Digital Therapeutics: Requires a doctor’s prescription and must be cleared by the FDA.
As discussed above, most healthcare apps run on either Apple or Android devices. This concentration has two obvious opposing effects on app security. On the one hand, a security flaw in the OS could imply that many more apps become vulnerable per flaw than would be the case in a more diverse OS system. This concentration also increases the motivation of hackers to find flaws in the first place. And since healthcare history is fixed (in contrast to say credit card info, which can be made obsolete by simply cancelling the card), it is quite lucrative to hackers.
On the other hand, Google and Apple have access to a large number of the world’s best programmers and so one would expect their OS's to have the least flaws. In particular, Android programmers have gifted us a "privacy sandbox":
https://developer.android.com/design-for-safety/privacy-sandbox/introduction.
However, even if based on a secure OS, the apps are written by third party developers, and therein lies the potential problem. As reported here,
https://www.theverge.com/2021/10/18/22732615/health-record-app-hacks-patinet-data
the app security company Approov studied the security of apps built using
"Fast Healthcare Interoperability Resources" (FHIR)
https://www.healthit.gov/sites/default/files/2019-08/ONCFHIRFSWhatIsFHIR.pdf
which is essentially HIPPA-compliant standards for storing and electronically sharing health care data. Approov found the systems were secure when restricted to the healthcare providers databases, which are governed by HIPPA. However, the third party apps which allowed these different primary databases to communicate (e.g., to insurance companies, testing services, and patients) were often hackable and they were able to access millions of medical records from over 25,000 providers. It is (to me) an oddity of regulation that the thrid party apps are deemed minor, and so not required to be HIPPA compliant.
In another Approov found that all 30 of the mobile health apps they tested had some security flaw.
https://medtech.citeline.com/MT145768/Mobile-Health-Apps-Are-Falling-Behind-In-Cybersecurity-Report-Finds
Of course, potential conflicts in all studies of data privacy, Cybersecurity, and data accuracy. Cybersecurity and database companies, who are in the best position to report on such things, have an obvious reporting bias for drumming up business.
In addition to illegal theft of patient data, apparently Americans do not realize that some of their data can be sold by the app company.
https://www.beckershospitalreview.com/cybersecurity/81-of-americans-unaware-digital-health-apps-can-sell-personal-data.html
Logically, once venture capital largess runs out, app developers must either charge for the app or sell the data. But of course, selling the data is usually part of the business model from the get go. However, about half of Americans (and a higher percentage among those under age 40) do not mind that their health data is sold. But clearly, a better job needs to be made regarding disclosure so that dissenters can opt out.
Read Comments