E&T: Health apps frequently neglect privacy of users, study finds

Screen Shot 2018-02-07 at 12.06.01

Hilary Lamb at Engineering & Technology reports on a study that has found “Health apps frequently neglect privacy of users”.

mHealth Insights

“A European study has found that half of health apps could be sharing sensitive personal data via insecure connections, and the majority of these apps share health-related data with third-party companies. The study involved a collaboration of researchers from the University of Pireus, Greece, and Rovia I Virgili University, Spain, who are working to develop improved solutions to protect European citizens’ online privacy. The researchers looked at 20 free apps available on Google Play, all of which had been downloaded between 100,000 and 10 million times and had a minimum rating of 3.5/5. They studied how the apps stored and monitored personal data, such as information about past health conditions. Of the apps analysed in the study, 80 per cent shared health-related data to third-party companies, with the other 20 per cent storing data on the users’ phones. This data included text as well as images, such as X-rays”

While I recognise we have a dire situation in which caring Healthcare Professionals are shaking in their shoes for making the decision to share their mobile number with their Patients I find it amazing that people are really surprised by this finding. I wonder how people who think this is the big issue of the day react to discovering that the biggest Cancer Charities in the world are using private investigators to research the families of people who have cancer so that they can then profile them for highly targeted donation strategies based on their personal wealth etc?

The reality is most of these app users will have public Facebook profiles and likes that would probably tell you more about them than you’ll achieve by hacking the citizens mobiles or the websites they’re interacting with. With some resources and considered use of keyword advertising you can probably engage them too (because as we know most Patients are googling their condition/diagnosis).

I also think the research could’ve had a lot more impact had the top line findings included the names of the 20 free apps, the developers and their partners, if there was evidence that they were being recommended by Medics and Patient Associations, how long Patients used these apps for (we know most apps just get downloaded and soon after are deleted), etc.

I can easily point you to 20 free ‘health’ apps on Google Play that fall into their criteria that are just nonsensical and have 3.5* ratings and 100,000 downloads but they’ve got that because they’ve just gaming the app store, making money via scams and users aren’t Patients but are citizens using it for fun/discovery.  Here’s one I found within 5 seconds that meets all the researchers criteria and there are thousands of cookie-cutter similar apps that promise like this one does to give a ‘Doctor Diagnosis’ based on a paint by numbers approach to collecting basic symptoms:

Screen Shot 2018-02-07 at 13.42.41

There is probably also no way that the researchers could identify if these apps weren’t just being used by ‘users’ that weren’t actual people/Patients eg. click farm operations can make lots of money from dumb mobile advertisers if they get the context right (it’s easy to imagine the dumb drug companies and the ‘charities’ they sponsor would be very gullible to spending on these type of ads), etc.

“Only half of these apps shared this data securely, using https connections to manage user login. More than half of the apps transmitted data using URL links: this made the data potentially accessible to anybody who could gain access to those links. 20 per cent of the apps did not refer users to a privacy policy or failed to do so in the language of the app. Some of the health apps required access to camera and microphone, contacts list, external storage, Bluetooth and location, despite their functionality not being dependent on this access”

I think it’s interesting to that only half of the apps shared personal Patient data securely but 80% referred users to a privacy policy. Until the app stores start enforcing standards Privacy Policies are meaningless: no one is reading them and they’re for the most part just providing a smoke cloud of false reassurance for Patients.

I’m a huge fan of Patient Champions and think it would be interesting to see what level of endorsement the apps reviewed by the researchers were getting from Patients (I don’t think that a +3.5 star app store rating is a proxy for this), Medical Professionals and healthcare provider organisations.

Related: NIH funded researchers spend $270 on misleadingly labelled iPhone apps & conclude that apps offered for ‘EDUCATIONAL USE ONLY’ can’t be relied on for ‘PROPER MEDICAL ADVICE’

“According to the study, the majority of the apps did not meet legal requirements or standards intended to protect users from inappropriate data use and disclosure to third parties. “We strongly support the use of mobile health apps, but users must know that apps’ popularity does not ensure privacy and security,” said Professor Agusti Solanas of Rovira I Virgili’s department of computer engineering and mathematics”

I think it’s clear this is like other minimally regulated markets (eg. the $B supplement industry) and I think a better piece of advice would be for Patients to download and use apps that have been recommended to them by their Healthcare Professionals (who have undertaken quality mHealth training from an accredited training body).

“The issue of health data being shared insecurely has been a concern for years. It has been reported that UK doctors frequently use their phones to share personal health data with their colleagues, including sending text and pictures via SMS to request their professional opinion. In 2015, the NHS was forced to remove health apps from its library of accredited apps after they were found to be leaking patients’ medical details online”

Of course Health data shouldn’t be shared insecurely but there are also issues with how health data that is being shared securely is being used and shared that this report doesn’t seem to touch on eg. Patients in the UK’s NHS gave their health data to medics and it was then given free of charge of an advertising company in the USA that we know is making billions in profits from doing things like selling adverts to referral agents that are masquerading as free helplines for addicts etc.

Screen Shot 2018-02-07 at 14.51.40

and this is all before we start thinking what smart connected homes are chattering away about…

*** UPDATE 8 Feb 2018: LINK TO THE JOURNAL PAPER ***

Screen Shot 2018-02-08 at 23.01.37

About 3G Doctor

The Corporate Blog of 3G Doctor
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s