NIH funded researchers spend $270 on misleadingly labelled iPhone apps & conclude that apps offered for ‘EDUCATIONAL USE ONLY’ can’t be relied on for ‘PROPER MEDICAL ADVICE’

JAMA Dermatology Diagnostic Inaccuracy of Smartphone Applications for Melanoma Detection

Other the next few days I expect journalists from all over the world will pick up on this paper published in JAMA Dermatology and get confused about the conclusions that it’s drawn. It’s already started at the Daily Mail:

Daily Mail Smartphone apps that diagnose skin cancer give misleading results and could delay life saving treatment

NPR:

NPR Skin Doctors Question Accuracy Of Apps For Cancer Risk

ArsTechnica:

arstechnica Siri does this look malignant

Mashable:

Mashable Can a Smartphone App really detect skin cancer

Wall Street Journal:

Wall St Journal Apps Aim to Detect Skin Cancer

My Thoughts:

I think it’s a great topic for researchers but on reading the paper it’s quite obvious that there are several significant gaps that undermine its value considerably:

1) None of the 4 apps that are reviewed are named:

Whilst the authors suggest that a decision was taken “Because the purpose of our study was to determine the accuracy of such applications in general and not to make a direct statement about a particular application” I fail to see how anyone is supposed to draw a general idea of something from such a small sample eg. to do that they’d have been better off testing 50 apps with 4 images (rather than testing 4 apps with 50 images each).

Considering the researchers expressed concern over the potential for apps like these to present risks that are of “particular concern for economically disadvantaged and uninsured patients” I think this a no name policy was a very bad decision but perhaps clinicians in the USA aren’t under the same obligations as those in the UK/Ireland are when it comes to reporting patient safety concerns.

From a research perspective it’s also not very helpful as it doesn’t provide readers with any capacity to evaluate these apps themselves or from the consumer perspective (we’ll never know for example if all these apps were popular but had a zero user rating as they were mostly being downloaded for fun/research purposes).

If I was doing this research at the very least I would include a snap shot of the app’s app store rating and comments from users who have previously tried it (all app stores feature this and it has a significant impact on users confidence/trust/expectations/use of an app).

2) Selection process was biased to select for low quality apps:

I found several things that were fundamentally wrong with the app selection process used;

i) As a qualifying feature the researchers excluded popular apps that “could not use existing photographs” as they required the applications they could test to allow “the use of existing images” eg. ones not captured with the smartphone but perhaps with a dermascope using polarised light by clinicians who treated patients.

Although we have no way of determining this (the apps tested were not named) I think it is likely that this would add a filter that would inadvertently add a bias to the research because it will have ruled out apps and services that are of a higher quality because there would be greater complexity involved in producing an app that can take control of the Smartphone Camera.

By automatically ruling out apps that have required greater commitment, programming talent and investment from a developer I’m left with the feeling that the research has been designed to fail the apps tested.

ii) the test of the apps wasn’t fair eg. wouldn’t it be obvious by the individual(s) offering “App 4” that this wasn’t a typical patient using the service (eg. this one particular patient paid them more than $250 and sent them 50+ high quality images of different moles on different skin types!).

3) All of the apps analysed are misleadingly labelled:

It amazes me that the research hasn’t commented more on this as really this is the biggest failing here and it’s the easiest way to ensure patients aren’t put at risk going forward: Apple and Google need to enforce the bans they have in place that promise to reject “Apps that contain false, fraudulent or misleading representations”.

The research states that all the apps had disclaimers stating that they are “are intended for educational purposes” only before then offering to “additionally give an assessment of risk or probability that a lesion is benign or malignant”.

The internet is chock full of misleading websites (eg. in a few minutes I can find you a hundred websites based outside of the USA where someone who is unqualified can give you an dermatological opinion in exchange for money) that can provide the service the researchers wanted (a review of an image they could send via email etc) and don’t provide disclaimers like the researchers have told us all of these apps provided. More over none of these nameless website providers have any need to comply with any App Store guidelines.

4) This research ignores Apps being offered by named qualified Doctors

It’s a shame that this research doesn’t build on previous work that shows mobile teledermatology can be an effective way for registered Dermatologists to provide remote triage:

Mobile teledermatology: a feasibility study of 58 subjects using mobile phones

When you appreciate that this research comes from a time before Apple launched the App Store it surprises me that the NIH funded researchers at the Department of Dermatology, University of Pittsburgh Medical Center, didn’t select to review at least one App that is offered by a named registered clinician.

Perhaps it’s because I have access to the European App Store but I found it easy to find an example of this there eg. the iDoc24 app that is produced by Alexander Borve (who recently published this research paper claiming that “the diagnostic accuracy and adequacy of the triage and management decisions achieved using MMS referrals were similar to those obtained with other store-and-forward teledermatology methods”):

iDoc24 iDoc24 - Ask the dermatologist today Europe only

My conclusion:

I’m amazed that with the profits Apple and Google are making from their App Stores (eg. Apple will have taken at least 30% of the revenues made by these app developers) that they are okay about providing billing and distribution in the USA of apps that are so misleadingly labelled and not provided by registered named Clinicians.

Although we have no way of knowing (the apps weren’t named so we can’t look for them ourselves etc) perhaps in reality patients who have concerns about their skin and are using their mobiles to search for help will be much more likely to arrive at highly rated quality information from the likes of the American Academy of Dermatology long before they even find these dangerously misleading apps.

Have I missed something? What do you think?

About 3G Doctor

The Corporate Blog of 3G Doctor
This entry was posted in Uncategorized. Bookmark the permalink.

2 Responses to NIH funded researchers spend $270 on misleadingly labelled iPhone apps & conclude that apps offered for ‘EDUCATIONAL USE ONLY’ can’t be relied on for ‘PROPER MEDICAL ADVICE’

  1. JD says:

    The study got it right. Automated image analysis is not going to be an effective way to diagnose melanoma and it will be sure way to do harm. Pathologists have trouble diagnosing melanoma when looking at the actual stained tissue under the microscope. Patient concern is a symptom that actually facilitates the diagnosis of melanoma, so if you take that away with a an incorrectly analyzed cell phone image there is great potential to do harm. Best thing to do is combine mobile imaging with a definitive detection system like the technology like expression analysis.

  2. Hi JD,

    I agree 100% that “Patient concern is a symptom that actually facilitates the diagnosis of melanoma“.

    I think that a significant problem arises because Patients don’t have ready access to Doctors and busy Doctors don’t have time to listen to and document all of a Patients concerns.

    Best thing to do is combine mobile imaging with a definitive detection system like the technology like expression analysis

    I think even better results will be found by listening to Patients sharing their medical history (using connected mobile tools that help to effectively document this).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s