Skip to main content
Diagnostic imaging

Diagnostic imaging

Smartphone apps fall short on skin cancer diagnosis

11 Feb 2020 Tami Freeman
Smartphone app
Algorithm-based smartphone apps designed to diagnose skin cancer are not accurate enough.

Skin cancer is one of the most common cancers worldwide and early detection, particularly of melanoma, is crucial to improve survival. With this objective in mind, there has been an influx of new dermatology smartphone apps that aim to help people with suspicious skin lesions decide whether to seek further medical attention.

Many of these apps use artificial intelligence algorithms to classify images of lesions into high or low risk for skin cancer (usually melanoma) and then provide a recommendation to the user. But how accurate are these algorithm-based smartphone apps? And how valid are the studies used to assess their accuracy? A research team led by Jon Deeks at the University of Birmingham and Hywel Williams at the University of Nottingham, aimed to find out (BMJ 10.1136/bmj.m127).

The researchers identified nine relevant studies that evaluated six different skin cancer detection apps. Six studies evaluated the diagnostic accuracy of the apps by comparison with histology, three verified the app recommendations against a reference standard of expert recommendations.

The team found that the studies were small and overall of poor quality. For instance, studies included suspicious moles chosen by clinicians not app users, and used images taken by experts on study phones, rather than by users on their own phones. Images that could not be evaluated by the apps were excluded. And many studies did not follow up on lesions identified as “low risk” by the apps, removing the opportunity to identify any missed cancers.

“This is a fast-moving field and it’s really disappointing that there is not better quality evidence available to judge the efficacy of these apps,” says Jacqueline Dinnes from the University of Birmingham’s Institute of Applied Health Research. “It is vital that healthcare professionals are aware of the current limitations both in the technologies and in their evaluations.”

Future studies, the researchers suggest, should be based on a clinically relevant population of smartphone users with concerns about a skin lesion. Studies must include follow-up of all lesions, not just those referred for further assessment. It’s also important to report all of the data, including failures due to poor image quality.

Poor regulation

Despite the limitations of this evidence base, two of the apps have obtained European CE marking: SkinScan and SkinVision. SkinScan was evaluated in a single study of 15 moles including five melanomas, with 0% sensitivity for detection of melanoma. SkinVision, meanwhile, was evaluated in two studies (252 lesions including 61 cancers) and achieved a sensitivity of 80% and a specificity of 78% for detecting malignant or premalignant lesions. Three studies verifying SkinVision against expert recommendations showed its accuracy was poor. While SkinVision produced the highest estimates of accuracy, its actual performance is likely to be worse, because studies were small and did not evaluate the app as it would be used in practice.

The researchers point out that smartphone apps are defined as class 1 devices (the European classification for low-risk devices such as plasters and reading glasses) for CE marking. They note that no skin cancer assessment app has received regulatory approval in the US, where the FDA has a stricter assessment process for smartphone apps.

“Regulators need to become alert to the potential harm that poorly performing algorithm-based diagnostic or risk monitoring apps create,” says Deeks. “We rely on the CE mark as a sign of quality, but the current CE mark assessment processes are not fit for protecting the public against the risks that these apps present.”

The researchers conclude that their review “found poor and variable performance of algorithm-based smartphone apps, which indicates that these apps have not yet shown sufficient promise to recommend their use.” They emphasize that healthcare professionals must be aware of the limitations of such apps and inform potential app users about these limitations.

“Although I was broad minded on the potential benefit of apps for diagnosing skin cancer, I am now worried given the results of our study and the overall poor quality of studies used to test these apps, says Williams. “My advice to anyone worried about a possible skin cancer is ‘if in doubt, check it out with your GP’.”

Related events

Copyright © 2025 by IOP Publishing Ltd and individual contributors