Advertisement
UK markets close in 42 minutes
  • FTSE 100

    8,177.27
    +56.03 (+0.69%)
     
  • FTSE 250

    20,011.93
    +85.34 (+0.43%)
     
  • AIM

    766.82
    +1.84 (+0.24%)
     
  • GBP/EUR

    1.1681
    -0.0003 (-0.03%)
     
  • GBP/USD

    1.2487
    -0.0037 (-0.30%)
     
  • Bitcoin GBP

    47,484.95
    +1,565.64 (+3.41%)
     
  • CMC Crypto 200

    1,264.92
    -5.82 (-0.46%)
     
  • S&P 500

    5,029.77
    +11.38 (+0.23%)
     
  • DOW

    38,037.42
    +134.13 (+0.35%)
     
  • CRUDE OIL

    79.01
    +0.01 (+0.01%)
     
  • GOLD FUTURES

    2,308.90
    -2.10 (-0.09%)
     
  • NIKKEI 225

    38,236.07
    -37.98 (-0.10%)
     
  • HANG SENG

    18,207.13
    +444.10 (+2.50%)
     
  • DAX

    17,938.31
    +6.14 (+0.03%)
     
  • CAC 40

    7,935.00
    -49.93 (-0.63%)
     

Your life in your phone’s hands: can an app really detect cancer?

In a video, 30-year-old Stacey Everson tells the story of how she picked up her phone, snapped a selfie, and saved her own life. She might have easily overlooked the small, irregular mole on her upper left arm. But prompted by friends and family, she took a picture of the growth with an app named SkinVision, and followed up on the app’s recommendation that she see a doctor, urgently. The doctor removed and tested the growth. “A week later, it came back positive for early-stage melanoma,” she says. “Something like that, I wouldn’t have thought it would be cancer.”

Her testimonial is one of several on the product’s website, and SkinVision is only one of several such artificial intelligence (AI)-based apps that aim to help anyone with a smartphone catch a slew of skin diseases – including lethal cancers – earlier than ever. The latest entrant is Google’s Derm Assist, a tool that aims to help users detect more than 288 common skin conditions. Almost 10bn internet users search for terms related to skin conditions each year, says Peggy Bui, a product manager at Google.

Related: ElliQ is 93-year-old Juanita’s friend. She’s also a robot

But few of those searchers receive expert care. The US faces a dearth of dermatologists, and many tend to cluster in urban areas, so large swaths of the population find themselves driving several hours to seek care, or waiting weeks or months for appointments.

ADVERTISEMENT

AI-based algorithms such as SkinVision, Derm Assist and others could ease these difficulties. None of them offer a diagnosis – at best, they flag growths as harmless or “high risk” and recommend whether a patient should seek care. But many moles turn out to be harmless, so the apps could help patients or primary care physicians – who might not feel confident identifying a skin cancer – figure out which patients really need specialist care.

“There are many different ways that artificial intelligence can help with triage and decision making to provide support to the physician rather than trying to do their job,” says dermatologist Roxana Daneshjou of Stanford University. “There are opportunities for these algorithms to improve patient care.”

But the algorithms are far from ideal, in part because they threaten to augment existing racial biases in the field of dermatology. In 2019, researchers studying six apps to spot skin cancer found they had been tested only in small, poorly conducted studies. They have also raised concerns with how the algorithms are regulated. None of these apps are approved for use in the US. Some, such as SkinVision and Google’s Derm Assist, are approved for sale in the EU, although the researchers’ analysis suggested that approval “does not provide adequate protection to the public”.

“We lack a rigorous framework for even thinking about how we should evaluate and test these algorithms before they’re used by patients,” says dermatologist Veronica Rotemberg of Memorial Sloan Kettering Cancer Center in New York.

To develop these dermatology apps, researchers present a computer with a library of pictures of common skin conditions and teach the machine to classify each one correctly. Then, the algorithm is tested for its ability to “diagnose” a different set of images based on what it learned. As the algorithm analyzes images that users upload, it learns and evolves – ideally, making fewer mistakes over time.

AI algorithms developed in similar ways are already approved by the FDA for use in clinics. More than 100 are also available to help radiologists and clinicians interpret pictures from X-rays, CTs or retina scans.

But researchers have found these tools vary widely in their performance, as well as how and where they are trained. For instance, an algorithm developed at one clinic is likely to make more mistakes when diagnosing patients at a different clinic. In a pre-print posted online in October 2020, researchers found that AI algorithms to analyze chest X-rays produced systemic biases across race, age and insurance type.

The problems arise, in part, because of how algorithms learn to recognize patterns in pictures, says Stanford University researcher James Zou. A tool developed on images from a population of older, white male patients might pick up on cues unique to that cohort rather than the disease itself. Then, if those cues are absent in a younger Black woman, it may misdiagnose her symptoms.

Another fundamental flaw lies in databases that algorithms study – particularly for skin conditions. Common databases of skin images rarely capture the myriad variations in skin tones and textures from around the world. That’s in part because compared with white patients, only half as many Black or Hispanic people see dermatologists. Patients with less education or lower socioeconomic status are also far less likely to be represented in these image libraries.

While many companies use proprietary databases that claim to overcome these problems, regulators and clinicians have no way to know for sure. Others are more transparent in their methods but still contend with the whiteness of image libraries. These biased libraries are problematic even to human experts; dermatologists tend to be less comfortable diagnosing skin conditions in patients of color, according to studies of US dermatologists.

In preliminary studies, Google is working to solve the problem by using another type of AI to develop artificial images of disorders on darker skins, which may eventually help improve algorithms. For now, Derm Assist lets users know if there’s greater uncertainty about their results.

According to Adewole Adamson, a dermatologist at the University of Texas at Austin, other apps should include such warning labels to let users know the results might be less accurate if they have darker skin types. “It’s a little messed up to think an app is only for white people or Black or Asian people – you don’t want a segregated algorithm,” he says. “But at least that would be transparent.”

Smartphone apps for skin conditions face another hurdle. Photographs captured by average users can vary widely. One user might snap a closeup on a sunlit beach, another might do so from a dimly lit bedroom. A growth that appears malignant maroon in one setting might look benign brown in another.

SkinVision’s website, for example, counts nearly 2 million users worldwide. And although success stories such as Everson’s make headlines, it’s uncertain how many others never needed a diagnosis at all. Even without apps to scrutinize every mole, rates of melanoma diagnosis in the US are six times higher than they were 40 years ago. But there has been no corresponding rise in how many people die of the disease. To Adamson and others, the data hint at an “epidemic of scrutiny” – not necessarily one of cancer itself.

Companies that make the smartphone apps “are banking on this accumulation of anecdotes”, says Adamson. “There’s a less provocative opposite version: the app said I had something, I went in for a biopsy, and it was nothing. You’re not going to hear that story.”

But those stories are sprinkled over the internet already. On patient forums, people report taking pictures of their moles and finding themselves at “high risk” for cancer. Spurred by the SkinVision app that tagged an old mole on his foot high-risk, one user turned to the Reddit community for reassurance. “SkinVision says it’s high risk and now I’m absolutely terrified!” he wrote. A dermatologist suspected it was harmless and suggested he could wait and watch or biopsy. The individual – who asked the Guardian to remain anonymous owing to the personal nature of their health concerns – chose the latter, and is now awaiting results.

Tracy Callahan, a 46-year old nurse in Cary, North Carolina, has had five early-stage melanomas removed in the past eight years. Even as a cancer survivor who scrutinizes every inch of her skin, she’s unconvinced of the utility of these apps. “A lot of benign lesions can mimic an early stage melanoma, or it might be something bad, and the app might not pick up on it,” she says. “I don’t know if these apps necessarily help someone like me.”

For algorithms to truly detect skin cancer well enough to bridge the gaps in dermatologic care, researchers, companies and regulatory agencies such as the FDA must converge on standards for these tools, Rotemberg says. One important factor, she says, is for algorithms to learn not just how to spot a disease, but when not to. Can a tool learn to recognize when it’s out of its depth and defer to a human?

“Even if an algorithm is able to say, ‘I’ve never seen an image with this lighting or this skin type,’ it helps you as a clinician to know how useful its interpretation is,” Rotemberg says. “In those instances, you can fall back to the gold standard – a specialist’s opinion. And you’re not creating problems by introducing this imprecise tool in between.”