Advertisement
UK markets closed
  • FTSE 100

    8,433.76
    +52.41 (+0.63%)
     
  • FTSE 250

    20,645.38
    +114.08 (+0.56%)
     
  • AIM

    789.87
    +6.17 (+0.79%)
     
  • GBP/EUR

    1.1629
    +0.0018 (+0.15%)
     
  • GBP/USD

    1.2529
    +0.0005 (+0.04%)
     
  • Bitcoin GBP

    48,314.24
    -1,476.14 (-2.96%)
     
  • CMC Crypto 200

    1,257.60
    -100.41 (-7.39%)
     
  • S&P 500

    5,222.80
    +8.72 (+0.17%)
     
  • DOW

    39,530.45
    +142.69 (+0.36%)
     
  • CRUDE OIL

    78.37
    -0.89 (-1.12%)
     
  • GOLD FUTURES

    2,372.30
    +32.00 (+1.37%)
     
  • NIKKEI 225

    38,229.11
    +155.13 (+0.41%)
     
  • HANG SENG

    18,963.68
    +425.87 (+2.30%)
     
  • DAX

    18,772.85
    +86.25 (+0.46%)
     
  • CAC 40

    8,219.14
    +31.49 (+0.38%)
     

Google says ‘Lens’ can now search for skin conditions based on images. Here’s how

 (Getty Images)
(Getty Images)

Google says its “Lens” image search can now help people understand what is going on with their skin.

The tool is intended as a smart image search: users can take pictures and use them to search for whatever is in them. It has previously suggested it is useful for finding the details of the clothes that make up an outfit, for instance, or looking up certain items of food.

But ens can also be used for looking up skin conditions or other unusual things on the body, the company suggested.

It warns that the tool is “informational only and not a diagnosis” and urges users to consult authorities for advice. But it suggested that it could be a useful way of starting to look up certain things on the body that might be otherwise hard to put into words.

ADVERTISEMENT

“Describing an odd mole or rash on your skin can be hard to do with words alone,” Google said. “This feature also works if you’re not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head.”

The feature was described in a more wide-ranging Google blog that focused on other more obvious uses, such as pointing the camera at a “cool building or landmark” or to translate street signs or menus.

Google said the feature was new within lens, but did not specify when it had been released.

The company has tried to use artificial intelligence to help with skin conditions before. In 2021, it released a new tool called “DermAssist”.

Google says it sees “billions of skin-related searches each year”. DermAssist was built to assist with those, though it too includes a disclaimer indicating it is only intended “for informational purposes” and not for a medical diagnosis.

Since that DermAssist feature is more specifically focused on helping with medical conditions, it is subject to more stringent regulation. As such, Google has still only made it available in a “limited release” and asks people to sign up to be part of that testing on its website.

DermAssist required users to answer a few questions and upload three photos. Lens on the other hand simply appears to use Google’s algorithms to match one picture with similar images of skin conditions, and give some indication of what that condition might be.