ICYMI: Apple Commits To Mental Health With Tracking On Watch & iPhone
Google has boosted its Lens image recognition tool so that it is now capable of recognising certain forms of skin disorders.
While a smartphone is obviously no substitute for a certified healthcare professional, it could be a key tool in alerting us to potential health risks.
Among other new features, Google Lens can now analyse the health of your skin, whether it's scrutinizing a suspicious mole or a rash.
To do this, point your smartphone's camera at the skin (or take a photo of it) and the app will suggest a number of possible corresponding conditions. These are accompanied by numerous images which can be compared with your own skin.
This area is not a totally new one for Google, which demonstrated a dedicated application called DermAssist at its annual Google I/O conference in 2021, capable of identifying various skin anomalies.
It's key to keep in mind that Google Lens is by no means a medical diagnostic tool or a replacement for a doctors' opinion, but rather a means of detecting potential problems before consulting a healthcare professional.
The latest version of Lens enables users to perform a wide range of functions.
Other practical applications include translating road signs or menus into over a hundred languages, solving mathematical problems simply by pointing the camera at an equation, or discovering new dishes and eating establishments near your location.
The power of Lens is soon to be integrated into Bard, Google's generative artificial intelligence.