Researchers at Stanford University have developed a system based on convolutional neural network technology that can visually identify cancerous skin moles and lesions, according to CNN.com.
In a paper published in the journal Nature, the researchers said the system can recognize the signs of skin cancer with the same level of accuracy as a human dermatologist. The technology eventually could be implemented in a mobile app, allowing users to detect skin cancer early, an important factor in improving survival rates.
The Stanford team employed a convolutional neural network algorithm that uses deep learning to recognize patterns. The researchers taught the algorithm what the world looks like by showing it images of common everyday objects over a period of about one week.
Once the system had gained enough knowledge to identify such items, the team trained it how to detect signs of skin cancer. This represented a major problem, given the great variability in the appearance of skin abnormalities among different patients.
To overcome this challenge, the researchers presented the algorithm 129,450 images showing symptoms of more than 2,000 types of skin diseases. Each image was associated with diagnostic information, which was also input into the system.
When presented with new images of skin cancer, the algorithm could “diagnose multiple different kinds of skin cancer, not just melanoma, and we were able to do this with regular clinical images, rather than with specialized dermoscopic images,” Stanford dermatologist Roberto Novoa told CNN.com.
Stanford said a dermatology app using its application could provide universal access to skin cancer diagnostic care.
Tyler Schulze is vice president, strategy & development at Veritone. He serves as general manager for developer partnerships, cognitive engine ecosystem, and media ingestion for the Veritone platform. Learn more about our platform and join the Veritone developer ecosystem today.