Convolutional neural networks, or CNNs, have certain limitations when compared to the normal human eye’s ability to pick up on new information as it relates to melanoma and other skin cancers. CNNs can misread these cancer diagnoses as natural skin instead of the cancerous skin that it is.
Researchers have named this CNN confusion as a kind of adversarial attack on a CNN’s ability to properly diagnose skin cancer due to variations in the color of the skin and variations in the way the images of the skin are fed to the machine. This confusion can ultimately lend itself to a misclassification of the issues at hand.
Newer research shows that when these machines are tested based on image data unrelated to images shown during training, the accuracy of the diagnosis decreases. The accuracy of the diagnosis can even vary based on if the images fed came from an iPhone or a digital camera due to the variation in colors they produce.
More clinicians are learning about the limitations of CNN architectures and are trying to find new strategies to help lower the rate of adversarial attacks. One such strategy involves retraining the CNNs with adversarial images to improve medical imaging altogether. The medical field is slowly learning new ways to apply artificial intelligence to other forms of medical imaging by adding as many colors and system variations as possible.
Have you experienced a misdiagnosis?
If you suspect that you have received an incorrect skin cancer diagnosis, it might be classified as a medical malpractice case. It may be wise for you to speak with an attorney to better understand your rights.