As artificial intelligence (AI) becomes increasingly sophisticated, its use is becoming more widespread across a number of healthcare applications—and diagnosis is a prime example. For instance IBM’s AI system, Watson for Health, is beginning to help healthcare organisations apply cognitive technology to unlock vast amounts of health data and aid diagnosis. The system can rapidly store and analyse information from medical journals across the world, data about every symptom and case study of treatment and response, something no human is capable of doing.1
Google is similarly driving the movement of AI into healthcare with DeepMind Health. The AI lab is collaborating with physicians, researchers and patients to take a new look at real-world healthcare problems, for instance the management of sight-threatening eye disease.1 Results recently published in Nature Medicine demonstrate how the AI system can interpret eye scans from routine clinical practice in seconds with exceptional accuracy, subsequently diagnosing and recommending how patients should be referred for treatment for over 50 serious eye diseases.2 Although preliminary, these results show the potential for AI systems to help speed up the time between diagnosis and treatment, and help doctors prioritise those who need urgent treatment.2
And the potential for AI-aided diagnosis doesn’t stop there. With the aim of creating greater access to medical care, researchers at Stanford University developed a deep learning algorithm that can identify skin cancer just from an image. With a database of almost 130,000 skin disease images, the researchers trained the algorithm to visually diagnose skin lesions, and tested its accuracy against 21 board-certified dermatologists. The algorithm was assessed through three key diagnostic tasks: keratinocyte carcinoma classification, melanoma classification, and melanoma classification when viewed using dermoscopy. Across over 370 images of both cancerous and non-cancerous lesions, in all three tasks, the algorithm matched the performance of the dermatologists.3
“Advances in computer-aided classification of benign versus malignant skin lesions could greatly assist dermatologists in improving diagnosis for challenging lesions and provide better management options for patients,” said Susan Swetter, professor of dermatology and director of the Pigmented Lesion and Melanoma Program at the Stanford Cancer Institute. While the algorithm currently exists on a computer, the team at Stanford’s next aim is to make it smartphone compatible, meaning those without easy access to a doctor could receive a diagnosis remotely.3
Some studies are even beginning to suggest the potential for greater accuracy in diagnosis with AI in comparison to clinicians. Researchers from the John Radcliffe Hospital in Oxford, England have recently demonstrated that their AI diagnostics system, Ultromics, was more accurate than doctors at diagnosing heart disease across a number of clinical trials.4 The technology works by extracting over 80,000 data points from a single echocardiogram image, overcoming subjectivity and increasing diagnostic accuracy from 80% to over 90%.5
Despite results such as these, we know that technology will never replace the need for human doctors. AI and its growing capabilities ought not to be viewed as a threat to physicians, but rather as a valuable tool which can assist them in their day to day roles.