DeepMind, Google’s artificial intelligence business, is planning clinical trials of technology that can help diagnose eye disease by analysing medical images after early tests showed its results were more accurate than human doctors.
It emerged in February that DeepMind had developed AI technology that could analyse 3D retinal scans for signs of major eye diseases, such as glaucoma or diabetic retinopathy, after striking a partnership with London’s Moorfields Eye Hospital.
Initial findings, published in the Nature Medicine journal on Monday, showed that DeepMind’s algorithm was better than eight retinal specialists at Moorfields in making referrals when tested on 997 patient scans. DeepMind’s algorithm had an error rate of 5.5 per cent compared with between 6.7 per cent and 24.1 per cent for the eight doctors, according to Pearse Keane, a consultant ophthalmologist at Moorfields who co-wrote the paper.
According to Dr Keane, the AI can also analyse the scan immediately while patients would ordinarily have to wait days for a specialist to review the images. When the specialists were provided with contextual information that they would usually have about patients in order to make the comparison more realistic, their error rates dropped to between 5.5 per cent and 13.1 per cent — on a par or worse than the AI.
Dr Keane said the findings had been “absolutely jaw-dropping” and patients could expect the AI to be rolled out in the NHS within three years. “We have designed this algorithm with a very specific real-world application in mind,” he said. “There are already pathways in the NHS where this algorithm could be [used] . . . this is a technology that is coming in the next three years for direct patient benefit.” DeepMind’s algorithm was trained using 14,884 anonymised 3D retinal scans provided by Moorfields and labelled for signs of disease by doctors.
To make its decisions more transparent, the AI was developed with two neural networks— a type of machine learning that can spot patterns and make predictions from large volumes of data. The first AI neural network analyses scans to identify diseased areas while the second makes a referral to a clinician, along with suggested diagnoses for the type of disease, according to DeepMind.
“We are really excited about the research findings and are now thinking about what is the thoughtful and safe way to accumulate the right sort of evidence . . . to [allow the technology to] enter clinical practice,” said Dominic King, clinical lead for DeepMind Health. He added that similar partnerships with University College London Hospitals to analyse radiotherapy scans and with Imperial College London to analyse mammograms were also showing “really promising signs”.
Moorfields will retain control over the database of retinal scans and receive free access to the algorithm for at least five years. DeepMind, which is based in London, has come under pressure to clarify its business model and relationship to its parent company Alphabet as it develops technology for use in the NHS.
A review panel set up to scrutinise its partnerships with the NHS raised concerns in June that the company might eventually use its funding and access to data to “drive monopolistic profits”. But Dr King said the company’s technology would help the NHS cut costs: “One of the important things to do in forthcoming studies is to show that not only does this approach improve clinical outcomes, it reduces cost,” he said.