Irish researcher develops AI to help prevent sight loss

Algorithm developed by ophthalmologist helps predict progression of retinal diseases

Automation in analysing scans is about to revolutionise patient outcomes with faster results affording earlier diagnosis and prompt treatment.
Automation in analysing scans is about to revolutionise patient outcomes with faster results affording earlier diagnosis and prompt treatment.

The ability to apply artificial intelligence (AI) to ophthalmology is gathering pace, a consequence of remarkable collaboration between eye specialists and technologists whose forte is the ability to process vast amounts of data quickly.

Irish ophthalmologist Dr Pearse Keane – based in Moorfields Hospital, London – has been the chief catalyst in developing AI software to detect 50 sight-threatening eye diseases. It operates by interpreting optical coherence tomography (OCT) scans of the back of the eye, which soon will be routine when going for an eye check.

Automation in analysing scans for diseases such as wet age-related macular degeneration (AMD), the main cause of blindness in Europe, and diabetic retinopathy, is about to revolutionise patient outcomes with faster results affording earlier diagnosis and prompt treatment, and ultimately preventing avoidable sight loss.

Since that initial breakthrough, the Keane team has developed an alert system for a third of people with AMD who later get it in their good eye and, potentially, an early-warning system for onset of neurodegenerative diseases, notably Alzheimer’s.

READ MORE

Keane is the main speaker at the Retina 2020 conference, an annual gathering in Ireland to showcase latest advances in treating eye disease and curing sight loss taking place later this week.

OCT is similar to ultrasound but uses light rather than sound waves to generate 3D digital images, which provide a detailed map of the eye, but are hard to read and need expert analysis to interpret, Keane explains. If people suddenly develop a problem, such as a bleed at the back of the eye, delays could cost patients their sight but already the AI tool is reducing incidence of AMD.

From Dublin, Keane studied medicine in UCD, qualifying in 2002. He always wanted to become an ophthalmologist or eye surgeon, with a desire to work in clinical research, and had a deep sense “you could bring profound benefits to peoples’ lives by curing their blindness”.

But it could also simply be the joy of helping someone see the scores in a match in the top corner of their television screen, he says, as was the outcome for a Waterford man following a cataract operation he performed early in his career.

He was conscious that ophthalmology was one of the most technology-driven medical specialities "with treatments utilising high-spec medical lasers, advanced microsurgical techniques and diagnostics involving ultra-high-resolution imaging". The opportunity to study at UCLA in Los Angeles with world leaders in ophthalmic imaging, particularly OCT scans – now the dominant way eye diseases are diagnosed – was pivotal.

In 2010, he joined Moorfields, one of Europe’s top ophthalmology centres which came with “50 per cent research time”, and he now has his own research team.

AI software interprets optical coherence tomography scans of the back of the eye.
AI software interprets optical coherence tomography scans of the back of the eye.

Huge patient volumes

A blockage in providing treatment was, however, to become obvious: Moorfields was conducting 1,000 OCT scans a day, yet people were losing their sight because they could not be seen quickly enough by retinal specialists. This was mainly due to huge patient volumes in hospital eye services – a problem evident across the world.

He hoped AI or deep machine learning could revolutionise eye testing, allowing them to spot conditions earlier and prioritise patients with the most serious eye diseases before irreversible damage sets in.

In 2015, he reached out to DeepMind, a UK-based AI company, which had just been acquired by Google. He was conscious of possibilities, recognising major tech companies such as Google and Amazon had "flipped a switch to become AI-first companies" by deploying artificial neural networks to simulate the way the human brain analyses and processes information.

The era of powerful technologies such as speech recognition (think Alexa), automated translation and self-parking cars was emerging, he recalls. He knew collaborators at the cutting edge of AI would be useful while there was little expertise in academic settings though “it helped DeepMind was based in King’s Cross, two stops from Moorfields on the tube”.

Together they developed an AI system – an algorithm – that can help to predict the progression of sight-threatening retinal diseases with accuracy on par with that of eight world-leading eye consultants. It looks for symptoms such as haemorrhages, lesions and fluid build-up. The “jaw-dropping results” were published in the journal Nature Medicine in August 2018.

The first neural network (the segmentation network) analyses the OCT scan to provide a map of the different types of eye tissue and the features of disease it sees. This map allows eyecare professionals to gain insight into the system’s “thinking”.

The second (the classification network) analyses this map to present clinicians with diagnoses and a referral recommendation. Crucially, the network expresses this recommendation as a percentage, allowing clinicians to assess the system’s confidence in its analysis.

“That was the concept car,” Keane says. Now they are going to production model stage – “the much harder ‘code to clinical’ phase” – which demands rigorous clinical trials and regulatory approval before being used in practice.

OCT scanners – “the hardware” – are now in every eye hospital in the UK and Ireland and are being rolled out in community clinics. When someone is being seen by an optometrist, there is a 50:50 chance they use the technology.

A welcome “spin-off” is the ability to identify those most at risk of further deterioration; a second AI system on likelihood of it occurring in patients’ “good eye”. The disease is prevalent among white Irish people, including a lot of Irish emigrants to Britain who he sees as patients.

Inspiration

Elaine Manna, now in her 70s, has been the inspiration for their AI research, he says. She noticed problems with her sight when the television began "going wavy" and it felt like she was wearing glasses covered in raindrops. She was diagnosed with wet age-related AMD in 2000 and eventually went blind in her left eye.

Problems in her right eye were diagnosed late. She was eventually transferred to Moorfields when she was told she would have to wait six weeks to be seen. Since 2013 she has received treatment there and the sight in her right eye has been saved. She now has regular injections, which stop the vessels growing or bleeding, and has become a great backer of the AI solution for others. “Our North Star was always patient benefit,” Keane adds.

Moorfields patient, Elaine Manna, with Dr Pearse Keane who says Elaine was the inspiration for their AI research.
Moorfields patient, Elaine Manna, with Dr Pearse Keane who says Elaine was the inspiration for their AI research.

Problems, however, are compounded by many “false positives”. Out of 7,000 urgent AMD referrals in 2016, only 800 had the disease. Ideally those with AMD need to be treated within two weeks, he says. “If it was my mother I would want to see her within six days.”

AI is not going to replace consultants. “No one is going to be having an injection in their eye or surgery purely on the basis of the machine,” Keane says. This “instant triaging process” is about getting people in front of the appropriate specialist at the earliest possible point.

A new frontier is investigating use of retinal images to design a diagnostic technique to identify neurodegenerative conditions.

“We know from small studies there’s retinal degradation in people with Alzheimer’s,” he says. It may also be the case with cardiovascular disease, Parkinson’s and multiple sclerosis. Having such a vast amount of data to work on, “you could get excited deep learning will be successfully applied”.

The AlzEye project seeks to examine changes to people’s eyes over time, and correlate these with emergence of Alzheimer’s disease in the same patient. Building such a dataset linking more than a million anonymised scans with the UK’s National Health Service (NHS) central database is immensely challenging ethically. It’s possible, Keane believes, as Moorfields has the technical infrastructure to aggregate data and rigorous governance on use of personal information.

NHS data shows ophthalmology was the busiest outpatient specialty in the UK during 2019-2020 with 7.9 million attendances over a year, the next busiest being trauma and orthopaedics with 7.4 million. It explains why patient safety is being undermined by delay, with indications the problem is worse in Ireland, he says.

Increased sight loss

The prevalence of age-related AMD stands out; it has been estimated 25 per cent of people over 60 in the European Union have early or intermediate forms of the disease. Given demographic trends and increased incidence of diabetes, there is every likelihood of increased sight loss and blindness. The sheer volume of patients underlines need for novel solutions, he adds.

On the impact of screens on modern lives, Keane says in a sense his parents much-repeated "don't sit so close to the TV" advice applies. Sitting in front of a screen for long periods is not going to do too much damage, he says, but is unquestionably leading to shortsightedness (myopia) – already it's a particular problem in Asia and among young people. "It can lead to problems later in life."

He has no doubt a survey of any batch of university students would show many are shortsighted. He believes it may also be related to “not getting enough outdoor time, absence of natural lighting and sitting in a hunch position for long periods in front of a screen”.

Ophthalmologist Dr Pearse Keane analyses an optical coherence tomography scan, which provides a detailed map of the eye.
Ophthalmologist Dr Pearse Keane analyses an optical coherence tomography scan, which provides a detailed map of the eye.

Keane believes ophthalmology could be the first branch of medicine to be fundamentally reinvented through the application of AI, particularly in the context of 70 to 80 per cent of sight loss being preventable by early detection alone. His work not only indicates what is possible when clinicians and technologists work together, it has enabled one giant leap to be taken.

Following the initial breakthrough, Keane remarked: “Ultimately, if something produces better outcomes and is better for our patients, then it should be irrelevant whether it’s a human or a machine or a fusion of the two that achieves that.” Through human endeavour, the machine is on the brink of delivering much more.

The charity Fighting Blindness hosts the Retina 2020 conference virtually on November 6th and 7th. Over 21 years, it has earned an international reputation for highlighting the latest developments on diagnostics and new treatments/cures for sight loss.

The first day features leading ophthalmologists including Dr Pearse Keane and is aimed at clinicians and scientists. The second day is an opportunity for people with a vision impairment – and their families – to hear directly from experts of advances being made, and to ask questions.