Artificial Intelligence (AI) tool checks retinal scans for heart disease, stroke risk: AI-enabled eye scan delivers stroke and heart disease risk scores

A team of researchers in the UK has developed a fully automated artificial intelligence-enabled system that can scan retinal images for vascular health, helping identify those at high risk of heart disease and stroke.

The old adage “the eyes are windows to the soul” is not that far off when considering how much one can infer about a person’s general health by studying their eyes.  Diseases such as rheumatoid arthritis and hyperthyroidism can be detected in the eyes, and recent innovations suggest neurodegenerative diseases like Alzheimer’s and Parkinson’s could be diagnosable through retinal scanning.

Considering how sensitive blood vessels in the eye can be to general cardiovascular changes, researchers have long studied the relationship between retinal features and conditions such as diabetes or coronary artery disease.  With the advent of computerized systems that can automatically identify minute differences in a person’s retinal vasculature researchers are now closing in on a new kind of diagnostic tool.

The AI system is dubbed QUARTZ (QUantitative Analysis of Retinal vessels Topology and siZe) and a new study put the algorithm to the test on more than 88,000 retinal images from two large ongoing population health studies.  Each person included in the study had an average of seven (7) to nine (9) years of follow-up data, allowing the researchers to evaluate the predictive capacity of the system.

The results showed the AI-driven system (when incorporated with age, sex, smoking status and medical history) could deliver 10-year risk scores for stroke and heart disease equal to one of the most commonly used diagnostic tools called the Framingham Risk Score (FRS).  Because FRS diagnostics require blood tests and blood pressure measurement, the ease of an automated eye-scanning technique reaching similar conclusions would mean more people could be better monitored if the technology was widely deployed.

The researchers do note that the majority of retinal images evaluated in the study were captured by “non-expert personnel.”  So, the results could potentially be improved using better imaging techniques from healthcare professionals.  However, on the other hand, the efficacy of less-complex retinal imaging does point to the possibility of this kind of technology being incorporated into a smartphone app.

In an accompanying commentary from two University of Dundee researchers who did not work on the new study, the clinical implications of this kind of tool are discussed.  The commentary suggests serious discussions need to be had to work out how this diagnostic information could affect clinical practice.

Should these tools be restricted to ophthalmologists or doctors?  If a patient is flagged as high-risk by one of these tools what are the clinical follow-ups required?  What would widespread availability of a tool like this mean for current healthcare systems?  Would lots of new patients suddenly flood clinics due to concerns over risks flagged by a smartphone app?

“What is now needed is for ophthalmologists, cardiologists, primary care physicians and computer scientists to work together to design studies to determine whether using this information improves clinical outcome, and, if so, to work with regulatory bodies, scientific societies and healthcare systems to optimize clinical workflows and enable practical implementation in routine practice,” the commentary concludes.

The new research was published in the British Journal of Opthalmology.

REFERENCE:  New Atlas; 06 OCT 2022; Rich Haridy