Retinal Images are Weirdly Predictive

Original article was published by Andrew Hetherington on Artificial Intelligence on Medium


Retinal Images are Weirdly Predictive

Keep your eyes open for these developments in deep learning

Photo by nrd on Unsplash

If you’ve ever been to see an opthamologistst, you’ve probably undergone a routine procedure where a specialist takes a picture of the back of your eye.

You will not be surprised to hear that retinal images are rather handy for diagnosing eye diseases. However, you may not have expected that they can also provide a lot of insight into a person’s risk of cardiovascular disease. Retinal imaging is a non-invasive way to examine the condition of someone’s blood vessels, which may be indicative of that person’s wider cardiovascular health.

If you’ve seen one of these retinal images before, you’ll probably be able to point out the optic disc and the various blood vessels (if you haven’t, try sticking “retina image” into Google — or “fundus”, which is the medical term for the back of the eye).

A doctor will be able to go one step further by identifying abnormalities and suggesting features that may warrant further investigation or treatment.

However, feed it to a machine and it’ll be able to predict:

  • how old you are;
  • your gender;
  • your ethnicity;
  • whether or not you smoke; and even
  • what you had for breakfast that morning.

Okay, so I may have made that last one up, but remarkably, the rest are true. Make no mistake, retinal images are weirdly predictive.

The eyes have it

Researchers at Google wrote a 2017 paper setting out an investigation into how deep learning could be used to predict a range of cardiovascular risk factors from retinal images. The paper briefly explains the more traditional approach to medical discovery: first observing associations and correlations between potential risk factors and disease, and only then designing and testing a hypothesis. Ryan Poplin et al then go on to demonstrate how deep learning architectures can pick up these associations by themselves without being told what to look for.

I’m sure we’ve all heard at some point the assertion that certain medical specialists are going to be replaced by AI algorithms that will be able to outperform them at recognising abnormalities in medical images. This research takes things in a slightly different direction — not seeking to outperform doctors at an existing task but to see what new information machines can glean from these particular images.

Early on in their research, the team found that their model was remarkably good at predicting variables like age and gender — so much so that they initially thought that it was a bug in the model (Ryan walks us through how the project developed on TWiML talk 112). But as they looked further into things, they discovered that these were real predictions. Not only that, they were incredibly robust ones as well — age, for example, could be successfully predicted with a mean absolute error of 3.26 years.

A number of other associations were found, and it turned out that the team could obtain better predictive power than their baseline model across all kinds of variables including blood pressure, blood glucose levels and even ethnicity — all risk factors for cardiovascular disease.

After observing these results, the team reasoned that if this range of cardiovascular risk factors could be predicted so well, then the model may even have predictive power when it came to identifying which patients were most likely to suffer from a major cardiovascular event (eg a stroke or heart attack) in the future. Despite some limitations in their training data, a model trained only on retinal images (so no explicitly given risk factors) was able to achieve an AUC of 0.70 (skim the ROC/AUC section of this article to learn more about AUC as a performance metric) — which becomes especially impressive in comparison to the 0.72 obtained by another existing risk scoring system which makes use of a great deal more input variables.

More than just windows to the soul

Photo by Liam Welch on Unsplash

In the TWiML podcast mentioned earlier, Ryan speculates about a possibile future in which retinal images are taken as vital signs to give a picture of overall patient health instead of just being used to diagnose eye diseases. As we have seen, this isn’t just fantasy — this straightforward and non-invasive procedure could give a much broader snapshot into a patient’s health than we might have previously expected.

To conclude — cardiovascular disease remains the leading cause of death across the world, but 80% of premature heart disease and stroke is preventable. Research like the paper discussed above can help us better understand who is at highest risk of cardiovascular disease and how these groups can be best managed — appropriate early interventions could go a very long way to extending and improving the quality of human life.