• Please Remember: Members are only permitted to share their own experiences. Members are not qualified to give medical advice. Additionally, everyone manages their health differently. Please be respectful of other people's opinions about their own diabetes management.

Retinal scan - what happens after the photos are taken?

Status
This thread is now closed. Please contact Anna DUK, Ieva DUK or everydayupsanddowns if you would like it re-opened.

helli

Well-Known Member
Relationship to Diabetes
Type 1
I have just come back from my annual retinal scan where I had a chat with the scanner about how they check the photos.
The process is different in England to Scotland (sorry, I didn't ask about Wales or Northern Ireland).
In England, they take two photos per eye at different angles. In Scotland they take one photo per eye.
The English photos are checked by at least 3 humans - an initial check, a second check and then a "blind check". If all checks give the same results, that's it. If they differ, they are checked by another person. If any checks see something "serious", they are sent to a more senior checker to review.
Then the results are sent off.

In Scotland, they use AI software developed by Google to do the checks.
The algorithm is unable to correlate the two views per eye that England take resulting in too many false positives and positive falses.
Research is ongoing to adopt the Google software in England but the AI needs more "I" (intelligence).
The chap I was talking to believes the development and approval of this for England is four or five years away.

Whilst there is double (and sometimes triple) checking, I can understand why a minor aberration on a scan could result in different results and may contribute to the common scenario of alternating between a diagnosis of background retinopathy one year, all clear the next, background the next, all clear, .....
 
I'm a lot more interested (and always have been) as to whether my 'background changes' are near enough to the bit of my eye(s) that actually affects eyesight - the 'fundus' is it? - rather than the fact I have them, cos that's the 'real' difference (between me going right I won't lose any sleep over that then, or panicking) in my understanding.
 
Well, I don’t know. All I do know is my retinal screening was perfectly normal in Scotland, and it’s perfectly normal in England.

The problem with Google AI is that nobody knows what it is looking for. Being AI, it constructs its own rules and algorithms. A few years back they tried a similar project with skin growths to identify melanomas or basal cell carcinomas. It was educated on photos taken previously of various skin problems including melanomas and BCCs. It did work well for a time, then failed.

Why did it fail? Because the pictures of BCCs and melanomas it was trained on mostly had rulers on them indicating the size of the lesion. All it ened up looking for was rulers. Easy to check, they just showed it a picture of a normal patch of skin including a ruler.

A similar thing happened in the early days of getting the AI to differ between photos of a wolf and a husky. That failed after a while, it just ended up looking for snow in the photos.

So what is the AI looking for in the photographs? As I said, nobody knows.

But what I do know is ophthalmologists tend to use white marker arrows to point to areas of abnormalities for future reference. I assume Google has learned not to make life easy for its AI by letting it learn from such material.
 
Status
This thread is now closed. Please contact Anna DUK, Ieva DUK or everydayupsanddowns if you would like it re-opened.
Back
Top