- OT
- Life in practice
- Business management
- Engage with AI as an optometrist?
How do I…
Engage with AI as an optometrist?
Dr Peter Hampson, clinical director at the AOP, provides an introduction to the often complicated subject of artificial intelligence (AI)
30 May 2023
John McCarthy, a renowned computer scientist from Stanford University, defined artificial intelligence (AI) as: “the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable.”
AI in optometry practice
The short answer to whether any optometrists are already using AI software in practice is yes, but this might be so simple as to be slightly misleading.There is a natural instinct to think AI is new, but it isn’t - in fact, many optometrists will have been using basic versions of AI for many years. Take, for example, visual fields testing and the commonly used algorithms built in to speed up the testing process. Perhaps surprisingly, this is an example of AI.
We are now starting to see the first commercially available products that can help to analyse optical coherence tomography (OCT) data and spot defects. You might have this or similar software in your practice already.
The short answer to whether any optometrists are already using AI software in practice is yes, but this might be so simple as to be slightly misleading
Ethical concerns
Again, the question of ethical concerns when it comes to AI has a somewhat complex answer. The short answer is yes. However, the ability to do anything about it isn’t necessarily going to be in the control of AOP members. The ethical concerns around AI involve much wider issues that society is grappling with.
Some of the risks in this area include:
- Bias in the AI software, based upon the training data used. Is your patient accurately represented within the training data the AI has used to ensure that the answer given is correct? This may not always be the case
- Access for all parts of society, regardless of ability to pay. A number of emerging AI applications involve the assessment of OCT scans, but what if the patient can’t afford the scan? This might mean they cannot benefit from the earlier diagnosis that OCT, in combination with AI, may bring
- The risk of not understanding the technology, or its limitations. Specsavers now refers to AI internally as ‘supported decision making.’ But when do you pay attention to the advice the AI offers, and when do you ignore it? What does ‘being a professional’ mean if you start to defer decision making to technology? This is another question that does not have a straightforward answer
- Regulation. The Government has recently launched an AI regulation white paper which appears to favour a light touch approach. This is linked to the previous point: if you do not understand what the AI technology is doing and it ultimately gets it wrong, who should be responsible? The technology provider, or the professional utilising it?
These are the questions that as a profession and as a society we must address if we are to make use of AI and not simply find ourselves being used by it. Of course, this means a longer and more in-depth conversation than one that is likely to take place between optometrists at a practice level.
Comments (3)
You must be logged in to join the discussion. Log in
MDCET03 June 2023
I think it is a good thing that society is exercising due caution about how to use AI and I think Optometry Schools could help by making this more prominent in the curriculum.
Learning about Bayes' rule may be the simplest way to start. This rule is over 250 years old, was the subject of three introductory articles written for Ophthalmic and Physiological Optics by Aspinall and Hill about 40 years ago, and its use for better interpretation of clinical test findings is clearly described in the first chapter of Prof David Elliott’s widely adopted textbook on Clinical Procedures in Primary Eye Care.
In my opinion, application of Bayes' rule is no more mathematically demanding than what first year undergraduate optometrists already have to master with step-along ray tracing calculations.
A free and open source platform called Orange Data Mining is available for everyone and was designed by engineers and educators to teach students at school and university how to build, test and use their own AI based on Bayes' rule and many other algorithms such as Logistic Regression and Neural Networks. It does not require computing skills. I worked with undergraduates and postgraduates at Aston University who really got to grips with developing their own AI clinical decision 'recommendation engines'.
The great thing about using Bayes' rule is that it can utilize parameters like sensitivity and specificity, that optometrists are already acquainted with, to generate Bayes' factors (or Likelihood Ratios) that can reveal the relative importance of different clinical signs and symptoms and can drive AI recommendations for diagnoses and management plans. Orange Data Mining makes simple work of judging the reliability of these recommendations through the use of techniques like cross-validation.
AI recommendation engines provide probability values for differential diagnoses and alternative management plans. So, students learn that AI provides individualized precision evidence-based practice. Here, the optometrist bases their final clinical decision on their own experience, their patient's preferences and point of care evidence-based recommendations provided by AI. As such, AI is just part of the decision making process. It does not replace the practitioner and so the legal responsibility for the final decision remains with the practitioner.
Introducing optometry students to creation of their own AI recommendation engines can also make use of huge amounts of untapped clinical data collected in Optometry School clinics. Students are then faced with the ethical concerns of doing this (and a trip to the AI NOW website shows that researchers have been debating the social consequences of AI for many years now) along with an appreciation of the inherent strengths and weaknesses of the AI recommendation engines themselves (i.e. the 'rubbish in = rubbish out' idea).
Maybe this is time for more discussion about AI on the optometry curriculum.
Mark Dunne, Visiting Senior Lecturer, Aston University
(I do not have any proprietary interest in the Orange Data Mining platform)
Report Like 164
Anonymous01 June 2023
When bringing up the topic on AI it would be helpful to those lacking understanding to break it down a bit more: i.e. AI can be implemented into simple testing and parameterisation outputs as we know from VF and OCT. However it is different if we look at using AI in conjunction with clinical data and imaging data to predict outcomes. And the same can be said for AI image analyses, it can simply be a programme recognising and identifying arteries and veins or it can go well beyond depending on its purpose.....but to appreciate such a system it is necessary to also show its performance and some of that is still lacking.
AI is not mysterious but how we as optoms can use it in clinical practice will depend on lot on what outputs it can produce, how safe they are, regulatory and input issues and more.
On top of that the discussions about its use for image analyses and how this can improve remote and hard to get to communities as well as those in low income countries has yet to look at the reality: image quality is crucial and looking at publications using "research" curated images such as the UK biobank still being only able to analyse 80% of the images as the rest is of "insufficient quality" need to be addressed.
As with many other methods the quality input can lead to quality output as otherwise: rubbish in = rubbish out.
Report Like 159
Anonymous01 June 2023
On the notion of who is responsible for clinical decisions: it is still the practitioner and not the AI algorithm provider. This is both a hinderence and a safety measure because no matter what methodology is used to train the system-experience and clinical judgement is not as simply implemented.
Report Like 155