touchOPHTHALMOLOGY touchOPHTHALMOLOGY
Artificial Intelligence, Cataract Surgery
Watch Time: 7:56 mins

Mark Packer, ASCRS 2023: avoiding postoperative complications using artificial intelligence

Copy Link
Published Online: Jul 10th 2023

Data that can be collected and interpreted by machine learning can be extremely helpful for identifying postoperative complication risks, according to Dr Mark Packer (Packer Research Associates, Inc., Boulder, CA, USA). Dr Packer talks to touchOPHTHALMOLOGY about the potential of this emerging technology in cataract surgery.

The presentation entitled ‘Predicting Vision Outcomes in Cataract Surgery with Machine Learning’ was presented at the American Society of Cataract and Refractive Surgery, 5–8 May, 2023.

Questions:

 

  1. How can artificial intelligence help prevent postoperative complications? (0:58)
  2. What is the future for artificial intelligence in eye care? (5:13)

 

Disclosures: Mark Packer has nothing to disclose in relation to this video interview.

Support: Interview and filming supported by Touch Medical Media. Interview conducted by Lisa Glass.

Filmed as a highlight of ASCRS

Access more content on Cataract Surgery here

Transcript

So, I was in private practice for many years up until about twenty twelve, worked as a principal investigator, key opinion leader, etcetera. During that time, publishing and speaking around the world. But over time, I became more and more involved in clinical research, and really enjoyed doing that. And so, that’s primarily what I do now. I’m involved in clinical research projects, really all around the world mostly anterior segment, but really extending from dry eye all the way to more recently retinal diseases.

How can artificial intelligence help prevent postoperative complications?

That’s a great question, and it’s kind of a different area from what I’ve been talking about, because I’ve been talking more about patient satisfaction and choosing the right lens for the right patient. Kinda would be assumption that everything goes well, but we know that’s not the case, not everything goes well all the time, and we do have complications.

And we know there are some known risk factors, you know, for these complications, right? Like we know if someone has a wobbly cataract that’s shaking back and forth in the eye, that that’s a that’s a danger sign going in. But there are other perhaps more subtle factors like a shallow anterior chamber short axial length, things like that that could predispose to corneal edema.

You know, and so part of what AI can do is, again, to integrate all those biometric factors that we’re collecting anyway, and then come up with some kind of risk category based on that, maybe it’s just high, medium, and low, or maybe it’s actually a percentage saying, you know, patients like this in your practice, have a nine percent chance of capsular rupture. I mean, that would be very important to know, and that could actually pop up you know, on my screen when I’m talking to the patient after all their data has been collected, and I’m, you know, typically now Most surgeons do all the biometry first before they see the patients, so you have all that data and you can give them more sort of intelligent assessment of the patient’s outcomes, potential outcomes.

So now, you know, through AI and looking at thousands of cases that I’ve personally performed over the years, right, as the data grows and grows within this sort of robot brain, it can now say patients just like this nine percent of the time have ruptured caps. So now I can tell that person, look, you know, this is the reality you have a one in ten chance of this complication. And I can probably also tell them what the outcomes are. Oh, and by the way, ninety percent of those people end up, you know, with vision better than two thousand and forty. So they’re doing pretty well, not perfect, but they’re doing well enough to drive without reading glasses or without drive without glasses or read without using reading glasses or whatever it is. Right?

So really all of these data that can be collected and interpreted by machine learning can be so helpful to us in terms of actually knowing our risks. The big stumbling block has always been data collection on the back end. You know, we want to collect the data on the front end because we need that to do the counteract operation. There’s no way around it. Without an axial length, I can’t choose an IOL. I need that. It’s absolutely critical, and so my technicians are gonna take time to get that data, to collect it, and then put it into our IOL calculation platform.

But on the back end, it’s been problematic because it’s not seen as required. I don’t really need to know you know, whether the patient’s thrilled, delighted, unhappy, whatever, it would be nice to know, but I’m not going to spend time, you know, sending a full, you know, validated questionnaire to that patient to get that response. It’s too cumbersome. Similarly, putting post operative refractive data into a computer program takes technician time or it has in the past to type it in.

What would be optimal is if the electronic health record is linked directly to the machine learning algorithm, and the patient feedback is collected easily, you know, just using a simple rating program like you do, like you from, you know, a restaurant reservation website. How many stars do you give this restaurant? I mean, it’s super easy. People are now used to doing that.

So let’s employ that as well. And now we can get important data on safety. Did they have a complication? That’ll be in the electronic health record.

You know, are they satisfied? That will be in their own ranking that they can do conveniently at home. And all of this can go together to help us improve our outcomes.

What is the future for artificial intelligence in eye care?

Well, the future is huge. I mean, this is a very hot topic right now.

We just had our first digital day at ASCRS in San Diego this year, so there were some very interesting presentations, really around a lot of different platforms, but everyone’s very interested in this. How can this help? I mean, we’re really in this era now where we’re thinking, you know, machine learning can help with so many things.

Let’s really try to put that to use. So all the way from diagnostics and risk stratification to patient satisfaction.

I mean, it’s not just cataract. Surgery, right? It’s also treatment of dry eye disease, for example, where we have a range of different treatments, and we’re looking for what would be optimal for this particular patient with this configuration of signs and symptoms. Maybe we can learn that if your score is thirty two on the OSDI, and you’ve got inferior corneal staining, then the best treatment is x, y, and z, I don’t know.

But that could really come out of something like this. And really to me the ability to incorporate patient feedback directly, especially with these more subjective types findings like intraocular lens selection, like the degree of comfort you have with dry eye. I mean, that’s been sort of relegated to thirty second interview when the patient comes back to the office after three months, how are you doing? Well, I’m doing okay.

The tech right down. Patient doing okay. Now the doctor comes in and interprets status, well, they’re fine. I don’t need to do anything else.

We’ll just continue. But really that okay, maybe that was only really two out of five stars, really not so great. You know, an Uber driver wouldn’t take that off that.

And so I really look forward to being able to incorporate patient feedback in a convenient way that they can do at home in real time. Not just this kind of snapshot at random moments when they come back to the office.

You know, we’re similarly looking at sort of these ongoing ways to measure intraocular pressure at home and get a much more realistic picture of how well our glaucoma treatments work. Rather than, you know, just this one snapshot every six months at the office. We don’t know what’s going on the rest of the time. Are they even using their eye drops? I don’t know. Maybe they’re just using them the day before they come in. Things like that could be could be really helpful and can be incorporated you know, thanks to our ability now to really have digital real time feedback.

Subtitles and transcript are autogenerated

 

Share this Video
Related Videos In Cataract Surgery
  • Copied to clipboard!
    accredited arrow-down-editablearrow-downarrow_leftarrow-right-bluearrow-right-dark-bluearrow-right-greenarrow-right-greyarrow-right-orangearrow-right-whitearrow-right-bluearrow-up-orangeavatarcalendarchevron-down consultant-pathologist-nurseconsultant-pathologistcrosscrossdownloademailexclaimationfeedbackfiltergraph-arrowinterviewslinkmdt_iconmenumore_dots nurse-consultantpadlock patient-advocate-pathologistpatient-consultantpatientperson pharmacist-nurseplay_buttonplay-colour-tmcplay-colourAsset 1podcastprinter scenerysearch share single-doctor social_facebooksocial_googleplussocial_instagramsocial_linkedin_altsocial_linkedin_altsocial_pinterestlogo-twitter-glyph-32social_youtubeshape-star (1)tick-bluetick-orangetick-red tick-whiteticktimetranscriptup-arrowwebinar Sponsored Department Location NEW TMM Corporate Services Icons-07NEW TMM Corporate Services Icons-08NEW TMM Corporate Services Icons-09NEW TMM Corporate Services Icons-10NEW TMM Corporate Services Icons-11NEW TMM Corporate Services Icons-12Salary £ TMM-Corp-Site-Icons-01TMM-Corp-Site-Icons-02TMM-Corp-Site-Icons-03TMM-Corp-Site-Icons-04TMM-Corp-Site-Icons-05TMM-Corp-Site-Icons-06TMM-Corp-Site-Icons-07TMM-Corp-Site-Icons-08TMM-Corp-Site-Icons-09TMM-Corp-Site-Icons-10TMM-Corp-Site-Icons-11TMM-Corp-Site-Icons-12TMM-Corp-Site-Icons-13TMM-Corp-Site-Icons-14TMM-Corp-Site-Icons-15TMM-Corp-Site-Icons-16TMM-Corp-Site-Icons-17TMM-Corp-Site-Icons-18TMM-Corp-Site-Icons-19TMM-Corp-Site-Icons-20TMM-Corp-Site-Icons-21TMM-Corp-Site-Icons-22TMM-Corp-Site-Icons-23TMM-Corp-Site-Icons-24TMM-Corp-Site-Icons-25TMM-Corp-Site-Icons-26TMM-Corp-Site-Icons-27TMM-Corp-Site-Icons-28TMM-Corp-Site-Icons-29TMM-Corp-Site-Icons-30TMM-Corp-Site-Icons-31TMM-Corp-Site-Icons-32TMM-Corp-Site-Icons-33TMM-Corp-Site-Icons-34TMM-Corp-Site-Icons-35TMM-Corp-Site-Icons-36TMM-Corp-Site-Icons-37TMM-Corp-Site-Icons-38TMM-Corp-Site-Icons-39TMM-Corp-Site-Icons-40TMM-Corp-Site-Icons-41TMM-Corp-Site-Icons-42TMM-Corp-Site-Icons-43TMM-Corp-Site-Icons-44TMM-Corp-Site-Icons-45TMM-Corp-Site-Icons-46TMM-Corp-Site-Icons-47TMM-Corp-Site-Icons-48TMM-Corp-Site-Icons-49TMM-Corp-Site-Icons-50TMM-Corp-Site-Icons-51TMM-Corp-Site-Icons-52TMM-Corp-Site-Icons-53TMM-Corp-Site-Icons-54TMM-Corp-Site-Icons-55TMM-Corp-Site-Icons-56TMM-Corp-Site-Icons-57TMM-Corp-Site-Icons-58TMM-Corp-Site-Icons-59TMM-Corp-Site-Icons-60TMM-Corp-Site-Icons-61TMM-Corp-Site-Icons-62TMM-Corp-Site-Icons-63TMM-Corp-Site-Icons-64TMM-Corp-Site-Icons-65TMM-Corp-Site-Icons-66TMM-Corp-Site-Icons-67TMM-Corp-Site-Icons-68TMM-Corp-Site-Icons-69TMM-Corp-Site-Icons-70TMM-Corp-Site-Icons-71TMM-Corp-Site-Icons-72