Key points from article :
A new study published in Annals of Surgery introduces an AI-powered system that can identify surgical incisions and detect potential infections from wound photos submitted by patients. Developed by researchers at the Mayo Clinic, the model was trained on over 20,000 images from 6,000 patients across nine hospitals. Led by Dr. Hala Muaddi (first author) and Dr. Cornelius Thiels (co-senior author), the goal was to improve postoperative care by enabling faster, remote assessment of surgical wounds—an increasingly important need with the rise of outpatient and virtual care.
The AI model, based on Vision Transformer architecture, first determines whether an image contains a surgical incision and then evaluates for signs of infection. It achieved 94% accuracy in detecting incisions and an 81% AUC in identifying infections. This high level of accuracy makes the tool promising for triaging images submitted by patients, helping clinicians focus their attention on cases that need urgent review.
The team envisions the AI system as a frontline screening tool that can support clinicians in early infection detection, possibly even before symptoms become visually obvious. This could help reassure patients sooner or prompt early interventions when needed—particularly beneficial in rural or resource-limited settings where access to specialists may be delayed.
While the results are encouraging and the model showed consistent performance across different demographic groups, the researchers note that further clinical validation is necessary. Still, the study lays the groundwork for AI-assisted remote wound monitoring, potentially transforming how surgical patients are cared for after they leave the hospital.