Advancing Surgeon Training Through Data Science
Crowdsourcing appears to be a scalable methodology that allows annotations to be captured on surgical tools in surgical video images, which helps advance surgical training, said Tae Soo Kim, MSE, from the department of computer science at Johns Hopkins University, during the American Society of Cataract and Refractive Surgery’s 2018 annual meeting in Washington, DC
Participants included 11 of the 19 individuals working on Amazon’s Mechanical Turk platform who were trained to annotate images. Investigators looked at reliability of more than 1,900 annotations of 200 images from 2 cataract surgeries that were annotated for 6 tool types: keratome, cystotome, forceps, anterior chamber cannula, irrigation-aspiration cannula, and phacoemulsification probe. Among the results:
- Moderate inter-annotator reliability in recognizing different tool types was observed.
- Tool type selection accuracy was 97% for all tools except cystotome (38%).
- Average pixel errors for the 6 tools types were ~15, 53, 13, 14, 9, and 8, respectively.
- The cystotome appeared similar to the anterior chamber cannula in still images, which likely played a role in errors.
Kim T. Advancing surgeon training through data science: Pilot study of cataract surgical tool annotation through crowd sourcing. Talk presented at: 2018 ASCRS-ASOA Annual Meeting; April 13-17, 2018; Washington, DC.