An Assessment Tool to Provide Targeted Feedback to Robotic Surgical Trainees: Development and Validation of the End-to-End Assessment of Suturing Expertise (EASE)

Abstract

Purpose: To create a suturing skills assessment tool that comprehensively defines criteria around relevant sub-skills of suturing and to confirm its validity.

Materials and Methods: 5 expert surgeons and an educational psychologist participated in a cognitive task analysis (CTA) to deconstruct robotic suturing into an exhaustive list of technical skill domains and sub-skill descriptions. Using the Delphi methodology, each CTA element was systematically reviewed by a multi-institutional panel of 16 surgical educators and implemented in the final product when content validity index (CVI) reached ≥0.80. In the subsequent validation phase, 3 blinded reviewers independently scored 8 training videos and 39 vesicourethral anastomoses (VUA) using EASE; 10 VUA were also scored using Robotic Anastomosis Competency Evaluation (RACE), a previously validated, but simplified suturing assessment tool. Inter-rater reliability was measured with intra-class correlation (ICC) for normally distributed values and prevalence-adjusted bias-adjusted Kappa (PABAK) for skewed distributions. Expert (≥100 prior robotic cases) and trainee (<100 cases) EASE scores from the non-training cases were compared using a generalized linear mixed model.

Results: After two rounds of Delphi process, panelists agreed on 7 domains, 18 sub-skills, and 57 detailed sub-skill descriptions with CVI ≥ 0.80. Inter-rater reliability was moderately high (ICC median: 0.69, range: 0.51-0.97; PABAK: 0.77, 0.62-0.97). Multiple EASE sub-skill scores were able to distinguish surgeon experience. The Spearman’s rho correlation between overall EASE and RACE scores was 0.635 (p=0.003).

Conclusions: Through a rigorous CTA and Delphi process, we have developed EASE, whose suturing sub-skills can distinguish surgeon experience while maintaining rater reliability.

Description
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
Haque, T. F., Hui, A., You, J., Ma, R., Nguyen, J. H., Lei, X., Cen, S., Aron, M., Collins, J. W., Djaladat, H., Ghazi, A., Yates, K. A., Abreu, A. L., Daneshmand, S., Desai, M. M., Goh, A. C., Hu, J. C., Lebastchi, A. H., Lendvay, T. S., … Hung, A. J. (2022). An Assessment Tool to Provide Targeted Feedback to Robotic Surgical Trainees: Development and Validation of the End-To-End Assessment of Suturing Expertise (EASE). Urology Practice, 9(6), 532–539. https://doi.org/10.1097/upj.0000000000000344
ISSN
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
Urology Practice
Rights
Publisher Policy
Source
Author
Alternative Title
Type
Article
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Author's manuscript
Full Text Available at
This item is under embargo {{howLong}}