Foundations for Studying Clinical Workflow: Development of a Composite Inter-Observer Reliability Assessment for Workflow Time Studies

If you need an accessible version of this item, please email your request to digschol@iu.edu so that they may create one and provide it to you.
Date
2019
Language
American English
Embargo Lift Date
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
American Medical Informatics Association
Abstract

The ability to understand and measure the complexity of clinical workflow provides hospital managers and researchers with the necessary knowledge to assess some of the most critical issues in healthcare. Given the protagonist role of workflow time studies on influencing decision makers, major efforts are being conducted to address existing methodological inconsistencies of the technique. Among major concerns, the lack of a standardized methodology to ensure the reliability of human observers stands as a priority. In this paper, we highlight the limitations of the current Inter-Observer Reliability Assessments, and propose a novel composite score to systematically conduct them. The composite score is composed of a) the overall agreement based on Kappa that evaluates the naming agreement on virtually created one-seconds tasks, providing a global assessment of the agreement over time, b) a naming agreement based on Kappa, requiring an observation pairing approach based on time-overlap, c) a duration agreement based on the concordance correlation coefficient, that provides means to evaluate the correlation concerning tasks duration, d) a timing agreement, based on descriptive statistics of the gaps between timestamps of same-task classes, and e) a sequence agreement based on the Needleman-Wunsch sequence alignment algorithm. We hereby provide a first step towards standardized reliability reporting in workflow time studies. This new composite IORA protocol is intended to empower workflow researchers with a standardized and comprehensive method for validating observers' reliability and, in turn, the validity of their data and results.

Description
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
Lopetegui, M., Yen, P. Y., Embi, P., & Payne, P. (2020). Foundations for Studying Clinical Workflow: Development of a Composite Inter-Observer Reliability Assessment for Workflow Time Studies. AMIA ... Annual Symposium proceedings. AMIA Symposium, 2019, 617–626.
ISSN
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
AMIA ... Annual Symposium Proceedings
Source
PMC
Alternative Title
Type
Article
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Final published version
Full Text Available at
This item is under embargo {{howLong}}