- Browse by Author
Browsing by Author "Ford, Kathleen"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Evaluating and Extending the Informed Consent Ontology for Representing Permissions from the Clinical Domain(IOS Press, 2022) Umberfield, Elizabeth E.; Stansbury, Cooper; Ford, Kathleen; Jiang, Yun; Kardia, Sharon L.R.; Thomer, Andrea K.; Harris, Marcelline R.; Health Policy and Management, School of Public HealthThe purpose of this study was to evaluate, revise, and extend the Informed Consent Ontology (ICO) for expressing clinical permissions, including reuse of residual clinical biospecimens and health data. This study followed a formative evaluation design and used a bottom-up modeling approach. Data were collected from the literature on US federal regulations and a study of clinical consent forms. Eleven federal regulations and fifteen permission-sentences from clinical consent forms were iteratively modeled to identify entities and their relationships, followed by community reflection and negotiation based on a series of predetermined evaluation questions. ICO included fifty-two classes and twelve object properties necessary when modeling, demonstrating appropriateness of extending ICO for the clinical domain. Twenty-six additional classes were imported into ICO from other ontologies, and twelve new classes were recommended for development. This work addresses a critical gap in formally representing permissions clinical permissions, including reuse of residual clinical biospecimens and health data. It makes missing content available to the OBO Foundry, enabling use alongside other widely-adopted biomedical ontologies. ICO serves as a machine-interpretable and interoperable tool for responsible reuse of residual clinical biospecimens and health data at scale.Item Lessons Learned for Identifying and Annotating Permissions in Clinical Consent Forms(Thieme, 2021) Umberfield, Elizabeth E.; Jiang, Yun; Fenton, Susan H.; Stansbury, Cooper; Ford, Kathleen; Crist, Kaycee; Kardia, Sharon L. R.; Thomer, Andrea K.; Harris, Marcelline R.; Health Policy and Management, School of Public HealthBackground: The lack of machine-interpretable representations of consent permissions precludes development of tools that act upon permissions across information ecosystems, at scale. Objectives: To report the process, results, and lessons learned while annotating permissions in clinical consent forms. Methods: We conducted a retrospective analysis of clinical consent forms. We developed an annotation scheme following the MAMA (Model-Annotate-Model-Annotate) cycle and evaluated interannotator agreement (IAA) using observed agreement (A o), weighted kappa (κw ), and Krippendorff's α. Results: The final dataset included 6,399 sentences from 134 clinical consent forms. Complete agreement was achieved for 5,871 sentences, including 211 positively identified and 5,660 negatively identified as permission-sentences across all three annotators (A o = 0.944, Krippendorff's α = 0.599). These values reflect moderate to substantial IAA. Although permission-sentences contain a set of common words and structure, disagreements between annotators are largely explained by lexical variability and ambiguity in sentence meaning. Conclusion: Our findings point to the complexity of identifying permission-sentences within the clinical consent forms. We present our results in light of lessons learned, which may serve as a launching point for developing tools for automated permission extraction.