ScholarWorksIndianapolis
  • Communities & Collections
  • Browse ScholarWorks
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Subject

Browsing by Subject "Cognitive load"

Now showing 1 - 2 of 2
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Essential Care for Every Baby: Neonatal Clinical Decision Support Tool
    (Springer, 2020-07) Rajapuri, Anushri Singh; Ravindran, Radhika; Horan, Kevin; Bucher, Sherri; Purkayastha, Saptarshi; Medicine, School of Medicine
    Unacceptably high rates of neonatal mortality are an urgent global health challenge. Consistent application of Essential Newborn Care (ENC) interventions reduce newborn mortality. However, ENC has failed to scale-up in low-middle income countries, where the bulk of neonatal deaths occur. The American Academy of Pediatrics designed an evidence-based, simplified training and educational curriculum called Essential Care for Every Baby (ECEB), which includes a clinical practice guideline for the time of delivery through 24 h after birth. However, the scale-up of ECEB has been hampered by the need to provide a wide variety of time-sensitive ECEB interventions to numerous mother-baby pairs. This incurs significant cognitive load among providers who perform varied tasks every few minutes for each baby. In this high-load, stressful situation, there are often profound gaps in the delivery of crucial ECEB strategies. We propose an innovative, scalable, clinical decision support mobile app which prioritizes recognition over recall and addresses existing challenges.
  • Loading...
    Thumbnail Image
    Item
    Validity evidence for an instrument for cognitive load for virtual didactic sessions
    (Wiley, 2022-02-01) Hickam, Grace; Jordan, Jaime; Haas, Mary R. C.; Wagner, Jason; Manthey, David; Cico, Stephen John; Wolff, Margaret; Santen, Sally A.; Emergency Medicine, School of Medicine
    Background: COVID necessitated the shift to virtual resident instruction. The challenge of learning via virtual modalities has the potential to increase cognitive load. It is important for educators to reduce cognitive load to optimize learning, yet there are few available tools to measure cognitive load. The objective of this study is to identify and provide validity evidence following Messicks' framework for an instrument to evaluate cognitive load in virtual emergency medicine didactic sessions. Methods: This study followed Messicks' framework for validity including content, response process, internal structure, and relationship to other variables. Content validity evidence included: (1) engagement of reference librarian and literature review of existing instruments; (2) engagement of experts in cognitive load, and relevant stakeholders to review the literature and choose an instrument appropriate to measure cognitive load in EM didactic presentations. Response process validity was gathered using the format and anchors of instruments with previous validity evidence and piloting amongst the author group. A lecture was provided by one faculty to four residency programs via ZoomTM. Afterwards, residents completed the cognitive load instrument. Descriptive statistics were collected; Cronbach's alpha assessed internal consistency of the instrument; and correlation for relationship to other variables (quality of lecture). Results: The 10-item Leppink Cognitive Load instrument was selected with attention to content and response process validity evidence. Internal structure of the instrument was good (Cronbach's alpha = 0.80). Subscales performed well-intrinsic load (α = 0.96, excellent), extrinsic load (α = 0.89, good), and germane load (α = 0.97, excellent). Five of the items were correlated with overall quality of lecture (p < 0.05). Conclusions: The 10-item Cognitive Load instrument demonstrated good validity evidence to measure cognitive load and the subdomains of intrinsic, extraneous, and germane load. This instrument can be used to provide feedback to presenters to improve the cognitive load of their presentations.
About IU Indianapolis ScholarWorks
  • Accessibility
  • Privacy Notice
  • Copyright © 2025 The Trustees of Indiana University