- Browse by Subject
Browsing by Subject "Education research"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Do the benefits continue? Long term impacts of the Anatomy Education Research Institute (AERI) 2017(BMC, 2022-11-24) Husmann, Polly R.; Brokaw, James J.; O’Loughlin, Valerie Dean; Anatomy, Cell Biology and Physiology, School of MedicineBackground: The Anatomy Education Research Institute (AERI) was held in Bloomington, Indiana in July of 2017. Previous research has shown that AERI was successful in meeting Kirkpatrick's first two levels of evaluation via positive initial reactions and learning gains identified at the end of AERI. This manuscript demonstrates continued success in Kirkpatrick levels two and three via six-month and thirty-month follow-up surveys and nine-month follow-up focus groups and interviews. Methods: Quantitative analyses were completed using Microsoft Excel (2019) and SPSS version 26 while qualitative analyses were completed for both survey responses and focus groups/interviews using thematic analyses. Results: Results demonstrate that the learning gains seen immediately post-AERI 2017 were sustained for all participants (accepted applicants and invited speakers). Qualitative results continued to demonstrate positive reactions to AERI 2017. Both quantitative and qualitative results demonstrated that the main obstacle to educational research for most participants is time, while collaboration, IRB, institutional roadblocks, and devaluing of educational research were also identified as obstacles. Conclusions: The research presented here indicates positive outcomes to Kirkpatrick Levels 1, 2, & 3 of evaluation following AERI 2017. However, substantial obstacles still exist for researchers in medical education. The need for a sustained community of practice for educational researchers was suggested as a potential buffer against these obstacles and multiple options for providing that community are discussed.Item Geoscience Education Perspectives on Integrated, Coordinated, Open, Networked (ICON) Science(Wiley, 2022) Fortner, Sarah K.; Manduca, Cathryn A.; Ali, Hendratta N.; Saup, Casey M.; Nyarko, Samuel Cornelius; Othus-Gault, Shannon; Perera, Viranga; Tong, Vincent C. H.; Gold, Anne U.; Furman, Tanya; Arthurs, Leilani; Mulvey, Bridget K.; St. John, Kristen; Singley, Joel G.; Johnson, Elijah Thomas; Witter, Molly; Batchelor, Rebecca L.; Carter, Deron T.; Damas, M. Chantale; LeMay, Lynsey; Layou, Karen M.; Low, Russanne; Wang, Hui Hui; Olson-Sawyer, Kai; Pallant, Amy; Ryker, Katherine; Lukes, Laura; LaDue, Nicole; Ryker, Katherine; van der Hoeven Kraft, Kaatje J.; Earth and Environmental Sciences, School of SciencePractitioners and researchers in geoscience education embrace collaboration applying ICON (Integrated, Coordinated, Open science, and Networked) principles and approaches which have been used to create and share large collections of educational resources, to move forward collective priorities, and to foster peer-learning among educators. These strategies can also support the advancement of coproduction between geoscientists and diverse communities. For this reason, many authors from the geoscience education community have co-created three commentaries on the use and future of ICON in geoscience education. We envision that sharing our expertise with ICON practice will be useful to other geoscience communities seeking to strengthen collaboration. Geoscience education brings substantial expertise in social science research and its application to building individual and collective capacity to address earth sustainability and equity issues at local to global scales The geoscience education community has expanded its own ICON capacity through access to and use of shared resources and research findings, enhancing data sharing and publication, and leadership development. We prioritize continued use of ICON principles to develop effective and inclusive communities that increase equity in geoscience education and beyond, support leadership and full participation of systemically non-dominant groups and enable global discussions and collaborations.Item Measurement in STEM education research: a systematic literature review of trends in the psychometric evidence of scales(Springer, 2023) Maric, Danka; Fore, Grant A.; Nyarko, Samuel Cornelius; Varma‑Nelson, PratibhaBackground: The objective of this systematic review is to identify characteristics, trends, and gaps in measurement in Science, Technology, Engineering, and Mathematics (STEM) education research. Methods: We searched across several peer-reviewed sources, including a book, similar systematic reviews, conference proceedings, one online repository, and four databases that index the major STEM education research journals. We included empirical studies that reported on psychometric development of scales developed on college/university students for the context of post-secondary STEM education in the US. We excluded studies examining scales that ask about specific content knowledge and contain less than three items. Results were synthesized using descriptive statistics. Results: Our final sample included the total number of N = 82 scales across N = 72 studies. Participants in the sampled studies were majority female and White, most scales were developed in an unspecified STEM/science and engineering context, and the most frequently measured construct was attitudes. Internal structure validity emerged as the most prominent validity evidence, with exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) being the most common. Reliability evidence was dominated by internal consistency evidence in the form of Cronbach’s alpha, with other forms being scarcely reported, if at all. Discussion: Limitations include only focusing on scales developed in the United States and in post-secondary contexts, limiting the scope of the systematic review. Our findings demonstrate that when developing scales for STEM education research, many types of psychometric properties, such as differential item functioning, test–retest reliability, and discriminant validity are scarcely reported. Furthermore, many scales only report internal structure validity (EFA and/or CFA) and Cronbach’s alpha, which are not enough evidence alone. We encourage researchers to look towards the full spectrum of psychometric evidence both when choosing scales to use and when developing their own. While constructs such as attitudes and disciplines such as engineering were dominant in our sample, future work can fill in the gaps by developing scales for disciplines, such as geosciences, and examine constructs, such as engagement, self-efficacy, and perceived fit.