- Browse by Subject
Browsing by Subject "Learning analytics"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Do They Even Care? Measuring Instructor Value of Student Privacy in the Context of Learning Analytics(AIS, 2021) Jones, Kyle; Vanscoy, Amy; Bright, Kawanna; Harding, Alison; Library and Information Science, Luddy School of Informatics, Computing, and EngineeringLearning analytics tools are becoming commonplace in educational technologies, but extant student privacy issues remain largely unresolved. It is unknown whether or not faculty care about student privacy and see privacy as valuable for learning. The research herein addresses findings from a survey of over 500 full-time higher education instructors. The findings detail faculty perspectives of their own privacy, students’ privacy, and the high degree to which they value both. Data indicate that faculty believe that privacy is important to intellectual behaviors and learning. This work reports initial findings of a multi-phase, grant-funded research project that will further uncover instructor views of learning analytics and its student privacy issues.Item Sensemaking during the use of learning analytics in the context of a large college system(2017-04-05) Morse, Robert Kenneth; Brady, Erin; Bolchini, Davide; Boling, Elizabeth; Hook, SaraThis research took place as a cognitive exploration of sensemaking of learning analytics at Ivy Tech Community College of Indiana. For the courses with the largest online enrollment, quality standards in the course design are maintained by creating sections from a course design framework. This means all sections have the same starting content and the same framework for assessment. The course design framework is maintained by the curriculum committee composed of program chairs who oversee the program to which the course belongs. This research proposed to develop a learning analytics dashboard to elicit the best practices in instantiating a course design framework from the perspective of the program chair. The Instructional Design Implementation Dashboard, IDID, was designed to address the sensemaking needs of program chairs. The program chairs were asked to make sense of IDID built around the data collected from the course management system and the student information system. IDID leveraged metrics from the user activity and the learner performance from the learning management system, combined with data about the student demographics captured from the student information system. IDID was used to identify highly successful sections and examine the instructor behaviors that might be considered best practices. Data Frame Sensemaking theory was confirmed as an accurate description of the experience of program chairs when using IDID. A revised model of Data Frame Sensemaking theory was developed to explain the interaction of those using the IDID platform.