Evaluating the Effects of Assignment Report Usage on Student Outcomes in an Intelligent Tutoring System: A Randomized Encouragement Design

Tuesday, December 10, 2024
3:00 pm to 4:00 pm
Location
Floor/Room #
320G

DATA SCIENCE

Ph.D. Qualifier Presentation

Wen Chiang Ivan Lim

Time: Tuesday, December 10, 2024, 3pm to 4pm

Location: Unity Hall, UH320G

Committee

Prof. Neil Heffernan         PhD Advisor, Department of Computer Science, WPI

Prof. Adam Sales               Co-Advisor, Department of Mathematical Sciences, WPI

Prof. Andrew Trapp          Co-Advisor, Department of Operations and Industrial Engineering, WPI

Title of the PhD qualifier exam:

Evaluating the Effects of Assignment Report Usage on Student Outcomes in an Intelligent Tutoring System: A Randomized Encouragement Design

Abstract:

As online learning platforms become more popular and deeply integrated into education, understanding their effectiveness and what drives that effectiveness becomes increasingly important. This study examines how teachers reviewing assignment reports impacts student outcomes within an intelligent tutoring system (ITS) using a randomized encouragement design (RED). While prior research has shown the benefits of ITS for student learning—highlighting the value of ITS immediate feedback on next problem correctness for students and large-scale evidence of its overall impact in schools—there is limited research exploring whether teachers actively use ITS data to inform their teaching practices, particularly when it comes to reviewing assignment reports and understanding the benefits of doing so.

Randomized controlled trials that require teachers to use ITS features based on random assignment present both ethical and practical challenges. As a result, much of the existing research on teachers' ITS usage relies on qualitative studies, small-scale experiments, or survey data, making it difficult to identify the causal effects of their engagement with these systems. To address this gap, we implemented a RED on ASSISTments, an online mathematics platform, randomly assigning teachers to an encouragement or control group after they created an assignment. Teachers in the encouragement group received a popup prompt encouraging them to explore the assignment report, while those in the control group continued without any additional prompts. To estimate the local average treatment effect of teachers encouraged to view the report on student outcomes in the next assignment, we focused our analysis on teachers new to ASSISTments.

The results reveal that while viewing the first assignment report had no significant effect on the proportion of students starting the next assignment or their value-added scores, it significantly increased the proportion of students completing the next assignment. This effect, validated by the Anderson-Rubin test (robust to weak instruments), represents a measurable causal impact of teachers using assignment reports on student outcomes. Based on data from 330 teachers, this large-scale study provides new insights into the causal effects of teachers' engagement with ITS data on student outcomes. By highlighting the importance of teachers' roles within an ITS, this research contributes to the growing body of evidence on effective teaching practices in online learning environments.

Audience(s)

Department(s):

Data Science
Contact Person
Kelsey Briggs

Phone Number: