This study’s goal was to determine the possible associations between student performance in an introductory statistics course with the use of online resources, course delivery method, and length of the course. The research questions sought to determine whether a more frequent use of online resources was positively correlated with a better course outcome, and if so, which specific resources accounted for the most variability in grades. This course offers three online sources of student support: online homework assistance, online practice questions and online practice tests. These are designed to address the various stages in learning. The homework assistance is generic and can be used with any raw data. The online practice questions are intended to mimic test questions and can be refreshed with random numbers to provide essentially an infinite number of questions. The first two resources provide information regarding correctness of calculations (including intermediate steps) to assist the student. Additionally, because the course was forced online during the pandemic and available in a condensed format during the summer (2 versus 8 months), we were interested in determining if student performance differed for online delivery versus face-to-face instruction, short versus long format of the course, and if there were any interactions among delivery method and use of resources to predict student performance.
For this purpose, we utilized archival data collected from the university’s educational website, Sakai. The study resources included practice questions, homework helpers, and practice exams. The participants were roughly 1000 students enrolled in the undergraduate statistics course in Psychology across the years of 2018 to 2021, and the data was completely anonymized and stripped of all identifiers. Course performance was measured by way of final course grades and use of resources was assessed in terms of the number of site visits and downloads. We hope to find a positive correlation between student performance and use of resources, traditional delivery of the course, and a longer course duration.
One preliminary analysis conducted on students within the current dataset compared how students who had made use of an online test preparation tool compared to those who had not used it. Students who used the online tool performed significantly better (M = 13.97, SD = 5.47 out of a maximum of 18) than those who did not use it (M = 9.04, SD = 5.57) t(113) = 4.46, p < .001. Thus, this indicates that the students who made use of this study resource performed significantly better than those who did not. This preliminary analysis is being followed up with more comprehensive analyses across multiple years and online resources. This research may have important pedagogical implications.
As an increasing number of courses are being offer online or have online components it is critical that we assess the effectiveness of these modes of delivery so we can design courses that are conducive to more effective learning and to assist students to attain better course outcomes.Jennifer Roters - Ishita Kapoor poster