Sometimes courses with large enrollments spawn useful innovations, and this study looked at one empirically. Large courses almost always mandate the use of multiple-choice tests, and incorporating quizzes in these courses can present sizeable logistical challenges. To cope with that situation in a large microbiology course, the faculty involved here developed an online multiple-choice question (MCQ) authoring, testing, and learning tool they dubbed Quizzical. Students in the course used the tool to write MCQs that were then used as quiz questions. The research explores how use of Quizzical affected performance in the course.
Riggs, C. D., Kang, S., & Rennie, O. (2020). Positive impact of multiple-choice authoring and regular quiz participation on student learning. CBE—Life Sciences Education, 19(2). https://doi.org/10.1187/cbe.19-09-0189 [open access]
Lots of prior research supports the use of interventions where students work with exam questions and take quizzes. The design of these activities and the logistical support the software provides make this intervention of particular interest. Students were assigned a lecture, after which they authored two exam questions using the lecture content. In addition to preparing the questions, students provided justification the answers and the detractors. The software provided guidance in writing MCQs, and the questions were reviewed (and could be revised) before they were posted. Students could take a 10-question quiz within 14 days of the lecture, and scores of 60 percent or more qualified for 6 to 8 percent of their course grade. For missed questions students were able to review the supporting material prepared by the question author. Both parts of the intervention were handled by the Quizzical software with minimal instructor involvement after initial set up.
The researchers studied the use of Quizzical in 500-student sections of a sophomore molecular biology course taken in 2017 and 2018.
The software tracked use of the quiz option as well as students’ scores. The research team categorized quiz use and scores and then compared them with scores on three exams and a cumulative final. They analyzed the level of engagement between exams and its impact on exam scores. Various statistical tests and controls for prior academic performance were used.
The researchers report that they did find a significant gender bias in their results. Male students performed significantly better than female students for both years of the study. Students in this study were predominantly STEM majors, and previous research on STEM students report conflicting findings with respect to gender bias. The research team writes, “The underlying reasons for these apparent gender biases are not clear, as different studies employ different metrics, assessment formats, and methods of analysis, but there are undoubtedly sociocultural and psychological factors involved. Despite the fact gender bias exists for the course grade in our study, we found no gender differences in Quizzical participation.”
Any study done with cohorts that share the same or similar majors raises the question of whether the findings can be generalized to students in other majors. So, although it is not possible to guarantee the same results with the Quizzical software when it’s used with other student cohorts, research results in many fields confirm that exam scores and course grades improve when students are involved with potential test questions.
The strongest implications rest on the now well-established value of what’s called “test-enhanced learning.” Whether it’s giving students old exams, having them write potential test question, or taking quizzes, significant exam score improvements are regularly reported.
This particular study shows the value of incentivizing participation in exam preparation activities. Doing so plays into that typical student response: Don’t do anything that doesn’t earn points. But better exam scores mean more learning, so perhaps that makes it a worthwhile tradeoff. Moreover, as most of us have learned, even minimal rewards motivate student participation.
Finally, the beauty of this particular intervention rests on students completing activities that benefit their learning at same time it relieves teachers of time-consuming tasks. Students write (and revise, if needed) what become the quiz questions. They justify the right answers and explain errors in the distractor items. With the quizzes, students decide when to take them and how many to complete. The software manages the quiz process, including keeping track of quiz scores and other relevant data.
Information about using Quizzical may be obtained from one of the authors, whose email appears in the article. Various support materials guide the setup process. The researchers write that “deploying Quizzical is straightforward, as it was designed for instructors with little or no experience in using educational software. . . . Once the course has been set up, there is little or no intervention needed.”
Batsell Jr., W. R., Perry, J. L., Hanley, E., & Hostetter, A. B. (2017). Ecological validity of the testing effect: The use of daily quizzes in introductory psychology. Teaching of Psychology, 44(1), 18–23. https://doi.org/10.1177/0098628316677492
Brame, C. J., & Biel, R. (2015). Test-enhanced learning: The potential for testing to promote greater learning in undergraduate science courses. CBE—Life Sciences Education, 14(2). https://doi.org/10.1187/cbe.14-11-0208 [open access]