Coverage of Course Topics in a Student Generated MCQ Repository Paul Denny, Andrew Luxton-Reilly and John Hamer Dept. of Computer Science University of Auckland Auckland, New Zealand {paul, andrew, j.hamer}@cs.auckland.ac.nz Helen Purchase Dept. of Computing Science University of Glasgow Glasgow, United Kingdom hcp@dcs.gla.ac.uk ABSTRACT A recent approach to engaging students in deep learning in- volves an online tool, PeerWise, through which students con- tribute multiple-choice questions to a shared question bank. Earlier work demonstrated a strong correlation between the use of PeerWise and student performance. In this study we investigate the quality of the MCQ repos- itory created by students in an introductory programming course by analysing the range of topics on which students chose to write questions (i.e. the repository coverage) with- out guidance from an instructor. We assess the repository coverage by comparing it with a common list of typical intro- ductory programming topics, and by looking at its extent. We find that, despite having freedom to choose any topic, students created a repository that covered all the major top- ics in the curriculum. Categories and Subject Descriptors: K.3.1 Computers and Education: Computer Uses in Education General Terms: Human factors. Keywords: MCQ, peer assessment, automated, question test bank, PeerWise, contributing student, topic coverage. 1. INTRODUCTION PeerWise is a web-based system that allows students to create multiple choice questions (MCQs) and answer those created by their peers [3]. Traditionally, MCQs have been constructed by teaching staff and used for summative assessment. Multiple choice question test repositories have also been used in a drill-and- practice fashion by students. Their use in this traditional manner has been criticised as encouraging surface learning Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ITiCSE’09, July 6–9, 2009, Paris, France. Copyright 2009 ACM 978-1-60558-381-5/09/07 ...$5.00. strategies [9] and being favoured by students with low aca- demic self-concept and poor learning skills [1]. By asking students to generate a MCQ and provide an appropriate explanation of the answer, we hope to provide a richer and deeper learning experience. Furthermore, asking students to critically evaluate existing MCQ items and pro- vide formative feedback about the quality of the question requires the application of higher order cognitive skills such as making critical judgements. PeerWise transforms the use of MCQs from an assessment tool to a learning opportunity. Used in this way, MCQs overcome the limitations associated with traditional use [7]. Earlier studies with PeerWise have found that most stu- dents enjoy using the tool [3], that they use it more often than required for assessment purposes and use it extensively for revision purposes during exam study periods [4]. We found that many of the student-generated questions were of high quality, that students can identify these high qual- ity questions, and that students use the ratings to answer the high quality questions more frequently than low quality questions [5]. In addition, we found significant correlations between student use of PeerWise and performance, includ- ing performance on non-MCQ exam questions [2]. It is characteristic of courses using “contributing student pedagogies” for students to be given a greater degree of power and control over their own learning [6]. Given the increasing interest in student driven learning practices, it is important to investigate how coverage of course topics is af- fected when students have control over the development of learning resources. This study investigates the range of topics covered by a student-created MCQ repository in a CS1 course. No guid- ance or incentive was given to the students on the nature of the questions they should create, so the topic distribu- tion of the repository was not influenced by any external factors. We assess the repository coverage by comparing it with a common list of typical introductory programming topics compiled by Schulte and Bennedsen [8] and by looking at its extent. 2. METHODOLOGY In this section, we report on our research methodology. We analysed the repository of questions developed by stu- dents in a standard first-year programming course (CS1) at The University of Auckland. This course was taught using Java in the first semester (March–June) of 2008. 11