Abstract
Classroom response systems are widely recognized as an effective tool for providing formative feedback and engaging students, but our research supports the hypothesis that these systems also provide opportunities for improving content retention. This is evidenced by an experiment we conducted on two distinct sections of an introductory course in computer science, wherein large collections of classroom response system questions were presented to different sections at different stages. Questions that were offered immediately after the corresponding material had the express purpose of providing an opportunity for formative feedback, while questions that were presented later were expected to improve content retention. The performance of participants on the corresponding questions of the final examination was then reviewed, and statistical analyses indicate that participants performed better on those questions that corresponded to the classroom response system questions provided for content retention.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Collier, R.D., Kawash, J.: Improving student content retention using a classroom response system. In: CSEDU 2017 - Proceedings of the 9th International Conference on Computer Supported Education, Porto, Portugal, 21–23 April 2017, vol. 1, pp. 17–24 (2017)
Boscardin, C., Penuel, W.: Exploring benefits of audience-response systems on learning: a review of the literature. Acad. Psychiatry 36, 401–407 (2012)
Moss, K., Crowley, M.: Effective learning in science: the use of personal response systems with a wide range of audiences. Comput. Educ. 56, 36–43 (2011)
Kay, R.H., LeSage, A.: Examining the benefits and challenges of using audience response systems: a review of the literature. Comput. Educ. 53, 819–827 (2009)
Bruff, D.: Teaching with Classroom Response Systems: Creating Active Learning Environments. Jossey-Bass, San Francisco (2009)
Moredich, C., Moore, E.: Engaging students through the use of classroom response systems. Nurse Educ. 32, 113–116 (2007)
Blasco-Arcas, L., Buil, I., Hernandez-Ortega, B., Sese, F.J.: Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 62, 102–110 (2013)
Webb, A., Carnaghan, C.: Investigating the effects of group response systems on student satisfaction, learning and engagement in accounting education. Issues Acc. Educ. 22, 391–409 (2006)
Liao, S.N., Zingaro, D., Laurenzano, M.A., Griswold, W.G., Porter, L.: Lightweight, early identification of at-risk CS1 students. In: Proceedings of the 2016 ACM Conference on International Computing Education Research, ICER 2016, pp. 123–131. ACM, New York (2016)
Porter, L., Zingaro, D., Lister, R.: Predicting student success using fine grain clicker data. In: Proceedings of the Tenth Annual Conference on International Computing Education Research, ICER 2014, pp. 51–58. ACM, New York (2014)
Draper, S.W., Brown, I.M.: Increasing interactivity in lectures using an electronic voting system. J. Comput. Assist. Learn. 20, 81–94 (2004)
Judson, E., Sawada, D.: Learning from past and present: electronic response systems in college lecture halls. J. Comput. Math. Sci. Teach. 21, 167–181 (2002)
Brewer, C.A.: Near real-time assessment of student learning and understanding in biology courses. BioScience 54, 1034–1039 (2004)
Caldwell, J.E.: Clickers in the large classroom: current research and best-practice tips. CBE Life Sci. Educ. 6, 9–20 (2007)
Simon, B., Kinnunen, P., Porter, L., Zazkis, D.: Experience report: CS1 for majors with media computation. In: Proceedings of the Fifteenth Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE 2010, pp. 214–218. ACM, New York (2010)
Porter, L., Simon, B.: Retaining nearly one-third more majors with a trio of instructional best practices in CS1. In: Proceeding of the 44th ACM Technical Symposium on Computer Science Education, SIGCSE 2013, pp. 165–170. ACM, New York (2013)
Tew, A.E., Dorn, B.: The case for validated tools in computer science education research. Computer 46, 60–66 (2013)
Cukierman, D.: Predicting success in university first year computing science courses: the role of student participation in reflective learning activities and in i-clicker activities. In: Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, ITiCSE 2015, pp. 248–253. ACM, New York (2015)
Vinaja, R.: The use of lecture videos, ebooks, and clickers in computer courses. J. Comput. Sci. Coll. 30, 23–32 (2014)
Simon, B., Parris, J., Spacco, J.: How we teach impacts student learning: peer instruction vs. lecture in CS0. In: Proceeding of the 44th ACM Technical Symposium on Computer Science Education, SIGCSE 2013, pp. 41–46. ACM, New York (2013)
Zingaro, D.: Peer instruction contributes to self-efficacy in CS1. In: Proceedings of the 45th ACM Technical Symposium on Computer Science Education, SIGCSE 2014, pp. 373–378. ACM, New York (2014)
Zingaro, D., Porter, L.: Tracking student learning from class to exam using isomorphic questions. In: Proceedings of the 46th ACM Technical Symposium on Computer Science Education, SIGCSE 2015, pp. 356–361. ACM, New York (2015)
Huss-Lederman, S.: The impact on student learning and satisfaction when a CS2 course became interactive (abstract only). In: Proceedings of the 47th ACM Technical Symposium on Computing Science Education, SIGCSE 2016, p. 687. ACM, New York (2016)
Bloom, B.S.: Taxonomy of Educational Objectives: The Classification of Educational Goals. Longmans, Green (1956)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Questions Used in the Study
Appendix: Questions Used in the Study
This appendix contains each of the final exam questions (i.e., the stem and alternatives, where applicable) associated with this investigation (i.e., multiple choice questions designated MCQ1, MCQ2, MCQ3, and MCQ4) and the corresponding classroom response system questions. As previously noted, participants would have 150 min to submit a response and then the results would be analyzed, the alternatives would each be discussed, and the solution would be presented.
MCQ1
Multiple Choice Question 1 [MCQ1] from the Final Exam. The EMPLOYEE table has the following columns: EmployeeID, FirstName, LastName, Job, and Salary. Which of the following WHERE clauses would be used to retrieve data for all the salespersons that have a salary less than 35000 and everyone who is not a salesperson and whose salary is above 35000? (n.b., the data should all appear in the same table.)
-
(a)
WHERE (Job = ‘Sales’ OR Salary < 35000)
AND (Job != ‘Sales’ OR Salary > 35000)
-
(b)
WHERE (Job = ‘Sales’ OR Salary < 35000)
OR (Job != ‘Sales’ OR Salary > 35000)
-
(c)
WHERE (Job = ‘Sales’ AND Salary < 35000)
AND (Job != ‘Sales’ OR Salary > 35000)
-
(d)
WHERE (Job = ‘Sales’ AND Salary < 35000)
OR (Job != ‘Sales’ OR Salary > 35000)
The Corresponding Classroom Response System Question. What Where clause should you use if you want to retrieve all the male employees from the marketing department with all the female employees from the research department?
-
(a)
WHERE (GENDER = ‘Male’ AND DEPT = ‘Marketing’)
AND (GENDER = ‘Female’ AND DEPT = ‘Research’)
-
(b)
WHERE (GENDER = ‘Male’ AND DEPT = ‘Marketing’)
OR (GENDER = ‘Female’ AND DEPT = ‘Research’)
-
(c)
WHERE (GENDER = ‘Male’ OR DEPT = ‘Marketing’)
AND (GENDER = ‘Female’ OR DEPT = ‘Research’)
-
(d)
WHERE (GENDER = ‘Male’ OR DEPT = ‘Marketing’)
OR (GENDER = ‘Female’ OR DEPT = ‘Research’)
MCQ2
Multiple Choice Question 2 [MCQ2] from the Final Exam. If table EMPLOYEE has 250 rows of data and table DEPARTMENT has 10 rows of data, how much data (i.e., how many rows) is retrieved by: SELECT * FROM EMPLOYEE, DEPARTMENT?
-
(a)
10
-
(b)
250
-
(c)
260
-
(d)
2500
The Corresponding Classroom Response System Question. If the DEPARTMENT table has 5 rows and the EMPLOYEE table has 100 rows, how many rows does the following query return: SELECT * FROM DEPARTMENT, EMPLOYEE?
-
(a)
5
-
(b)
20
-
(c)
500
-
(d)
2000
MCQ3
Multiple Choice Question 3 [MCQ3] from the Final Exam. Which of the following benefits does UDP offer relative to TCP?
-
(a)
UDP consumes fewer computer resources by not managing connections.
-
(b)
UDP guarantees that packets of a transmission arrive in the same order they were sent.
-
(c)
UDP does not divide a message into packets.
-
(d)
UDP guarantees that all packets successfully arrive at the destination.
The Corresponding Classroom Response System Question. Which of the following statements is true?
-
(a)
UDP is a connection-oriented protocol.
-
(b)
TCP is analogous to the postal system.
-
(c)
TCP and UDP are based on packet-switching.
-
(d)
None of the above.
MCQ4
Multiple Choice Question 4 [MCQ4] from the Final Exam. Suppose that you have opened the webpage at “my.ucalgary.ca". Which of the following statements is true?
-
(a)
POST is generated when you click the “Sign In” button.
-
(b)
GET is generated when you lick the “Disclaimer” hyperlink.
-
(c)
GET is generated when you click the “About CAS” hyperlink.
-
(d)
All of the above.
The Corresponding Classroom Response System Question. Under the Hypertext transfer protocol...
-
(a)
POST is generated when clicking a hyperlink.
-
(b)
GET is generated when clicking a hyperlink.
-
(c)
GET is generated when you press the button to login to D2L.
-
(d)
POST is generated when a URL is entered in the browser’s address field.
SAQ1
Short Answer Question 1 [SAQ1] from the Final Exam. Give an example of a graph that has five (5) vertices and the minimum number of colours to legally colour it is also five (5). To clarify, if the minimum number of colours to legally colour the graph is five, then you cannot legally colour it with four or fewer colours. Recall that a legal colouring requires that no two adjacent vertices have the same colour.
The Corresponding Classroom Response System Question. What is the minimum number of colours necessary for colouring the following graph such that no two adjacent vertices have the same colour?
![figure a](http://media.springernature.com/lw685/springer-static/image/chp%3A10.1007%2F978-3-319-94640-5_1/MediaObjects/470244_1_En_1_Figa_HTML.gif)
-
(a)
1
-
(b)
3
-
(c)
4
-
(d)
5
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Collier, R., Kawash, J. (2018). Effectively Using Classroom Response Systems for Improving Student Content Retention. In: Escudeiro, P., Costagliola, G., Zvacek, S., Uhomoibhi, J., McLaren, B. (eds) Computers Supported Education. CSEDU 2017. Communications in Computer and Information Science, vol 865. Springer, Cham. https://doi.org/10.1007/978-3-319-94640-5_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-94640-5_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-94639-9
Online ISBN: 978-3-319-94640-5
eBook Packages: Computer ScienceComputer Science (R0)