Abstract
Evidence Centered Game Design (ECgD) is an increasingly popular model used for stealth game assessments employing education data mining techniques for the measurement of learning within serious (and other) games (GlassLab, Psychometric considerations in game-based assessment. Institute of Play. Retrieved July 1, 2014, from http://www.instituteofplay.org/work/projects/glasslab-research/). There is a constant tension in ECgD between how pre-defined the learning outcomes and measures need to be, and how much important, but unanticipated, learning can be detected in gameplay. The EdGE research team is employing an emergent approach to develo** a game-based assessment mechanic that starts empirically from what the players do in a well-crafted game and detects patterns that may indicate implicit understanding of salient phenomena. Implicit knowledge is foundational to explicit knowledge (Polanyi, The tacit dimension. University of Chicago Press, Chicago, IL,1966), yet is largely ignored in education because of the difficulty measuring knowledge that a learner has not yet formalized. This chapter describes our approach to measuring implicit science learning in the game, Impulse, designed to foster an implicit understanding of Newtonian mechanics using a combination of video analysis, game log analyses, and comparisons with pre-post assessment results. This research demonstrates that it is possible to reliably detect strategies that demonstrate an implicit understanding of fundamental physics using data mining techniques on user-generated data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Asbell-Clarke, J., & Rowe, E. (2014). Scientific inquiry in digital games. In F. Blumberg (Ed.), Learning by playing: Video games in education. New York: Oxford University Press.
Asbell-Clarke, J., Rowe, E., & Sylvan, E. (2013, April). Assessment design for emergent game-based learning. Paper presented at the ACM SIGCHI conference on human factors in computing systems (CHI’13). Paris, France.
Asbell-Clarke, J., Rowe, E., Sylvan, E., & Baker, R. (2013, June). Working through impulse: Assessment of emergent learning in a physics game. Paper presented at the 9th annual meeting of the Games+Learning+Society (GLS) conference, Madison, WI.
Baker, R. S., & Clarke-Midura, J. (2013). Predicting successful inquiry learning in a virtual performance assessment for science. In User modeling, adaptation, and personalization (pp. 203–214). Berlin: Springer.
Baker, R. S., Ocumpaugh, J., Gowda, S.M., Kamarainen, A., Metcalf, S.J. (2014) Extending log-based affect detection to a multi-user virtual environment for science. In Proceedings of the 22nd conference on user modelling, adaptation, and personalization, pp. 290–300 (To appear).
Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society. Series B (Methodological), 57, 289–300.
Clark, D. B., Nelson, B., Chang, H., D’Angelo, C. M., Slack, K., & Martinez-Garza, M. (2011). Exploring Newtonian mechanics in a conceptually-integrated digital game: Comparison of learning and affective outcomes for students in Taiwan and the United States. Computers and Education, 57(3), 2178–2195.
Clearleft Ltd. (2013) Silverback (Version 2.0) [Software]. http://silverbackapp.com.
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37–46. doi:10.1177/001316446002000104.
Collins, H. (2010). Tacit and explicit knowledge. Chicago: University of Chicago Press.
diSessa, A. A. (1993). Toward an epistemology of physics. Cognition and Instruction, 10(2/3), 105–225. doi:10.2307/3233725.
Fisch, S. M., Lesh, R., Motoki, E., Crespo, S., & Melfi, V. (2011). Children’s mathematical reasoning in online games: Can data mining reveal strategic thinking? Child Development Perspectives, 5(2), 88–92.
Gee, J. P. (2003). What video games have to teach us about learning and literacy (1st ed.). New York: Palgrave/Macmillan.
Gee, J. P. (2007). What video games have to teach us about learning and literacy (2nd ed.). New York: Palgrave/Macmillan.
GlassLab (2014). Psychometric considerations in game-based assessment. Institute of Play. Retrieved July 1, 2014, from http://www.instituteofplay.org/work/projects/glasslab-research/
Halverson, R., Wills, N., & Owen, E. (2012). CyberSTEM: Game-based learning telemetry model for assessment. Presentation at 8th Annual GLS, Madison, WI.
Hanley, J. A., & McNeil, B. J. (1982). The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology, 143(1), 29–36. doi:10.1148/radiology.143.1.7063747.
Kinnebrew, J. S., & Biswas, G. (2012). Identifying learning behaviors by contextualizing differential sequence mining with action features and performance evolution. In Proceedings of the international conference on educational data mining, pp. 57–64.
McCloskey, M. (1983). Intuitive physics. Scientific American, 248(4), 122–130.
Minstrell, J. (1982). Explaining the “at rest” condition of an object. The Physics Teacher, 20(1), 10–14.
Mislevy, R., & Haertel, G. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6–20.
Moore, A.W. (2003) Cross-validation for detecting and preventing overfitting. Statistical Data Mining Tutorials.
National Research Council. (2011). Learning science through computer games and simulations. In M. A. Honey & M. L. Hilton (Eds.), Committee on science learning: Computer games, simulations, and Education. Washington, DC: National Academies Press.
Pardos, Z.A., Baker, R.S.J.d., San Pedro, M.O.C.Z., & Gowda, S.M. (2013). Affective states and state tests: Investigating how affect throughout the school year predicts end of year learning outcomes. Proceedings of the 3rd international conference on learning analytics and knowledge, pp. 117–124.
Plass, J., Homer, B. D., Kinzer, C. K., Chang, Y. K., Frye, J., Kaczetow, W., et al. (2013). Metrics in simulations and games for learning. In M. Seif El-Nasr, A. Drachen, & A. Canossa (Eds.), Game analytics: Maximizing the value of player data (pp. 694–730). London: Springer.
Polanyi, M. (1966). The tacit dimension. Chicago, IL: University of Chicago Press.
Quinlan, J. R. (1993). C4.5: Programs for machine learning. San Francisco: Morgan Kaufmann.
Rowe, E., Asbell-Clarke, J., Bardar, E., Kasman, E., & MacEachern, B. (2014, June). Crossing the bridge: Connecting game-based implicit science learning to the classroom. Paper presented at the 10th annual meeting of Games+Learning+Society. Madison, WI.
Rowe, E., Baker, R., Asbell-Clarke, J., Kasman, E., & Hawkins, W. (2014, July). Building automated detectors of gameplay strategies to measure implicit science learning. Poster presented at the 7th annual meeting of the international educational data mining society, July 4–8, London.
Sabourin J, Mott B, Lester J (2011) Modeling learner affect with theoretically grounded dynamic Bayesian networks. In Proceedings of the 4th international conference on affective computing and intelligent interaction. Memphis, TN, pp. 286–295.
Sao Pedro, M. A., Baker, R. S. J., Gobert, J., Montalvo, O., & Nakama, A. (2013). Leveraging machine-learned detectors of systematic inquiry behavior to estimate and predict transfer of inquiry skill. User Modeling and User-Adapted Interaction, 23(1), 1–39.
Sao Pedro, M., Baker, R.S.J.d., & Gobert, J. (2012) Improving construct validity yields better models of systematic inquiry, even with less information. In Proceedings of the 20th international conference on user modeling, adaptation and personalization (UMAP 2012), pp. 249–260.
Shute, V. J., Masduki, I., Donmez, O., Kim, Y. J., Dennen, V. P., Jeong, A. C., et al. (2010). Assessing key competencies within game environments. In D. Ifenthaler, P. Pirnay-Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 281–309). New York: Springer-Verlag.
Shute, V., & Ventura, M. (2013). Stealth assessment: Measuring and supporting learning in video games. Cambridge, MA: MIT Press.
Shute, V., Ventura, M., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning? Flow and Grow. Serious Games: Mechanisms and Effects, 1(1), 1–33.
Shute, V., Ventura, M., & Kim, J. (2013). Assessment and learning of qualitative physics in Newton’s playground. Journal of Educational Research, 106(6), 423–430. doi:10.1080/00220671.2013.832970.
Srikant, R., & Agrawal, R. (1996). Mining sequential patterns: Generalizations and performance improvements (pp. 1–17). Berlin, Germany: Springer.
Thomas, D., & Brown, J. S. (2011). A new culture of learning: Cultivating the imagination for a world of constant change. Lexington, KY: CreateSpace.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
Acknowledgments
We are grateful for NSF/EHR/DRK12 grant #1119144 and our research group, EdGE at TERC, which includes Erin Bardar, Teon Edwards, Jamie Larsen, Barbara MacEachern, Emily Kasman, and Katie McGrath. Our evaluators, the New Knowledge Organization, assisted with establishing the reliability of the coding.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Rowe, E., Asbell-Clarke, J., Baker, R.S. (2015). Serious Games Analytics to Measure Implicit Science Learning. In: Loh, C., Sheng, Y., Ifenthaler, D. (eds) Serious Games Analytics. Advances in Game-Based Learning. Springer, Cham. https://doi.org/10.1007/978-3-319-05834-4_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-05834-4_15
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-05833-7
Online ISBN: 978-3-319-05834-4
eBook Packages: Humanities, Social Sciences and LawEducation (R0)