Abstract
Combining user experience and learning efficacy studies, this chapter presents a design case in the domain of food safety and food import inspection. Within five months, an effective self-paced online training was developed. Multiple methods of user and learner experience studies were applied including instructional flow, prototy**, and a heuristic review. Then, a usability test and a learning efficacy study with pre/posttests (administered before and after the course) were conducted. Major results indicate improvements required in representation and engagement for learning as well as design, functionalities, and content. In addition, this flexible self-paced online course positively impacts student learning. In comparison to the pretest scores, the posttests scores of most learners witnessed a statistically significant increase, although some of the advanced learners did not reach the targeted learning growth due to their existing knowledge of the topic (expertise reversal effect). The guessing rates for most of the learners declined after taking the training. Implications suggest that such a design follows the educational design research methodology of formative and iterative design and research phases. This case shows how to combine design and research to contribute to research to improve. Although this process sounds complicated, it is easy to follow and easy to implement. Also, it can be done in a rather short time span and in an agile process that keeps academic rigor in mind. However, it needs someone with project management skills to keep the different teams on schedule and to ensure they deliver on time.
Similar content being viewed by others
References
Al-Saimary, I. (2016). Bacterial contamination of dairy products in Basrah. The Journal of Applied Bacteriology, 1, 1–6.
Alshammari, M., Anane, R., & Hendley, R. J. (2015). Design and usability evaluation of adaptive e-learning systems based on learner knowledge and learning style. In IFIP conference on human-computer interaction (pp. 584–591). Cham, Switzerland: Springer.
APM. (2021). What is agile project management? Retrieved from The Association of Project Management. https://www.apm.org.uk/resources/find-a-resource/agile-project-management/
Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human Computer Interaction, 24, 574–594.
Barge, G. L. (2007). Pre- and post-testing with more impact. Journal of Extension, 45(6). https://www.joe.org/joe/2007december/iw1.php
Benesty, J., Chen, J., Huang, Y., & Cohen, I. (2009). Pearson correlation coefficient. In Noise reduction in speech processing. Berlin, Germany: Springer. https://doi.org/10.1007/978-3-642-00296-0_5
Bosman, A., Brent, P., Cocconcelli, P. S., Conole, G., Gombert, D., Hensel, A., … Zilliacus, J. (2016). Expertise for the future area of food safety risk assessment. EFSA Journal, 14(S1), e00503. https://doi.org/10.2903/j.efsa.2016.s0503
Chammem, N., Issaoui, M., De Almeida, A. I. D., & Delgado, A. M. (2018). Food crises and food safety incidents in European Union, United States, and Maghreb Area: Current risk communication strategies and new approaches. Journal of AOAC International, 101(4), 923–938.
Dringus, L. P., & Cohen, M. S. (2005). An adaptable usability heuristic checklist for online courses. The Frontiers in Education 35th Annual Conference, Indianapolis, IN.
Erenler, T., & Hale, H. (2018). Heuristic evaluation of e-learning. International Journal of Organizational Leadership, 7, 195–210. https://doi.org/10.33844/ijol.2018.60235
Food and Agriculture Organization of the United Nations. (2016). Risk-based imported food inspection manual. Rome, Italy: Food and Agriculture Organization of the United Nations. Retrieved February 7, 2021 from http://www.fao.org/3/a-i5381e.pdf
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111
Garrison, D., Cleveland-Innes, M., & Fung, T. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the Community of Inquiry framework. The Internet and Higher Education, 13, 31–36. https://doi.org/10.1016/j.iheduc.2009.10.002
Hadjerrouit, S. (2010, March). An empirical evaluation of technical and pedagogical usability criteria for web-based learning resources with middle school students. In Society for information technology & teacher education international conference (pp. 2231–2238). Association for the Advancement of Computing in Education (AACE).
Honebein, P. C., & Honebein, C. H. (2015). Effectiveness, efficiency, and appeal: Pick any two? The influence of learning domains and learning outcomes on designer judgments of useful instructional methods. Educational Technology Research and Development, 63(6), 937–955.
Honebein, P. C., & Reigeluth, C. M. (2021). To prove or improve, that is the question: The resurgence of comparative, confounded research between 2010 and 2019. Educational Technology Research and Development, 69(2), 465–496. https://doi.org/10.1007/s11423-021-09988-1
Jahnke, I. (2015). Digital didactical designs: Teaching and learning in CrossActionSpaces. Routledge
Jahnke, I., Bergström, P., Mårell-Olsson, E., Häll, L. & Kumar, S. (2017). Digital Didactical Designs as Research Framework – iPad Integration in Nordic Schools. In: Computers & Education (2017). Volume 113, October 2017, pp. 1–15. https://doi.org/10.1016/j.compedu.2017.05.006
Jahnke, I., Schmidt, M., Pham, M., & Singh, K. (2020). Sociotechnical-pedagogical usability for designing and evaluating learner experience in technology-enhanced environments. In Learner and user experience research. EdTech Books. https://edtechbooks.org/ux/sociotechnical_pedagogical_usability
Jahnke, I., Riedel, N., Singh, K. & Moore, J., (2021). Advancing sociotechnical-Pedagogical Heuristics for useability evaluation of online courses for adult learners. Online Learning, 25(4), 416–439. https://doi.org/10.24059/olj.v25i4.2439
Kalyuga, S., Ayres, P., Chandler, P., & Sweller, J. (2003). The expertise reversal effect. Educational Psychologist, 38(1), 23–31.
Koulocheri, E., Soumplis, A., Kostaras, N., & Xenos, M. (2011). Usability inspection through heuristic evaluation in e-learning environments: The LAMS case. In VII international conference on ICT in education, challenges (pp. 617–630).
Lee, Y.-M., Jahnke, I., & Austin, L. (2021). Mobile microlearning design and effects on learning efficacy and learner experience. Educational Technology Research and Development. https://doi.org/10.1007/s11423-020-09931-w
Lim, J. M. (2016). The relationship between successful completion and sequential movement in self-paced distance courses. The International Review of Research in Open and Distance Learning, 17(1), 159–179.
Mayer, R. E. (2002). Multimedia learning. Psychology of Learning and Motivation, 41, (pp. 85–139). Academic Press.
McKenney, S., & Reeves, T. (2018). Conducting educational design research. New York, NY: Routledge.
Missouri Department of Elementary & Secondary Education. (2015). Setting growth targets for student learning objectives: Methods and considerations. Resource document. https://dese.mo.gov/sites/default/files/Methods-and-Considerations.pdf. Accessed 20 Dec 2020.
Moore, J. L., Dickson-Deane, C., & Liu, M. Z. (2014). Designing CMS courses from a pedagogical usability perspective. In A. Benson & A. Whitworth (Eds.), Research on course management systems in higher education (pp.143–169). Information Age Publishing.
Nacu, D., Martin, C. K., & Pinkard, N. (2018). Designing for 21st century learning online: A heuristic method to enable educator learning support roles. Educational Technology Research and Development, 66(4), 1029–1049.
Nakamura, W. T., de Oliveira, E. H. T., & Conte, T. (2017). Usability and user experience evaluation of learning management systems -a systematic map** study. In International Conference on Enterprise Information Systems (Vol. 2, pp. 97–108). https://doi.org/10.5220/0006363100970108
Nielsen, J. (1994). Usability inspection methods. Boston, MA: CHI’94. https://rauterberg.employee.id.tue.nl/lecturenotes/0H420/Nielsen[1994].pdf
Nielsen, J. (2012). Thinking Aloud: The #1 Usability Tool. Nielsen Norman Group. https://www.nngroup.com/articles/thinking-aloud-the-1-usability-tool/
Nokelainen, P. (2006). An empirical assessment of pedagogical usability criteria for digital learning material with elementary school students. Educational Technology & Society, 9(2), 178–197.
Pham, M., Singh, K., & Jahnke, I. (2021, accepted). Socio-technical-pedagogical usability of online courses for older adult learners. Interactive Learning Environments. https://doi.org/10.1080/10494820.2021.1912784.
Preece, J., Sharp, H., & Rogers, Y. (2019). Interaction design: beyond human-computer interaction. John Wiley & Sons.
Privitera, G. J. (2019). Essential statistics for the behavioral sciences. Sage. https://edge.sagepub.com/privitera
Robinson, H. A., Sheffield, A., Phillips, A. S., & Moore, M. (2017). “Introduction to teaching online”: Usability evaluation of interactivity in an online social constructivist course. TechTrends, 61(6), 533–540.
Santoso, H. B., Isal, R. Y. K., Basaruddin, T., Sadita, L., & Schrepp, M. (2014, September). In-progress: User experience evaluation of student-centered e-learning environment for computer science program. In 2014 3rd international conference on user science and engineering (i-USEr) (pp. 52–55). IEEE.
Sauro, J. (2011). A practical guide to the system usability scale (SUS): Background, benchmarks & best practices. Create Space Publishing. https://measuringu.com/product/suspack/
Sauro, J. (2012a). Measuring Errors in the User Experience. MeasuringU. https://measuringu.com/errors-ux/
Sauro, J. (2012b). 10 Things to Know About the Single Ease Question (SEQ). MeasuringU. https://measuringu.com/seq10/
Sauro, J. (2018a). 5 Ways to Interpret A SUS Score. MeasuringU. https://measuringu.com/interpret-sus-score/
Sauro, J. (2018b). Using Task Ease (SEQ) to Predict Completion Rates and Times. MeasuringU. https://measuringu.com/seq-prediction/
Schmidt, M., Earnshaw, Y., Tawfik, A. A., & Jahnke, I. (2020). Methods of user centered design and evaluation for learning designers. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and user experience research: An introduction for the field of learning design & technology. EdTech Books. https://edtechbooks.org/ux/ucd_methods_for_lx
Ssemugabi, S., & De Villiers, R. (2007, October). A comparative study of two usability evaluation methods using a web-based e-learning application. In Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in develo** countries (pp. 132–142).
Sullivan, K., Harper, M., & West, C. K. (2001). Professional development needs of school foodservice directors. Journal of Child Nutrition Management, 25, 89–95.
Van Den Haak, M., De Jong, M., & Jan Schellens, P. (2003). Retrospective vs. concurrent think-aloud protocols: testing the usability of an online library catalogue. Behaviour & Information Technology, 22(5), 339–351.
Vaz-Fernandes, P., & Caeiro, S. (2019). Students’ perceptions of a food safety and quality e-learning course: A CASE study for a MSC in food consumption. International Journal of Educational Technology in Higher Education, 16(1), 37. https://doi.org/10.1186/s41239-019-0168-8
Wong, S. K., Ngyen, T., Chang, E., & Jayaratna, N. (2003). Usability Metrics for E-learning. In: E. Meersmann and Z. Tari (Eds). OTM Workshops, LNCS 2889, pp. 235–252. Berlin: Springer.
World Health Organization. (2021). Food safety. Retrieved February 6, 2021, from https://www.who.int/westernpacific/health-topics/food-safety
Xavier University. (2018). Online learning heuristics. https://www.xavier.edu/id/online-blended-classes/
Zach, L., Doyle, M. E., Bier, V., & Czuprynski, C. (2012). Systems and governance in food import safety: A US perspective. Food Control, 27(1), 153–116.
Acknowledgments
This work was funded by the College of Agriculture, Food, and Natural Resources (CARNR) at the University of Missouri-Columbia, from the USDA-Foreign Agricultural Services (FAS) Emerging Markets Program.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Section Editor information
Rights and permissions
Copyright information
© 2023 Springer Nature Switzerland AG
About this entry
Cite this entry
Jahnke, I., Li, S., Singh, K., Yu, F., Riedel, N. (2023). Combining User Experience and Learning Efficacy in Design and Redesign. In: Spector, M.J., Lockee, B.B., Childress, M.D. (eds) Learning, Design, and Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-17727-4_179-1
Download citation
DOI: https://doi.org/10.1007/978-3-319-17727-4_179-1
Received:
Accepted:
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-17727-4
Online ISBN: 978-3-319-17727-4
eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education