Abstract
Computational analysis methods and machine learning techniques introduce innovative ways to capture classroom interactions and display data on analytics dashboards. Automated classroom analytics employ advanced data analysis, providing educators with comprehensive insights into student participation, engagement, and behavioral trends within classroom settings. Through the provision of context-sensitive feedback, automated classroom analytics systems can be integrated into the evidence-based pedagogical decision-making and reflective practice processes of faculty members in higher education institutions. This paper presents TEACHActive, an automated classroom analytics system, by detailing its design and implementation. It outlines the processes of stakeholder engagement and map**, elucidates the benefits and obstacles associated with a comprehensive classroom analytics system design, and concludes by discussing significant implications. These implications propose user-centric design approaches for higher education researchers and practitioners to consider.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
The emergence of computational analysis methods and machine learning techniques provides new ways to understand classroom interactions and design automated systems that can analyze instructor and student classroom behaviors. Classroom analytics can be displayed on dashboards, offering instructors automated observations and feedback regarding classroom interactions and activities (Sedrakyan et al., 2020). Classroom analytics applications are implemented in higher education institutions to enhance student engagement and learning through timely and evidence-based pedagogical actions (Larrabee Sønderlund et al., 2019). With the ability to process and interpret vast amounts of data, these systems can shift the focus from mere information collection to meaningful action, enhancing both teaching practices and student experiences in higher education classrooms.
Classroom analytical systems face a range of challenges that stem from the complexity of educational environments and the diverse needs of stakeholders. One notable challenge is the sheer volume of data generated by these systems, often overwhelming educators, and administrators. Additionally, ensuring data accuracy and integrity can be a daunting task, as discrepancies can lead to misinformed decision-making. The integration of classroom analytics into existing teaching practices and curriculum alignment poses another hurdle, requiring seamless adoption without disrupting established workflows. However, technological advancements are step** up to address these challenges. The emergence of advanced machine learning algorithms allows for more efficient data processing and pattern recognition, aiding educators in deriving meaningful insights from vast datasets. Moreover, user-friendly visualization tools are being developed, simplifying the interpretation of analytics and making them accessible to educators who may not be data experts. These advancements also facilitate real-time monitoring, enabling prompt intervention when students disengage or require additional support.
As classroom analytical systems continue to evolve, they are becoming increasingly equipped to navigate the challenges of the educational landscape, providing educators with actionable insights to enhance learning outcomes. However, the design of classroom analytics systems is predominantly influenced by technocentric perspectives, constraining their pedagogical implications (Li et al., 2021). User interaction—whether they are learners, instructors, or administrators—with analytics interfaces isn't isolated; rather, intricate interactions among diverse stakeholders underpin the successful implementation of such systems (Larusson & White, 2014). The drive toward constructing human-centered analytics systems, emphasizing user and stakeholder needs, and their dynamic roles in the design process, is gaining momentum (Boy, 2017). It is important for various stakeholders to actively engage throughout the design, implementation, and evaluation phases, aligning practices with needs and expectations while fostering long-term sustainability. A holistic approach is essential to grasp how analytics systems and interfaces can be formulated through these intricate stakeholder interactions.
This paper presents the design and implementation of a professional development system called TEACHActive, which is founded upon computational analysis and automated context-sensitive feedback (AlZoubi et al., 2021a). TEACHActive introduces an innovative approach to displaying classroom behavioral data, which is automatically analyzed through an automated classroom sensing system. The design and implementation of TEACHActive take place within the context of a project funded by the National Science Foundation (NSF). The primary objective of the project was to enhance student engagement and promote active learning within engineering classrooms, utilizing an analytics-driven faculty professional development framework. This paper outlines the implications of stakeholder engagement and map** methods that emphasize the interconnectedness and relationships among diverse stakeholders, along with their involvement throughout the design and deployment processes.
TEACHActive: Automated Classroom Analytics System
Classroom interactions can be captured by sensors and wearables in physical classrooms, and quantified through multimodal classroom analytics that provide a deeper understanding of embodied learning and aspects of teaching (Martínez-Maldonado et al., 2022a, b). Computer vision advancements using simple cameras can serve as whole-classroom sensors instead of individual student and teacher wearables (Ahuja et al., 2001). The effectiveness of reflective practice is magnified when informed by empirical data, facilitating a more nuanced understanding of educational contexts (Avramides et al., 2015; Wise, 2020). Reflective practice operates in iterative cycles, encompassing the identification of pedagogical challenges, the implementation of targeted interventions, and subsequent reflection on their outcomes (Horton-Deutsch & Sherwood, 2017). The TEACHActive system integrates reflective practice with analytics, thus making it a practical tool for instructors. For example, if classroom analytics indicate that an instructor's facilitation is excessively dominant, they can use reflective practice to allocate more time for collaborative activities in future sessions. Similarly, if students appear disengaged, the instructor can introduce more interactive elements to enhance student participation.
TEACHActive transforms raw classroom data into meaningful metrics, utilizing these outcomes to offer practical feedback to instructors aiming to enhance student engagement and active learning through classroom analytics. The TEACHActive system comprises three key components: 1) Pedagogical training and analytics orientation, 2) automated classroom observation, and 3) feedback and reflection. Figure 1 presents the TEACHActive processes and components.
Component 1. Pedagogical Training and Analytics Orientation
The instructors who participate in the automated classroom observation attend a series of group and one-on-one pedagogical training sessions and workshops to understand and explore how they might integrate classroom analytics into their classroom decision making processes. The training sessions also help instructors become familiar with the system’s capabilities and offer them an opportunity to share feedback regarding their needs before implementation.
Component 2. Automated Classroom Observation
TEACHActive deploys EduSense, a computer vision-based classroom system that relies on passive video footage captured by video cameras placed at two vantage points in the physical classroom space. These cameras capture student and teacher activity (video) and spoken communication (audio) (Ahuja et al., Successful implementation of the TEACHActive depended on continuous interaction and collaboration with and between multiple stakeholders within and outside the university. There is increasing interest in involving users and stakeholders in the design of analytics systems and interfaces (Sanders & Stappers, 2008). A stakeholder is “any group or individual who can affect or is affected by the achievement of the organization’s objectives” (Freeman, 1984, p. 46). Stakeholders possess invaluable knowledge, experience and interaction with the system. Stakeholder engagement among instructors, administrators, and students is an important consideration in classroom analytics system design and implementation (Cober et al., 2015; Dollinger et al., 2019). These systems involve sensors and devices such as cameras to gather data about student behaviors that potentially provide valuable insights into student learning and classroom dynamics (Blikstein & Worsley, 2016). Thus, the goal of classroom analytics is to provide stakeholders with insights into student learning and behavior, and to support evidence-based decision-making. However, the use of automated classroom analytics systems can also raise concerns about privacy, data security, and the potential for biased or inappropriate data usage (Kitto & Knight, 2019). While stakeholder involvement in the design processes varies, their input is considered critical in making design decisions, particularly when individuals serve in multiple roles (Grimpe et al., 2014). Stakeholder engagement is critical to ensure that a system is adopted by potential users and addresses privacy, security, and ethical concerns leading to more sustainable practices. It is also important to provide ongoing support and communication to stakeholders, so they have the needed resources and information to effectively use and benefit from an automated classroom analytics system. Three types of TEACHActive stakeholders are identified based on their level of engagement: key, primary, and secondary. Key stakeholders were a group of decision makers directly involved in the design and implementation. These included instructors (users), research team members, UI designers, and software developers. Primary stakeholders were individuals and/or groups who directly impacted the system, including the system administrator (infrastructure/servers/security), audio visual department, classroom scheduling unit, institutional review board (IRB) administrators, and the EduSense Team. Secondary stakeholders were individuals and/or groups indirectly impacted by both key and primary stakeholders’ efforts, including external evaluators, participating faculty’s students and the College of Engineering faculty. Figure 5 represents the stakeholder diagram that maps the three stakeholder types. Three factors determined successful stakeholder engagement throughout the design, deployment, and evaluation processes: (a) Implementation of human-centered design methods, (b) active and continuous collaboration, and (c) evaluation for impact and sustainability. Human-centered design methods were implemented throughout all system design processes to ensure that key stakeholders (e.g., UI designers, software developers, instructors, and research team) were involved in the design processes. This involved the gathering of user requirements and creating personas, creating mock-ups, conducting user walkthroughs, and making design decisions after reaching user and research team consensus. The research team, user interface (UI) designers, and software developer met periodically and collaborated on key features and elements. User requirements were gathered through semi-structured interviews. Personas were created to understand users’ needs, goals, barriers, frustrations, expected outcomes, and experiences. UI designers created mock-ups, conducted user walkthroughs and interviews with instructors, collaborated on the system’s steps development, and completed a series of user tests (AlZoubi et al., 2021b). Software developers worked on the system’s deployment and maintenance and investigated the possibility of adding new software features based on the research team’s decisions and themes. Researchers met with instructors (users) on a regular basis and involved them in the dashboard design processes, as well as reported themes and identified key points that could be addressed in future iterations. Active, continuous, and collaborative engagement with key stakeholders was central to our design approach, including the system administrators (infrastructure/servers/security), Audio-Visual Department, classroom scheduling unit, Institutional Review Board (IRB) administrators, and EduSense Team. Due to the complexity of implementing a system within a higher education institution’s traditional structure with units at various operational levels, establishing frequent sustained interactions with stakeholders was key to improved user experience and effective design decisions (Buchan et al., 2017). The system’s successful application depended on the inclusion and alignment of multiple stakeholders’ perspectives within and across the institution. Achieving support from various institutional units was challenging due to dissimilar goals, operations, and relationships with the project. Including units such as the University IT services in planning conversations, proved to be helpful in addressing concerns and suggestions during the projects’ early developmental phases. Therefore, the team spent considerable time on relationship building at the beginning of the project, as well as aligning goals throughout the system’s implementation. The system was implemented within the context of a research project involving human subjects. Therefore, IRB administrators constituted one of the primary stakeholders and ensured that the research followed ethical guidelines. Because the system is dependent on collecting and automatically analyzing video data from classrooms, the IRB team and researchers collaborated to protect instructors and students. After meeting the criteria for ethical data collection and protection, we collaborated with the university’s classroom scheduling services and audio-visual unit to identify potential classrooms that fit system criteria and enabled successful camera installation. Transparency among stakeholders was critical when addressing privacy and ethical concerns. For example, attributes of power (those who decide which design considerations will move forward), ownership (how and with whom data are shared, the consequences of opting out, and how long data are kept), and responsibility (those responsible for data accuracy) should be addressed transparently while designing systems. Working with system administrators and the IT team was crucial to ensure security and privacy. System admins helped set up a data collection server for the system, and a IT security group ensured that all data collected was securely stored on the university network. EduSense is a licensed (BSD 3- Clause) open-source system that allows source code modification (Ahuja et al., Following challenges emerged during the stakeholder engagement processes: Identifying stakeholders: Identifying all relevant stakeholders was a challenging task. To address this challenge, the research team thoroughly investigated the context and environment that would house the system prior to designing, deploying and implementing it. Ensuring representation: Even if all relevant stakeholders were identified, it was difficult to make sure they were all represented in the design process and their needs were adequately addressed. The team met periodically with stakeholders to clarify what points could be moved forward and what points were outside the scope of the project, and to communicate them clearly. Building trust and engagement: Building trust and engagement with stakeholders was a challenging task; it required a deep understanding of the stakeholders’ perspectives, needs, and concerns. This could be difficult if stakeholders were skeptical about using automated systems. Addressing privacy concerns: Privacy concerns such as data security and data governance can be a significant challenge when designing automated systems since these systems collect, store and process data that must be protected. Thus, it was crucial to ensure that the system admin was one of the stakeholders and involved in system deployment and implementation. Ethical considerations: Automated systems can have a significant impact on groups; thus, it is important to ensure that these systems are fair and unbiased. In the implementation of such a system, the IRB office was one of the stakeholders that had the responsibility to anticipate and address potential ethical issues. Limited resources: Stakeholder engagement is resource-intensive and requires a budget. It should be recognized that there may be budget limitations for research or staff time needed for engagement activities. The TEACHActive system presented in this paper provides examples that include stakeholders in the design, development, and evaluation processes and support sustainable system design in the classroom analytics field. For successful design and implementation of such systems, it is critical to identify stakeholders early in the process and to include them in design decisions. The stakeholder map presented in this paper illustrates groups and individuals who held key, primary, and secondary roles in the project’s implementation. The map can be updated as conditions change, and newly recognized needs emerge. The building of stakeholder maps and identifying stakeholder engagement strategies help designers and developers improve the impact of their work and contribute to sustainable design and implementation practices. Human-centered design tools as well as stakeholder engagement and map** are becoming more important in the design of analytics systems and interfaces. The comprehensive analytics system presented in this paper is uniquely situated within the growing literature on automated classroom observation and multi-modal classroom analytics (Blikstein & Worsley, 2016; Martínez-Maldonado et al., 2022a, b). Further system improvement will depend on aligning the goals of researchers, individual faculty users, and institutional units. The project team’s commitment to agile collaboration with stakeholders within the context of institutional structures and decision processes, as well as prolonged engagement with participating faculty to identify factors that influence their classroom actions, will be important. Further commitment to the dashboards’ interface iterations with participatory or co-design practices will improve users’ experience, promote user trust, agency and dashboard ownership. Finally, integrating TEACHActive with a community of practice (CoP) can present an opportunity to foster collaborative learning, reflection, and continuous improvement of higher education pedagogies. In a CoP, instructors, researchers, and educational technologists can come together to share their experiences, insights, and best practices related to innovative classroom pedagogies and the utilization of classroom analytics. Such a CoP would provide a space for instructors to discuss and dissect the data generated by TEACHActive, collectively exploring its implications on their teaching methods while engaging in reflective practice. It would also facilitate cross-disciplinary dialogue, enabling educators from various sub-disciplines to learn from each other and adapt successful strategies to their own classrooms, all while reflecting on their pedagogical approaches. Moreover, this CoP can serve as a platform for research-driven discussions, allowing stakeholders to delve deeper into the nuances of classroom analytics and its impact on student engagement and learning outcomes, promoting reflective practice at both the individual and collective levels. Ultimately, the TEACHActive-CoP has the potential to create a thriving ecosystem where the system's benefits are maximized through collaborative knowledge sharing, reflective practice, and continuous professional development. Ahuja, K., Kim, D., Xhakaj, F., Varga, V., **e, A., Zhang, S., Townsend, J. E., Harrison, C., Ogan, A., & Agarwal, Y. (2019). EduSense: Practical Classroom Sensing at Scale. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(3), 1–26. https://doi.org/10.1145/3351229 AlZoubi, D., Kelley, K., Baran, E., Gilbert, S. B., & Karabulut Ilgu, A. & Jiang, S. (2021a). TEACHActive feedback dashboard: Using automated classroom analytics to visualize pedagogical strategies at a glance. In CHI conference on human factors in computing systems extended abstracts (pp. 1–6), Association for Computing Machinery (ACM). https://doi.org/10.1145/3411763.3451709 AlZoubi, D., Kelley, J., Baran, E., Gilbert, S.B. , Jiang, S., & Karabulut-Ilgu, A. (2021b). Designing the TEACHActive feedback dashboard: A human centered approach. In: Proceedings of 11th International Conference on Learning Analytics and Knowledge (LAK21) Online. Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in higher education: A systematic literature review. Online Learning, 20(2), 13–29. Avramides, K., Hunter, J., Oliver, M., & Luckin, R. (2015). A method for teacher inquiry in cross‐curricular projects: Lessons from a case study. British Journal of Educational Technology, 46(2), 249–264. https://doi.org/10.1111/bjet.12233 Baran, E., AlZoubi, D., & Karabulut-Ilgu, A. (2022). Leveraging engineering instructors’ professional development with classroom analytics. In E. Langran (Ed.), Proceedings of society for information technology & teacher education international conference (pp. 1769–1775). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/220948/ Bernacki, M. L., Greene, M. J., & Lobczowski, N. G. (2021). A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose (s)? Educational Psychology Review, 33(4), 1675–1715. Blikstein, P., & Worsley, M. (2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220–238. Boy, G. A. (2017). The handbook of human-machine interaction: A human-centered design approach. CRC Press. Brown, T. (2008). Design thinking. Harvard Business Review, 86(6), 84. Buchan, J., Bano, M., Zowghi, D., MacDonell, S., & Shinde, A. (2017). Alignment of stakeholder expectations about user involvement in agile software development. Proceedings of the 21st International Conference on Evaluation and Assessment in Software Engineering, 334–343. https://doi.org/10.1145/3084226.3084251 Cober, R., Tan, E., Slotta, J., So, H.-J., & Könings, K. D. (2015). Teachers as participatory designers: Two case studies with technology-enhanced learning environments. Instructional Science, 43(2), 203–228. Dollinger, M., Liu, D., Arthars, N., & Lodge, J. M. (2019). Working together in learning analytics towards the co-creation of value. Journal of Learning Analytics, 6(2), 10–26. Freeman, R. E. (1984). Strategic management. A stakeholder approach. New York. Grimpe, B., Hartswood, M., & Jirotka, M. (2014). Towards a closer dialogue between policy and practice: Responsible design in HCI. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2965–2974. https://doi.org/10.1145/2556288.2557364 Horton-Deutsch, S., & Sherwood, G. D. (2017). Reflective practice: Transforming education and improving outcomes (Vol. 2). Sigma Theta Tau. Kitto, K., & Knight, S. (2019). Practical ethics for building learning analytics. British Journal of Educational Technology, 50(6), 2855–2870. Larrabee Sønderlund, A., Hughes, E., & Smith, J. (2019). The efficacy of learning analytics interventions in higher education: A systematic review. British Journal of Educational Technology, 50(5), 2594–2618. Larusson, J. A., & White, B. (2014). Learning analytics: From research to practice (Vol. 13). Springer. Li, Q., Jung, Y., & Friend Wise, A. (2021). Beyond First Encounters with Analytics: Questions, Techniques and Challenges in Instructors’ Sensemaking.LAK21: 11th International Learning Analytics and Knowledge Conference, 344–353. https://doi.org/10.1145/3448139.3448172 Mandinach, E. B., & Gummer, E. S. (2016). What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teaching and Teacher Education, 60, 366–376. https://doi.org/10.1016/j.tate.2016.07.011 Martínez-Maldonado, R., Yan, L., Deppeler, J., Phillips, M., & Gašević, D. (2022a). Classroom Analytics: Telling Stories About Learning Spaces Using Sensor Data. In E. Gil, Y. Mor, Y. Dimitriadis, & C. Köppe (Eds.), Hybrid Learning Spaces (pp. 185–203). Springer International Publishing. https://doi.org/10.1007/978-3-030-88520-5_11 Martínez-Maldonado, R., Kay, J., Tomitsch, M., Yacef, K., & Siemens, G. (2022b). Multimodal learning analytics: A comprehensive review. Journal of Computer Assisted Learning, 38(2), 133–169. Ndukwe, I. G., & Daniel, B. K. (2020). Teaching analytics, value and tools for teacher data literacy: A systematic and tripartite approach. International Journal of Educational Technology in Higher Education, 17(1), 1–31. Norman, D. (2013). The design of everyday things: Revised and expanded edition. Basic books. Ranalli, J., Link, S., & Chukharev-Hudilainen, E. (2017). Automated writing evaluation for formative assessment of second language writing: Investigating the accuracy and usefulness of feedback as part of argument-based validation. Educational Psychology, 37(1), 8–25. Sanders, E.B.-N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. CoDesign, 4(1), 5–18. https://doi.org/10.1080/15710880701875068 Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2020). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, 107, 105512. Tissenbaum, M., Matuk, C., Berland, M., Lyons, L., Cocco, F., Linn, M., Plass, J. L., Hajny, N., Olsen, A., Schwendimann, B., Boroujeni, M. S., Slotta, J. D., Vitale, J., Gerard, L., & Dillenbourg, P. (2016). Real-time visualization of student activities to support classroom orchestration. In C. K. Looi, J. L. Polman, U. Cress, & P. Reimann (Eds.), Transforming learning, empowering learners: The international conference of the learning sciences (ICLS) 2016, Volume 2. Singapore: International Society of the Learning Sciences. Walkington, J., Christensen, H. P., & Kock, H. (2001). Develo** critical reflection as a part of teaching training and teaching practice. European Journal of Engineering Education, 26(4), 343–350. https://doi.org/10.1080/03433790110068242 Wise, A. (2020). Educating data scientists and data literate citizens for a new generation of data. Journal of the Learning Sciences, 29(1), 165–181. https://doi.org/10.1080/10508406.2019.1705678 This research is supported in part by National Science Foundation under Grant no. 2021118. The opinions, findings, and conclusions or recommendations expressed are those of the author(s) and do not necessarily reflect the views of the National Science Foundation Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Baran, E., AlZoubi, D. & Morales, A.S. Design and Implementation of an Automated Classroom Analytics System: Stakeholder Engagement and Map**.
TechTrends 67, 945–954 (2023). https://doi.org/10.1007/s11528-023-00905-2 Accepted: Published: Issue Date: DOI: https://doi.org/10.1007/s11528-023-00905-2Stakeholder Engagement and Map**
Implementation of Human-Centered Design Methods
Active and Continuous Collaboration
Challenges of Stakeholder Map** and Engagement
Key Implications and Final Remarks
References
Acknowledgements
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Rights and permissions
About this article
Cite this article
Keywords