Abstract
We present a method to aggregate four different facial cues to help identify distraction among online learners: facial emotion detection, micro-sleep tracking, yawn detection, and iris distraction detection. In our proposed method, the first module identifies facial emotions using both 2D and 3D convolutional neural networks (CNNs) which facilitates comparison between spatiotemporal and solely spatial features. The other three modules use a 3D facial mesh to localize the eye and lip coordinates in order to track a student’s facial landmarks and identify iris positions as well as signs of micro-sleep like yawns or drowsiness. The results from each module are combined to form an all-encompassing label displayed on an integrated user interface that can further be used to provide real-time alerts to students and instructors when required. From our experiments, the emotion, micro-sleep, yawn, and iris monitoring modules individually achieved 72.5%, 95%, 97%, and 93% accuracy scores, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Altuwairqi K, Kammoun Jarraya S, Allinjawi A, Hammami M (2021) Student behavior analysis to measure engagement levels in online learning environments. Sig Image Video Process 15
Fathima S, Kumar A, Raoof S (2021) Real time emotion detection of humans using mini-xception algorithm. In: IOP conference series: materials science and engineering 1042:012027
Freund Y, Schapire RE (1999) A short introduction to boosting. In: International joint conference on artificial intelligence, pp 1401–1406
Gupta A, Jaiswal R, Adhikari S, Balasubramanian V (2016) DAISEE: dataset for affective states in e-learning environments. CoRR abs/1609.01885
He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. CoRR abs/1512.03385
Hussain M, Zhu W, Zhang W, Abidi R (2018) Student engagement predictions in an e-learning system and their impact on student course assessment scores. Comput Intell Neurosci 2018:1–21
Karimah SN, Hasegawa S (2021) A real-time engagement assessment in online learning process using convolutional neural network. In: The Asian conference on education
Khine WSS, Hasegawa S, Kotani K (2020) Engagement estimation for an e-learning environment application
Kingma D, Ba J (2014) Adam: a method for stochastic optimization. In: International conference on learning representations
Liu S, Deng W (2015) Very deep convolutional neural network based image classification using small training sample size. In: 2015 3rd IAPR Asian conference on pattern recognition (ACPR), pp 730–734
Lugaresi C, Tang J, Nash H, McClanahan C, Uboweja E, Hays M, Zhang F, Chang CL, Yong M, Lee J, Chang WT, Hua W, Georg M, Grundmann M (2019) Mediapipe: a framework for perceiving and processing reality
Nezami OM, Hamey L, Richards D, Dras M (2018) Deep learning for domain adaption: engagement recognition. CoRR abs/1808.02324
Nvidia, Vingelmann P, Fitzek FH (2020) Cuda, release: 10.2.89. https://developer.nvidia.com/cuda-toolkit
Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A (2019) Pytorch: an imperative style, high-performance deep learning library. Adv Neural Inf Process Syst 32:8024–8035
Sharma P, Joshi S, Gautam S, Filipe V, Reis MJCS (2019) Student engagement detection using emotion analysis, eye tracking and head movement with machine learning. CoRR abs/1909.12913
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. ar**v 1409.1556
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: IEEE conference on computer vision and pattern recognition, vol 1
Acknowledgements
This work was supported by Mitacs Globalink Research Internship Program and NSERC Discovery Grant, Canada.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ravi, P., Ali Akber Dewan, M. (2023). Real-time Multi-module Student Engagement Detection System. In: Sharma, H., Shrivastava, V., Bharti, K.K., Wang, L. (eds) Communication and Intelligent Systems. ICCIS 2022. Lecture Notes in Networks and Systems, vol 686. Springer, Singapore. https://doi.org/10.1007/978-981-99-2100-3_22
Download citation
DOI: https://doi.org/10.1007/978-981-99-2100-3_22
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-2099-0
Online ISBN: 978-981-99-2100-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)