Real-time Multi-module Student Engagement Detection System

  • Conference paper
  • First Online:
Communication and Intelligent Systems (ICCIS 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 686))

Included in the following conference series:

  • 339 Accesses

Abstract

We present a method to aggregate four different facial cues to help identify distraction among online learners: facial emotion detection, micro-sleep tracking, yawn detection, and iris distraction detection. In our proposed method, the first module identifies facial emotions using both 2D and 3D convolutional neural networks (CNNs) which facilitates comparison between spatiotemporal and solely spatial features. The other three modules use a 3D facial mesh to localize the eye and lip coordinates in order to track a student’s facial landmarks and identify iris positions as well as signs of micro-sleep like yawns or drowsiness. The results from each module are combined to form an all-encompassing label displayed on an integrated user interface that can further be used to provide real-time alerts to students and instructors when required. From our experiments, the emotion, micro-sleep, yawn, and iris monitoring modules individually achieved 72.5%, 95%, 97%, and 93% accuracy scores, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (France)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 181.89
Price includes VAT (France)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 232.09
Price includes VAT (France)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Altuwairqi K, Kammoun Jarraya S, Allinjawi A, Hammami M (2021) Student behavior analysis to measure engagement levels in online learning environments. Sig Image Video Process 15

    Google Scholar 

  2. Fathima S, Kumar A, Raoof S (2021) Real time emotion detection of humans using mini-xception algorithm. In: IOP conference series: materials science and engineering 1042:012027

    Google Scholar 

  3. Freund Y, Schapire RE (1999) A short introduction to boosting. In: International joint conference on artificial intelligence, pp 1401–1406

    Google Scholar 

  4. Gupta A, Jaiswal R, Adhikari S, Balasubramanian V (2016) DAISEE: dataset for affective states in e-learning environments. CoRR abs/1609.01885

    Google Scholar 

  5. He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. CoRR abs/1512.03385

    Google Scholar 

  6. Hussain M, Zhu W, Zhang W, Abidi R (2018) Student engagement predictions in an e-learning system and their impact on student course assessment scores. Comput Intell Neurosci 2018:1–21

    Article  Google Scholar 

  7. Karimah SN, Hasegawa S (2021) A real-time engagement assessment in online learning process using convolutional neural network. In: The Asian conference on education

    Google Scholar 

  8. Khine WSS, Hasegawa S, Kotani K (2020) Engagement estimation for an e-learning environment application

    Google Scholar 

  9. Kingma D, Ba J (2014) Adam: a method for stochastic optimization. In: International conference on learning representations

    Google Scholar 

  10. Liu S, Deng W (2015) Very deep convolutional neural network based image classification using small training sample size. In: 2015 3rd IAPR Asian conference on pattern recognition (ACPR), pp 730–734

    Google Scholar 

  11. Lugaresi C, Tang J, Nash H, McClanahan C, Uboweja E, Hays M, Zhang F, Chang CL, Yong M, Lee J, Chang WT, Hua W, Georg M, Grundmann M (2019) Mediapipe: a framework for perceiving and processing reality

    Google Scholar 

  12. Nezami OM, Hamey L, Richards D, Dras M (2018) Deep learning for domain adaption: engagement recognition. CoRR abs/1808.02324

    Google Scholar 

  13. Nvidia, Vingelmann P, Fitzek FH (2020) Cuda, release: 10.2.89. https://developer.nvidia.com/cuda-toolkit

  14. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A (2019) Pytorch: an imperative style, high-performance deep learning library. Adv Neural Inf Process Syst 32:8024–8035

    Google Scholar 

  15. Sharma P, Joshi S, Gautam S, Filipe V, Reis MJCS (2019) Student engagement detection using emotion analysis, eye tracking and head movement with machine learning. CoRR abs/1909.12913

    Google Scholar 

  16. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. ar**v 1409.1556

    Google Scholar 

  17. Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: IEEE conference on computer vision and pattern recognition, vol 1

    Google Scholar 

Download references

Acknowledgements

This work was supported by Mitacs Globalink Research Internship Program and NSERC Discovery Grant, Canada.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pooja Ravi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ravi, P., Ali Akber Dewan, M. (2023). Real-time Multi-module Student Engagement Detection System. In: Sharma, H., Shrivastava, V., Bharti, K.K., Wang, L. (eds) Communication and Intelligent Systems. ICCIS 2022. Lecture Notes in Networks and Systems, vol 686. Springer, Singapore. https://doi.org/10.1007/978-981-99-2100-3_22

Download citation

Publish with us

Policies and ethics

Navigation