Log in

A Simple Resha** Method of sEMG Training Data for Faster Convergence in CNN-Based HAR Applications

  • Original Article
  • Published:
Journal of Electrical Engineering & Technology Aims and scope Submit manuscript

Abstract

Convolutional neural networks (CNNs) have demonstrated excellent image recognition performance. CNNs have also been successfully extended to human activity recognition (HAR) applications, which aim to recognize the intent of human actions or diagnoses for clinical purposes using sEMG collected from the human body. It has been observed in the literature that using visual image training data to train a CNN is prone to having a square matrix in terms of shape in image recognition, while using sEMG training data to train CNN may inherently have a rectangular matrix with a small number of rows and a vast number of columns in terms of shape in HAR applications. This leads to the assumption that CNN may converge much faster in its learning if sEMG training data are reshaped to have a square matrix in terms of shape without losing any information from the original training data in the shape of a rectangular matrix. This study proposes a simple but very effective resha** method to reshape sEMG training data in terms of shape from a rectangular matrix to a square matrix without losing any of the original information. Empirical studies confirm that the proposed resha** method enhances CNN learning such that it converges much faster regardless of optimizers and CNN models considered in the study. Our findings strongly recommend the use of CNN learning in sEMG-based HAR application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Sattarifar A, Nestorovic T (2022) Damage localization and characterization using one-dimensional convolutional neural network and a sparse network of transducers. Eng Appl Artif Intell 115:105273. https://doi.org/10.1016/j.engappai.2022.105273

    Article  Google Scholar 

  2. Gao C, Chen Y, Jiang X, Hu L, Zhang Z, Zhang Y (2023) Bi-STAN: bilinear spatial-temporal attention network for wearable human activity recognition. Int J Mach Learn Cybern 14:2545–2561

    Google Scholar 

  3. Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324. https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  4. Pande S, Karale N (2021) Handwritten digits identification using MNIST database via machine learning models. IOP Conf Ser: Master Sci Eng. https://doi.org/10.1088/1757-899X/1022/1/012108

    Article  Google Scholar 

  5. Calik R, Demirci M (2018) Cifar-10 image classification with convolutional neural networks for embedded systems. In: 2018 IEEE/ACS 15th international conference on computer systems and applications (AICCSA), Aqaba, Jordan, October, 2018. https://doi.org/10.1109/AICCSA.2018.8612873

  6. Codirenzi A, Lanting B, Teeter M (2023) A convolutional neural network for high throughput screening of femoral stem taper corrosion. Proc Inst Mech Eng Part H J Eng Med. https://doi.org/10.1177/09544119231177834

    Article  Google Scholar 

  7. Sajid H, Ching S, Cheung S (2015) VSig: Hand-gestured signature recognition and authentication with wearable camera. In: 2015 IEEE International workshop on information forensics and security (WIFS), Rome, Italy. https://doi.org/10.1109/WIFS.2015.7368556

  8. Chen F, Deng J, Pang Z, Nejad M, Yang H, Yang G (2018) Finger angle-based hand gesture recognition for smart infrastructure using wearable wrist-worn camera. Appl Sci. https://doi.org/10.3390/app8030369

    Article  Google Scholar 

  9. Karthik R, Menaka R, Kathiresan GS, Anirudh M, Nagharjun M (2022) Gaussian dropout based stacked ensemble CNN for classification of breast tumor in ultrasound images. IRBM 43(6):715–733. https://doi.org/10.1016/j.irbm.2021.10.002

    Article  Google Scholar 

  10. Lu S, Wang S, Zhang Y (2023) BCDNet: an optimized deep network for ultrasound breast cancer detection. IRBM. https://doi.org/10.1016/j.irbm.2023.100774

    Article  Google Scholar 

  11. Wang H et al (2014) Mitosis detection in breast cancer pathology images by combining handcrafted and convolutional neural network features. J Med Imaging. https://doi.org/10.1117/1.JMI.1.3.034003

    Article  Google Scholar 

  12. Um T, Babakeshizadeh V, Kulic D (2017) Exercise motion classification from large-scale wearable sensor data using convolutional neural networks. In: 2017 IEEE/RSJ International conference on intelligent robots and systems (IROS), September, Vancouver, BC, Canada. https://doi.org/10.1109/IROS.2017.8206051

  13. Batgerel G, Kwon C (2023) The study on effect of sEMG sampling frequency on learning performance in CNN based finger number recognition. J Korea Inst Converg Signal Process 24(1):51–56

    Google Scholar 

  14. Park J, Kwon C (2021) Korea finger number gesture recognition based on CNN using surface electromyography signals. J Electr Eng Technol 16:591–598. https://doi.org/10.1007/s42835-020-00587-3

    Article  Google Scholar 

  15. Batgerel G, Kwon C (2023) A study on time-series training data selection method for CNN learning. In: The 54th KIEE summer conference 2023, YongPyong, Kangwon-do, South Korea, July 12–15, 2023

  16. Chen X, Wang Z (2013) Pattern recognition of number gestures based on a wireless surface EMG system. Biomed Signal Process Control 8(2):184–192. https://doi.org/10.1016/j.bspc.2012.08.005

    Article  Google Scholar 

  17. Costanza E, Inverso S, Allen R, Maes P (2007) Intimate interface in action: assessing the usability and subtlety of EMG-based motionless gestures. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 819–828

  18. Qi W, Su H, Yang C, Ferrigno G, Momi E, Aliverti A (2019) A fast and robust deep convolutional neural networks for complex human activity recognition using smartphone. Sensors. https://doi.org/10.3390/s19173731

    Article  Google Scholar 

  19. Saponas T, Tan D, Morris D, Balakrishnan R (2008) Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 515–524

  20. Saponas T, Tan D, Morris D, Turner J, Landay J (2010) Making muscle-computer interfaces more practical. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 851–854

  21. Alzori M et al (2014) Electromyography data for non-invasive naturally-controlled robotic hand prosthetics. Sci Data

  22. Patricia N, Tommasi T, Caputo B (2014) Multi-source adaptive learning for fast control of prosthetics hand. In: International conference on pattern recognition, pp 2769–2774

  23. Geng W, Du Y, ** W, Wei W, Hu Y, Li J (2016) Gesture recognition by instantaneous surface EMG images. Sci Rep. https://doi.org/10.1038/srep36571

    Article  Google Scholar 

  24. Phinyomark A, Scheme E (2018) EMG pattern recognition in the era of big data and deep learning. Big Data Cogn Comput. https://doi.org/10.3390/bdcc2030021

    Article  Google Scholar 

  25. Rojas-Martnez M, Mananas M, Alonso J (2012) High-density surface EMG maps from upper-arm and forearm muscles. J Neuroeng Rehabilit 9(1)

  26. Rojas-Martnez M, Mananas M, Alonso J, Merletti R (2013) Identification of isometric contractions based on high density EMG maps. J Electromyogr Kinesiol 23:33–42

    Google Scholar 

  27. Zhang X, Zhou P (2012) High-density myoelectric pattern recognition toward improved stroke rehabilitation. IEEE Trans Biomed Eng 59:1649–1657

    Google Scholar 

  28. Active two user manual version 3.2, Biosemi, 2007

  29. http://colab.research.google.com/notebooks/welcome.ipynb, Google.com

  30. Xu D, Zhang S, Zhang H, Mandic D (2021) Convergence of the RMSprop deep learning method with penalty for nonconvex optimization. Neural Netw 139:17–23. https://doi.org/10.1016/j.neunet.2021.02.011

    Article  Google Scholar 

  31. Elshamy R, Abu-Elnasr O, Elhoseny M, Elmougy S (2023) Improving the efficiency of RMSprop optimizer by utilizing Nestrove in deep learning. Sci Rep. https://doi.org/10.1038/s41598-023-35663-x

    Article  Google Scholar 

  32. Kingma D, Ba J (2015) Adam: a method for stochastic optimization. In: International conference on learning representations (ICLR), pp 1–13. ar**v:1412.6980v9

  33. Bock S, Weib M (2019) A proof of local convergence for the Adam optimizer. In: 2019 International joint conference on neural networks (IJCNN). https://doi.org/10.1109/IJCNN.2019.8852239

Download references

Acknowledgements

This research was supported in part by the Soonchunhyang University Research Fund and supported in part by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No.2021R111A3043994)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chun-Ki Kwon.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Batgerel, G., Kwon, CK. A Simple Resha** Method of sEMG Training Data for Faster Convergence in CNN-Based HAR Applications. J. Electr. Eng. Technol. 19, 2607–2619 (2024). https://doi.org/10.1007/s42835-023-01736-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42835-023-01736-0

Keywords

Navigation