Log in

Fuzzy Logic Control of a Head-movement Based Semi-autonomous Human–machine Interface

  • Research Article
  • Published:
Journal of Bionic Engineering Aims and scope Submit manuscript

Abstract

Quadriplegia is a neuromuscular disease that may cause varying degrees of functional loss in trunk and limbs. In such cases, head movements can be used as an alternative communication channel. In this study, a human–machine interface which is controlled by human head movements is designed and implemented. The proposed system enables users to steer the desired movement direction and to control the speed of an output device by using head movements. Head movements of the users are detected using a 6 DOF IMUs measuring three-axis accelerometer and three-axis gyroscope. The head movement axes and the Euler angles have been associated with movement direction and speed, respectively. To ensure driving safety, the speed of the system is determined by considering the speed requested by the user and the obstacle distance on the route. In this context, fuzzy logic algorithm is employed for closed-loop speed control according to distance sensors and reference speed data. A car model was used as the output device on the machine interface. However, the wireless communication between human and machine interfaces provides to adapt this system to any remote device or systems. The implemented system was tested by five subjects. Performance of the system was evaluated in terms of task completion times and feedback from the subjects about their experience with the system. Results indicate that the proposed system is easy to use; and the control capability and usage speed increase with user experience. The control speed is improved with the increase in user experience.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Data availability

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study. The algorithm and program codes of the current study are available from the corresponding author on reasonable request.

References

  1. Kuriakose, D. C. (2020, December 22). What Is Quadriplegia/What Is Tetraplegia?. from https://www.spinalcord.com/quadriplegia-tetraplegia

  2. Andreoni, G., Parini, S., Maggi, L., Piccini, L., Panfili, G., & Torricelli, A. (2007). Human machine interface for healthcare and rehabilitation. In: Vaidya, S., Jain, L. C., & Yoshida, H., (Eds.), Advanced computational intelligence paradigms in healthcare-2: Studies in Computational Intelligence, (vol. 65, pp. 131–150). Springer, Heidelberg, Berlin. https://doi.org/10.1007/978-3-540-72375-2_7

  3. Tolle, H., & Arai, K. (2016). Design of head movement controller system (HEMOCS) for control mobile application through head pose movement detection. International Journal of Interactive Mobile Technologies, 10(3), 24–28. https://doi.org/10.3991/ijim.v10i3.5552

    Article  Google Scholar 

  4. **rgo, L. R., & Varquiel, F. L. (2017). Accelerometer-based computer mouse for people with special needs. Journal of Accessibility and Design for All, 7(1), 1–20. https://doi.org/10.17411/jacces.v7i1.113

    Article  Google Scholar 

  5. Kumar, M., & Neelima, B. (2014). A portable wireless head movement controlled human-computer interface for people with disabilities. International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering, 3(7), 10477–10484.

    Article  Google Scholar 

  6. Anwer, S., Waris, A., Sultan, H., Butt, S. I., Zafar, M. H., Sarwar, M., Niazi, I. K., Shafique, M., & Pujari, A. N. (2020). Eye and voice-controlled human machine ınterface system for wheelchairs using ımage gradient approach. Sensors, 20(19), 5510. https://doi.org/10.3390/s20195510

    Article  Google Scholar 

  7. Kumar, S., Sultan, M. J., Ullah, A., Zameer, S., Siddiqui, S., & Sami, S. K. (2018). Human machine interface glove using piezoresistive textile based sensors. IOP Conference Series: Materials Science and Engineering, 414, 012041. https://doi.org/10.1088/1757-899x/414/1/012041

    Article  Google Scholar 

  8. Kalita, D. (2020). Designing of facial emotion recognition system based on machine learning. 8th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (pp. 969–972), Noida, India. https://doi.org/10.1109/ICRITO48877.2020.9197771

  9. Carfi, A., & Mastrogiovanni, F. (2021). Gesture-based human-machine interaction: Taxonomy, problem definition, and analysis. IEEE Transactions on Cybernetics. https://doi.org/10.1109/TCYB.2021.3129119

    Article  Google Scholar 

  10. He, T., Sun, Z., Shi, Q., Zhu, M., Anaya, D. V., Xu, M., Chen, T., Yuce, M. R., Thean, A.V.-Y., & Lee, C. (2019). Self-powered glove-based intuitive interface for diversified control applications in real/cyber space. Nano Energy, 58, 641–651. https://doi.org/10.1016/j.nanoen.2019.01.091

    Article  Google Scholar 

  11. Shi, Q., Qiu, C., He, T., Wu, F., Zhu, M., Dziuban, J. A., Walczak, R., Yuce, M. R., & Lee, C. (2019). Triboelectric single-electrode-output control interface using patterned grid electrode. Nano Energy, 60, 545–556. https://doi.org/10.1016/j.nanoen.2019.03.090

    Article  Google Scholar 

  12. Champaty, B., Jose, J., Pal, K., & Thirugnanam, A. (2014). Development of EOG based human machine interface control system for motorized wheelchair, Annual International Conference on Emerging Research Areas: Magnetics, Machines and Drives (pp. 1–7), India. https://doi.org/10.1109/AICERA.2014.6908256

  13. Zhang, J. H., Wang, B. Z., Zhang, C., **ao, Y. Q., & Wang, M. Y. (2019). An EEG/EMG/EOG-based multimodal human-machine interface to real-time control of a soft robot hand. Frontiers in Neurorobotics, 13, 7. https://doi.org/10.3389/fnbot.2019.00007

    Article  Google Scholar 

  14. Liu, H., Dong, W., Li, Y., Li, F., Geng, J., Zhu, M., Chen, T., Zhang, H., Sun, L., & Lee, C. (2020). An epidermal sEMG tattoo-like patch as a new human–machine interface for patients with loss of voice. Microsystems & Nanoengineering, 6, 16. https://doi.org/10.1038/s41378-019-0127-5

    Article  Google Scholar 

  15. Aydin, E. A., Bay, O. F., & Guler, I. (2018). P300-based asynchronous brain computer interface for environmental control system. IEEE Journal of Biomedical and Health Informatics, 22(3), 653–663. https://doi.org/10.1109/JBHI.2017.2690801

    Article  Google Scholar 

  16. Rudigkeit, N., & Gebhard, M. (2019). AMiCUS-A head motion-based interface for control of an assistive robot. Sensors (Basel), 19(12), 2836. https://doi.org/10.3390/s19122836

    Article  Google Scholar 

  17. Arvind, A., & Harikrishnan, R. (2016). Head movement controlled wheel chair using MEMS sensors. International Research Journal of Engineering and Technology, 3(5), 1135–1138.

    Google Scholar 

  18. Süzen, A. A., Deniz, Ö., & Çetin, A. (2017). Kafa hareketleri ile kontrol edilebilen tekerlekli sandalye. Mehmet Akif Ersoy Üniversitesi Fen Bilimleri Enstitüsü Dergisi, 8(1), 66–72.

    Google Scholar 

  19. Machangpa, J. W., & Chingtham, T. S. (2018). Head gesture controlled wheelchair for quadriplegic patients. Procedia Computer Science, 132, 342–351.

    Article  Google Scholar 

  20. Meshram, V. P., Rajurkar, P. A., Bhiogade, M. M., Kharabe, A. C., & Banewar, D. (2015). Wheelchair automation using head gesture. International Journal of Advanced Research in Computer Science and Software Engineering, 5(1), 641–646.

    Google Scholar 

  21. Al-Neami, A. Q. H., & Ahmed, S. M. (2018). Controlled wheelchair system based on gyroscope sensor for disabled patients. Biosci Biotech Res Asia, 15(4), 921–927.

    Article  Google Scholar 

  22. Sezer, V. (2018). Kafa hareketleriyle kontrol edilebilen yari-otonom elektrikli tekerlekli sandalye geliştirilmesi. Gazi Üniversitesi Fen Bilimleri Dergisi Part C: Tasarım ve Teknoloji, 6(1), 221–232.

    Google Scholar 

  23. Qamar, I. O., Fadli, B. A., Sukkar, G. A., & Abdalla, M. (2017). Head movement based control system for quadriplegia patients, 10th Jordanian International Electrical and Electronics Engineering Conference (pp. 1–5), Amman, Jordan. https://doi.org/10.1109/JIEEEC.2017.8051405

  24. Gomes, D., Fernandes, F., Castro, E., & Pires, G. (2019). Head-movement interface for wheelchair driving based on inertial sensors, IEEE 6th Portuguese Meeting on Bioengineering (pp. 1–4), Lisbon, Portugal. https://doi.org/10.1109/ENBENG.2019.8692475

  25. Kelasidi, E., Moe, S., Pettersen, K. Y., Kohl, A. M., Liljebäck, P., & Gravdahl, J. T. (2019). Path following, obstacle detection and obstacle avoidance for thrusted underwater snake robots. Frontiers in Robotics and AI. https://doi.org/10.3389/frobt.2019.00057

    Article  Google Scholar 

  26. Abhishek, T. S., Schilberg, D., & Doss, A.S.-A. (2021). RETRACTED: Obstacle avoidance algorithms: A review. IOP Conference Series: Materials Science and Engineering, 1012(1), 012052. https://doi.org/10.1088/1757-899x/1012/1/012052

    Article  Google Scholar 

  27. Kumar, M., Misra, L., & Shekhar, G. (2015). A survey in fuzzy logic: An introduction. International Journal for Scientific Research and Development, 3(6), 822–824.

    Google Scholar 

  28. Elmas, Ç. (2003). Bulanık Mantık Denetleyiciler (Kuram, Uygulama, Sinirsel Bulanık Mantık). Seçkin Yayıncılık.

  29. Fernández, A., Usamentiaga, R., Carús, J. L., & Casado, R. (2016). Driver distraction using visual-based sensors and algorithms. Sensors (Basel)., 16(11), 1805. https://doi.org/10.3390/s16111805

    Article  Google Scholar 

  30. Sharifa, A., (2015). Multimodal analysis of verbal and nonverbal behaviour on the example of clinical depression, PhD Thesis at The Australian National University.

Download references

Acknowledgements

This study was supported by the Scientific and Technological Research Council of Turkey (TUBITAK).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eda Akman Aydin.

Ethics declarations

Conflict of interests

The authors have no conflicts of interest or competing interests to declare.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Özlük, Y., Akman Aydin, E. Fuzzy Logic Control of a Head-movement Based Semi-autonomous Human–machine Interface. J Bionic Eng 20, 645–655 (2023). https://doi.org/10.1007/s42235-022-00272-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42235-022-00272-3

Keywords

Navigation