Abstract
Nowadays, mobile robots have been popular not only in industrial applications such as materials transportation but also in non-industrial applications, e.g., human assistance. Among developed configurations, omnidirectional mobile robots have attracted great attention in recent times due to their superior maneuverability over their conventional counterparts. In this research, an application of a four mecanum-wheeled omnidirectional mobile robot (4-MWMR) in human assistance has been developed. By using image processing, the 4-MWMR is capable of following an authorized person, thereby assisting users in transporting large-size or heavy-weight materials. Good experimental results show the ability of the developed system to be used in practice.
This research is funded by the Hanoi University of Science and Technology (HUST) under project number T2022-PC-005.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Rubio, F., Valero, F., Llopis-Albert, C.: A review of mobile robots: concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 16(2) (2019). https://doi.org/10.1177/1729881419839596
Taheri, H., Zhao, C.X.: Omnidirectional mobile robots, mechanisms and navigation approaches. Mech. Mach. Theory 153, 103958 (2020). https://doi.org/10.1016/j.mechmachtheory.2020.103958. ISSN 0094-114X
Cebollada, S., Payá, L., Flores, M., Peidró, A., Reinoso, O.: A state-of-the-art review on mobile robotics tasks using artificial intelligence and visual data. Expert Syst. Appl. 167, 114195 (2021). https://doi.org/10.1016/j.eswa.2020.114195. ISSN 0957-4174
L. Payá, A. Gil, O. Reinoso: A state-of-the-art review on map** and localization of mobile robots using omnidirectional vision sensors. J. Sens. 20 (2017). Article ID 3497650. https://doi.org/10.1155/2017/3497650
Panigrahi, P.K., Bisoy, S.K.: Localization strategies for autonomous mobile robots: a review. J. King Saud Univ. Comput. Inf. Sci. 34(8), 6019–6039 (2022). https://doi.org/10.1016/j.jksuci.2021.02.015. ISSN 1319-1578
Gupta, M., Kumar, S., Behera, L., Subramanian, V.K.: A novel vision-based tracking algorithm for a human-following mobile robot. IEEE Trans. Syst. Man Cybern. Syst. 47(7), 1415–1427 (2017). https://doi.org/10.1109/TSMC.2016.2616343
**, D., Fang, Z., Zeng, J.: A robust autonomous following method for mobile robots in dynamic environments. IEEE Access 8, 150311–150325 (2020). https://doi.org/10.1109/ACCESS.2020.3016472
Yuan, Z., Tian, Y., Yin, Y., Wang, S., Liu, J., Wu, L.: Trajectory tracking control of a four mecanum wheeled mobile platform: an extended state observer-based sliding mode approach. IET Control Theory Appl. 14, 415–426 (2020). https://doi.org/10.1049/iet-cta.2018.6127
Schenkel, T., Ringhage, O., Branding, N.: A Comparative Study of Facial Recognition Techniques: With focus on low computational power. Dissertation (2019)
Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. EEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2017)
Liu, W., et al.: SSD: single shot MultiBox detector. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9905, pp. 21–37. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46448-0_2
Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, pp. 779–788 (2016)
Yinan, L.I.: A survey of research on deep learning target detection methods. China New Telecomm-unications 23(9), 159–160 (2021)
Chiu, Y.-C., Tsai, C.-Y., Ruan, M.-D., Shen, G.-Y., Lee, T.-T.: Mobilenet-SSDv2: an improved object detection model for embedded systems. In: International Conference on System Science and Engineering (ICSSE) 2020, pp. 1–5 (2020). https://doi.org/10.1109/ICSSE50014.2020.9219319
Howard, A.G., et al.: MobileNets: efficient convolutional neural networks for mobile vision applications. http://arxiv.org/abs/1704.04861
Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen, L.-C.: MobileNetV2: inverted residuals and linear bottlenecks. http://arxiv.org/abs/1801.04381
Acknowledgements
This research is funded by the Hanoi University of Science and Technology (HUST) under project number T2022-PC-005.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Nguyen, T.C., Bui, T.N., Nguyen, V.N., Nguyen, D.P., Nguyen, C.M., Nguyen, M.L. (2023). Interactive Control Between Human and Omnidirectional Mobile Robot: A Vision-Based Deep Learning Approach. In: Nguyen, T.D.L., Verdú, E., Le, A.N., Ganzha, M. (eds) Intelligent Systems and Networks. ICISN 2023. Lecture Notes in Networks and Systems, vol 752. Springer, Singapore. https://doi.org/10.1007/978-981-99-4725-6_66
Download citation
DOI: https://doi.org/10.1007/978-981-99-4725-6_66
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4724-9
Online ISBN: 978-981-99-4725-6
eBook Packages: Computer ScienceComputer Science (R0)