Abstract
The purpose of this research was to conduct a prediction human falling using deep learning and dimensionality reduction techniques in human activity recognition and behavioral prediction using smart watch and smart phone data. The deep learning-based techniques combined with multiple sensor data aim to classify daily activities. Previous work in human falling has focused on using multiple accelerometers placed on different parts of the body, with more recent work focused on sensors embedded in smartphones to classify activities. This research classifies activities from utilizing the data from the following sensors—accelerometer, gyroscope and magnetometer. In addition to comparing these evaluation metrics, a comparison of each network’s confusion matrix, feature importance and multisensory fusion analysis is performed—to evaluate which network best suits the data and successfully classifies the daily activities in question. Another intriguing aim of this research is to compare two data clustering techniques for visualizing the smart watch and smart phone dataset. This research aims to present the best visualization technique by conducting a comparative study on the two-visualization techniques. The result of this research found that all six-machine learning classification algorithms consistently outperformed State-of-the-Art baselines. Deep Neural Network (99.97% accuracy) and MLP (90.55%) accuracy performed excellently on the data, with very little misclassified instances. All six-classification algorithms produced more insightful, predictive results than existing baselines, while DNN successfully clustered and visualized the data. The results show that each algorithm is suited to the smart watch and smart phone dataset, with high performance results achieved throughout. The DNN model does not struggles in distinguishing between the falling activity and running activity with 7% of the activity misclassified. DNN outperforms MLP in this aspect as it misclassifies 3% of the activities between jogging and running. A solution to this would be to place an extra sensor on the thigh to distinguish between both activities. This sensor would lead to detection in a greater acceleration and range of motion in the upper thigh area when the subject is running in comparison to falling.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig7_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig8_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig9_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig10_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig11_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig12_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig13_HTML.jpg)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig14_HTML.jpg)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig15_HTML.jpg)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00500-023-09295-2/MediaObjects/500_2023_9295_Fig16_HTML.jpg)
Similar content being viewed by others
Data availability
All data generated or analyzed during this study are included in the manuscript.
Code availability
Not applicable.
Change history
24 January 2024
A Correction to this paper has been published: https://doi.org/10.1007/s00500-024-09672-5
References
Adhikari K (2019) Computer Vision Based Posture Estimation and Fall Detection. Doctoral Dissertation, Bournemouth University, Poole, UK
Butt A, Narejo S, Anjum M, Yonus M, Memon M, Samejo A (2022) Fall detection using LSTM and transfer learning. Wireless Pers Commun 126:1–18. https://doi.org/10.1007/s11277-022-09819-3
Cahoolessur DK, Rajkumarsingh B (2020) Fall detection system using XGBoost and IoT. R & D Journal 36:8–18. https://doi.org/10.17159/2309-8988/2020/v36a2
Cheng W, Jhan DM (2013) Triaxial accelerometer-based fall detection method using a self-constructing cascade-AdaBoostSVM classifier. IEEE J Biomed Health Inform 17(2):411–419
Gkioxari G, Arbeláez P, Bourdev L, Malik J (2013) Articulated pose estimation using discriminative armlet classifiers. Proceed IEEE Conf Comput vis Patt Recogn, Portland, OR, USA 23–28:3342–3349
Gunale K, Mukherji P (2018) Indoor human fall detection system based on automatic vision using computer vision and machine learning algorithms. J Eng Sci Technol 13:2587–2605
Harrou F, Zerrouki N, Sun Y, Houacine A (2017) Vision-based fall detection system for improving safety of elderly people. IEEE Instrum Meas Mag 20:49–55
Hasan M.M, Islam MS, Abdullah S (2019) Robust pose-based human fall detection using recurrent neural network. In Proceedings of the 2019 IEEE International Conference on Robotics, Automation, Artificial-intelligence and Internet-of-Things (RAAICON), Dhaka, Bangladesh, 29 November–1, pp. 48–51.
Hnoohom N, Jitpattanakul A, Inluergsri P, Wongbudsri P, Ployput W, (2018) "Multi-sensor-based fall detection and activity daily living classification by using ensemble learning", Proc Int ECTI Northern Sect Conf Electr Electron Comput Telecommun Eng (ECTI-NCON), pp. 111–115
Jeong S, Kang S, Chun I (2019) Human-skeleton based fall-detection method using LSTM for manufacturing industries. In Proceedings of the 2019 34th International Technical Conference on Circuits/Systems, Computers and Communications (ITCCSCC), Jeju, Korea, 23–26, pp. 1–4.
Jiao LC, Zhang F, Liu F, Yang SY, Li LL, Feng ZX, Qu R (2019) A survey of deep learning-based object detection. IEEE Access 7:128837–128868
Kangas M, Konttila A, Lindgren P, Winblad P, Jamsa T (2008) Comparison of low-complexity fall detection algorithms for body attached accelerometers. Gait Posture 28(2):285–291
Kangas M, Konttila A, Winblad I, Jamsa T 2007 Determination of simple thresholds for accelerometry-based parameters for fall detection. In Proceedings of the 29th Annual International Conference of the IEEE, Engineering in Medicine and Biology Society, pp. 1367–1370
Lin CB, Dong Z, Kuan WK, Huang YF (2021) A framework for fall detection based on openpose skeleton and lstm/gru models. Appl Sci 11:329
Lin LL, Yang Y, Song YT, Nemec B, Ude A, Rytz JA, Buch AG, Krüger N, Savarimuthu TR 2014 Peg-in-hole assembly under uncertain pose estimation. In Proceedings of the 11th World Congress on Intelligent Control and Automation, Shenyang, China, 29 June–4 July; pp. 2842–2847.
Liu S-H, Cheng W-C (2012) Fall detection with the support vector machine during scripted and continuous unscripted activities. Sensors (basel, Switzerland) 12:12301–12316. https://doi.org/10.3390/s120912301
Liu H, Liu W, Chi Z, Wang Y, Yu Y, Chen J, ** T (2022) Fast Human Pose Estimation in Compressed Videos. IEEE Trans Multimed 14:1–12
Mastorakis G, Makris D (2014) Fall detection system using Kinect’s infrared sensor. J Real-Time Image Proc 9(4):635–646
Mao A, Ma X, He Y, Luo J (2017) Highly portable, sensor-based system for human fall monitoring. Sensors 17(9):2096
Miikkulainen R, Liang J, Meyerson E, Rawal A, Fink D, Francon O, Raju B, Shahrzad H, Navruzyan A, Duffy N et al (2019) Chapter 15—Evolving Deep Neural Networks. In: Kozma R, Alippi C, Choe Y, Morabito F (eds) Artificial intelligence in the age of neural networks and brain computing. Academic Press, Cambridge, MA, USA, pp 293–312
Min W, Cui H, Rao H, Li Z, Yao L (2018) Detection of human falls on furniture using scene analysis based on deep learning and activity characteristics. IEEE Access 6:9324–9335
Mirza AA, Dutta M, Mishra S, Mirza A.U. Performance Evaluation of Different Classification Factors for Early Diagnosis of Alzheimer’s Disease. In Proceedings of International Conference on IoT Inclusive Life (ICIIL 2019), Chandigarh, India, 19–20 December 2019; pp. 305–316.
Nari MI, Suprapto SS, Kusumah IH, Adiprawita W. (2016). A simple design of wearable device for fall detection with accelerometer and gyroscope. Electronics and Smart Devices (ISESD), International Symposium on (pp. 88–91). IEEE.
Nizam Y, Mohd MNH, Jamil MMA (2018) Development of a user-adaptable human fall detection based on fall risk levels using depth sensor. Sensors 18:2260
Núñez-Marcos A, Azkune G, Arganda-Carreras I (2017) Vision-based fall detection with convolutional neural networks. Wirel Commun Mob Comput 2017:9474806
Pishchulin L, Andriluka M, Gehler P, Schiele B (2013) Strong appearance and expressive spatial models for human pose estimation. Proceed IEEE Int Conf Comput vis Syd Aust 1–8:3487–3494
Purushothaman A, Vineetha KV, Kurup D (2018). Fall Detection System Using Artificial Neural Network. 1146–1149. https://doi.org/10.1109/ICICCT.2018.8473219.
Ramakrishna V, Munoz D, Hebert M, Bagnell JA, Sheikh Y (2014) Pose machines: articulated pose estimation via inference machines. In Proceed Eur Conf Comput vis, Zurich, Switzerland 6–12:33–47
Santos GL, Endo PT, Monteiro KHD, Rocha ED, Silva I, Lynn T (2019) Accelerometer-based human fall detection using convolutional neural networks. Sensors 19:1644
Solbach MD, Tsotsos JK (2017) Vision-based fallen person detection for the elderly. Proceed IEEE Int Conf Comput vis Workshops, Venice, Italy 22–29:1433–1442
Wang X, Ellul J, Azzopardi G (2020a) Elderly fall detection systems: a literature survey. Front Robot AI 7:71
Wang L, Peng M, Zhou Q (2020b) Pre-impact fall detection based on multisource CNN ensemble. IEEE Sensors J 20(10):5442–5451. https://doi.org/10.1109/JSEN.2020.2970452
World Health Organization; World Health Organization; Ageing, & Life Course Unit. WHO Global Report on Falls Prevention in Older Age; World Health Organization: Geneva, Switzerland, 2008.
**ao B, Wu, H, Wei, Y. Simple baselines for human pose estimation and tracking. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 466–481.
Zheng J, Fu H, Li W, Wu W, Yu L, Yuan S, Tao WYW, Pang TK, Kanniah KD (2021) Growing status observation for oil palm trees using unmanned aerial vehicle (UAV) images. ISPRS J Photogramm Remote Sens 173:95–121
Funding
No funds, grants were received by any of the authors.
Author information
Authors and Affiliations
Contributions
ANAS and SK, contributed to the design and methodology of this study, the assessment of the outcomes and the writing of the manuscript.
Corresponding author
Ethics declarations
Conflict of interest
There is no conflict of interest among the authors.
Ethical approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The original article has been updated: Due to co-author name update.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Al-Shawi, A.N., Kurnaz, S. Deep neural network for human falling prediction using log data from smart watch and smart phone sensors. Soft Comput (2023). https://doi.org/10.1007/s00500-023-09295-2
Accepted:
Published:
DOI: https://doi.org/10.1007/s00500-023-09295-2