Abstract
Convolutional neural networks (CNN) are widely used in the fields of object detection and image segmentation thanks to their high performance. The choice of architecture and activation functions in convolutional neural networks are of great importance in the learning process when performing object detection on an image. Among several activation functions, the Rectified Linear Unit (ReLU) activation function is widely used in the CNN and hemorrhage classification tasks. However, ReLU has the disadvantage of the negative region problem during neural activation. Numerous studies have been carried out on activation functions to improve the learning of convolutional neural networks. In this context, there are many challenges such as learning saturation, vanishing/exploding gradient problem, and formation of dead neurons. We proposed a new activation function called HardSReLUE. In this study, retinal blood vessels were detected and removed from the image using the Gabor transform to detect and classify hemorrhages in diabetic retinopathy lesions. The detection and classification of hemorrhagic areas were performed using the VGG-19, ResNet-50, and YOLOv5 CNN architectures. Moreover, experimental studies were carried out using ReLU, ELU, SeLU, PReLU, Mish, Swish, and the proposed HardSReLUE activation functions to increase the classification performance of CNN architectures. In the experimental studies, the EyePACS database was used due to the diverse and large number of retinal images. Additionally, an experimental study was performed using the MNIST dataset to support the success of the proposed activation function. The results of the experimental studies show that the proposed HardSReLUE activation function and the YOLOv5 architecture have better performance than others. Final training accuracy after 100 epochs for VGG-19, ResNet-50, and YOLOv5 are 91.72%, 93.38%, and 94.75% respectively.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig7_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig8_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig9_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig10_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig11_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig12_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig13_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11042-022-14313-w/MediaObjects/11042_2022_14313_Fig14_HTML.png)
Similar content being viewed by others
Data availability
The datasets generated and analyzed during the current study are available from the corresponding author upon request.
The datasets generated and/or analyzed during the current study are available in the [kaggle] repository, [https://www.kaggle.com/datasets/mariaherrerot/eyepacspreprocess]
References
Abràmoff MD et al (2016) Improved automated detection of diabetic retinopathy on a publicly available dataset through integration of deep learning. Invest Ophthalmol Vis Sci 57(13):5200–5206
Adem K (2018) Exudate detection for diabetic retinopathy with circular Hough transformation and convolutional neural networks. Expert Syst Appl 114:289–295
Adem K, Közkurt C (2019) Defect detection of seals in multilayer aseptic packages using deep learning. Turk J Electr Eng Comput Sci 27(6):4220–4230
Adem K, Kiliçarslan S, Cömert O (2019) Classification and diagnosis of cervical cancer with stacked autoencoder and softmax classification. Expert Syst Appl 115:557–564. https://doi.org/10.1016/j.eswa.2018.08.050
Aggarwal K et al (2022) Has the future started? The current growth of artificial intelligence, machine learning, and deep learning. Iraqi J Comput Sci Math 3(1):115–123
Alom MZ et al (2018) The history began from alexnet: a comprehensive survey on deep learning approaches. Ar**v Prepr. Ar**v180301164
Alyoubi WL, Shalash WM, Abulkhair MF (2020) Diabetic retinopathy detection through deep learning techniques: a review. Inf Med Unlocked 20:100377
Arbabshirani MR et al (2018) Advanced machine learning in action: identification of intracranial hemorrhage on computed tomography scans of the head with clinical workflow integration. NPJ Digit Med 1(1):1–7
Bawa VS, Kumar V (2019) Linearized sigmoidal activation: a novel activation function with tractable non-linear characteristics to boost representation capability. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2018.11.042
Chilamkurthy S et al (2018) Development and validation of deep learning algorithms for detection of critical findings in head CT scans. Ar**v Prepr. Ar**v180305854
Clevert D-A, Unterthiner T, Hochreiter S (2016) Fast and accurate deep network learning by Exponential Linear Units (ELUs). Ar**v151107289 Cs. Accessed: 27 Apr 2022. [Online]. Available: http://arxiv.org/abs/1511.07289
Dargan S, Kumar M, Ayyagari MR, Kumar G (2020) A survey of deep learning and its applications: a new paradigm to machine learning. Arch Comput Methods Eng 27(4):1071–1092
Gönül Ş, Kadioğlu E (2013) Retina sinir lifi tabakası ve diyabet. Tıp Araştırmaları Derg 11(2):87–93
Govindaiah A, Otero-Marquez O, Pasquale L, Brown AC, Smith RT, Bhuiyan A (2022) A validation study of an automated artificial intelligence-based detection model for disc hemorrhage using color fundus imaging. Invest Ophthalmol Vis Sci 63(7):2042-A0483
Grigorescu S, Trasnea B, Cocias T, Macesanu G (2020) A survey of deep learning techniques for autonomous driving. J Field Robot 37(3):362–386
He K, Zhang X, Ren S, Sun J (2015) Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE international conference on computer vision, pp 1026–1034
Hiller R, Sperduto RD, Podgor MJ, Ferris FL III, Wilson PW (1988) Diabetic retinopathy and cardiovascular disease in type II diabetics: the Framingham Heart Study and the Framingham Eye Study. Am J Epidemiol 128(2):402–409
Ji Q, Huang J, He W, Sun Y (2019) Optimized deep convolutional neural networks for identification of macular diseases from optical coherence tomography images. Algorithms 12(3):51
** X, **e Y, Wei X-S, Zhao B-R, Chen Z-M, Tan X (2022) Delving deep into spatial pooling for squeeze-and-excitation networks. Pattern Recognit 121:108159
Kiliçarslan S, Celik M (2021) RSigELU: a nonlinear activation function for deep neural networks. Expert Syst Appl 174:114805. https://doi.org/10.1016/j.eswa.2021.114805
Kilicarslan S, Adem K, Celik M (2020) Diagnosis and classification of cancer using hybrid model based on ReliefF and convolutional neural network. Med Hypotheses 137:109577
Kilicarslan S, Celik M, Sahin Ş (2021) Hybrid models based on genetic algorithm and deep learning algorithms for nutritional Anemia disease classification. Biomed Signal Process Control 63:102231
Klambauer G, Unterthiner T, Mayr A, Hochreiter S (2017) Self-normalizing neural networks. Ar**v170602515 Cs Stat. Accessed: 27 Apr 2022. [Online]. Available: http://arxiv.org/abs/1706.02515
Klein R, Klein BE, Moss SE, Cruickshanks KJ (1999) Association of ocular disease and mortality in a diabetic population. Arch Ophthalmol 117(11):1487–1495
Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Advances in neural information processing systems, vol 25. Accessed: 28 Apr 2022. [Online]. Available: https://proceedings.neurips.cc/paper/2012/hash/c399862d3b9d6b76c8436e924a68c45b-Abstract.html
Li Y, Hao Z, Lei H (2016) Survey of convolutional neural network. J Comput Appl 36(9):2508
Li J, Sun A, Han J, Li C (2020) A survey on deep learning for named entity recognition. IEEE Trans Knowl Data Eng 34(1):50–70
Li T et al (2021) Applications of deep learning in fundus images: a review. Med Image Anal 69:101971
Maas AL, Hannun AY, Ng AY (2013) Rectifier nonlinearities improve neural network acoustic models
Megahed M, Mohammed A (2020) Modeling adaptive E-learning environment using facial expressions and fuzzy logic. Expert Syst Appl 157:113460
Mehrotra A, Tripathi S, Singh KK, Khandelwal P (2014) Blood Vessel Extraction for retinal images using morphological operator and KCN clustering. In: IEEE International Advance Computing Conference (IACC), 2014, pp 1142–1146
Misra D (2020) Mish: a self regularized non-monotonic activation function. Ar**v190808681 Cs Stat. Accessed: 27 Apr 2022. [Online]. Available: http://arxiv.org/abs/1908.08681
Mondal SS, Mandal N, Singh A, Singh KK (2020) Blood vessel detection from retinal fundas images using GIFKCN classifier. Procedia Comput Sci 167:2060–2069
Nair V, Hinton GE (2010) “Rectified linear units improve restricted boltzmann machines,” presented at the ICML. Accessed: 27 Apr 2022. [Online]. Available: https://openreview.net/forum?id=rkb15iZdZB
Orlando JI, Prokofyeva E, Fresno MD, Blaschko MB (2018) An ensemble deep learning based approach for red lesion detection in fundus images. Comput Methods Programs Biomed 153:115–127
Pacal I, Karaboga D (2021) A robust real-time deep learning based automatic polyp detection system. Comput Biol Med 134:104519–104519
Ramachandran P, Zoph B, Le QV (2017) Searching for activation functions. Ar**v171005941 Cs. Accessed: 27 Apr 2022. [Online]. Available: http://arxiv.org/abs/1710.05941
Ravanelli M, Brakel P, Omologo M, Bengio Y (2018) Light gated recurrent units for speech recognition. IEEE Trans Emerg Top Comput Intell 2(2):92–102
Skouta A, Elmoufidi A, Jai-Andaloussi S, Ouchetto O (2022) Hemorrhage semantic segmentation in fundus images for the diagnosis of diabetic retinopathy by using a convolutional neural network. J Big Data 9(1):78. https://doi.org/10.1186/s40537-022-00632-0
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. Ar**v Prepr. Ar**v14091556
Tripathi S, Singh KK, Singh BK, Mehrotra A (2013) Automatic detection of exudates in retinal fundus images using differential morphological profile. Int J Eng Technol 5(3):2024–2029
Vijayan T, Sangeetha M, Kumaravel A, Karthik B (2020) Gabor filter and machine learning based diabetic retinopathy analysis and detection. Microprocess Microsyst:103353
Wan S, Liang Y, Zhang Y (2018) Deep convolutional neural networks for diabetic retinopathy detection by image classification. Comput Electr Eng 72:274–282
Wang X, Qin Y, Wang Y, **ang S, Chen H (2019) ReLTanh: an activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis. Neurocomputing 363:88–98
Wang J, Luo J, Liu B, Feng R, Lu L, Zou H (2020) Automated diabetic retinopathy grading and lesion detection based on the modified R-FCN object-detection algorithm. IET Comput Vis 14(1):1–8
Yan B, Fan P, Lei X, Liu Z, Yang F (2021) A real-time apple targets detection method for picking robot based on improved YOLOv5. Remote Sens 13(9):1619
Yau JW et al (2012) Meta-analysis for Eye Disease (METAEYE) Study Group. Global prevalence and major risk factors of diabetic retinopathy. Diabetes Care 35(3):556–564
Yoo H-J (2015) Deep convolution neural networks in computer vision: a review. IEIE Trans Smart Process Comput 4(1):35–43
Zago GT, Andreão RV, Dorizzi B, Salles EOT (2020) Diabetic retinopathy detection using red lesion localization and convolutional neural networks. Comput Biol Med 116:103537
Zhang G et al (2021) Lesion synthesis to improve intracranial hemorrhage detection and classification for CT images. Comput Med Imaging Graph 90:101929
Zhao H, Liu F, Li L, Luo C (2018) A novel softplus linear unit for deep convolutional neural networks. Appl Intell 48(7):1707–1720
Zhou Y, Li D, Huo S, Kung S-Y (2021) Shape autotuning activation function. Expert Syst Appl 171:114534. https://doi.org/10.1016/j.eswa.2020.114534
Acknowledgements
This research article was supported by Bandırma Onyedi Eylül University Scientific Research Projects Coordination Unit (no: BAP-22-1003-004).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical approval
Not required.
Competing interests
None.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Kiliçarslan, S. A novel nonlinear hybrid HardSReLUE activation function in transfer learning architectures for hemorrhage classification. Multimed Tools Appl 82, 6345–6365 (2023). https://doi.org/10.1007/s11042-022-14313-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-14313-w