Log in

Biological Eagle-eye Inspired Target Detection for Unmanned Aerial Vehicles Equipped with a Manipulator

  • Research Article
  • Published:
Machine Intelligence Research Aims and scope Submit manuscript

Abstract

Inspired by eagle eye mechanisms, the structure and information processing characteristics of the eagle’s visual system are used for the target capture task of an unmanned aerial vehicle (UAV) with a mechanical arm. In this paper, a novel eagle-eye inspired multi-camera sensor and a saliency detection method are proposed. A combined camera system is built by simulating the double fovea structure on the eagle retina. A saliency target detection method based on the eagle midbrain inhibition mechanism is proposed by measuring the static saliency information and dynamic features. Thus, salient targets can be accurately detected through the collaborative work between different cameras of the proposed multi-camera sensor. Experimental results show that the eagle-eye inspired visual system is able to continuously detect targets in outdoor scenes and that the proposed algorithm has a strong inhibitory effect on moving background interference.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. W. R. T. Roderick, M. R. Cutkosky, D. Lentink. Bird-inspired dynamic gras** and perching in arboreal environments. Science Robotics, vol. 6, no. 61, Article number eabj7562, 2021. DOI: https://doi.org/10.1126/scirobotics.abj7562.

  2. D. Alshamaa, A. Cherubini, R. Passama, S. Pla, L. Damm, S. Ramdani. RobCap: A mobile motion capture system mounted on a robotic arm. IEEE Sensors Journal, vol. 22, no. 1, pp. 917–925, 2022. DOI: https://doi.org/10.1109/JSEN.2021.3128386.

    Article  Google Scholar 

  3. D. Morrison, P. Corke, J. Leitner. Learning robust, real-time, reactive robotic gras**. The International Journal of Robotics Research, vol. 39, no. 2–3, pp. 183–201, 2020. DOI: https://doi.org/10.1177/0278364919859066.

    Article  Google Scholar 

  4. J. Thomas, G. Loianno, K. Sreenath, V. Kumar. Toward image based visual servoing for aerial gras** and perching. In Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Hong Kong, China, pp. 2113–2118, 2014. DOI: https://doi.org/10.1109/ICRA.2014.6907149.

    Google Scholar 

  5. J. González-Martín-Moro, J. L. Hernández-Verdejo, A. Clement-Corral. The visual system of diurnal raptors: Updated review. Archivos de la Sociedad Española de Oftalmología, vol. 92, no. 5, pp. 225–232, 2017. DOI: https://doi.org/10.1016/e.oftal.2016.11.019.

    Article  Google Scholar 

  6. L. Reymond. Spatial visual acuity of the eagle Aquila audax: A behavioural, optical and anatomical investigation. Vision Research, vol. 25, no. 10, pp. 1477–1491, 1985. DOI: https://doi.org/10.1016/0042-6989(85)90226-3.

    Article  Google Scholar 

  7. Y. M. Deng, H. B. Duan. Biological eagle-eye-based visual platform for target detection. IEEE Transactions on Aerospace and Electronic Systems, vol. 54, no. 6, pp. 3125–3136, 2018. DOI: https://doi.org/10.1109/TAES.2018.2845178.

    Article  Google Scholar 

  8. Q. Fu, S. T. Wang, J. Wang, S. N. Liu, Y. B. Sun. A lightweight eagle-eye-based vision system for target detection and recognition. IEEE Sensors Journal, vol. 21, no. 22, pp. 26140–26148, 2021. DOI: https://doi.org/10.1109/JSEN.2021.3120922.

    Article  Google Scholar 

  9. M. M. Cheng, J. Warrell, W. Y. Lin, S. Zheng, V. Vineet, N. Crook. Efficient salient region detection with soft image abstraction. In Proceedings of IEEE International Conference on Computer Vision, IEEE, Sydney, Australia, pp. 1529–1536, 2013. DOI: https://doi.org/10.1109/ICCV.2013.193.

    Google Scholar 

  10. F. Bonnin-Pascual, A. Ortiz. A flying tool for sensing vessel structure defects using image contrast-based saliency. IEEE Sensors Journal, vol. 16, no. 15, pp. 6114–6121, 2016. DOI: https://doi.org/10.1109/JSEN.2016.2578360.

    Article  Google Scholar 

  11. Y. Zhang. Detection and tracking of human motion targets in video images based on Camshift algorithms. IEEE Sensors Journal, vol. 20, no. 20, pp. 11887–11893, 2020. DOI: https://doi.org/10.1109/JSEN.2019.2956051.

    Article  Google Scholar 

  12. X. C. Cao, Z. Q. Tao, B. Zhang, H. Z. Fu, W. Feng. Self-adaptively weighted co-saliency detection via rank constraint. IEEE Transactions on Image Processing, vol. 23, no. 9, pp. 4175–4186, 2014. DOI: https://doi.org/10.1109/TIP.2014.2332399.

    MathSciNet  MATH  Google Scholar 

  13. M. Chancán, L. Hernandez-Nunez, A. Narendra, A. B. Barron, M. Milford. A hybrid compact neural architecture for visual place recognition. IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 993–1000, 2020. DOI: https://doi.org/10.1109/LRA.2020.2967324.

    Article  Google Scholar 

  14. Y. H. Luo, M. Xu, C. H. Yuan, X. Cao, L. Q. Zhang, Y. Xu, T. J. Wang, Q. Feng. SiamSNN: Siamese spiking neural networks for energy-efficient object tracking. In Proceedings of the 30th International Conference on Artificial Neural Networks and Machine Learning, Springer, Bratislava, Slovakia, pp. 182–194, 2021. DOI: https://doi.org/10.1007/978-3-030-86383-8_15.

    Google Scholar 

  15. Z. Y. Yang, Y. J. Wu, G. R. Wang, Y. K. Yang, G. Q. Li, L. Deng, J. Zhu, L. P. Shi. DashNet: A hybrid artificial and spiking neural network for high-speed object tracking. [Online], Available: https://arxiv.org/abs/1909.12942, 2019.

  16. H. B. Duan, L. **n, Y. Xu, G. Z. Zhao, S. J. Chen. Eagle-vision-inspired visual measurement algorithm for UAV’s autonomous landing. International Journal of Robotics and Automation, vol. 35, no. 6, pp. 94–100, 2020. DOI: https://doi.org/10.2316/J.2020.206-0221.

    Google Scholar 

  17. H. B. Duan, X. B. Xu, Y. M. Deng, Z. G. Zeng. Unmanned aerial vehicle recognition of maritime small-target based on biological eagle-eye vision adaptation mechanism. IEEE Transactions on Aerospace and Electronic Systems, vol. 57, no. 5, pp. 3368–3382, 2021. DOI: https://doi.org/10.1109/TAES.2021.3075524.

    Article  Google Scholar 

  18. X. Li, H. B. Duan, J. C. Li, Y. M. Deng, F. Y. Wang. Biological eagle eye-based method for change detection in water scenes. Pattern Recognition, vol. 122, Article number 108203, 2022. DOI: https://doi.org/10.1016/J.PATCOG.2021.108203.

  19. H. Z. Fu, X. C. Cao, Z. W. Tu. Cluster-based co-saliency detection. IEEE Transactions on Image Processing, vol. 22, no. 10, pp. 3766–3778, 2013. DOI: https://doi.org/10.1109/TIP.2013.2260166.

    Article  MathSciNet  MATH  Google Scholar 

  20. S. Ohayon, W. Harmening, H. Wagner, E. Rivlin. Through a barn owl’s eyes: Interactions between scene content and visual attention. Biological Cybernetics, vol. 98, no. 2, pp. 115–132, 2008. DOI: https://doi.org/10.1007/s00422-007-0199-4.

    Article  MATH  Google Scholar 

  21. J. Orlowski, O. Ben-Shahar, H. Wagner. Visual search in barn owls: Task difficulty and saccadic behavior. Journal of Vision, vol. 18, no. 1, Article number 4, 2018. DOI: https://doi.org/10.1167/18.1.4.

  22. C. A. Goddard, S. P. Mysore, A. S. Bryant, J. R. Huguenard, E. I. Knudsen. Spatially reciprocal inhibition of inhibition within a stimulus selection network in the avian midbrain. PLoS One, vol. 9, no. 1, Article number e85865, 2014. DOI: https://doi.org/10.1371/eournal.pone.0085865.

  23. S. P. Mysore, E. I. Knudsen. Reciprocal inhibition of inhibition: A circuit motif for flexible categorization in stimulus selection. Neuron, vol. 73, no. 1, pp. 193–205, 2012. DOI: https://doi.org/10.1016/e.neuron.2011.10.037.

    Article  Google Scholar 

  24. W. Kim, C. Kim. Spatiotemporal saliency detection using textural contrast and its applications. IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 4, pp. 646–659, 2014. DOI: https://doi.org/10.1109/TCSVT.2013.2290579.

    Article  Google Scholar 

  25. S. Potier, M. Mitkus, A. Kelber. High resolution of colour vision, but low contrast sensitivity in a diurnal raptor. Proceedings of the Royal Society B-Biological Sciences, vol. 285, no. 1885, Article number 20181036, 2018. DOI: https://doi.org/10.1098/rspb.2018.1036.

  26. V. A. Tucker. The deep fovea, sideways vision and spiral flight paths in raptors. Journal of Experimental Biology, vol. 203, no. 24, pp. 3745–3754, 2000. DOI: https://doi.org/10.1242/eeb.203.24.3745.

    Article  Google Scholar 

  27. P. Cornic, C. Illoul, A. Cheminet, G. Le Besnerais, F. Champagnat, Y. Le Sant, B. Leclaire. Another look at volume self-calibration: Calibration and self-calibration within a pinhole model of Scheimpflug cameras. Measurement Science and Technology, vol. 27, no. 9, Article number 094004, 2016. DOI: https://doi.org/10.1088/0957-0233/27/9/094004.

  28. K. Gao, H. Aliakbarpour, J. Fraser, K. Nouduri, F. Bunyak, R. Massaro, G. Seetharaman, K. Palaniappan. Local feature performance evaluation for structure-from-motion and multi-view stereo using simulated city-scale aerial imagery. IEEE Sensors Journal, vol. 21, no. 10, pp. 11615–11627, 2021. DOI: https://doi.org/10.1109/JSEN.2020.3042810.

    Article  Google Scholar 

  29. C. Liu, Y. Huang, Y. M. Rong, G. Li, J. Meng, Y. L. **e, X. L. Zhang. A novel extrinsic calibration method of mobile manipulator camera and 2D-LiDAR via arbitrary trihedron-based reconstruction. IEEE Sensors Journal, vol. 21, no. 21, pp. 24672–24682, 2021. DOI: https://doi.org/10.1109/JSEN.2021.3111196.

    Article  Google Scholar 

  30. S. Q. Li, C. Xu, M. **e. A robust O(n) solution to the perspective-n-point problem. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, no. 7, pp. 1444–1450, 2012. DOI: https://doi.org/10.1109/TPAMI.2012.41.

    Article  Google Scholar 

  31. R. Achanta, S. Süsstrunk. Saliency detection using maximum symmetric surround. In Proceedings of IEEE International Conference on Image Processing, IEEE, Hong Kong, China, pp. 2653–2656, 2010. DOI: https://doi.org/10.1109/ICIP.2010.5652636.

    Google Scholar 

  32. E. Erdem, A. Erdem. Visual saliency estimation by nonlinearly integrating features using region covariances. Journal of Vision, vol. 13, no. 4, Article number 11, 2013. DOI: https://doi.org/10.1167/13.4.11.

  33. H. B. Duan, Y. M. Deng, X. H. Wang, F. Liu. Biological eagle-eye-based visual imaging guidance simulation platform for unmanned flying vehicles. IEEE Aerospace and Electronic Systems Magazine, vol. 28, no. 12, pp. 36–45, 2013. DOI: https://doi.org/10.1109/MAES.2013.6693667.

    Article  Google Scholar 

  34. X. D. Hou, L. Q. Zhang. Saliency detection: A spectral residual approach. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Minneapolis, USA, 2007. DOI: https://doi.org/10.1109/CVPR.2007.383267.

    Google Scholar 

  35. W. G. Wang, J. B. Shen, R. G. Yang, F. Porikli. Saliency-aware video object segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 1, pp. 20–33, 2018. DOI: https://doi.org/10.1109/TPAMI.2017.2662005.

    Article  Google Scholar 

  36. W. J. Zhu, S. Liang, Y. C. Wei, J. Sun. Saliency optimization from robust background detection. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Columbus, USA, pp. 2814–2821, 2014. DOI: https://doi.org/10.1109/CVPR.2014.360.

    Google Scholar 

  37. F. Perazzi, P. Krähenbühl, Y. Pritch, A. Hornung. Saliency filters: Contrast based filtering for salient region detection. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Providence, USA, pp. 733–740, 2012. DOI: https://doi.org/10.1109/CVPR.2012.6247743.

    Google Scholar 

  38. Y. C. Wei, F. Wen, W. J. Zhu, J. Sun. Geodesic saliency using background priors. In Proceedings of the 12th European Conference on Computer Vision, Springer, Florence, Italy, pp. 29–42, 2012. DOI: https://doi.org/10.1007/978-3-642-33712-3_3.

    Google Scholar 

  39. C. Yang, L. H. Zhang, H. C. Lu, X. Ruan, M. H. Yang. Saliency detection via graph-based manifold ranking. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Portland, USA, pp. 3166–3173, 2013. DOI: https://doi.org/10.1109/CVPR.2013.407.

    Google Scholar 

  40. Y. L. Chen, C. T. Hsu. Implicit rank-sparsity decomposition: Applications to saliency/co-saliency detection. In Proceedings of the 22nd International Conference on Pattern Recognition, IEEE, Stockholm, Sweden, pp. 2305–2310, 2014. DOI: https://doi.org/10.1109/ICPR.2014.400.

    Google Scholar 

  41. F. Perazzi, J. Pont-Tuset, B. McWilliams, L. Van Gool, M. Gross, A. Sorkine-Hornung. A benchmark dataset and evaluation methodology for video object segmentation. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Las Vegas, USA, pp. 724–732, 2016. DOI: https://doi.org/10.1109/CVPR.2016.85.

    Google Scholar 

  42. P. Ochs, J. Malik, T. Brox. Segmentation of moving objects by long term video analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 36, no. 6, pp. 1187–1200, 2014. DOI: https://doi.org/10.1109/TPAMI.2013.242.

    Article  Google Scholar 

  43. A. Borji, L. Itti. State-of-the-art in visual attention modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 1, pp. 185–207, 2013. DOI: https://doi.org/10.1109/TPAMI.2012.89.

    Article  Google Scholar 

  44. R. Achanta, S. Hemami, F. Estrada, S. Susstrunk. Frequency-tuned Salient Region Detection. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Miami, USA, pp. 1597–1604, 2009. DOI: https://doi.org/10.1109/CVPR.2009.5206596.

    Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (Nos. T2121003, U1913602 and U19B2033), Science and Technology Innovation 2030-Key Project of “New Generation Artificial Intelligence” (No. 2018AAA0100803).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi-Min Deng.

Ethics declarations

The authors declared that they have no conflicts of interest to this work.

Additional information

Colored figures are available in the online version at https://springer.longhoe.net/journal/11633

Yi-Min Deng received the B. Sc. and Ph. D. degrees in control science and engineering from School of Automation Science and Electrical Engineering, Beihang University, China in 2011 and 2017, respectively. He is currently an associate professor at School of Automation Science and Electrical Engineering, Beihang University, China. He is enrolled in the Young Elite Scientists Sponsorship Program by CAST (Chinese Association for Science and Technology) and Young Top Talent Support Program by Beihang University.

His research interests include biological computer vision and autonomous flight control.

ORCID iD: 0000-0003-1533-3839

Si-Yuan Wang received the B. Sc. degree in automation from School of Automation and Electrical Engineering, University of Science and Technology Bei**g, China in 2019. She is currently a master student in guidance, navigation and control at School of Automation Science and Electrical Engineering, Beihang University, China.

Her research interests include bioinspired computation and artificial intelligence.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Deng, YM., Wang, SY. Biological Eagle-eye Inspired Target Detection for Unmanned Aerial Vehicles Equipped with a Manipulator. Mach. Intell. Res. 20, 741–752 (2023). https://doi.org/10.1007/s11633-022-1342-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11633-022-1342-3

Keywords

Navigation