Abstract
In recent years, approaching non-cooperative spacecraft is necessary to multitudes of on-orbit missions, which are more difficult for non-cooperative spacecraft than cooperative spacecraft because there are no markings on the surface of non-cooperative spacecraft and service spacecraft can not access to pose information. To solve this problem, several navigation methods based on vision images have been proposed. However, most of those methods are validated in ideal environment, they do not consider the interference caused by complex environment in real space. In this work, a method using Ringed Residual U-Net (RRU-Net) to detect the non-cooperative target under complex space environment is proposed to improve the performance and robustness of on-orbit service missions for non-cooperative spacecraft. In addition, we compare the results of Binary Cross Entropy Loss (BCELoss) function and Tversky Loss function to improve the accuracy and robustness of this method. Experiment results demonstrate the proposed method achieve high accuracy and stability under complex space environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Li, J., Chen, S., Li, C., Wang, F.: Distributed game strategy for formation flying of multiple spacecraft with disturbance rejection. IEEE Trans. Aerosp. Electron. Syst. 57(1), 119–128 (2021)
Zhang, T., Li, C., Ma, D., Wang, X., Li, C.: An optimal task management and control scheme for military operations with dynamic game strategy. Aerosp. Sci. Technol. 115, 106815–106823 (2021)
Zhang, L., Wu, D.M., Ren, Y.: Vision-based approach to non-cooperative spacecraft attitude estimation. IEEE Access 7, 106179–106194 (2019)
Hossein-Nejad, Z., Agahi, H., Mahmoodzadeh, A.: SIFT: Match image by the adaptive redundant keypoint elimination method. Pattern Anal. Appl. 24(2), 669–683 (2021)
Huan W., Liu M., Hu Q.: Non-cooperative target pose estimation based on deep learning. In: 2020 39th Chinese Control Conference, pp. 3339–3343. IEEE (2020)
Bi, X., Wei, Y., **ao, B., Li, W.: Image splicing forgery detection using RRU-net. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pp. 30–39 (2019)
Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28
He, K., Zhang, X., Ren, S., Sun, J.: Image detection based on deep residual network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Borse, S., Wang, Y., Zhang, Y., Porikli, F.: A loss function investigation on structured boundary-aware segmentation. In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 5897–5907 (2021)
Zhang, M., Wang, H., He, P., Malik, A., Liu, H.: Improving GAN-generated image detection generalization using unsupervised domain adaptation. In: IEEE International Conference on Multimedia and Expo, pp. 1–6 (2022)
Salehi, S.S.M., Erdogmus, D., Gholipour, A.: Three-dimensional fully convolutional deep networks using image segmentation based on tversky loss. CoRR abs/1706.05721 (2017). arxiv.org/abs/1706.05721d
Abraham, N., Khan, N.M.: Lesion segmentation based on novel focal tversky loss function and improved attention u-net. CoRR abs/1810.07842 (2018). arxiv.org/abs/1810.07842
Acknowledgements
The authors would like to express their Acknowledgement for the support by Open Project Funds for the Key Laboratory of Space Photoelectric Detection and Perception (Nan**g University of Aeronautics and Astronautics), Ministry of Industry and Information Technology (No. NJ2022025-3); Supported by the Fundamental Research Funds for the Central Universities (NO. NJ2022025).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yu, S., Wu, Y., Guo, H., Tian, W. (2023). Non-cooperative Spacecraft Detection Based on Deep Learning Under Complex Space Environment. In: Yan, L., Duan, H., Deng, Y. (eds) Advances in Guidance, Navigation and Control. ICGNC 2022. Lecture Notes in Electrical Engineering, vol 845. Springer, Singapore. https://doi.org/10.1007/978-981-19-6613-2_586
Download citation
DOI: https://doi.org/10.1007/978-981-19-6613-2_586
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-6612-5
Online ISBN: 978-981-19-6613-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)