MOVRO: Loosely Coupled EKF-Based Monocular Visual Radar Odometry

  • Conference paper
  • First Online:
Intelligent Autonomous Systems 18 (IAS 2023)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 795))

Included in the following conference series:

  • 74 Accesses

Abstract

Reliable and accurate ego-motion estimation constitute an integral part of any autonomous system. To achieve this, it is essential to integrate sensors of complementary characteristics while also minimizing costs. The most often used ego-motion estimation setup usually consists of a monocular camera and an inertial measurement unit (IMU) which can often drift due to inaccuracy of modest performance IMUs. Inlieu of enhancing such setups with a higher-performing IMU, in this paper we propose to fuse the monocular camera with a radar sensor. Specifically, we combine measurements from a 3D millimeter-wave frequency modulated continuous wave radar and a monocular camera within an error-state extended Kalman filter framework. The filter was implemented using loose coupling that allows for an independent and scalable development of sensor models. We evaluated the proposed method against state-of-the-art radar inertial, visual inertial, and radar visual inertial odometries. The results show that the proposed method is comparable with other multi-domain solutions, especially in outdoor sequences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now
Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kramer, A., Stahoviak, C., Santamaria-Navarro, A., Agha-Mohammadi, A.A., Heckman, C.: Radar-inertial ego-velocity estimation for visually degraded environments. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 5739–5746 (2020)

    Google Scholar 

  2. Doer, C., Trommer, G.F.: Radar inertial odometry with online calibration. In: European Navigation Conference (ENC) (2020)

    Google Scholar 

  3. Adams, M., Mullane, J., Ebi, J., Vo, B.-N.: Robotic Navigation and Map** with Radar. Artech House (2012)

    Google Scholar 

  4. Nabati, R., Qi, H.: CenterFusion: center-based radar and camera fusion for 3D object detection. In: IEEE Winter Conference on Applications of Computer Vision (WACV) (2021)

    Google Scholar 

  5. Monaco, C.D.: Ego-motion estimation from Doppler and spatial data in sonar, radar, or camera images. Ph.D. dissertation (2019)

    Google Scholar 

  6. Geiger, A., Ziegler, J., Stiller, C.: StereoScan: dense 3D reconstruction in real-time. In: IEEE Intelligent Vehicles Symposium (IV), pp. 963–968 (2011)

    Google Scholar 

  7. Campos, C., Elvira, R., Rodriguez, J.J., Montiel, J.M., Tardos, J.D.: ORB-SLAM3: an accurate open-source library for visual, visual-inertial, and multimap slam. IEEE Trans. Robot. 37, 1874–1890 (2021)

    Article  Google Scholar 

  8. Newcombe, R.A., Lovegrove, S.J., Davison, A.J.: DTAM: dense tracking and map** in real-time. In: IEEE International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)

    Google Scholar 

  9. Mur-Artal, R., Tardós, J.D.: Probabilistic semi-dense map** from highly accurate feature-based monocular SLAM. In: Robotics: Science and Systems (RSS), vol. 11 (2015)

    Google Scholar 

  10. Klein, G., Murray, D.: Improving the agility of keyframe-based SLAM. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), LNCS, vol. 5303 (2008)

    Google Scholar 

  11. Eade, E., Drummond, T.: Edge landmarks in monocular slam. In: Image and Vision Computing (IMAVIS), vol. 27 (2009)

    Google Scholar 

  12. Forster, C., Pizzoli, M., Scaramuzza, D.: SVO: fast semi-direct monocular visual odometry. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 15–22 (2014)

    Google Scholar 

  13. Koledić, K., Cvišić, I., Marković, I., Petrović, I.: MOFT: monocular odometry based on deep depth and careful feature selection and tracking. In: IEEE International Conference on Robotics and Automation (ICRA) (2023)

    Google Scholar 

  14. Dickmann, J., Klappstein, J., Hahn, M., Appenrodt, N., Bloecher, H.L., Werber, K., Sailer, A.: Automotive radar the key technology for autonomous driving: from detection and ranging to environmental understanding. In: IEEE Radar Conference (RadarConf) (2016)

    Google Scholar 

  15. Doer, C., Trommer, G.F.: An EKF based approach to radar inertial odometry. In: IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pp. 152–159 (2020)

    Google Scholar 

  16. Michalczyk, J., Jung, R., Weiss, S.: Tightly-coupled EKF-based radar-inertial odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 12336–12343 (2022)

    Google Scholar 

  17. Mostafa, M., Zahran, S., Moussa, A., El-Sheimy, N., Sesay, A.: Radar and visual odometry integrated system aided navigation for UAVs in GNSS denied environment. Sensors 18 (2018)

    Google Scholar 

  18. Liang, Y., Muller, S., Schwendner, D., Rolle, D., Ganesch, D., Schaffer, I.: A scalable framework for robust vehicle state estimation with a fusion of a low-cost IMU, the GNSS, radar, a camera and lidar. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1661–1668 (2020)

    Google Scholar 

  19. Doer, C., Trommer, G.F.: Radar visual inertial odometry and radar thermal inertial odometry: robust navigation even in challenging visual conditions. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 331–338 (2021)

    Google Scholar 

  20. Bloesch, M., Burri, M., Omari, S., Hutter, M., Siegwart, R.: Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback. Int. J. Robot. Res. (IJRR) 36, 1053–1072 (2017)

    Article  Google Scholar 

  21. Lu, Y.-E., Tsai, S., Tsai, M.-L., Chiang, K.-W.: A low-cost visual radar-based odometry framework with mmwave radar and monocular camera. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. (ISPRS) (2022)

    Google Scholar 

  22. Scannapieco, A.F., Renga, A., Graziano, M.D., Fasano, G.: Preliminary performance assessment of radar-aided monocular visual odometry for small aerial platforms. In: IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), pp. 278–283 (2019)

    Google Scholar 

  23. Madyastha, V.K., Ravindray, V.C., Mallikarjunan, S., Goyal, A.: Extended Kalman filter vs. error state Kalman filter for aircraft attitude estimation. In: AIAA Guidance, Navigation, and Control Conference (2011)

    Google Scholar 

  24. Solà, J.: Quaternion kinematics for the error-state Kalman filter. ar**v preprint ar**v:1711.02508 (2017)

  25. Bremer, H.: Elastic multibody dynamics. In: Intelligent Systems, Control and Automation: Science and Engineering (ISCA), vol. 13, pp. 85–90 (2008)

    Google Scholar 

  26. Baisa, N.L.: Derivation of a constant velocity motion model for visual tracking. ar**v preprint ar**v:2005.00844 (2020)

  27. Fischler, M.A., Bolles, R.C.: Random sample consensus. Commun. ACM 24, 381–395 (1981)

    Article  Google Scholar 

  28. Kellner, D., Barjenbruch, M., Klappstein, J., Dickmann, J., Dietmayer, K.: Instantaneous ego-motion estimation using Doppler radar. In: IEEE Conference on Intelligent Transportation Systems (ITSC), pp. 869–874 (2013)

    Google Scholar 

  29. Barfoot, T.D.: State Estimation for Robotics. Cambridge University Press (2017)

    Google Scholar 

  30. Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34(4), 1004–1020 (2018)

    Article  Google Scholar 

  31. Wise, E., Cheng, Q., Kelly, J.: Spatiotemporal calibration of 3D mm-wavelength radar-camera pairs. ar**v preprint ar**v:2211.01871 (2022)

  32. Zhang, Z., Scaramuzza, D.: A tutorial on quantitative trajectory evaluation for visual(-inertial) odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 7244–7251 (2018)

    Google Scholar 

Download references

Acknowledgements

This research has been supported by the European Regional Development Fund under the grant KK.01.2.1.02.0119 (A-UNIT).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vlaho-Josip Štironja .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Štironja, VJ., Peršić, J., Marković, I., Petrović, I. (2024). MOVRO: Loosely Coupled EKF-Based Monocular Visual Radar Odometry. In: Lee, SG., An, J., Chong, N.Y., Strand, M., Kim, J.H. (eds) Intelligent Autonomous Systems 18. IAS 2023. Lecture Notes in Networks and Systems, vol 795. Springer, Cham. https://doi.org/10.1007/978-3-031-44851-5_11

Download citation

Publish with us

Policies and ethics

Navigation