Abstract
Reliable and accurate ego-motion estimation constitute an integral part of any autonomous system. To achieve this, it is essential to integrate sensors of complementary characteristics while also minimizing costs. The most often used ego-motion estimation setup usually consists of a monocular camera and an inertial measurement unit (IMU) which can often drift due to inaccuracy of modest performance IMUs. Inlieu of enhancing such setups with a higher-performing IMU, in this paper we propose to fuse the monocular camera with a radar sensor. Specifically, we combine measurements from a 3D millimeter-wave frequency modulated continuous wave radar and a monocular camera within an error-state extended Kalman filter framework. The filter was implemented using loose coupling that allows for an independent and scalable development of sensor models. We evaluated the proposed method against state-of-the-art radar inertial, visual inertial, and radar visual inertial odometries. The results show that the proposed method is comparable with other multi-domain solutions, especially in outdoor sequences.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Kramer, A., Stahoviak, C., Santamaria-Navarro, A., Agha-Mohammadi, A.A., Heckman, C.: Radar-inertial ego-velocity estimation for visually degraded environments. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 5739–5746 (2020)
Doer, C., Trommer, G.F.: Radar inertial odometry with online calibration. In: European Navigation Conference (ENC) (2020)
Adams, M., Mullane, J., Ebi, J., Vo, B.-N.: Robotic Navigation and Map** with Radar. Artech House (2012)
Nabati, R., Qi, H.: CenterFusion: center-based radar and camera fusion for 3D object detection. In: IEEE Winter Conference on Applications of Computer Vision (WACV) (2021)
Monaco, C.D.: Ego-motion estimation from Doppler and spatial data in sonar, radar, or camera images. Ph.D. dissertation (2019)
Geiger, A., Ziegler, J., Stiller, C.: StereoScan: dense 3D reconstruction in real-time. In: IEEE Intelligent Vehicles Symposium (IV), pp. 963–968 (2011)
Campos, C., Elvira, R., Rodriguez, J.J., Montiel, J.M., Tardos, J.D.: ORB-SLAM3: an accurate open-source library for visual, visual-inertial, and multimap slam. IEEE Trans. Robot. 37, 1874–1890 (2021)
Newcombe, R.A., Lovegrove, S.J., Davison, A.J.: DTAM: dense tracking and map** in real-time. In: IEEE International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)
Mur-Artal, R., Tardós, J.D.: Probabilistic semi-dense map** from highly accurate feature-based monocular SLAM. In: Robotics: Science and Systems (RSS), vol. 11 (2015)
Klein, G., Murray, D.: Improving the agility of keyframe-based SLAM. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), LNCS, vol. 5303 (2008)
Eade, E., Drummond, T.: Edge landmarks in monocular slam. In: Image and Vision Computing (IMAVIS), vol. 27 (2009)
Forster, C., Pizzoli, M., Scaramuzza, D.: SVO: fast semi-direct monocular visual odometry. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 15–22 (2014)
Koledić, K., Cvišić, I., Marković, I., Petrović, I.: MOFT: monocular odometry based on deep depth and careful feature selection and tracking. In: IEEE International Conference on Robotics and Automation (ICRA) (2023)
Dickmann, J., Klappstein, J., Hahn, M., Appenrodt, N., Bloecher, H.L., Werber, K., Sailer, A.: Automotive radar the key technology for autonomous driving: from detection and ranging to environmental understanding. In: IEEE Radar Conference (RadarConf) (2016)
Doer, C., Trommer, G.F.: An EKF based approach to radar inertial odometry. In: IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pp. 152–159 (2020)
Michalczyk, J., Jung, R., Weiss, S.: Tightly-coupled EKF-based radar-inertial odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 12336–12343 (2022)
Mostafa, M., Zahran, S., Moussa, A., El-Sheimy, N., Sesay, A.: Radar and visual odometry integrated system aided navigation for UAVs in GNSS denied environment. Sensors 18 (2018)
Liang, Y., Muller, S., Schwendner, D., Rolle, D., Ganesch, D., Schaffer, I.: A scalable framework for robust vehicle state estimation with a fusion of a low-cost IMU, the GNSS, radar, a camera and lidar. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1661–1668 (2020)
Doer, C., Trommer, G.F.: Radar visual inertial odometry and radar thermal inertial odometry: robust navigation even in challenging visual conditions. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 331–338 (2021)
Bloesch, M., Burri, M., Omari, S., Hutter, M., Siegwart, R.: Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback. Int. J. Robot. Res. (IJRR) 36, 1053–1072 (2017)
Lu, Y.-E., Tsai, S., Tsai, M.-L., Chiang, K.-W.: A low-cost visual radar-based odometry framework with mmwave radar and monocular camera. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. (ISPRS) (2022)
Scannapieco, A.F., Renga, A., Graziano, M.D., Fasano, G.: Preliminary performance assessment of radar-aided monocular visual odometry for small aerial platforms. In: IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), pp. 278–283 (2019)
Madyastha, V.K., Ravindray, V.C., Mallikarjunan, S., Goyal, A.: Extended Kalman filter vs. error state Kalman filter for aircraft attitude estimation. In: AIAA Guidance, Navigation, and Control Conference (2011)
Solà, J.: Quaternion kinematics for the error-state Kalman filter. ar**v preprint ar**v:1711.02508 (2017)
Bremer, H.: Elastic multibody dynamics. In: Intelligent Systems, Control and Automation: Science and Engineering (ISCA), vol. 13, pp. 85–90 (2008)
Baisa, N.L.: Derivation of a constant velocity motion model for visual tracking. ar**v preprint ar**v:2005.00844 (2020)
Fischler, M.A., Bolles, R.C.: Random sample consensus. Commun. ACM 24, 381–395 (1981)
Kellner, D., Barjenbruch, M., Klappstein, J., Dickmann, J., Dietmayer, K.: Instantaneous ego-motion estimation using Doppler radar. In: IEEE Conference on Intelligent Transportation Systems (ITSC), pp. 869–874 (2013)
Barfoot, T.D.: State Estimation for Robotics. Cambridge University Press (2017)
Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34(4), 1004–1020 (2018)
Wise, E., Cheng, Q., Kelly, J.: Spatiotemporal calibration of 3D mm-wavelength radar-camera pairs. ar**v preprint ar**v:2211.01871 (2022)
Zhang, Z., Scaramuzza, D.: A tutorial on quantitative trajectory evaluation for visual(-inertial) odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 7244–7251 (2018)
Acknowledgements
This research has been supported by the European Regional Development Fund under the grant KK.01.2.1.02.0119 (A-UNIT).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Štironja, VJ., Peršić, J., Marković, I., Petrović, I. (2024). MOVRO: Loosely Coupled EKF-Based Monocular Visual Radar Odometry. In: Lee, SG., An, J., Chong, N.Y., Strand, M., Kim, J.H. (eds) Intelligent Autonomous Systems 18. IAS 2023. Lecture Notes in Networks and Systems, vol 795. Springer, Cham. https://doi.org/10.1007/978-3-031-44851-5_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-44851-5_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-44850-8
Online ISBN: 978-3-031-44851-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)