Log in

Visual-inertial-actuator odometry for multirotor UAVs with rotor drag and external disturbance

  • Published:
International Journal of Dynamics and Control Aims and scope Submit manuscript

Abstract

Accurate knowledge of an unmanned aerial vehicle’s (UAV) state and external forces is crucial for motion control. One method to obtain this information is Visual-Inertial-Actuator Odometry (VIAO) which combines commonly available vision and inertial sensors with actuation data which consists of a model of the vehicle dynamics and input measurements. In this paper, we propose VIAO system based on the existing VIMO (Visual Inertial Model-Based Odometry) method. Our approach includes a dynamical model for a multirotor UAV with rotor drag. In addition, a disturbance observer is added for constant force estimation. The observer is integrated to the optimization-based method by adding a residual term to the cost functional. The proposed system can differentiate between the constant or slowly time-varying component of the external force and accelerometer bias. We evaluate the performance using a benchmark dataset and show an average of \(21\%\) improvement in position estimation accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availibility

No data generated.

Code Availability

Code available upon request.

References

  1. Scaramuzza D, Fraundorfer F (2011) Visual odometry: part I: the first 30 years and fundamentals. IEEE Robot Autom Mag 18(4):80–92

    Article  Google Scholar 

  2. Mur-Artal R, Montiel JMM, Tardos JD (2015) ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans Rob 31(5):1147–1163

    Article  Google Scholar 

  3. Martinelli A (2012) Vision and IMU data fusion: closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Trans Rob 28(1):44–60

    Article  Google Scholar 

  4. Park S, Schöps T, Pollefeys M (2017) Illumination change robustness in direct visual SLAM, Singapore, pp 4523–4530

  5. Weiss S, Achtelik MW, Lynen S, Chli M, Siegwart R (2012) Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In: Proceedings of the 2012 IEEE international conference on robotics and automation, Saint Paul, MN, pp 957–964

  6. Scaramuzza D, Achtelik MC, Doitsidis L, Friedrich F, Kosmatopoulos E, Martinelli A, Achtelik MW, Chli M, Chatzichristofis S, Kneip L et al (2014) Vision-controlled micro flying robots: from system design to autonomous navigation and map** in GPS-denied environments. IEEE Robot Autom Mag 21(3):26–40

    Article  Google Scholar 

  7. Mourikis AI, Roumeliotis SI (2007) A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of the 2007 IEEE international conference on robotics and automation, Rome, Italy, pp 3565–3572

  8. Kelly J, Sukhatme GS (2011) Visual-inertial sensor fusion: localization, map** and sensor-to-sensor self-calibration. Int J Robot Res 30(1):56–79

    Article  Google Scholar 

  9. Qin T, Li P, Shen S (2018) VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans Rob 34(4):1004–1020

    Article  Google Scholar 

  10. Klein G, Murray D (2007) Parallel tracking and map** for small AR workspaces. In: Proceedings of the 2007 IEEE and ACM international symposium on mixed and augmented reality, Nara, Japan, pp 225–234

  11. Strasdat H, Montiel JM, Davison AJ (2012) Visual SLAM: why filter? Image Vis Comput 30(2):65–77

    Article  Google Scholar 

  12. Abeywardena DMW, Dissanayake G (2013) Tightly-coupled model aided visual-inertial fusion for quadrotor micro air vehicles. In: Field and service robotics. Springer tracts in advanced robotics, vol 105, pp 153–166

  13. Loianno G, Brunner C, McGrath G, Kumar V (2017) Estimation, control, and planning for aggressive flight with a small quadrotor with a single camera and IMU. IEEE Robot Autom Lett 2(2):404–411

    Article  Google Scholar 

  14. Bangura M (2017) Aerodynamics and control of quadrotors. Ph.D. thesis, College of Engineering and Computer Science, The Australian National University

  15. Omari S, Hua M-D, Ducard G, Hamel T (2013) Nonlinear control of VTOL UAVs incorporating flap** dynamics. In: Proceedings of the 2013 IEEE/RSJ international conference on intelligent robots and systems, Tokyo, Japan, pp 2419–2425

  16. Kai J-M, Allibert G, Hua M-D, Hamel T (2017) Nonlinear feedback control of quadrotors exploiting first-order drag effects. IFAC-PapersOnLine 50(1):8189–8195

  17. Faessler M, Franchi A, Scaramuzza D (2017) Differential flatness of quadrotor dynamics subject to rotor drag for accurate tracking of high-speed trajectories. IEEE Robot Autom Lett 3(2):620–626

    Article  Google Scholar 

  18. Suter D, Hamel T, Mahony R (2002) Visual servo control using homography estimation for the stabilization of an X4-flyer. In: Proceedings of the 41st IEEE conference on decision and control, vol 3. Las Vegas, NV, pp 2872–2877

  19. Moeini A, Lynch A, Zhao Q (2020) Disturbance observer-based nonlinear control of a quadrotor UAV. Adv Control Appl Eng Ind Syst 2(1):1–20

    Google Scholar 

  20. Nisar B, Foehn P, Falanga D, Scaramuzza D (2019) VIMO: simultaneous visual inertial model-based odometry and force estimation. IEEE Robot Autom Lett 4(3):2785–2792

  21. Moeini A, Lynch AF, Zhao Q (2021) Exponentially stable motion control for multirotor UAVs with rotor drag and disturbance compensation. J Intell Robot Syst 103(1):15

    Article  Google Scholar 

  22. Supplementary material to: Visual-inertial-actuator odometry for multirotor UAVs subject to rotor drag and external disturbance. Technical report (2022). https://tinyurl.com/4htyv2fj

  23. Furgale P, Rehder J, Siegwart R (2013) Unified temporal and spatial calibration for multi-sensor systems. In: Proceedings of the 2013 IEEE/RSJ international conference on intelligent robots and systems, Tokyo, Japan, pp 1280–1286

  24. Civera J, Davison A, Montiel J (2008) Inverse depth parametrization for monocular SLAM. IEEE Trans Rob 24(5):932–945

  25. Nister D (2004) An efficient solution to the five-point relative pose problem. IEEE Trans Pattern Anal Mach Intell 26(6):756–770

    Article  PubMed  Google Scholar 

  26. Lepetit V, Moreno-Noguer F, Fua P (2008) EPnP: an accurate O(n) solution to the pnp problem. Int J Comput Vis 81(2):155

  27. Xu Y, Chen Z, Luo ACJ (2019) On bifurcation trees of period-1 to period-2 motions in a nonlinear Jeffcott rotor system. Int J Mech Sci 160:429–450

    Article  Google Scholar 

  28. Antonini A, Guerra W, Murali V, Sayre-McCord T, Karaman S (2020) The blackbird UAV dataset. Int J Robot Res 39(10–11):1346–1364

    Article  Google Scholar 

  29. Zhang Z, Scaramuzza D (2018) A tutorial on quantitative trajectory evaluation for visual(-inertial) odometry. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), Madrid, Spain, pp 7244–7251

  30. Prokhorov D, Zhukov D, Barinova O, Anton K, Vorontsova A (2019) Measuring robustness of visual SLAM. In: 16th international conference on machine vision applications, Tokyo, Japan, pp 1–6

Download references

Funding

Natural Sciences and Engineering Research Council of Canada (NSERC), Award Number: RGPIN-2023-04156, Recipient: Alan Lynch.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed equally.

Corresponding author

Correspondence to Alan F. Lynch.

Ethics declarations

Conflict of interest

The authors have no relevant financial or nonfinancial interests to disclose.

Informed consent

Not applicable.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Moeini, A., Lynch, A.F. & Zhao, Q. Visual-inertial-actuator odometry for multirotor UAVs with rotor drag and external disturbance. Int. J. Dynam. Control (2024). https://doi.org/10.1007/s40435-023-01357-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40435-023-01357-5

Keywords

Navigation