Real-time onboard 3D state estimation of an unmanned aerial vehicle in multi-environments using multi-sensor data fusion

Research output: Contribution to journalArticlepeer-review

48 Scopus citations

Abstract

The question of how to estimate the state of an unmanned aerial vehicle (UAV) in real time in multi-environments remains a challenge. Although the global navigation satellite system (GNSS) has been widely applied, drones cannot perform position estimation when a GNSS signal is not available or the GNSS is disturbed. In this paper, the problem of state estimation in multi-environments is solved by employing an Extended Kalman Filter (EKF) algorithm to fuse the data from multiple heterogeneous sensors (MHS), including an inertial measurement unit (IMU), a magnetometer, a barometer, a GNSS receiver, an optical flow sensor (OFS), Light Detection and Ranging (LiDAR), and an RGB-D camera. Finally, the robustness and effectiveness of the multi-sensor data fusion system based on the EKF algorithm are verified by field flights in unstructured, indoor, outdoor, and indoor and outdoor transition scenarios.

Original languageEnglish
Article number919
JournalSensors (Switzerland)
Volume20
Issue number3
DOIs
StatePublished - 1 Feb 2020
Externally publishedYes

Keywords

  • Multi-environments
  • Multi-sensor data fusion
  • State estimation
  • Unmanned aerial vehicle

Fingerprint

Dive into the research topics of 'Real-time onboard 3D state estimation of an unmanned aerial vehicle in multi-environments using multi-sensor data fusion'. Together they form a unique fingerprint.

Cite this