Files in this item



application/pdfCHEN-THESIS-2017.pdf (14MB)
(no description provided)PDF


Title:Graphical SLAM for urban UAV navigation
Author(s):Chen, Derek
Advisor(s):Gao, Grace X
Department / Program:Aerospace Engineering
Discipline:Aerospace Engineering
Degree Granting Institution:University of Illinois at Urbana-Champaign
Global Positioning System (GPS)
Abstract:In recent years, there has been rising interest in commercial applications of Unmanned Air Vehicles (UAVs). Examples of their usage include aerial photography, infrastructure inspection and emergency first response. For widespread commercial adoption of these applications to occur, UAV navigation must be made safe and reliable. In open-sky environments, Global Positioning System (GPS) receivers are most commonly used to provide accurate and globally referenced positioning for UAVs. However, many applications, such as consumer product delivery, require UAVs to operate in densely populated urban environments. In these environments, buildings and structures reflect and block GPS signals, leading to multipath and low satellite visibility. These factors create GPS-challenged environments that result in large errors in UAV positioning or make GPS unavailable. To improve urban GPS, one approach uses environment modeling such as 3D city models to mitigate the effects of multipath and NLOS errors. Others pair GPS with odometry measurements from relative positioning sensors, such as Light Detection and Ranging (LiDAR) sensors. LiDAR-based odometry provides an accurate relative navigation solution in GPS-challenged environments, but requires distinguishable features in the surrounding environment and is susceptible to drift and biases. As a result, there is a need for sensor fusion techniques that can provide reliable and robust positioning in urban environments. In this thesis, we apply a Simultaneous Localization and Mapping (SLAM) approach to fuse GPS pseudorange measurements with LiDAR point clouds and 3D building footprint data of the existing region, for UAV trajectory estimation and environment mapping. Our approach consists of three main aspects: graphical modeling, map-based processing, and inference. First, we use a probabilistic graph, specifically a directed acyclic graph, to model the trajectory of a UAV. Nodes in the graph represent states of the UAV or GPS satellites; while edges represent relations between states created by sensor measurements. We then use the graph to structure our environment maps. We represent our environment in two mapping formats: a point cloud map and an urban building map. The point cloud map is formulated by anchoring each collected LiDAR point cloud with their respective state. The result is a large point cloud collected throughout the trajectory of the UAV. The urban building map is a geometric representation of the large scale structures in the environment. It is first initialized with available sources of 3D building model data for the navigating region. In this work, we use Champaign building footprint data from the State of Illinois data portal. We then run plane fitting algorithms on the collected point clouds at each state to update the urban building map. As the UAV navigates, the graph is populated by additional nodes and corresponding LiDAR measurements, allowing for SLAM of the environment. Next, we apply the formulated maps in two ways: mitigation of errors resultant from reflected GPS signals; and map matching with existing or previously collected maps of the region. The urban building map is first initialized with city building footprint data. We then draw line-of-sight vectors to from the UAV to each satellite and identify NLOS satellites from intersections with the urban building map. Next, we use density of surrounding buildings to identify potential satellite measurements affected by multipath. After identifying multipath-affected GPS measurements, we propose a multipath model via covariance adjustment to deweigh their effects on the UAV state estimate. We then append our graph with additional map matching edges. Based on the probabilistic distribution of a state, we generate and propagate particles representing potential states of the UAV. Then using the urban map, we compare the expected buildings observed by each particle with the buildings observed from the LiDAR measurements. We find the most likely particle and use it as a constraint measurement in the graph. We then compare the point cloud collected at each time step with the point cloud map to perform loop closure and create constraint measurements to the initial position of the UAV. Afterwards, we take a probabilistic approach towards trajectory estimation in the graphical inference step. Using the edges directed at the UAV nodes in the graph, we formulate a joint probability that represents the likelihood of the state estimate given the collected measurements. We then perform inference on the graph and formulate a Maximum A Posteriori (MAP) estimate of the UAV trajectory. Since the collected LiDAR point clouds are anchored at the corresponding state estimates, we simultaneously optimize for the maps generated by the system. Finally, we experimentally validate our algorithm by presenting the results of a series of UAV flight tests in both GPS-challenged and GPS-friendly environments near and on the University of Illinois at Urbana-Champaign campus. We show that our probabilistic graphical sensor fusion approach provides an accurate and available navigation solution that allows a UAV to navigate an urban environment under the presence of GPS signal reflections and occlusions.
Issue Date:2017-09-08
Rights Information:Copyright 2017 Derek Chen
Date Available in IDEALS:2018-03-13
Date Deposited:2017-12

This item appears in the following Collection(s)

Item Statistics