Mapping on Demand: Local Perception and 3D Navigation

I also participated in a micro aerial vehicles (MAV) project as a research assistant. The goal of this project is to develop a MAV featured with real-time state estimation, obstacle detection, mapping, and navigation planning. This project focusses on the local detection and representation of obstacles around a lightweight UAV. For this, measurements of a laser range scanner, ultrasonic distance sensors, and stereo cameras will be fused.


Our MAV has an integrated system with a multimodal sensor setup for omnidirectional environment perception and 6D state estimation. I was mainly responsible for processing the images from 6 stereo cameras to detect AprilTags in the complex environment. My work contributed to localizing the MAV and building the map. The outcomes of this work were published in a premier international conference for unmanned aircraft system ICUAS.

Scheme of the sensors, actuators, computers, and bus systems on our MAV. We use high bandwidth USB 3.0 connection for the cameras due to the high data rates, and lower bandwidth buses for laser and flight control. The dashed lines indicate a WiFi connection and a connection inactive during flights.


Detected and localized AprilTags in the 3D environment using the images captured by the 6 cameras equipped on Micro aerial vehicles (MAV). Used the SLAM method to locate the MAV and build a map of the AprilTags simultaneously.

AprilTags are tracked in the environment. We obtain the position of tracked tags relative to the camera.

Map of AprilTag detections from all six cameras during a test flight. Each dot, colored by tag id, is one detection. Clusters of similar tag detections are circled. The tags are projected to the world frame with position estimates from our localization (red arrows). Grid lines indicate 1 m distances.