Some time ago, we have started to work with the ARDrone, which is a cheap commercially available toy originally intended for augmented reality games. The drone has an open API, which allows to read sensory data and obtain images from its cameras. This makes it a useful platform for robotics research, and allowed us to use it in several proof-of-concept experiments. For a brief introduction of usage of the ARDrone for robotic research read the article AR-Drone as a Platform for Robotics Research and Education .
The SDK provided by the manufacturer was too complex for our purposes, and therefore, we have created a simple software which allows to acquire its sensory data, stream videos from its cameras and control it by joystick or keyboard. The software is available for download.
The sensory equipment of the ARDrone allows to use it in monocular mapping, localization and exploration tasks.
The following videos demonstrate the ability of the drone to navigate autonomously in indoor and outdoor environments.
The drone is controlled by Simple, yet stable bearing only navigation for land robots, which allows to traverse ground robots a given path.
The extension of the aforementioned method for UAVs has been described in the article A simple visual navigation system for an UAV .
Note that the drone uses just its default sensory equipment, (i.e. accelerometers, gyros, sonars and cameras) and no external position reference (such as GPS) is used.
In the following videos, the drone is supposed to take snapshots of a serie of locations. In the first case, it travels between these points using the aforementioned navigation method. In order to position the drone over the places of interest with sufficient precision, the planned path has to consider the uncertainty in the quadcopter position, see article Surveillance Planning with Localization Uncertainty for UAVs .
In the second video, the drone takes snapshots of places which are not accessible by a ground robot. The robot travels autonomously to a predefined set of locations, where the drone takes off and moves over the places of interest. While flying, the drone uses the robot helipad to determine its position relative to the robot. Once the place of interest has been photographed, the drone lands on the robot, which then proceeds to another location. For details, see the article Cooperative Micro UAV-UGV Autonomous Indoor Surveillance .