Lab #5: Virtual AR-Drone


The objective of this lab is to practice working with the simulated AR-Drone. I encourage you to work in teams if you'd like to. You are welcome to copy code from the internet, but make sure to credit where it came from.


Upon successful completion of this lab, you will be able to:

  1. Send commands to a simulated AR-Drone
  2. Receive imagery from AR-Drone and do vision procesing to locate a landing target
  3. Implement control logic to search and land on a landing target

Your goal is to launch the AR-Drone, search for the orange landing target that is on the ground, and then land on the target. The James has provided a git repository that you can use that includes the libraries you need, a world with the landing pad, and example code to get you started. It also contains some documentation that you might find useful.

You should implement some sort of searching algorithm to locate the target. In the given simulated world, the platform is located close and directly in front of the AR-Drone. It is not considered a valid strategy to simply fly forwards until you see the object. Whatever solution you come up with should work even if the target is moved to another location. You can move around the target in Gazebo to test your searching algorithm.

Hand in
In Blackboard, please submit the following:
  1. A text file with:
    • Your name
    • The names of any people you worked with (optional)
    • Any on-line references / websites you found particularly useful
    • A short description of your searching algorithm for locating the target
  2. A video of your AR-Drone searching for and landing on the target in 2 different locations.
  3. The code you wrote for the assignment and any instructions needed to execute it.
  1. Text file content: 30
  2. Code for searching algorithm: 30
  3. Code for landing on target: 20
  4. Video of quadrotor searching and landing on 2 different targets:20