Lab #6: Vision and Control

       Introduction

The main objective of this lab to leverage what we've learned so far in order to control an ARDrone. I recommend you work in small groups, but you can also work alone if you'd prefer.

Objectives

Upon successful completion of this lab, you will be able to:

  1. Successfully control the ARDrone using ROS.
  2. Estimate distances from the ARDrone based on landmarks.
  3. Demonstrate basic reactive control.
Warning

The drones can be fickle. I suggest you keep the code so that spacebar will do an emergency stop. You really don't want to hit something/someone. If a drone breaks, we'll be able to fix it, but I'd prefer to keep them intact! Please always use the foam bodies on the drones when flying them. This will make them less likely to break if you hit something, and easier to grab. Remember that you can always turn off the drone by grabbing it and turning it upside down --- just don't grab the rotor!

Assignment
  • By the end of this lab, your robot should:
    1. Take off at some point within 10 feet of a target (the markers on the middle door of Dana 3, or another of your choosing).
    2. Rotate until you find your target.
    3. Move towards the target.
    4. Land, three feet (horizontally) from the target location
  • To get credit, please submit all modified code and a video via Angel.
  • To get full credit, your video should show your robot landing 1-5 feet from the target (you can show a tape measure in the video to verify this).
Hints
  1. You can choose to use your vision code from the previous lab, or you could consider using the built-in functionality. This tutorial http://robohub.org/up-and-flying-with-the-ar-drone-and-ros-handling-feedback/ is part of a series of tutorials of using ROS and the ARDrone. It mentions using keyboard_controller_with_tags.launch which you may find useful, as well as the discussion of controlling the drone.
  2. If you haven't played with rosbag (http://wiki.ros.org/rosbag) yet, you should consider it. You can record some vision and then replay it via ROS. This will allow you to test your code without having to worry about controling the robot.
Turn in
  1. Submit any python code you modified via Angel.
  2. In less than half of one page, in either plain text or .pdf, list:
    • Name of group member(s)
    • High-level approach to the project
    • What you could have done differently to make things easier on yourselves
    • What extra credit assignments you did, if any
    • A video showing a successful run, inculding any extra credit parts.
  3. Possible extra credit:
    • Show that your robot can successfully find the target, and land in front of it, from multiple starting positions.
    • Show that you can "kidnap" your robot. Once the robot has located the target, grab it with your hand and move it's position / angle so that it has to re-identify the target.
    • A short second video is suffient to show that your code works. However, if you're able to make a longer video that's informative and interesting (e.g., explaining what you did and how you did it to another computer science major), you will get some extra credit.
    • There will also be an in-class competition. The person who gets closest to the correct landing position wins!