Autonomous Drone Solution
Last updated
Was this helpful?
Last updated
Was this helpful?
The Red Balloon Finder project was written to enable a Raspberry or Intel Aero to control an Copter based quadcopter to follow and pop 1m red balloon. The python code that runs on the board can be found on the section of Autonomous of the git.
In vision based systems, there are many types of hardware/software configuration tailored for specific applications: Visual Servoing, Visual Odometry and Visual Simultaneous Localization And Mapping (SLAM). In this project we are using the former type of system: Visual Servoing that is designed to:
Take off and Landing
Obstacle Avoidance/Tracking
Position and Attitude control
Stabilization over a target
The main idea of Visual Servoing is to regulate the pose {Cξ,T } (position and orientation) of a robotic platform relative to a target, using a set of visual features {f } extracted from the sensors.
Randy's Target Tracking is an Image Based Visual Servoing (IVBS), where the 2D image features are used for the calculation and control values. We exploit a hybrid method where the size of the object is known -a priori- making the estimation of distance along the Z axis possible. In the example below, were the system is following a moving target at a fixed distance, we can relate the target position to the camera projected plane. In this Tracker, we apply a color and shape (blob) filtering in order to extract a location on the camera plane.
The Pixhawk has all the regular ArduCopter functionality but in addition it accepts commands from an external navigation computer, an board when it is in the Guided flight mode, or in AUTO and running a newly created MAVLink NAV_GUIDED mission command. This NAV_GUIDED command takes some special arguments like a timeout (in seconds), a min and max altitude, and a maximum distance and if the Odroid takes too long to find the balloon, or gives crazy commands that cause the copter to fly away, the Pixhawk retakes control.
So while the board is in control, it first requests the pixhawk to rotate the copter slowly around while it pulls images from the integrated cam and uses OpenCV to search for blobs of red in the images. During the search it keeps track of the largest red blob it saw so after the copter has rotate 360 degrees it sends commands to point the vehicle in the direction of that blob and then sends 3D velocity requests 5 times per second to guide the copter towards the top of the balloon.
To start the project we need to :
Code
Functions
attitude_history.py
Provides delayed attitude and location information.
balloon_config.py
Handles config for the ballon_finder project.
balloon_strategy.py
The main script that initialize all the classes.
balloon_utils.py
Utility functions for the balloon_finder project
balloon_video.py
Initialise the camera, and the output video
colour_finder.py
Helps find the min and max Hue, Saturation and Brightness of a desired object
find_balloon.py
Find the pixel location of red balloons.
linux_run_strategy.sh
Initialice the drone on arducopter
position_vector.py
Holds a 3 axis position offset from home in meters
Python's to allow the image capture and image processing to run in separate processes (so they run in parallel).