# Autonomous Drone Solution

## Autonomous Drone Solution

The Red Balloon Finder project was written to enable a Raspberry or Intel Aero to control an Copter based quadcopter to follow and pop 1m red balloon. The python code that runs on the board can be found on the section of Autonomous of the git.

### Obje**ct detection theory**

In vision based systems, there are many types of hardware/software configuration tailored for specific applications: Visual Servoing, Visual Odometry and Visual Simultaneous Localization And Mapping (SLAM). In this project we are using the former type of system: Visual Servoing that is designed to:

* Take off and Landing
* Obstacle Avoidance/Tracking
* Position and Attitude control
* Stabilization over a target&#x20;

The main idea of Visual Servoing is to regulate the pose {Cξ,T } (position and orientation) of a robotic platform relative to a target, using a set of visual features {f } extracted from the sensors.

![](https://450959332-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-Lj1MxEFF6NBjAkcTWn7%2F-Lj1N-trFp8FOuVMhQJp%2F-Lj1NVsiwvFn6D_F7DrI%2FScreenshot%20from%202018-04-02%2014-06-06.png?generation=1562334341898530\&alt=media)\
Randy's Target Tracking is an Image Based Visual Servoing (IVBS), where the 2D image features are used for the calculation and control values. We exploit a hybrid method where the size of the object is known -a priori- making the estimation of distance along the Z axis possible. In the example below, were the system is following a moving target at a fixed distance, we can relate the target position to the camera projected plane. In this Tracker, we apply a color and shape (blob) filtering in order to extract a location on the camera plane.

![](https://450959332-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-Lj1MxEFF6NBjAkcTWn7%2F-Lj1N-trFp8FOuVMhQJp%2F-Lj1NVsk7dQePoxrfugC%2FScreenshot%20from%202018-04-02%2014-06-24.png?generation=1562334342251741\&alt=media)

### Communication

The Pixhawk has all the regular ArduCopter functionality but in addition it accepts commands from an external navigation computer, an board when it is in the Guided flight mode, or in AUTO and running a newly created MAVLink NAV\_GUIDED mission command. This NAV\_GUIDED command takes some special arguments like a timeout (in seconds), a min and max altitude, and a maximum distance and if the Odroid takes too long to find the balloon, or gives crazy commands that cause the copter to fly away, the Pixhawk retakes control.

So while the board is in control, it first requests the pixhawk to rotate the copter slowly around while it pulls images from the integrated cam and uses OpenCV to search for blobs of red in the images. During the search it keeps track of the largest red blob it saw so after the copter has rotate 360 degrees it sends commands to point the vehicle in the direction of that blob and then sends 3D velocity requests 5 times per second to guide the copter towards the top of the balloon.

### MultiProcessing

Python's [MultiProcessing](https://docs.python.org/dev/library/multiprocessing.html) to allow the image capture and image processing to run in separate processes (so they run in parallel).

### Start Project

#### Simulation

To start the project we need to :

```
$ cd Xiaomin/Scripts/Autonomous/simulator
$ ./start.sh
```

#### Drone

```
$ workon cv
$ cd Xiaomin/Scripts/Autonomous/scripts/
$ python ballon_video.py
```

### Code Structure

| Code                    | Functions                                                                     |
| ----------------------- | ----------------------------------------------------------------------------- |
| attitude\_history.py    | Provides delayed attitude and location information.                           |
| balloon\_config.py      | Handles config for the ballon\_finder project.                                |
| balloon\_strategy.py    | The main script that initialize all the  classes.                             |
| balloon\_utils.py       | Utility functions for the balloon\_finder project                             |
| balloon\_video.py       | Initialise the camera, and the output video                                   |
| colour\_finder.py       | Helps find the min and max Hue, Saturation and Brightness of a desired object |
| find\_balloon.py        | Find the pixel location of red balloons.                                      |
| linux\_run\_strategy.sh | Initialice the drone on arducopter                                            |
| position\_vector.py     | Holds a 3 axis position offset from home in meters                            |


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://alfredo-reyes-montero.gitbook.io/xiaomin/solutions/autonomous-drone-solution.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
