You are on page 1of 118

Special Engineering Project:

UAV Vision and Control System

3rd Year Project


Final Report

Author: Richard Lane


1456687
ee41rl@surrey.ac.uk

Project Supervisor: Dr. Richard Bowden


University of Surrey
School of Electronics & Physical Sciences

Department of Electronic Engineering

Final Year Project Dissertation

I confirm that the project dissertation I am submitting is entirely my own work and that any
material used from other sources has been clearly identified and properly acknowledged
and referenced. In submitting this final version of my report to the JISC anti-plagiarism
software resource, I confirm that my work does not contravene the university regulations
on plagiarism as described in the Student Handbook. In so doing I also acknowledge that I
may be held to account for any particular instances of uncited work detected by the JISC
anti-plagiarism software, or as may be found by the project examiner or project organiser. I
also understand that if an allegation of plagiarism is upheld via an Academic Misconduct
Hearing, then I may forfeit any credit for this module or a more sever penalty may be
agreed.

Project Title: UAV Vision and Control System

Student Name: Richard Lane

Supervisor: Dr. Richard Bowden

Date: 04 / 05 / 2008
The University of Surrey UAV Vision and Control System Richard Lane

Abstract
This report details the development of a UAV Vision and Control System, as a contribution
towards the second year of the Special Engineering Project. The eventual aim of the project was to
have a UAV hovering over a set of fiducials, on top of the Special Engineering Project’s UGV. The
vision side to the project progressed well throughout the year, with software completed to a
point where it can track fiducials and calculate orientation. The control side to the project
suffered greatly from incompatibility issues; however a working control interface prototype has
been developed.

i
The University of Surrey UAV Vision and Control System Richard Lane

Acknowledgements
The author would like to thank the following people for their help and support through the
duration of the project. Dr. Richard Bowden for selecting me to work on such an exciting project,
and for the help and guidance given to me throughout the project. Adam Lindsay for his swift
development of the USB to analogue voltage microcontroller. Liam Ellis for the various times he
has answered my queries, and for lending me several pieces of equipment. Shiv Tailor for taking
some excellent photos for the SEP website. Lastly Ben Abbott, Adam Lindsay, Affan Shaukat, and
Dr. Philips Jackson for their valuable ideas and input into project meetings.

ii
The University of Surrey UAV Vision and Control System Richard Lane

Table of Contents

1 Introduction .................................................................................................................... 1
1.1 The Special Engineering Project ................................................................................................... 1
1.2 Project Objectives.......................................................................................................................... 2
1.3 Specifications................................................................................................................................... 2
1.4 Breakdown of Report.................................................................................................................... 2
2 Background and Literature........................................................................................... 3
2.1 Unmanned Aerial Vehicles............................................................................................................ 3
2.1.1 Example UAV System: “md 4-200”......................................................................................... 3
2.2 Existing Vision and Control Systems........................................................................................... 4
2.3 Remote Controlled Aircraft......................................................................................................... 4
2.4 Quadrocopters ............................................................................................................................... 6
2.5 OpenCV........................................................................................................................................... 6
2.5.1 CvCam......................................................................................................................................... 6
2.5.2 HighGUI ...................................................................................................................................... 7
2.6 Network API................................................................................................................................... 7
2.6.1 Bcastmanager.............................................................................................................................. 7
3 Vision and Control System ........................................................................................... 8
3.1 System Overview ........................................................................................................................... 8
3.2 Source of Hardware and Software .............................................................................................. 8
3.3 Modifications to the Original Plan ............................................................................................... 9
4 Hardware ...................................................................................................................... 10
4.1 UAV................................................................................................................................................10
4.1.1 X-UFO Aerial Platform...........................................................................................................10
4.1.2 Wireless Camera .....................................................................................................................10
4.2 UGV ...............................................................................................................................................12
4.2.1 Target Fiducials ........................................................................................................................12
4.2.2 Superbright LEDs .....................................................................................................................13
4.3 External Laptop ............................................................................................................................13
4.3.1 Laptop Specifications ...............................................................................................................13
4.3.2 Resource Usage........................................................................................................................14
4.4 Control Interface..........................................................................................................................14
4.4.1 Investigated Control Methods ...............................................................................................14
4.4.2 USB-PPM Interface Method 1 ................................................................................................14
4.4.3 USB-PPM Interface Method 2 ................................................................................................15
4.4.4 USB-Voltage Interface Method ..............................................................................................16
5 Software ........................................................................................................................ 21
5.1 Software Overview ......................................................................................................................21
5.2 Fiducial Detection ........................................................................................................................21

iii
The University of Surrey UAV Vision and Control System Richard Lane

5.2.1 Video Acquisition.....................................................................................................................21


5.2.2 The Process of Fiducial Detection ........................................................................................21
5.2.3 Relevant Pixel Detection ........................................................................................................22
5.2.4 Pixel Grouping..........................................................................................................................22
5.2.5 Possible LED Sorting ...............................................................................................................23
5.3 Decision System ...........................................................................................................................28
5.3.1 Orientation Calculations.........................................................................................................28
5.3.2 UAV to UGV Communications .............................................................................................32
5.3.3 UGV Emulation Software .......................................................................................................33
5.3.4 Error Handling..........................................................................................................................34
5.4 Control Interface..........................................................................................................................35
5.4.1 Control Integration .................................................................................................................35
5.4.2 Visualisation ..............................................................................................................................36
5.4.3 Remote Control Test Software.............................................................................................36
6 Live Test Results .......................................................................................................... 38
6.1 Hardware.......................................................................................................................................38
6.1.1 Manual Flight with Camera Mounted....................................................................................38
6.1.2 Control Interface .....................................................................................................................39
6.2 Software.........................................................................................................................................39
6.2.1 Fiducial Detection....................................................................................................................39
6.2.2 Orientation Calculations.........................................................................................................40
7 Additional Work .......................................................................................................... 41
7.1 Project Planning ............................................................................................................................41
7.1.1 Gantt Charts.............................................................................................................................41
7.1.2 Costing ......................................................................................................................................41
7.1.3 Hardware Sourcing..................................................................................................................41
7.2 Project Website ...........................................................................................................................42
7.3 Project Meetings...........................................................................................................................42
8 Conclusions ................................................................................................................... 43
9 Discussions .................................................................................................................... 44
9.1 Gantt Charts .................................................................................................................................44
9.2 Project Objectives........................................................................................................................44
9.3 Future Work.................................................................................................................................44

iv
The University of Surrey UAV Vision and Control System Richard Lane

List of Figures

Figure 1: Picture of example UAV system “md 4-200” ............................................................................. 3


Figure 2: Example Vision Target using Moiré Patterns .............................................................................. 4
Figure 3: Examples of available Remote Controlled Aircraft.................................................................... 5
Figure 4: Quadrotor Torque Diagram ......................................................................................................... 6
Figure 5: Final System Overview Block Diagram ........................................................................................ 8
Figure 6: Original System Overview Block Diagram .................................................................................. 9
Figure 7: Picture showing power socket modifications to X-UFO........................................................10
Figure 8: Picture of wireless camera mounted to the X-UFO ...............................................................11
Figure 9: Screenshot showing Video Options for WinDVR ...................................................................12
Figure 10: Completed LED Boards attached to Scale Layout.................................................................12
Figure 11: Pictures of LED board and Main Supply board.......................................................................13
Figure 12: LED Boards Circuit Diagram.....................................................................................................13
Figure 13: Superbright LED test board ......................................................................................................13
Figure 14: Unreliable X-UFO Control Method using Hitec Laser 6 Transmitter ...............................16
Figure 15: Picture of the Original USB-Voltage Interface Prototype.....................................................17
Figure 16: RS-232 Level Converter Circuit Diagram...............................................................................18
Figure 17: X-UFO Transmitter Modification Circuit Diagram ...............................................................18
Figure 18: Pictures showing modifications to X-UFO Transmitter .......................................................19
Figure 19: Microcontroller-Transmitter Interface Circuit Diagram ......................................................20
Figure 20: Screenshots showing the effect of likelihood LED sorting algorithms ...............................26
Figure 21: Histogram plots showing distribution of the amount of pixels in possible LEDs .............28
Figure 22: Fiducial Numbering for Orientation Calculations..................................................................29
Figure 23: Orientation Data shown on a test frame................................................................................32
Figure 24: Control Visualisation shown on a test frame .........................................................................36
Figure 25: Screenshot of Remote Control Test software ......................................................................37
Figure 26: Screenshots from video showing UAV mobility with camera mounted ............................38
Figure 27: Screenshots showing Fiducial Detection in Low and Normal Light Conditions...............40
Figure 28: Screenshots showing Orientation Calculations in Low and Normal Light Conditions ...40

List of Tables

Table 1: Cost Breakdown of Hardware Ordered ....................................................................................41


Table 2: Part number and Source of Hardware Ordered.......................................................................42

v
The University of Surrey UAV Vision and Control System Richard Lane

List of Appendices

Appendix A: Original Gantt Chart


Appendix B: Revised Gantt Chart
Appendix C: Vision and Control System Software Code
Appendix D: Remote Control Test Software Code
Appendix E: UGV Emulation Software Code
Appendix F: X-UFO Technical Information
Appendix G: SC-8000 Technical Information
Appendix H: Wireless Camera Technical Information

Glossary of Terms
API Application Programming Interface
CCTV Closed-Circuit Television
CVSSP Centre for Vision, Speech and Signal Processing (at The University of Surrey)
DAC Digital-to-Analogue Converter
DFT Discrete Fourier Transform
IC Integrated Circuit
IP Internet Protocol
KLT Kanade-Lucas-Tomasi feature tracker
PC Personal Computer
PCB Printed Circuit Board
PPM Pulse Position Modulation
R/C Radio Control
RS-232 Recommended Standard 232 (a serial data standard)
SEP Special Engineering Project
TCP Transmission Control Protocol
UAV Unmanned Arial Vehicle
UGV Unmanned Ground Vehicle
USB Universal Serial Bus

vi
The University of Surrey UAV Vision and Control System Richard Lane

1 Introduction

1.1 The Special Engineering Project


The Special Engineering Project (SEP) was started in 2006 by Dr. Richard Bowden as a memorable
third year project, the idea being to create a group project which could be continued and
improved upon in years to come. Third year projects are individually marked, so care is taken to
allocate four independent projects each year, which can be integrated to produce a project which
is more meaningful than could be achieved by one student in the same time span.

The original four team members; Ahmed Aichi, Edward Cornish, Peter Helland, and Martin
Nicholson, succeeded in producing a robotic platform capable of navigating it’s self to a certain
extent. Some parts of the project were deemed either not as successful as hoped, or un-useful to
the continuation of the project and have therefore not be used again this year.

This year the project has been continued with four new team members, responsible for four more
individual projects to be integrated into SEP.
- The author is responsible for producing a vision and control system capable of hovering an
airborne platform (referred to as the UAV in the rest of this report), above the original
robotic platform (referred to as the UGV).
- Ben Abbott is responsible for the continuation and improvement, of the high level control and
onboard vision system for the UGV, as well as project management aspects such as budget
and chairing meetings.
- Adam Lindsay is responsible for producing a generic wired and wireless sensor interface to be
poled by the decision system, including useful sensors to SEP such as odometers, and
gyroscopes.
- Affan Shaukat is responsible for producing a sound source localization system capable of
detecting significant noises and relaying a direction to the decision system.

The UGV was previously designed to navigate around the CVSSP department. It was decided to
push the project in the direction of security applications, which led to the above four projects. The
vision system and generic sensor interface is to give feedback to the decision system about
location and movement of the UGV, and audio sensing to be used to detect interesting events for
the UGV to navigate too. The UAV is intended to provide a hovering platform, or disembodied
head for the UGV, to in the future mount a camera on to take pictures of people found in the
CVSSP corridors.

1
The University of Surrey UAV Vision and Control System Richard Lane

1.2 Project Objectives


Primary Objectives:
1. To produce a robust vision system capable of tracking a set of fixed fiducials.
2. To use information provided by the vision system to establish the camera’s location relative
to the fiducials.
3. To produce a control and decision system to interface between the vision system and UAV.
4. To stabilise a UAV above a set of fiducials fixed to the UGV, by using a vision system for the
control feedback.

Secondary Objectives:
1. Enhance the vision system so it’s possible to stabilise a UAV above the UGV without the
fixed fiducials by tracking the UGV itself.
2. Enhance the vision system so it’s possible to stabilise a UAV above any object, by training the
systems to recognise and track a specific object.

The secondary objectives are where the project could progress to, if the primary objectives are
achieved earlier than anticipated.

1.3 Specifications
1. The aerial platform must be capable of carrying a wireless camera.
2. The vision system must work at the frame rate of the camera (25-30Hz).
3. The vision system must be robust enough to keep tracking under stable conditions, and
achieve tracking for 50% of the time under unstable conditions.
4. The control system must provide full software control of the aerial platform.
5. The aerial platform must be at least as stable under automated flight, as during manual flight.

1.4 Breakdown of Report


The remainder of the report starts by explaining relevant background information and research to
the project, followed by details of modules within the author’s project giving technical information
and their progress, testing, and results. There is then some live test results and information of
additional project work and planning. Finally there is a conclusion and discussion on the final
progress and results of the project and the future work which could be carried out.

2
The University of Surrey UAV Vision and Control System Richard Lane

2 Background and Literature

2.1 Unmanned Aerial Vehicles


UAVs are aerial vehicles which are capable of unmanned (remote controlled) or unpiloted
(automated) flight. They are used for a wide range of activities from remote controlled toys to
police surveillance and army reconnaissance.

2.1.1 Example UAV System: “md 4-200”

[1]
Figure 1: Picture of example UAV system “md 4-200”

The "md 4-200" developed by a German company called "microdrones GmbH" is described as a
vertical take off and landing, autonomous unmanned micro vehicle. It is capable of manual flight
using a traditional style R/C transmitter, as well as automated waypoint navigation and automated
software controlled flight.

Specification:
Important specification points include: [1]
- The integration of accelerometers, gyroscopes, magnetometer, airpressure, humidity and
temperature sensors to allow reliable stable flight.
- GPS for autonomous waypoint navigation or position hold.
- Up to 20 minutes of flight time.
- 200g payload limit.
A USB R/C interface is also available. The "PPM9_USART" which is also developed by
"microdrones GmbH" allows the emulation of a standard R/C transmitter via software [2].

Mobile Police CCTV:


British police have adopted the use of these small quadrocopter UAVs to take advantage of their
mobile capabilities. They were first trialled at a large public event (the V Festival) in August 2007
[3], where the UAV was equipped with high-resolution night vision capable camera. The UAV was
used primarily to monitor large car parks and was controlled by a single operator wearing video
goggles. More recently a similar model of UAV has been adopted by the British Transport Police
to "crack down on metal thefts on the railways" [4]. As well as carrying a camera, it has the ability
to shoot "SmartWater" at subjects allowing them to later be identified by police.

3
The University of Surrey UAV Vision and Control System Richard Lane

2.2 Existing Vision and Control Systems


There are several existing research projects into vision and control systems UAVs. The most
notable is described in the paper “Estimation and Control of a Quadrotor Vehicle Using
Monocular Vision and Moiré Patterns” by Tournier, Valenti, How, and Feron [5].

They succeeded in creating a vision based estimation system using a single camera mounted
downwards on a UAV, with a tracking target in the field of view of the camera. The tracking target
used is described as “a novel target that incorporates the use of moiré patterns” [6]. The target
pattern consists of 5 red patches and two orthogonal moiré patterns as shown in Figure 2.

[7]
Figure 2: Example Vision Target using Moiré Patterns

The Red Dots shown in Figure 3 are used to calculate the yaw and distance of the camera relative
to the target as well as to locate the moiré patterns. Moiré patterns in this case are created using
two sets of gratings, one a set distance on top of the other. This result is “a near constant
apparent wavelength on the image plane” [8] independent of distance of the camera from the
target. Relative position, pitch and roll are calculated using DFT calculations on the information
provided by the moiré patterns. More detailed information on the position estimation can also be
found in the thesis entitled “Six Degree of Freedom Estimation Using Monocular Vision and Moiré
Patterns” by Tournier [9].

Although the method applied by Tournier, Valenti, How, and Feron for orientation calculations
was successful and accurate it is far beyond the complexity which could be achieved within the
author’s project time scale. However a similar method of calculating yaw and distance has been
employed in the author’s project, using five red fiducials. The same five fiducials are then also used
to provide some relatively simple and therefore less accurate position, pitch and roll values.

2.3 Remote Controlled Aircraft


Several types of remote controlled aircraft were researched and considered for use as the
project’s aerial platform. It was decided that a quadrocopter would be the most suitable for the
task mainly due to its natural stability and power over weight advantages, below are some
advantages/disadvantages of the options available.

4
The University of Surrey UAV Vision and Control System Richard Lane

Miniature helicopters
These are meant for indoor flying so are fairly small, but most have limited control and also weigh
less than a wireless camera and battery, which makes most of them useless to the project. The
example shown in Figure 3a is the “Heli-Q”. Its size is shown in relation to a cigarette.

Mid-sized helicopters
These are generally used outdoors; they are fairly large and should be able to hold the weight of a
wireless camera and battery. A large down side is the size of the main rotors that are typically
over 600mm in diameter and therefore not very safe for flying through tight corridors. Mid-sized
helicopters are also mostly for experienced R/C flyers and are notoriously hard to hover. The
example shown in Figure 3b is the “Walkera Dragonfly”, which is a standard mid-sized helicopter.

Quadrocopters
These are a perfect size although about 600mm wide like the mid-sized helicopters; quadrocopters
have four much smaller rotors, making them safer for use in corridors. They also appear powerful
enough to hold the wireless camera and battery and are naturally stable due to onboard gyro
stabilisation. A small down side is that there are relatively few ready to use quadrocopters
commercially available. The example shown in Figure 3c is the “X3D-BL UFO”, which is a high end
quadrocopter.

Blimps
These would be by far the easiest option to control, but they are bulky and are made to be very
light, so it is possible a blimp would not be able to support any extra weight. The example shown
in Figure 3d is the “Tri-Turbofan” blimp.

Figure 3: Examples of available Remote Controlled Aircraft

b) Mid-sized Helicopter:
“Walkera Dragonfly” [11]
a) Miniature Helicopter:
“Heli-Q” [10]
c) Quadrocopter:
“X3D-BL UFO” [12]

d) Blimp:
“Tri-Turbofan” [13]

5
The University of Surrey UAV Vision and Control System Richard Lane

2.4 Quadrocopters
Quadrocopters are a form of helicopter with four fixed pitch rotor blades. Two pairs of rotors
spin in opposite directions to create counter torque as shown in Figure 4. Directional motion is
achieved by independently varying the speed of the four rotors; in the descriptions below the
motors are numbered as shown in Figure 4.
- Height is controlled by varying the speed of all four rotors. Increasing the speed of rotors 1-4
increases height.
- Yaw is controlled by speeding up two opposite rotors and slowing the other two. Speeding up
1&3 and slowing 2&4 would result in turning right.
- Pitch is controlled by increasing and decreasing the speed of two opposite rotors. Assuming 1
is the forward rotor, speeding up 3 and slowing 1 will result in forward motion.
- Roll is controlled by increasing and decreasing the speed of two opposite rotors. Assuming 1
is the forward rotor, speeding up 2 and slowing 4 will result in rolling left.

[14]

Figure 4: Quadrotor Torque Diagram

2.5 OpenCV
“OpenCV (Open Source Computer Vision) is a library of programming functions mainly aimed at real
time computer vision. Example applications of the OpenCV library are Human-Computer Interaction
(HCI); Object Identification, Segmentation and Recognition; Face Recognition; Gesture Recognition;
Motion Tracking, Ego Motion, Motion Understanding; Structure From Motion (SFM); and Mobile
Robotics.” [15]

OpenCV was originally developed by Intel. Several parts of OpenCV are used within the author’s
project, CvCam and HighGUI which are described below, as well as common mathematic
functions, drawing functions, and data structures.

2.5.1 CvCam

“CvCam is a universal cross-platform module for processing video stream from digital video cameras.
It is implemented as a dynamic link library (DLL) for Windows and as a shared object library (so) for
linux systems and provides a simple and convenient Application Programming Interface (API) for

6
The University of Surrey UAV Vision and Control System Richard Lane

reading and controlling video stream, processing its frames and rendering the results. CvCam is
distributed as a part of Intel’s OpenCV project under the same license and uses some functionality of
the Open Source Computer Vision Library.” [16]

CvCam is used as the method for initialising the wireless camera, streaming video and then
accessing individual video frames for processing.

2.5.2 HighGUI

“While OpenCV is intended and designed for being used in production level applications, HighGUI is
just an addendum for quick software prototypes and experimentation setups. The general idea
behind its design is to have a small set of directly useable functions to interface your computer vision
code with the environment.” [17]

HighGUI is used to create a window for displaying video frames from CvCam; it is also used for
some very basic user event handling.

2.6 Network API


The Network API is “A generic API for peer to peer interprocess communication over a network” [18]
developed by Ahmed Aichi during the first year of SEP. It is an extremely useful tool for the
project as it requires very little knowledge of how it works and provides a simple communication
method between as many different software modules as necessary over standard TCP networking.
The Network API is used in the author’s project to communicate to and from the UAV decision
system and UGV decision system over a wireless network.

2.6.1 Bcastmanager

Bcastmanager is an additional part to the Network API which allows the use of broadcast names
when establishing a connection instead of specific IP addresses. This helps when using the
Network API to communicate between two pieces of software on different PCs, as an IP allocated
to a PC on one day may be different on another day. That fact makes it impractical to directly use
IP addresses. The Bcastmanager works by running of both target PCs and broadcasting their name
and IP address across their network. One PC’s name and IP address is picked up by the other PC
and this information is passed on to Network API. The Network API on each PC will now know
the current IP address of the other PC.

7
The University of Surrey UAV Vision and Control System Richard Lane

3 Vision and Control System

3.1 System Overview


The UAV Vision and Control System consists of several purchased or designed pieces of hardware
interacting with the project software, which runs on an external laptop. Figure 5 contains a block
diagram showing these interactions, each block is described in more detail in future sections.

Figure 5: Final System Overview Block Diagram

3.2 Source of Hardware and Software


Software designed and coded by the author:
- Image Processing
- Decision System
- Control Interface
Hardware designed and built by the author:
- Fiducials
- Interface Circuits
- Modifications to X-UFO Transmitter
Hardware designed and programmed by Adam Lindsay
- Microcontroller (designed by Adam Lindsay)

8
The University of Surrey UAV Vision and Control System Richard Lane

Hardware purchased from budget:


- X-UFO and Transmitter
- Wireless Camera and Video Receiver
- USB to RS-232 Converter
Legacy hardware from 2006-07:
- Robot Platform

3.3 Modifications to the Original Plan


The original plan for the UAV Vision and Control System contained slightly different hardware
blocks than the final design. The original system (shown in Figure 6) involved using a USB-PPM Box
to control the R/C Transmitter with software. This interface suffered from many incompatibility
issues, which are described in detail in future sections. The USB-PPM Box was eventually replaced
by a USB to RS-232 Converter, Microcontroller, and several interface circuits and modifications to
the X-UFO Transmitter (as shown in Figure 5). These changes are also described in detail in
future sections.

Figure 6: Original System Overview Block Diagram

9
The University of Surrey UAV Vision and Control System Richard Lane

4 Hardware

4.1 UAV

4.1.1 X-UFO Aerial Platform

The X-UFO is an R/C (remote controlled) quadrocopter made by a company called Silverlit (see
Appendix C for specification and picture). It has an onboard mechanical gyro and control circuit,
which automatically handles speed control for the four motors and compensates for most
unwanted pitch and roll. Having four automatically regulated propellers makes the X-UFO
naturally stable, because of this hovering requires minimal effort compared to traditional
helicopters; the user need only control the height and compensate for drifting. The X-UFO is also
capable of lifting the extra weight of a wireless camera and 9V battery, but care needs to be taken
when mounting extra weight to maintain the centre of gravity.

Figure 7: Picture showing power socket modifications to X-UFO

The X-UFO is supplied by a 12V 350mAh battery, which causes a problem when developing, such
that 8 hours charging time for the battery results in less than 5 minutes flying/development time.
To solve this problem a 12V 5A power supply was purchased and modified with a 10 metre long
light weight cable. An extra socket for the power supply was also added to the X-UFO (shown in
Figure 7), giving the flexibility to use either the power supply or original battery.

4.1.2 Wireless Camera

Mounting
A wireless camera is to be attached to the underside of the X-UFO to provide visual feedback to
the decision systems by referencing a set of fiducials on top of the UGV. The camera itself is small
and light, and can easily be mounted to the X-UFO’s foam body, however the 9V battery required
to power it is relatively bulky. This may cause some future issues with the centre of balance if not
mounted correctly. Tests have shown that stability is not an issue if the battery is held within the
X-UFO’s battery cradle, however weight is an issue and the X-UFO struggles to gain much height

10
The University of Surrey UAV Vision and Control System Richard Lane

with the camera and battery mounted (shown in Figure 8). See Appendix E for pictures and
technical specifications of the wireless camera.

Figure 8: Picture of wireless camera mounted to the X-UFO

Interference
The camera transmits on a set frequency channel at around 2.4GHz which could cause two issues.
Firstly it could interfere with local wireless computer networking (which is also at 2.4GHz). Tests
have already shown that if a wireless network is trying to use the same channel the network will
be completely blocked, however on different channels both systems can be used together.
Secondly the RF link for the wireless sensor system being developed by Adam Lindsay also uses
channels at around 2.4GHz. This should be less of a problem as a feature of the wireless sensor
system link is to be able to channel hop and automatically pick a noise free channel. During testing
no interference has been detected between the wireless camera transmitter and wireless sensor
system.

WinDVR Setup
InterVideo WinDVR is the software which comes with the wireless camera receiver. Once
installed options need to be set before use with the Vision and Control System software. The
setting under Setup%Option%Device%Standard% should be changed to “PAL_D”. The options
under Setup%Option%Display%Video should be set as shown in Figure 9.

11
The University of Surrey UAV Vision and Control System Richard Lane

Figure 9: Screenshot showing Video Options for WinDVR

4.2 UGV

4.2.1 Target Fiducials

The target fiducials to be used for the vision tracking system consists of 5 ultra bright diffused red
LEDs. Each of the five small LED and resistor stripboards can be bolted on top of already existing
mount points for the UGV’s ultrasound sensors, or to mount points on this years extended
platform. The LEDs are set out with one in each corner and one in the middle front centre, as
shown in Figure 10. The picture shown in Figure 10 is cardboard template with the LED
stripboards attached to it. The template, which is to the same scale as the UGV mount points, is
used to hold the LEDs in the correct orientation for testing and demonstration purposes.

420mm

280mm

UGV Forward Direction

Figure 10: Completed LED Boards attached to Scale Layout

It was decided that the best power supply for the LEDs was one of the UGV laptop’s USB ports,
as this would save weight and cost by not requiring extra batteries. Each small LED board (shown

12
The University of Surrey UAV Vision and Control System Richard Lane

in Figure 11) plugs into a main supply board (shown in Figure 11) separately to facilitate easy and
neat attachment to the UGV. The supply board can then be hidden underneath the UGV’s laptop
platform.

Figure 11: Pictures of LED board and Main Supply board

4.2.2 Superbright LEDs

To provide the best visibility on video of the LEDs, large superbright 8mm LEDs were purchased
as tests using standard brightness 5mm LEDs showed that the LEDs could not be easily
distinguished at distances of over a metre. A test board (shown in Figure 13) was built containing 4
superbright LEDs to be used to find how many LEDs would be required for each fiducial. Tests
showed that just one superbright LED is easily visible using the wireless camera up to distances of
over 3 metres.

Figure 12: LED Boards Circuit Diagram


Figure 13: Superbright LED test board

4.3 External Laptop

4.3.1 Laptop Specifications

The laptop used for development and testing of the project code was an Apple MacBook Pro,
running Windows XP Pro. The image processing function development for the project code can
tend to be very processor intense depending on the operating conditions. For future reference in
case of code re-use on a different laptop some relevant specifications of the laptop used are listed

13
The University of Surrey UAV Vision and Control System Richard Lane

below, these were taken from the Mac OS System Profiler. Resource usage data described in
section 4.3.2 should also be taken into account.

Model Identifier: MacBookPro3,1


Processor Details: 2.4GHz, Intel Core 2 Duo, 4MB L2 Cache, 800MHz Frontside Bus
Memory Details: 2GB, 667MHz DDR2 SDRAM
Graphics Chipset: 256MB GDDR3 VRAM, NVIDIA GeForce 8600M GT
USB Ports: 2 x USB 2.0
Network Card: AirPort Extreme, 802.11 a/b/g/n

4.3.2 Resource Usage

Below some average CPU Usage figures for the Vision and Control System software are given,
with increase amounts of the Vision and Control software running on the laptop described in
section 4.3.1.

Video Rendering Only : 20%


+ Image Processing : 25%
+ Orientation Calculations : 28%
+ Network API : 30%

4.4 Control Interface

4.4.1 Investigated Control Methods

During the course of the project three control interface methods were investigated. These are
referred to as USB-PPM Interface Method 1, USB-PPM Interface Method 2, and USB-Voltage
Interface Method, and described in detail in future sections. The USB-Voltage Interface Method is
the final chosen method for the control interface as it was the only method to prove reliable.

4.4.2 USB-PPM Interface Method 1

Method
The original plan was to attempt to use the USB-PPM box to control the X-UFO via the original
transmitter’s buddy port. Assuming they were compatible this would allow a very tidy control
method for the X-UFO from software.

USB-PPM Interface Box


The USB-PPM box (SC-8000, see Appendix D for a picture and technical specifications) provides
an interface between software and an R/C transmitter. At the software side it acts as a PC comm
port with an interface library which allows you to send PPM values to a compatible (up to 8-
channel) R/C transmitter. PPM servo values range from 8000-22000, so in terms of the throttle
8000 would be none, 22000 would be max, and 15000 would be midway. The box is connected to
a transmitter via the transmitter’s buddy/trainer port. This is a port which allows two transmitters
to be connected together for the purpose of teaching; if a learner loses control the trainer’s
transmitter can override the learner’s controls. The USB-PPM box essentially pretends to be
another transmitter and can therefore be used to override the controls of the connected
transmitter. The manufacturer of the box provides USB drivers and a simple to use software API.

14
The University of Surrey UAV Vision and Control System Richard Lane

Initial Results
After receiving the X-UFO and USB-PPM box they proved to be incompatible. It was found that
the X-UFO transmitter does not use a standard PPM signal for its buddy control. Instead it uses a
very different pulse system, which is then converted to a PPM signal just before transmission. This
result had been anticipated, as the X-UFO transmitter is intended as a toy rather than a
professional R/C transmitter. The planned solution was to try using a professional R/C transmitter
instead, see 4.4.3.

Possible Modifications to X-UFO Transmitter


After finding the X-UFO transmitter and USB-PPM box to be incompatible, the transmitter was
dismantled in the hope of finding a way of connecting the PPM signal directly into the transmitter
without using the buddy port. A signal very similar to the PPM signal outputted from the USB-PPM
box was located inside the transmitter. It was considered that a solution could be to amplify the
PPM signal to the same amplitude as the signal found inside the transmitter, and to find the correct
place to insert the signal. However without a schematic diagram of the transmitter it could have
been damaging to the transmitter to try this, so the idea was put on hold for other possible
solutions.

4.4.3 USB-PPM Interface Method 2

Method
This secondary plan was to attempt to use the USB-PPM box to control the X-UFO via a third
party R/C transmitter’s buddy port. This however relies on both the USB-PPM box being
compatible with the transmitter and the transmitter being compatible with the X-UFO. But again if
compatible this would allow a very tidy control method for the X-UFO from software.

USB-PPM Interface Box


The same USB-PPM interface box is used as described in section 4.4.2.

Hitec Laser 6 Transmitter Results


Liam Ellis, a postgraduate also at The University of Surrey, was using the USB-PPM box (SC-8000)
for a research project with a compatible Hitec Laser 6 R/C transmitter. The Laser 6 transmitter
however is not directly compatible with the X-UFO. After borrowing this transmitter several
solutions were investigated with varying success.

Solution one was to try and use the Laser 6 transmitter (which has a 35MHz frequency) to control
the X-UFO by changing the X-UFO’s receiver crystal to 35MHz. This failed presumably because
the rest of the X-UFO’s receiver circuitry is only compatible with 26MHz channels. Two new
aerials of correct length for receiving 35MHz R/C were also made and tested with the X-UFO
with no success. The calculations used for length of the aerial required are shown below.

wavelength = speed of light / frequency = 3*108 / 35*106 = 8.57 metres


1/8 wave aerial length = wavelength / 8 = 8.57 / 8 = 1.07 metres
1/16 wave aerial length = wavelength / 16 = 8.57 / 16 = 0.54 metres

Solution two was to try and use the Hitec R/C transmitter to control the X-UFO (which has a
26MHz frequency) by changing the transmitter’s crystal to 26MHz. This also failed presumably
because the rest of the transmitter’s circuitry is only compatible with 35MHz channels. However it

15
The University of Surrey UAV Vision and Control System Richard Lane

was noticed that the X-UFO did respond correctly to the transmitter when the transmitter’s
aerial was rested on the X-UFO’s mains power cable. The only possible explanation for this is that
the RF signal from the transmitter was propagating up the power cable (this could be seen using an
oscilloscope) and was being picked up by the receiver. This could be made to work 50% of the
time by securing the power cable along side the transmitter’s aerial (as shown in Figure 14). This
method could have been used as a temporary solution had it been more reliable; obviously an
unreliable control interface is extremely undesirable with aerial vehicles. It did however prove the
important concept that the X-UFO could be controlled using the signal from the USB-PPM box.

Figure 14: Unreliable X-UFO Control Method using Hitec Laser 6 Transmitter

Hitec Optic 6 Transmitter Results


After some research on German R/C forums, it was found that several people were using third
party transmitters to control the X-UFO. A commonly used combination of hardware was the
Hitec Optic 6 Transmitter along with a 27Mhz RF transmitter module. The Optic 6 transmitter is
the model up from the previously tested Laser 6, which was importantly USB-PPM box
compatible. The 27MHz RF transmitter module (Futaba FP-TJ-FM) is a module which clips into
that back of the Optic 6 and allows any 26-27MHz crystals to be used. After obtaining both pieces
of hardware there was however (for unknown reasons) no success manually controlling the X-
UFO. With project time starting to run short this solution was put on hold in favour of a
completely different idea for a solution, which is described in 4.4.4.

4.4.4 USB-Voltage Interface Method

Method
After little success with previously investigated USB-PPM interface methods it was decided to take
a different approach. The original X-UFO transmitter was dismantled to find that each channel is
essentially driven by a voltage from a potentiometer (attached to the control sticks). This means
that the transmitter could be modified to be driven by a software controlled voltage source. It was
decided that Adam Lindsay would strip down the current code for the wireless sensor system

16
The University of Surrey UAV Vision and Control System Richard Lane

microcontrollers, so that they would take a servo value via RS-232 and output the servo value as a
voltage from the microcontrollers’ DACs. Four channels are required, which can be achieved
using two microcontrollers with two DACs each. The original prototype for this system is shown
in Figure 15. The voltages from the microcontrollers can then be amplified and used to drive the
original X-UFO transmitter. Several modifications are required to the X-UFO transmitter, which
are described in later sections along with the intermediate circuits required.

Figure 15: Picture of the Original USB-Voltage Interface Prototype

USB to RS-232 Converter


A USB to RS-232 converter is required to connect the laptop to the microcontrollers. The
converter uses a “Prolific” serial converter chip for which drivers and API are readily available. It
was originally purchased as part of the wireless sensor system project but proved to slow for that
purpose.

RS-232 Level Converter


A level converter is required between the USB to RS-232 Converter (which works at 5V) and the
Microcontrollers (which work at 3V). Adam Lindsay had already designed a circuit for this purpose
for the wireless sensor system, which uses an optocoupler IC. This circuit was stripped down to
just the Rx channel as only one way communication is required. The resulting circuit is shown in
Figure 16.

17
The University of Surrey UAV Vision and Control System Richard Lane

Figure 16: RS-232 Level Converter Circuit Diagram

RS-232 to Voltage Microcontrollers


The RS-232 to voltage microcontrollers were designed and coded by Adam Lindsay and therefore
full details of their implementation are included in the SEP: Wireless Sensor Project report. The
microcontrollers are the same as used for the wireless sensor system; the debug boards obtained
by Adam Lindsay were used for the prototyping. Each microcontroller handles two R/C channels,
a channel number and servo value for each channel is sent via RS-232 to the microcontroller’s Rx
input (pin 46). The microcontrollers will then take the servo values and set a voltage at the
appropriate DAC (pins 5-6). More information on which microcontroller and pin relates to which
channel, can be found in future sections.

Modified X-UFO Transmitter


Modifications were made to the X-UFO transmitter to allow individual switching for each channel,
between software controlled input (switch up towards socket) or manual input using the sticks
(switch down towards sticks). A 2-way switch was inserted at the output of the channel control
potentiometers for each channel as shown in Figure 17.

Figure 17: X-UFO Transmitter Modification Circuit Diagram

18
The University of Surrey UAV Vision and Control System Richard Lane

Holes were drilled in the front of the transmitter for the switches as well as a hole for the five pin
voltage input socket. These modifications are shown in Figure 18. Care should be taken when the
X-UFO is powered on, if the switches are in the software control position but no input is present
this will lead to unpredictable behaviour of the X-UFO.

VIN Socket Switches for


channels 1-4

Potentiometer
plugs

Figure 18: Pictures showing modifications to X-UFO Transmitter

Microcontroller-Transmitter Interface Circuit


An interface circuit (shown in Figure 19) is required between the output DAC pins on the
microcontrollers and the transmitter to improve the voltage range at the transmitter. The voltage
range at the output of the DAC (VINi) is approximately 0.5-1.5V and the range required at the
transmitter (VOUTi) is approximately 0-3.5V. Firstly the minimum voltage at the input to the op-amp
needs to be 0V, so a diode is used to drop 0.5V from VINi. Secondly a gain of just over 3 is
required; this is achieved using a non-inverting op-amp circuit with a gain of 3.2. Practical results
show a range of 0.2-3.3V at VOUTi. The circuit is repeated four times, one for each R/C channel.
The corresponding inputs and outputs are listed below.

VIN1 = Microcontroller 1 (MC1) pin 5 % VOUT1 = Transmitter Channel 1


VIN2 = Microcontroller 1 (MC1) pin 6 % VOUT2 = Transmitter Channel 2
VIN3 = Microcontroller 2 (MC2) pin 5 % VOUT3 = Transmitter Channel 3
VIN4 = Microcontroller 2 (MC2) pin 6 % VOUT4 = Transmitter Channel 4

19
The University of Surrey UAV Vision and Control System Richard Lane

Figure 19: Microcontroller-Transmitter Interface Circuit Diagram

Results
This method provides a working interface between software and the X-UFO aerial platform, albeit
not as simple as the original plan. Servo values can be sent via software to the modified X-UFO
transmitter and the input for each channel can be selected using the switches on the front of the
transmitter.

20
The University of Surrey UAV Vision and Control System Richard Lane

5 Software

5.1 Software Overview

Process
For each frame of video loaded by CvCam a callback function is called; within the callback function
is most of the Vision and Control System code. Video is rendered at 30Hz which means that 30Hz
is also the frequency of the system code. The overall working of the system is described below as
pseudocode and each section in described in detail in future sections. The full software code can
be found in Appendix C.

Callback Function Pseudocode Representation

callback loop
if Network API communications are connected
if height message received
set acknowledgement bit
set new target height
process odd frame
choose 5 reference points
get decision commands from reference points
if commands are valid
adapt values for grouping thresholds
send control values
else
set error bits
show control visualisaton
if message bits set
send message to UGV decision system
loop

5.2 Fiducial Detection

5.2.1 Video Acquisition

Video acquisition is achieved by using CvCam, part of the OpenCV library. During initialisation a
callback function is passed to CvCam; the callback function is then called by CvCam for every
video frame taken from the wireless camera. Most of the project code is then run from the
callback function, which has access to the most current image data structure every time it is called.
Video is then rendered to a window using HighGUI, another part of the OpenCV library.

5.2.2 The Process of Fiducial Detection

Fiducial detection starts by scanning through pixels in a video frame and grouping them together if
they meet the required criteria. These grouped pixels, or possible LEDs, are sorted based on the
likelihood of them actually being an LED. Likelihood is calculated using shape, size and orientation
as the criteria, the five possible LEDs with the highest likelihood are then assumed to be the five
fiducials.

21
The University of Surrey UAV Vision and Control System Richard Lane

5.2.3 Relevant Pixel Detection

Process
Pixels are examined one at a time; from left to right on each line and each line from top to bottom
of the frame. Each pixel is tested to see if it meets one of two criteria, is it bright white or strong
red as these are the best criteria available to describe the pixels making up an LED in the image
data structure. If a pixel matches the criteria it is grouped to an existing or new possible LED
object, the pixel grouping algorithm is described in a later section. For easy visualisation of
relevant pixels bright white pixels are changed to pure white, strong red pixels are changed to
pure red, and all other pixels are changed to black within the processed field.

Criteria
Each pixel is stored in the image data structure as three component colour values; red, green and
blue. For a pixel to be classed as bright white the sum of the red, green and blue values must be
greater than the adaptive white threshold. For a pixel to be classed as strong red the green and
blue values are deducted from the red value, the result must then be greater than the adaptive red
threshold.

Pixel Detection Pseudocode Representation

for each line in a frame


for each pixel in a line
if pixel is bright white
pixel is relevant so group pixel
mark pixel by changing colour to pure white
if pixel is strong red
pixel is relevant so group pixel
mark pixel by changing colour to pure red
else
pixel not useful
mark pixel by changing colour to black
loop
loop

Adaptive Pixel Thresholds


Adaptive pixel thresholds are required for accurate relevant pixel detection in a range of lighting
conditions. Whilst scanning through pixels during relevant pixel detection peak bright white and
strong red values are calculated. The bright white value is the sum of the red, green and blue
values; the threshold is then calculated as 5% of the peak value. The strong red value is the green
and blue values, deducted from the red value; the threshold is then calculated as 30% of the peak
value. The calculated threshold values are then used during relevant pixel detection for the next
frame.

5.2.4 Pixel Grouping

Process
Once a pixel is determined as relevant it needs to be grouped, the result is a list of possible LEDs
containing an averaged [x,y] position in the frame, the amount of pixels, and a likelihood factor
which is used later on. A relevant pixel’s [x,y] position is compared to the position of each

22
The University of Surrey UAV Vision and Control System Richard Lane

possible LED. The pixel is added to the closest possible LED, as long as the distance is below the
adaptive distance threshold. The possible LED position is averaged with the pixel position and the
pixel count is increased. If the pixel is not close enough to any current possible LED a new
possible LED is created.

Pixel Grouping Pseudocode Representation

for each possible LED in list


if pixel distance from possible LED is less than threshold
check if shortest distance
loop
if a shortest distance has been found
average position of possible LED with position of pixel
increase pixel count of possible LED
else
create new possible LED with position of pixel

Adaptive Distance Threshold


The adaptive distance threshold is used to determine if a pixel is close enough to an existing
possible LED to be grouped with it. The threshold is adapted to ensure that it stays about half the
minimum distance between two fiducials. If the threshold is too large the pixels from more than
one fiducial could be grouped together, this could happen if the camera is moved far away from
the fiducials. If the threshold is too small the pixels from one fiducial could be added to several
groups this could happen if the camera is moved very close to the fiducials. If tracking is
completely lost, the thresholds are set to a default value in an attempt to recover.

5.2.5 Possible LED Sorting

Process
Once a list of possible LEDs is obtained, the list requires sorting to decide which of the possible
LEDs are most likely to be the target fiducials. This is done by assigning each possible LED a
likelihood, which represents its likelihood of being one of the five fiducials. This likelihood value is
then modified using four criteria; Group Pixel Threshold, Shape Analysis, Size Histogram, and
Relative Orientation. For each criterion a likelihood is calculated for each possible LED and then
multiplied with the possible LED’s current likelihood. The five possible LEDs with the highest
likelihood are then taken to be the five fiducials.

Group Pixel Threshold


The amount of pixels making up each possible LED is compared with a group pixel threshold. All
possible LEDs below the threshold are assigned a starting likelihood of 0.0, effectively discarding
them. All above the threshold are assigned 1.0. This helps to weed out small groups pixels caused
by noise straight away and therefore reduce future computational complexity.

Shape Analysis
The shape of the target fiducials in a frame will generally be circular, which means possible LEDs
found to not be circular can be given a lower likelihood. The shape analysis works by starting at
the centre point of each possible LED and then work outwards one pixel at a time checking that it
is not black (a non-useful pixel). This is done in eight directions, four perpendicular and four
diagonal; the distances between the centre and the edge of the shape are recorded. The

23
The University of Surrey UAV Vision and Control System Richard Lane

differences between distances are then summed, perpendicular first, then diagonal, and then both.
These sums are used to produce a likelihood that the possible LED is circular.

Size Histogram
Each of the target fiducials in a frame will generally be represented by a similar amount of pixels.
Therefore if a histogram is taken and five possible LEDs are in the same histogram group there is a
strong likelihood that they are the target fiducials. It is also very likely that the five possible LEDs
will be in two adjacent groups. For this reason the histogram likelihood process is repeated twice
with different histogram widths for improved accuracy. Once each histogram has been produced
likelihoods for each possible LED are calculated from the amount of items in the histogram group,
to which the possible LED belongs.

Relative Orientation
Possible LEDs which form right angles or straight lines with each other are more likely to be the
target fiducials because of the target fiducial layout. To start the orientation likelihood process the
angle between two possible LEDs relative to another possible LED is calculated for every
combination of possible LEDs which do not already have a likelihood of 0.0. If a combination is
found which is within a set threshold of 90° or 180° then this combination is stored in a list for
later reference. The list contains details of the reference possible LED and the product of all three
possible LED’s likelihoods. Once the list is complete the maximum product of likelihoods for each
reference possible LED is used for the likelihood that the reference possible LED is a fiducial. This
results in a large likelihood for combinations where all three possible LEDs already have a high
likelihood. Combinations which happen by chance will not have all three possible LEDs with high
likelihood and will therefore have a small product of likelihoods.

Shape Likelihood Pseudocode Representation

for each useful possible LED in list


calculate distance from centre of possible LED, to outer boarder
for four perpendicular and diagonal directions
average distances
sum differences between perpendicular distances and divide by average
sum differences between diagonal distances and divide by average
sum differences between perpendicular and diagonal distances
and divide by average
calculate a likelihood that the shape is circular
based on the three sums
possible LED’s likelihood becomes it’s current likelihood multiplied
by the shape likelihood
loop

24
The University of Surrey UAV Vision and Control System Richard Lane

Size Likelihood Pseudocode Representation

repeat twice with different histogram widths


for each useful possible LED in list
find histogram value for possible LED
keep track of highest value
add possible LED’s value to histogram
loop
for each histogram value
cap value at set limit if required
loop
for each possible LED in list
possible LED’s likelihood becomes it’s current likelihood multiplied
by the value proportional to possible LED’s histogram value
loop
loop

Orientation Likelihood Pseudocode Representation

for each useful possible LED in list


use this LED as origin
for each other useful possible LED in list
use this LED as LED1
for each other useful possible LED in list
use this LED as LED2
find angle between LED1 and LED2 with respect to origin
if angle is within threshold of 90° or 180°
multiply likelihoods of three LEDs and keep for later
loop
loop
loop
for each saved likelihood
find max likelihood for each origin LED
origin LED’s likelihood becomes it’s current likelihood multiplied
by the max combination of likelihoods found for that origin
loop

Field Based verses Frame Based Processing


It was noticed that shape analysis had a negative effect on fiducial detection in the event of the
camera moving forwards or backwards quickly. This is because the camera transmits an interlaced
signal, which means that when there is fast movement circles are no longer circles. Circles created
by the LEDs in a video frame become stretched out with an interlaced combing effect on either
side. It was decided that to counter this effect, field based processing would be used rather then
frame based. This solves the problem but reduces the vertical resolution by half, which can lead to
inaccurate tracking at larger distances. An advantage however is that if required the second field
could be processed separately from the first effectively giving a tracking rate of 60Hz, twice that of
the frame rate.

Results
Results for the possible LED sorting are shown in several screenshots in Figure 20. The
screenshots show the progressive improvement of sorting, as each likelihood algorithm is added.

25
The University of Surrey UAV Vision and Control System Richard Lane

Possible LEDs are highlighted by green rings, where the size of the ring represents the likelihood
that the possible LED is actually a target fiducial. Part e) of Figure 20 shows the final result, with
the five largest rings around the five fiducials, indicating that all five fiducials would be correctly
picked out from the original possible LEDs.

Figure 20: Screenshots showing the effect of likelihood LED sorting algorithms

a) Example exaggerated
noisy frame created to
test likelihood method of
sorting possible LEDs.
The frame contains
shapes which could be
mistaken for target
fiducials, some of which
are fairly common in
strongly lit locations.

b) The result of possible


LED sorting after
applying Group Size
Threshold.

26
The University of Surrey UAV Vision and Control System Richard Lane

c) The result of possible


LED sorting after
applying Group Size
Threshold, with the
Shape Likelihood
algorithm.

d) The result of possible


LED sorting after
applying Group Size
Threshold, with the
Shape and Size
Likelihood algorithms.

27
The University of Surrey UAV Vision and Control System Richard Lane

e) The result of possible


LED sorting after
applying Group Size
Threshold, with the
Shape, Size and
Orientation Likelihood
algorithms.

The results of the pixel size histogram (using the example frame shown in Figure 20) are shown in
Figure 21. For the two histograms performed there are two clear groups that are most likely to
contain possible LEDs which are target fiducials.

Figure 21: Histogram plots showing distribution of the amount of pixels in possible LEDs

5.3 Decision System

5.3.1 Orientation Calculations

Process
Once there has been five significant fiducials discovered in a frame the camera’s orientation
relative to the fiducials can be calculated. Figure 22 shows the fiducials as they should be
numbered, the steps listed below describe how the fiducials are given the correct number and
how orientation is calculated.

28
The University of Surrey UAV Vision and Control System Richard Lane

1) The average position of the five fiducials is calculated.


2) The distances between each fiducial and each other four fiducials are calculated. The two
shortest distances will have a fiducial in common; this will always be fiducial 0.
3) The angles from the direction of fiducial 0 to each of the other four fiducials are calculated,
working clockwise. Fiducial 1 will be between 0°-90°, fiducial 2 will be between 90°-180°,
fiducial 3 will be between 180°-270°, and fiducial 4 will be between 270°-360°.
4) The direction the camera is facing compared to the fiducials can then be calculated; this is
shown as angle & in Figure 22.
5) Assuming the camera is level, the location of the camera relative to the fiducials is then the
difference between the average of the five fiducials, and the centre point of the video frame.
6) If the camera when attached to the UAV, is pitching or rolling drastically this could affect the
results from step 5). To compensate, pitch and roll can be calculated by be looking at the
distances between fiducials 1, 2, 3 and 4. If distance[1,4] is larger than distance[3,2] then it’s
pitching forwards, else it’s pitching backwards. If distance[1,2] is larger than distance[4,3]
then it’s rolling right, else it’s rolling left.
7) The information gained at step 6) can then be taken into account to calculate a modified
value instead of the centre point of the video frame.
8) Height of the camera above the fiducials can be estimated be looking at the average of
distance[1,3] and distance[2,4]. Tests have shown that this distance is approximately 300
pixels at a height of 1 metre. Therefore the height in metres is taken as 300 divided by the
average distance.

Forward direction
of camera
Average of
five points

Figure 22: Fiducial Numbering for Orientation Calculations

Orientation Pseudocode Representation

find the centre point


for each fiducial
sum x and y locations in frame
loop
divide by number of fiducials

29
The University of Surrey UAV Vision and Control System Richard Lane

sort points to identify fiducial number


for each point
for each other point
find and record distance between two points
loop
loop
for each recorded distance
find the two least values…
if distance < least1
if least1 < least2
recorded distance becomes new value of least2
else
recorded distance becomes new value of least1
else if distance < least2
recorded distance becomes new value of least2
loop
fiducial 0 is the point common to both least distances
calculate angle of fiducial 0
for each point which is not fiducial 0
calculate angle of point
angle is calculated angle minus angle of fiducial 0
if angle is between 0° and 90° then
point is fiducial 1
else if angle is between 90° and 180° then
point is fiducial 2
else if angle is between 180° and 270° then
point is fiducial 3
else if angle is between 270° and 360° then
point is fiducial 4
loop

find direction of camera


calculate angle of point 0

find pitch and roll


roll is distance[1,2] minus distance[3,4]
if roll is positive then
rolling left
if roll is negative then
rolling right
pitch is distance[3,2] minus distance[1,4]
if pitch is positive then
pitching forward
if pitch is negative then
pitching backward

find height of camera above fiducials


take average of distance[1,3] and distance[2,4]
divide the average at 1 metre by the calculated average

30
The University of Surrey UAV Vision and Control System Richard Lane

Angle Calculations
The following calculation is used to find the angle of one point compared to a reference point.
When used in the orientation calculations the reference point is the average of the five fiducial
locations. Resulting angles are relative to the camera forward direction shown in Figure 22, this
direction is taken as 0°.

TempX = RefX – PointX


TempY = RefY – PointY
if TempY = 0 and TempX >=0 then
Angle = 0
else if TempY = 0 and TempX < 0 then
Angle = 180
else if TempX = 0 and TempY < 0 then
Angle = 90
else if TempX = 0 and TempY > 0 then
Angle = 270
else
Angle = arctan ( TempY / TempX)
Angle = Angle * 180 / PI
if TempY < 0 and TempX > 0 then Angle = 0 – Angle
if TempY > 0 and TempX > 0 then Angle = 360 – Angle
if TempY > 0 and TempX < 0 then Angle = 180 – Angle
if TempY < 0 and TempX < 0 then Angle = 0 – Angle

Results
Figure 23 shows a screenshot of orientation data being displayed on same noisy test frame used to
demonstrate likelihoods. Four lines should connect the fiducials in a rectangular shape, this shows
the fiducials are being numbered correctly. These lines can be red or green, red is normal and
green visualises the direction of pitch and roll. Blue lines show the forward directions of the
camera and fiducials. The green line shown between points 3-4 in this example means that the
UAV would be rolling to the left. The green line shown between points 1-4 means that the UAV
would be pitching forwards.

31
The University of Surrey UAV Vision and Control System Richard Lane

Figure 23: Orientation Data shown on a test frame

5.3.2 UAV to UGV Communications

Process
A protocol has been implemented for communications between the UAV decision system and the
UGV decision system using Network API and Bcastmanager, which were developed by Ahmed
Aichi during the first year of SEP. The Bcastmanager is used to broadcast an address string
across a network, which removes the need to know the target program’s IP address.
Bcastmanager then finds the IP address of the target program which is also broadcasting an
address string and establishes a connection for Network API. Once connected, the UAV
decision system checks for a received message before each video frame is processed. After each
video frame is processed any messages ready to be sent out to the UGV decision system are
transmitted, for example to acknowledge a received message and/or report an error.

Messages from UGV to UAV


Only one message is required which is a height request. One double value will be sent to
represent the height in metres at which the UGV decision system wishes the UAV to fly. This
should then be acknowledged and possibly rejected by the UAV decision system.

32
The University of Surrey UAV Vision and Control System Richard Lane

Messages from UAV to UGV


One char will be sent containing 8 status bits; each bit has a different meaning as described below.
For example to acknowledge a command the hex byte 0x01 would be sent, or to report a low
battery and complete loss of tracking the hex byte 0x28 would be sent (0x20 + 0x08). Error bits
are set during processing of a video frame.

Bit 0 Acknowledge that a command has been received


Bit 1 UAV decision system error
Bit 2 Partial loss of tracking
Bit 3 Complete loss of tracking
Bit 4 Invalid command
Bit 5 UAV battery low
Bit 6 Reserved
Bit 7 Reserved

Network API and Bcastmanager Variables


When setting up the Network API and Bcastmanager several variables are required at both ends.
The following variables have been decided by the author and Ben Abbott for UAV to UGV
communications.

Local Port: 6200 - port number for UAV decision system


Remote Port: 6000 - port number for UGV decision system
Local Address: “uav_system” - Bcastmanager name for UAV decision system
Remote Address: “DECISION” - Bcastmanager name for UGV decision system
Channel from UGV: “ugv_to_uav”
Channel to UGV: “uav_to_ugv

5.3.3 UGV Emulation Software

Purpose
For the purpose of testing UAV to UGV communications, an emulation program was written
which pretends to be the UGV decision system (the code is included in Appendix E). Once
running it establishes a connection with the UAV decision system using Network API and
Bcastmanager in the same that the UGV decision systems would. It will then pick up error
messages sent from the UAV decision system and display their meaning in a console window. This
provides and excellent way of checking when error messages are being sent and that they are
being sent correctly. The UGV emulator also sends a height request every few seconds to the
UAV decision system. The height sent will increment each message by 0.1 metres until it receives
an “invalid command” message back from the UAV decision system. It will then decrement each
message until again it receives an “invalid command” message. This is used to test the UAV
decision systems response to height requests, which could be out of the allowed range, and to
check it accepts a valid height as its new target height.

33
The University of Surrey UAV Vision and Control System Richard Lane

UGV Emulation Pseudocode Representation

enter state machine loop starting at Init state


state Init:
initiate Network API and Bcastmanager
wait for connection to be established
change to Wait state
state Wait:
sleep for 20ms to avoid over using CPU
if time since last message sent < send interval then
change to Check state
else
change to Send state
state Check:
check with Network API for new message
if message then
decipher message and display meaning
if invalid command message received
swap height change to increment or decrement
change to Wait state
state Send:
increment or decrement next height
send height message with Network API
change to Wait state
state Error:
pause and advise program is closed
loop

5.3.4 Error Handling

There are several types of error which could occur during use of the Vision and Control System;
anticipated errors and how they are handled are described below.

Video Frame Glitches


Video frame glitches can occur due to the wireless camera having poor reception or suffering
from interference. The effect is a video frame filled with noisy lines across it. These lines however
should not be picked up as fiducials because of the implemented fiducial detection algorithm.
However if they are, checks are in place to make sure the tracking points have not jumped
drastically across the frame, since the last frame. If the frame is labelled as a glitch frame it is not
used for further calculations.

Partial Loss of Tracking


This is an error which is reported to the UGV decision system by sending a “Partial loss of
tracking” message in the event of a partial loss of fiducial tracking. Partial loss of tracking is defined
as not having a complete set of suitable fiducials in five consecutive video frames. The set of
fiducials could not be suitable if one or more fiducials are missing or in the event of a frame glitch.
Upon receiving a “Partial loss of tracking” message the UGV decision should react by stopping or
re-tracing its movements. Assuming that the problem is that the UGV has become partially out of
view of the UAV’s camera, this is an attempt to bring the UGV’s fiducials back into view. This
error is only partially implemented, in the event of this error it will be reported to the UGV.
However the ideal behaviour for the UAV would be to attempt to cope with previous position

34
The University of Surrey UAV Vision and Control System Richard Lane

information or the fiducials available until tracking is restored. This would be part of the
automated control implementation which is incomplete.

Complete Loss of Tracking


This is an error which is reported to the UGV decision system by sending a “Complete loss of
tracking” message in the event of a complete loss of fiducial tracking. Complete loss of tracking is
defined as not being able to find any suitable fiducials in five consecutive video frames. Upon
receiving a “Complete loss of tracking” message the UGV decision should react by re-tracing its
movements. Assuming that the problem is that the UGV has become out of view of the UAV’s
camera, this is an attempt to bring the UGV’s fiducials back into view. This error is only partially
implemented, in the event of this error it will be reported to the UGV. However the ideal
behaviour for the UAV would be to slowly lower itself towards the ground until tracking is
achieved or the floor is reached safely. This would be part of the automated control
implementation which is incomplete.

Low Battery Warning


This is an error which is reported to the UGV decision system by sending a “UAV battery low”
message in the event of a low battery being detected. A low battery error should be defined as
UAV being unable to reach a target height within 10 seconds at full throttle. 10 seconds is more
than enough time to rise from the minimum to maximum height limit at full throttle. Therefore it
is safe to assume if the height is not reached, it is due to the battery not having enough power to
keep the UAV in the air. Although this error is defined, it is not yet implemented as this would be
part of the automated control implementation which is incomplete.

Invalid Requested Height


This is an error which is reported to the UGV decision system by sending an “Invalid command”
message in the event of an invalid height request being received from the UGV decision system.
The minimum height is set at 0.5 metres above the UGV fiducials, flying any lower than this could
result in unreliable tracking. The maximum height is set at 2.0 metres above the UGV fiducials,
flying any higher could result in unreliable tracking or collision with average height ceilings. Upon
receiving an “Invalid command” message just after sending a height request, the UGV decision
should re-send the request with a value within agreed limits.

Decision System Error


This is an error which is reported to the UGV decision system by sending a “UAV decision system
error” message in the event of an unknown decision system error. This should be considered by
the UGV decision system as a serious error. There is nothing that the UGV decision system can
do as the UAV will most likely have crashed.

5.4 Control Interface

5.4.1 Control Integration

Progress
Due to the amount of issues encountered getting the hardware control interface working, as
described in previous sections, the control integration of the project software remains largely
incomplete. The actual software-hardware communications are in place for both types of control

35
The University of Surrey UAV Vision and Control System Richard Lane

method; the USB-PPM box or the USB-Voltage microcontroller. Servo values can be sent to either
piece of hardware; however the calculation of the servo values is incomplete.

5.4.2 Visualisation

Process
A small visualisation box is drawn in the corner of the video window. It gives a visual
representation of the control servo values being sent out of the interface. Four lines represent the
four control sticks on the R/C transmitter. The lengths of the lines represent the servo values.
Figure 24 shows a screenshot of a test frame containing the control visualisation.

Pitch

Roll

Yaw

Throttle

Figure 24: Control Visualisation shown on a test frame

5.4.3 Remote Control Test Software

Purpose
A small Remote Control Test program was written to provide a simplified method of testing the
control interface between software and R/C transmitter, without running the full vision software.
It is compatible with both interface methods using the USB-PPM Box or the USB-Voltage
microcontrollers; the method used is set via a compile option. The program allows the use of a
keyboard to change servo values sent out by the selected interface. With each key press the
appropriate servo value is incremented/decremented by a set step size. Instructions are shown on

36
The University of Surrey UAV Vision and Control System Richard Lane

screen in the form of a transmitter (shown in Figure 25), pressing space resets the servo values to
default, and pressing esc closes the program.

Figure 25: Screenshot of Remote Control Test software

Remote Control Pseudocode Representation

initialise interface method


either SC-8000 servo controller or USB-RS232 serial port
if initialised correctly
initiate command values
while esc key not pressed
calculate servo values from command values
print servo values
send servo values
print instructions
get key from keyboard
adjust command values according to key pressed
check command values are within limits
clear screen
loop
set servo values to neural
send servo values
close interface connection

37
The University of Surrey UAV Vision and Control System Richard Lane

6 Live Test Results

6.1 Hardware

6.1.1 Manual Flight with Camera Mounted

Live tests were conducted to assess the mobility of the X-UFO as a UAV platform. The first test
performed was using the mains supply umbilical with the camera and 9V battery mounted to the
X-UFO. Upward acceleration was found to be fairly sluggish, achieving height at a rate of about 0.5
metres/sec. As a result of the extra weight the rotors have to spin faster to keep the UAV a flight,
therefore turning speed is also reduced because the maximum difference between rotor speeds is
reduced. Overall however the mobility of the UAV is still acceptable, as shown in Figure 26, the
UAV was capable of taking off and manoeuvring around the test area from about 0.5 metres to the
ceiling at 3.0 metres. It is also worth noting that with the extra weight centrally distributed there
were no visible negative effects on platform stability.

Figure 26: Screenshots from video showing UAV mobility with camera mounted

The second test was using the mains supply umbilical with the camera mounted to the X-UFO, but
without the 9V camera battery. Overall mobility was greatly improved with upward speeds of
around 1.0 metres/sec. Therefore if extra greater mobility is required in the future it is
recommended that the wireless camera is also powered from the same supply as the X-UFO.

The third test was using the X-UFO battery to supply the X-UFO with the camera and 9V battery
also mounted to the X-UFO. The extra weight from the camera and 9V battery almost prevents
the X-UFO from taking off. Height can barely be gained and due to the short flight time of the X-
UFO battery, flight was only achieved for 30 seconds before the battery could not keep up the
demand of full power to the motors. This test was repeated without the 9V camera battery and

38
The University of Surrey UAV Vision and Control System Richard Lane

results similar to the first test were achieved. However flight time was limited to about 4 minutes
before it was impossible for the UAV to gain height. Therefore if a battery powered UAV is
required several steps could be taken.
- A reduction of weight on the UAV. Such as the camera using the same battery as the X-UFO,
and/or the removal of safety precautions such as the foam padding surrounding to rotors.
- An upgrade to the X-UFO battery. Batteries with higher capacity are readily available in
professional R/C stores.
- An upgrade to the UAV platform. There are several higher rated quadrocopters available
commercially. This option was considered at the start of the project however due to their
price they were not suitable.

6.1.2 Control Interface

Tests were conducted using the USB-Voltage interface method (described in section 4.4.4) and the
Remote Control test software (described in 5.4.3). These tests proved that the control interface
works with no noticeable lag between pressing a key on the laptop to the X-UFO reacting
appropriately. As the Remote Control software provides servo values in 10 steps it is an
impractical control method for actual flight. However whilst holding the top of the X-UFO to
prevent collisions it was clear that each channel functions correctly whilst varying the values from
minimum to maximum.

6.2 Software

6.2.1 Fiducial Detection

Live tests were conducted to assess the fiducial tracking algorithm. The screenshots in Figure 27
show successful fiducial detection, with the likelihood of possible LEDs being the fiducials
represented as the size of the green circle marking them. The normal light condition screenshot
shows some possible LEDs which have been discarded; they are marked as tiny green dots due to
the extremely low likelihood of them being the fiducials. The normal light condition screenshot
shown in Figure 28 is also a good example of fiducial detection working correctly. Small white
reflections off of a radiator can be seen, these would have been picked up as possible LEDs
however they have been successful discarded.

Despite coping with noise extremely well compared previous detection algorithms, detection
problems still occur in some situations. In situations where the LEDs are very close to reflective
surfaces, the reflection on the surface can merge with the light from the LED. This results in a
larger mass of light in roughly the same position as the LED, which will get discarded by the
fiducial detection as the shape will not be circular or the same size as the other fiducials. In cases
where there happens to be noisy light in the frame, which is the same shape and size as a fiducial,
the algorithm can also fail. If the noise also happens to be at right angles to the other fiducials this
can lead to the noise being picked over an actual fiducial.

39
The University of Surrey UAV Vision and Control System Richard Lane

Figure 27: Screenshots showing Fiducial Detection in Low and Normal Light Conditions

6.2.2 Orientation Calculations

Live tests were conducted to assess orientation calculations in live conditions. Due to the
relatively simple nature of the orientation algorithm, as long as fiducial detection is successful,
orientation calculations are successful 100% of the time. Screenshots showing successful
orientation calculations are shown in the Figure 28.

Figure 28: Screenshots showing Orientation Calculations in Low and Normal Light Conditions

40
The University of Surrey UAV Vision and Control System Richard Lane

7 Additional Work

7.1 Project Planning

7.1.1 Gantt Charts

A Gantt chart was produced at the start of the project (shown in Appendix A) which included a
breakdown of the expected tasks during the project. The time allocated to some of the tasks
proved to be overly optimistic, and did not take into account busy times of the year or the four
weeks between semesters. There were also some unanticipated tasks taking up several weeks. At
the time of writing the Interim report the Gantt chart was revised and has since been updated
again with the time taken for each task and the addition of extra tasks (shown in Appendix B).

In the Gantt charts provided in Appendix A and Appendix B orange squares show the planned
times scales for each task, green lines show the actual time taken to complete tasks. If a task has a
green line but no orange square this shows that it is an unexpected task added into the Gantt
chart after it was first produced. In the revised Gantt chart in Appendix B lighter orange squares
show the time plan carried over from the first Gantt chart for comparison.

7.1.2 Costing

The total budget for the SEP project was £1000, a breakdown of hardware costs for the UAV
Vision and Control project is shown in Table 1. The original planned costs for the author’s project
totalled £299.48, but with the need for a mains supply for the X-UFO the last two items in the
table were ordered, raising the total to £321.45. The total may be more than a quarter of the SEP
budget but the other three projects obtained a large amount of hardware for free.

Table 1: Cost Breakdown of Hardware Ordered

7.1.3 Hardware Sourcing

A list of the part numbers and source of hardware ordered from the SEP budget is shown in Table
2. For future reference flyonthewall.uk.com was a particularly unreliable source and it took over a
month to receive the wireless camera ordered from them, this hampered progress on the project

41
The University of Surrey UAV Vision and Control System Richard Lane

whilst waiting for camera (a major component in the project). Tom’s RC is based in USA but the
SC-8000 was received within a week. Most of the small components were ordered from Rapid
through lab services as it is a preferred source, however parts may be found cheaper elsewhere.

Table 2: Part number and Source of Hardware Ordered

7.2 Project Website


As with 2006/07 SEP, a project website has also been produced for 2007/08. The author produced
the UAV project page, located at http://www.ee.surrey.ac.uk/Projects/SEP/0708/uav/. This provides
an overview of the progress made on the UAV Vision and Control System project and results
achieved. It also has a photo gallery containing development photos taken during the project, some
development videos, and provides download links for the project code, resources, and this report.
The author also produced the main 2007/08 introduction page, located at
http://www.ee.surrey.ac.uk/Projects/SEP/0708/. This provides an introduction to this year’s
projects, and a group photo of the 2007/08 SEP members.

7.3 Project Meetings


During the course of the project weekly meetings were held to discuss progress, problems and
targets. The meetings were chaired by Ben Abbott as the project leader, and meeting minutes
were taken each week by the other team members. The author participated in minute taking
every three weeks or less, and also chaired a meeting once. The result of the meetings were that
all members of SEP had time to sit down and discuss issues that they were having, and decide on
what their targets were to be achieved by the next weeks meeting.

42
The University of Surrey UAV Vision and Control System Richard Lane

8 Conclusions
Over the course of the project the most progress was made on the vision fiducial tracking system.
It can successfully pick out five fiducials from a video frame in a range of lighting conditions and can
handle noise to a certain extent. Therefore objective 1 “to produce a robust vision system capable
of tracking a set of fixed fiducials” can be considered achieved. It could be argued that the
robustness can be improved further. There will always be some situations where it would be
extremely difficult to pick fiducials out of a frame using just the current methods employed in the
project. Some improvements considered are described in section 9.3.

The orientation calculations of the UAV relative to the UGV are calculated correctly. The exact
location calculations have not been implemented due to the delay in getting the control interface
working, without an interface there was no need for the location information. Therefore objective
2 “to use information provided by the vision system to establish the camera’s location relative to
the fiducials” can only be considered partially achieved.

The control interface despite taking a large amount of the project time to produce is in a working
prototype state. It would have been extremely beneficial to the rest of the project objectives if the
originally planned interface had worked as it should first time. Despite being a prototype, objective
3 “to produce a control and decision system to interface between the vision system and UAV” can
be considered achieved. A black box generic final product would have an ideal solution; however
the final solution only works with the modified X-UFO transmitter. If a more professional
platform was to be purchased the interface is also in place using the originally planned interface.

Unfortunately objective 4 “to stabilise a UAV above a set of fiducials fixed to the UGV, by using a
vision system for the control feedback” has not been achieved. Due to the lack of a working
control interface for almost the entire project duration it has not been possible to start work on
the control feedback system. None of the secondary objectives were attempted due to the time
constraints on the project.

The project management aspects of the project have had varied success. Regular weekly meetings
helped the SEP team motivated and focused. The author’s efforts to plan task timings using a Gantt
chart failed several weeks into the project. Too many small tasks were overlooked, and the whole
plan was affected drastically by the constant delays in producing the control interface.

43
The University of Surrey UAV Vision and Control System Richard Lane

9 Discussions

9.1 Gantt Charts


The original Gantt chart (shown in Appendix A) from the start of the project now seems very
optimistic. I started off well but soon fell behind after extra tasks, which were not originally
anticipated started to emerge. The fact that I was unlikely to get much project work completed
during the exam period also passed me by. A revised Gantt chart (shown in Appendix B) seemed
more realistic as it also takes into account time available during holidays. However after several
weeks and countless more problems with the control interface, it was clear that tasks were far
behind being completed when they should have been.

9.2 Project Objectives


I am reasonably happy with the overall progress made on my project; obviously I would like to
have achieved more. I feel that completing the majority of the three out of four objectives in such
a large project however is very good progress. I am really happy with the vision system I’ve
produced. I came up with a lot of the ideas for its implementation without the need for advice. My
programming skills have also improved greatly over the year.

Actually achieving a working control interface took too much time during the project. If I could
change anything it would have been early decisions about the control interface. Two better
possibilities would have been to buy a more expensive aerial platform, one with a professional R/C
transmitter which isn’t intended as a toy, or to have decided to go in the direction of modifying
the internals of the X-UFO transmitter at an earlier stage of the project.

Failing to achieve automated flight is a great disappointment to me. This would have been the most
exciting part of the project to get working. However with the amount of problems encountered
with the control interface, it became clear late on in the project that this would not be
implemented.

Towards the end of the project it began to feel like I was working on two projects. If the original
interface plan had worked first time I feel I would have been able to achieve automated flight in
some form and a more reliable tracking system. In hindsight it would have made more sense to
split my project between two students.

9.3 Future Work


Although the fiducial detection works reasonably reliably there are several thresholds and
constants within the software header files, which given time could be optimised to produce
greater reliability.

There are still extra reliability methods I did not have time to implement such as temporal
checking of possible LEDs during fiducial detection. An example of this is assigning a higher
likelihood to possible LEDs which are close to the location of fiducials in the previous frame.

There are two other methods of tracking I considered during the project to be included alongside
the current fiducial detection. The first is using edge detection to pick out the edges on the UGV.

44
The University of Surrey UAV Vision and Control System Richard Lane

After finding the edges, corners can be found and the position of these corners could be
compared to the position of possible LEDs during fiducial detection to improve reliability.
Secondly I thought about implementing KLT tracking to again improve reliability as well as
providing addition positional information. The remnants of testing these ideas can be found within
my project code however there was not enough time to experiment further.

A method of coping with the merging of noisy light and fiducial light could be to detect circular
edges using available edge detection algorithms. This could be used to further improve fiducial
detection by being able to at least recognise half of the circular fiducial shape.

I spent some time trying to find ready code to use for camera pose estimation. There is most
likely something available however this was put to one side until the control interface was
working. Implementing an established pose estimation algorithm would produce more accurate
values to be fed to the control feedback system.

A large amount of work is needed on the control feedback system as it is nearly non-existent due
to the lack of time remaining after managing to produce a working control interface too late in the
project. It would also be good to see a finished black box version of the control interface with a
surface mounted PCB.

45
The University of Surrey UAV Vision and Control System Richard Lane

References
1. microdrones GmbH: MD 4-200 System Features
http://www.microdrones.com/md4-200.html, 2008-05-01

2. microdrones GmbH: Description of PPM9_USART


http://www.microdrones.com/documents/md_ppm9_usart_r1.0.pdf, 2007-10

3. Daily Mail: Police use remote-controlled drone to spy on crowd at V festival


http://www.dailymail.co.uk/pages/live/articles/technology/technology.html?in_article_id=47674
8&in_page_id=1965, 2007-08-21

4. Telegraph: 'Microdrone', the police's tiny eye in the sky


http://www.telegraph.co.uk/news/uknews/1580986/%27Microdrone%27%2C-the-police%27s-
tiny-eye-in-the-sky.html, 2008-03-08

5. “Estimation and Control of a Quadrotor Vehicle Using Monocular Vision and Moiré Patterns”
Glenn P. Tournier, Mario Valenti, Jonathan P. How, and Eric Feron,
http://acl.mit.edu/papers/GNC06_TournierHow.pdf, 2006-08
6. Abstract (page 1)
7. Figure 3 (page 3)
8. Moiré Target Overview (page 2)

9. “Six Degree of Freedom Estimation Using Monocular Vision and Moiré Patterns”
Glenn P. Tournier, http://vertol.mit.edu/papers/tournier_thesis.pdf, 2006-06

10. Picture of the “Heli-Q” miniature helicopter


http://www.nahpco.com/images/8003.jpg

11. Picture of the “Walkera Dragonfly” helicopter


http://www.airbornerc.com.au/images/dragonfly1a_big.jpg

12. Picture of the “X3D-BL UFO” quadrocopter


http://www.xufo-shop.de/.media/048902790336.png

13. Picture of the “Tri-Turbofan” blimp


http://www.modelspot.com/tripage-blimp.jpg

14. Wikipedia: Quadrotor Flight Control


http://en.wikipedia.org/wiki/Image:Quadrotor_yaw_torque.png, 2007-01-15

15. OpenCV Wiki: What is Open CV?


http://opencvlibrary.sourceforge.net/Welcome, 2007-10-12

16. CvCam Manual: Introduction


cvcam.rtf distributed with OpenCV, 2007

46
The University of Surrey UAV Vision and Control System Richard Lane

17. HighGUI Reference Manual: HighGUI overview


http://opencvlibrary.sourceforge.net/HighGui, 2007-04-11

18. Ahmed Aichi’s SEP Homepage


http://www.ee.surrey.ac.uk/Projects/SEP/0607/network/, 2007

19. Hammacher Schlemmer


http://www.hammacher.com/publish/72612.asp?promo=xsells#, 2008-05-03

20. Silverlit Toys Manufactory Ltd


http://www.silverlit-flyingclub.com/xufo.htm, 2007

21. Tom’s RC
http://www.tti-us.com/rc/sc8000tech.htm, 2007

22. flyonthewall.uk.com
http://www.flyonthewall.uk.com, 2007

47
The University of Surrey UAV Vision and Control System Richard Lane

Appendix A: Original Gantt Chart

Appendix A - i
The University of Surrey UAV Vision and Control System Richard Lane

Appendix A - ii
The University of Surrey UAV Vision and Control System Richard Lane

Appendix B: Revised Gantt Chart

Appendix B - i
The University of Surrey UAV Vision and Control System Richard Lane

Appendix B - ii
The University of Surrey UAV Vision and Control System Richard Lane

Appendix B - iii
The University of Surrey UAV Vision and Control System Richard Lane

Appendix C: Vision and Control System Software Code


aerial_platform.h

//--------------------------------------------------------------------------//
// aerial_platform.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for aerial_platform project
//--------------------------------------------------------------------------//

#ifndef __AERIAL_PLATFORM_H__
#define __AERIAL_PLATFORM_H__

// Complile options
//#define CONNECT_NETAPI
//#define USE_UGV_EMULATION
#define BCAST_LOCATION "C:/SEP_code/netAPI/"
#define EMULATOR_EXE "C:/SEP_code/ugv_emulation/release/ugv_emulation.exe"
//#define SEP_REPORT_ERRORS
//#define PAUSE_ON_ENTER

// Use to test individual frame


//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_320x240_led_perfect.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_320x240_led_speckled.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_320x240_led_noisy.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_320x240_3xled_noisy.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/col_320x240_5xled_speckled.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/col_320x240_led_real.jpg"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/col_320x240_5xled_realish.jpg"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_720x576_5xled_very_noisy.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_720x576_odd_5xled_very_noisy.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_720x576_even_5xled_very_noisy.bmp"
//#define PROCESS_FRAME_TEST
"C:/SEP_code/aerial_platform/test_frames/bw_720x576_noisy_shapes.bmp"

// KLT feature selection


//#define PROCESS_KLT_FEATURE
"C:/SEP_code/aerial_platform/test_frames/feature_320x240_testledboard_close.jpg"
#define PROCESS_KLT_FEATURE
"C:/SEP_code/aerial_platform/test_frames/feature_320x240_testledboard_mid.jpg"

#define CTRLMETHOD_USB2VOLTAGE
//#define CTRLMETHOD_SC8000

#define COMM_STRING "\\\\.\\COM13"


#define SC8000_COMM_PORT 3

#define CONTROL_CAMERA 1
#define CONTROL_CAMERA_WIDTH 720
#define CONTROL_CAMERA_HEIGHT 576
#define CONTROL_CAMERA_FPS 30.0

Appendix C - i
The University of Surrey UAV Vision and Control System Richard Lane

#define PROCESS_LEDS
//#define PROCESS_EDGES
//#define PROCESS_KLT

#define CALL_OK 1
#define CALL_ERROR 0

#define ERRORBIT_ACKNOWLEDGE 1 // 0x01


#define ERRORBIT_DECISIONERROR 2 // 0x02
#define ERRORBIT_PARTIALTRACKING 4 // 0x04
#define ERRORBIT_COMPLETETRACKING 8 // 0x08
#define ERRORBIT_INVALIDCOMMAND 16 // 0x10
#define ERRORBIT_BATTERYLOW 32 // 0x20
#define ERRORBIT_RESERVED1 64 // 0x40
#define ERRORBIT_RESERVED2 128 // 0x80

typedef enum {
STATE_CLOSE,
STATE_ERROR,
STATE_INIT,
STATE_PAUSED,
STATE_RESUME,
STATE_START,
STATE_STOPPED
} state_machine;

#endif //__AERIAL_PLATFORM_H__

aerial_platform.cpp

//--------------------------------------------------------------------------//
// aerial_platform.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() and procesing_callback() for control of the aerial
// platform
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "aerial_platform.h"
#include "sep_uavcomms.h"
#include "sep_cvcam.h"
#include "sep_commands.h"
#include "sep_uavcontrol.h"
#include "sep_orientation.h"
#include "cvcam.h"
#include "highgui.h"

#ifdef PROCESS_LEDS
#include "sep_leds.h"
#endif //ifdef PROCESS_LEDS
#ifdef PROCESS_EDGES
#include "sep_edges.h"
#endif //ifdef PROCESS_EDGES
#ifdef PROCESS_KLT
#include "sep_klttracker.h"
#endif //ifdef PROCESS_KLT

void processing_callback(IplImage* image)


{
#ifdef PROCESS_LEDS
#ifdef CONNECT_NETAPI
static sepUAVComms callbackComms;
if (callbackComms.commsConnected())

Appendix C - ii
The University of Surrey UAV Vision and Control System Richard Lane

{
#endif // ifdef CONNECT_NETAPI

bool gotcommands(false);
static sepLEDs callbackProcess;
static sepOrientation callbackOrientation;
static sepUAVControl callbackUAVControl;
// error status bits
static char callbackErrorBits(0);

#ifdef CONNECT_NETAPI
// check for target height message
static double tempheight;
if (callbackComms.commsCheck(tempheight))
{
callbackErrorBits += ERRORBIT_ACKNOWLEDGE;
callbackOrientation.SetTargetHeight(tempheight, callbackErrorBits);
}
#endif // ifdef CONNECT_NETAPI

// 5 reference points from processing


static CvPoint refpoints[5];
// commands from orientation
// [0] Forward(+)/Backward(-), [1] Left(+)/Right(-)
// [2] Up(+)/Down(-), [3] TurnLeft(-180->0)/TurnRight(0->180)
static double controldirections[4];
// do processing and get ref points from first field
callbackProcess.ProcessField(image, FIELD_ODD);

callbackProcess.Get5RefPoints(refpoints);
// calc orientation and get commands
gotcommands = callbackOrientation.GetCommands(image, refpoints,
controldirections);
// if got valid commands proceed and send commands to UAV
if (gotcommands)
{
// adapt values for grouping and distance thresholds
callbackProcess.AdaptValues(callbackOrientation.GetMinDistance());
// send commands to uav
// callbackUAVControl.SendCommands(controldirections);
}
else
{
// check orientation class for reportable errors
callbackOrientation.GetErrors(callbackErrorBits);
}
// show control visualisation in video window
callbackUAVControl.RefreshDisplay(image);

#ifdef CONNECT_NETAPI
// transmit status if needed
if (callbackErrorBits)
{
callbackComms.commsSend(callbackErrorBits);
callbackErrorBits = 0;
}
}
#endif //ifdef CONNECT_NETAPI
#endif //ifdef PROCESS_LEDS

#ifdef PROCESS_EDGES
static sepEdges callbackProcess;
callbackProcess.ProcessFrame(image);
#endif //ifdef PROCESS_EDGES

Appendix C - iii
The University of Surrey UAV Vision and Control System Richard Lane

#ifdef PROCESS_KLT
static sepKLTTracker callbackProcess;
callbackProcess.ProcessFrame(image);
#endif //ifdef PROCESS_KLT

int main(int argc, char** argv)


{
#ifndef PROCESS_FRAME_TEST

VidFormat controlformat = {CONTROL_CAMERA_WIDTH,


CONTROL_CAMERA_HEIGHT, CONTROL_CAMERA_FPS};
sepCVCAM ControlCamera(CONTROL_CAMERA, controlformat, processing_callback);

// enter main state machine


state_machine current_state = STATE_INIT, next_state = STATE_INIT;
while(current_state != STATE_CLOSE)
{
switch (current_state)
{
// init state
case STATE_INIT:
cvNamedWindow("control camera", CV_WINDOW_AUTOSIZE);
if (ControlCamera.camInit(
(HWND)cvGetWindowHandle("control camera")) == CALL_OK)
{
#ifdef PAUSE_ON_ENTER
next_state = STATE_STOPPED;
#else //ndef PAUSE_ON_ENTER
next_state = STATE_START;
#endif //PAUSE_ON_ENTER
}
else
{
next_state = STATE_ERROR;
}
break;

// stopped state
case STATE_STOPPED:
command_line_in(current_state, next_state);
break;

// start state
case STATE_START:
ControlCamera.camStart();
#ifdef PAUSE_ON_ENTER
next_state = STATE_PAUSED;
#else //ndef PAUSE_ON_ENTER
next_state = STATE_CLOSE;
#endif //PAUSE_ON_ENTER
break;

// paused state
case STATE_PAUSED:
command_line_in(current_state, next_state);
break;

// resume state
case STATE_RESUME:
ControlCamera.camResume();
next_state = STATE_PAUSED;
break;

// close state
case STATE_CLOSE:

Appendix C - iv
The University of Surrey UAV Vision and Control System Richard Lane

#ifdef PAUSE_ON_ENTER
ControlCamera.camKill();
cvDestroyWindow("control camera");
#else //ndef PAUSE_ON_ENTER
cvDestroyWindow("control camera");
#endif //PAUSE_ON_ENTER
break;

// error state
case STATE_ERROR:
command_line_in(current_state, next_state);
break;

// default
default:
#if SEP_REPORT_ERRORS
cerr << "state machine entered unknown state" << endl;
#endif // SEP_REPORT_ERRORS
next_state = STATE_ERROR;
}
current_state = next_state;
}

#else //PROCESS_FRAME_TEST

IplImage* frame_in = cvLoadImage(PROCESS_FRAME_TEST, CV_LOAD_IMAGE_COLOR);


processing_callback(frame_in);
cvNamedWindow("frame", CV_WINDOW_AUTOSIZE);
cvShowImage("frame",frame_in);
cvWaitKey(0);

#endif //PROCESS_FRAME_TEST

return 0;
}

sep_commands.h

//--------------------------------------------------------------------------//
// sep_commands.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for aerial_platform project
//--------------------------------------------------------------------------//

#ifndef __SEP_COMMANDS_H__
#define __SEP_COMMANDS_H__

#include "aerial_platform.h"

int command_line_in(state_machine current_state, state_machine& next_state);

#endif //__SEP_COMMANDS_H__

sep_commands.cpp

//--------------------------------------------------------------------------//
// sep_commands.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//

Appendix C - v
The University of Surrey UAV Vision and Control System Richard Lane

// contains command_line_in() for reading user commands and changing


// state accordingly
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_commands.h"

int command_line_in(state_machine current_state, state_machine& next_state)


{
char command_in;
switch (current_state)
{
// in stopped state
case STATE_STOPPED:
cout << "input command (r = run, c = close)...";
cin >> command_in;
switch (command_in)
{
case 'c':
next_state = STATE_CLOSE;
break;
case 'r':
next_state = STATE_START;
break;
default:
#if SEP_REPORT_ERRORS
cerr << "invalid input" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
break;

// in paused state
case STATE_PAUSED:
cout << "input command (r = resume, c = close)...";
cin >> command_in;
switch (command_in)
{
case 'c':
next_state = STATE_CLOSE;
break;
case 'r':
next_state = STATE_RESUME;
break;
default:
#if SEP_REPORT_ERRORS
cerr << "invalid input" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
break;

// in error state
case STATE_ERROR:
cout << "input command (c = close)...";
cin >> command_in;
switch (command_in)
{
case 'c':
next_state = STATE_CLOSE;
break;
default:
#if SEP_REPORT_ERRORS
cerr << "invalid input" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}

Appendix C - vi
The University of Surrey UAV Vision and Control System Richard Lane

break;
}
return CALL_OK;
}

sep_cvcam.h

//--------------------------------------------------------------------------//
// sep_cvcam.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepCVCAM class
//--------------------------------------------------------------------------//

#ifndef __SEP_CVCAM_H__
#define __SEP_CVCAM_H__

#include "cvcam.h"
#include "highgui.h"

class sepCVCAM
{
public:
sepCVCAM(int number, VidFormat format, void* callback);
~sepCVCAM();
int camInit(HWND window);
int camStart();
int camPause();
int camResume();
int camKill();
private:
sepCVCAM(){}
int cam_number;
VidFormat cam_format;
HWND cam_window;
void* cam_callback;
};

#endif //__SEP_CVCAM_H__

sep_cvcam.cpp

//--------------------------------------------------------------------------//
// sep_cvcam.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepCVCAM class for using cvcam for video input
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_cvcam.h"
#include "aerial_platform.h"

sepCVCAM::sepCVCAM(int number, VidFormat format, void* callback)


{
// keep cvcam variables
cam_number = number;
cam_format = format;
cam_callback = callback;
}

Appendix C - vii
The University of Surrey UAV Vision and Control System Richard Lane

sepCVCAM::~sepCVCAM()
{
}

int sepCVCAM::camInit(HWND window)


{
// set window handle
cam_window = window;

// check camera(s) exist


if (!cvcamGetCamerasCount())
{
cerr << "no cameras detected" << endl;
return CALL_ERROR;
}
// set cvcam properties
cvcamSetProperty(cam_number, CVCAM_PROP_ENABLE, CVCAMTRUE);
cvcamSetProperty(cam_number, CVCAM_PROP_CALLBACK, cam_callback);
cvcamSetProperty(cam_number, CVCAM_PROP_WINDOW, &cam_window);
cvcamSetProperty(cam_number, CVCAM_PROP_SETFORMAT, &cam_format);

// check cvcam initiated


if (!cvcamInit())
{
cerr << "error initiating cvcam" << endl;
return CALL_ERROR;
}
else
{
cout << "cvcam initiated" << endl;
return CALL_OK;
}
}

int sepCVCAM::camStart()
{
// start cvcam
cvcamStart();
#ifdef PAUSE_ON_ENTER
cout << "cvcam started, press enter to pause" << endl;
cvWaitKey(0);
cvcamPause();
cout << "cvcam paused" << endl;
#else //ndef PAUSE_ON_ENTER
cout << "cvcam started" << endl;
cvWaitKey(0);
cvcamStop();
cvcamExit();
cout << "cvcam killed" << endl;
#endif //PAUSE_ON_ENTER
return CALL_OK;
}

int sepCVCAM::camPause()
{
// pause cvcam
cvcamPause();
cout << "cvcam paused" << endl;
return CALL_OK;
}

int sepCVCAM::camResume()
{
// resume cvcam and wait for keys
cvcamResume();
cout << "cvcam resumed, press enter to pause" << endl;

Appendix C - viii
The University of Surrey UAV Vision and Control System Richard Lane

cvWaitKey(0);
cvcamPause();
cout << "cvcam paused" << endl;
return CALL_OK;
}

int sepCVCAM::camKill()
{
// stop and close cvcam
cvcamStop();
cvcamExit();
cout << "cvcam killed" << endl;
return CALL_OK;
}

sep_edges.h

//--------------------------------------------------------------------------//
// sep_edges.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepEdges class
//--------------------------------------------------------------------------//

#ifndef __SEP_EDGES_H__
#define __SEP_EDGES_H__

#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"

class sepEdges
{
public:
sepEdges();
~sepEdges();
void ProcessFrame(IplImage* frameimage);
private:
CvSize imagesize;
CvMemStorage* storage;
IplImage* originalimage;
IplImage* bwcopyimage;
};

#endif //ndef __SEP_EDGES_H__

sep_edges.cpp

//--------------------------------------------------------------------------//
// sep_edges.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepEdges class for detecting and tracking edges of UGV
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_edges.h"

// edges constructor
sepEdges::sepEdges()

Appendix C - ix
The University of Surrey UAV Vision and Control System Richard Lane

{
imagesize = cvSize(CONTROL_CAMERA_WIDTH,CONTROL_CAMERA_HEIGHT);
bwcopyimage = cvCreateImage(imagesize,IPL_DEPTH_8U,1);
storage = cvCreateMemStorage(0);
}

// edges deconstructor
sepEdges::~sepEdges()
{
cvReleaseImage(&bwcopyimage);
cvReleaseMemStorage(&storage);
}

// processes image
void sepEdges::ProcessFrame(IplImage *frameimage)
{
originalimage = frameimage;
// convert to gray scale
cvCvtColor(originalimage, bwcopyimage, CV_BGR2GRAY);
// use cvcanny
CvSeq* lines = 0;
int linecount;
cvCanny(bwcopyimage, bwcopyimage, 50, 200, 3);
// use houghlines
lines = cvHoughLines2(bwcopyimage, storage, CV_HOUGH_PROBABILISTIC,
1, CV_PI/180, 80, 30, 10);
for(linecount = 0; linecount < lines->total; linecount++)
{
CvPoint* line = (CvPoint*)cvGetSeqElem(lines, linecount);
cvLine(originalimage, line[0], line[1], CV_RGB(0,0,255), 2, 8);
}
}

sep_klttracker.h

//--------------------------------------------------------------------------//
// sep_klttracker.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepKLTTracker class
//--------------------------------------------------------------------------//

#ifndef __SEP_KLTTRACKER_H__
#define __SEP_KLTTRACKER_H__

#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"

#define FEATURES_TO_TRACK 500

class sepKLTTracker
{
public:
sepKLTTracker();
~sepKLTTracker();
void ProcessFrame(IplImage* frameimage);
private:
void ProcessFeature(void);
CvSize imagesize;
IplImage* originalimage;
IplImage* bworiginalimage;
IplImage* bwfeatureimage;
IplImage* eigimage;

Appendix C - x
The University of Surrey UAV Vision and Control System Richard Lane

IplImage* tempimage;
IplImage* pyramid1;
IplImage* pyramid2;
CvPoint2D32f featurefeatures[FEATURES_TO_TRACK];
int numfeatures;
CvPoint2D32f framefeatures[FEATURES_TO_TRACK];
CvSize opticalflowwindow;
CvTermCriteria opticalflowterminationcriteria;
char opticalflowfoundfeature[FEATURES_TO_TRACK];
float opticalflowfeatureerror[FEATURES_TO_TRACK];
};

#endif //ndef __SEP_KLTTRACKER_H__

sep_klttracker.cpp

//--------------------------------------------------------------------------//
// sep_klttracker.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepKLTTracker class for
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_klttracker.h"

// opticalflow constructor
sepKLTTracker::sepKLTTracker():
imagesize(cvSize(CONTROL_CAMERA_WIDTH, CONTROL_CAMERA_HEIGHT)),
opticalflowwindow(cvSize(2,2)),
opticalflowterminationcriteria(
cvTermCriteria(CV_TERMCRIT_ITER | CV_TERMCRIT_EPS, 20, 0.3)),
numfeatures(FEATURES_TO_TRACK)
{
// create required images
bworiginalimage = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
eigimage = cvCreateImage(imagesize, IPL_DEPTH_32F, 1);
tempimage = cvCreateImage(imagesize, IPL_DEPTH_32F, 1);
pyramid1 = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
pyramid2 = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
ProcessFeature();
}

// opticalflow deconstructor
sepKLTTracker::~sepKLTTracker()
{
cvReleaseImage(&bwfeatureimage);
cvReleaseImage(&bworiginalimage);
cvReleaseImage(&eigimage);
cvReleaseImage(&tempimage);
cvReleaseImage(&pyramid1);
cvReleaseImage(&pyramid2);
}

// processes the feature frame


void sepKLTTracker::ProcessFeature(void)
{
// find good features to track in feature image
bwfeatureimage = cvLoadImage(PROCESS_KLT_FEATURE,0);
cvGoodFeaturesToTrack(bwfeatureimage, eigimage, tempimage,
featurefeatures, &numfeatures, 0.01, 0.01, NULL, 9, 0, 0.04);
}

// processes image

Appendix C - xi
The University of Surrey UAV Vision and Control System Richard Lane

void sepKLTTracker::ProcessFrame(IplImage *frameimage)


{
// convert image to gray scale
originalimage = frameimage;
cvCvtColor(originalimage, bworiginalimage, CV_BGR2GRAY);
// run optical flow algorithm
cvCalcOpticalFlowPyrLK(bwfeatureimage, bworiginalimage, pyramid1, pyramid2,
featurefeatures, framefeatures, numfeatures, opticalflowwindow, 5,
opticalflowfoundfeature, opticalflowfeatureerror,
opticalflowterminationcriteria, 0 );
// visualise features in frame
int featurecount;
for(featurecount = 0; featurecount < numfeatures; featurecount++)
{
// if Pyramidal Lucas Kanade didn't really find the feature, skip it
if (opticalflowfoundfeature[featurecount] == 0 ) continue;

int linethickness(1);
CvScalar linecolor(CV_RGB(255,0,0));

// get arrow for feature


CvPoint p,q;
double angle, hypotenuse;
p.x = (int) featurefeatures[featurecount].x;
p.y = (int) featurefeatures[featurecount].y;
q.x = (int) framefeatures[featurecount].x;
q.y = (int) framefeatures[featurecount].y;
angle = atan2( (double) p.y - q.y, (double) p.x - q.x );
hypotenuse = sqrt( (double)((p.y - q.y)*(p.y - q.y) +
(p.x - q.x)*(p.x - q.x)) );

// draw main line of arrow


cvLine(originalimage, p, q, linecolor, linethickness, CV_AA, 0);

// draw tip of arrow


p.x = (int) (q.x + 9 * cos(angle + CV_PI / 4));
p.y = (int) (q.y + 9 * sin(angle + CV_PI / 4));
cvLine(originalimage, p, q, linecolor, linethickness, CV_AA, 0);
p.x = (int) (q.x + 9 * cos(angle - CV_PI / 4));
p.y = (int) (q.y + 9 * sin(angle - CV_PI / 4));
cvLine(originalimage, p, q, linecolor, linethickness, CV_AA, 0);
}
}

sep_leds.h

//--------------------------------------------------------------------------//
// sep_leds.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepLEDs class
//--------------------------------------------------------------------------//

#ifndef __SEP_LEDS_H__
#define __SEP_LEDS_H__

#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"
#include <map>

//#define DEBUG_SHOW_HISTOGRAM_INFO
//#define DEBUG_SHOW_ORIENTATION_INFO
//#define DEBUG_SHOW_LIKELIHOOD_INFO

Appendix C - xii
The University of Surrey UAV Vision and Control System Richard Lane

//#define DEBUG_SHOW_THRESHOLD_VALUES

#define SHOW_USEFUL_ONLY
#define LIKELIHOOD_CIRCLE_WEIGHT 0.05
#define SIZE_HISTOGRAM_STEPS_START 25
#define SIZE_HISTOGRAM_STEP_JUMP 10
#define HISTOGRAM_REPETITIONS 2
#define ORIENTATION_ANGLE_VARIATION 20
#define PERCENT_OF_RED_PEAK 0.3 // 30%
#define PERCENT_OF_WHITE_PEAK 0.05 // 5%
#define DEFAULT_LED_REDTHRESHOLD 50
#define DEFAULT_LED_WHITETHRESHOLD 700
#define DEFAULT_LED_DISTTHRESHOLD 20
#define DEFAULT_LED_GROUPTHRESHOLD 20
#define LED_NUM_USED 5 // must be 5 or less

#define FIELD_ODD 0
#define FIELD_EVEN 1
#define R_OFFSET 2
#define G_OFFSET 1
#define B_OFFSET 0

typedef struct _Led {


CvPoint Position;
int pixelcount;
double likelihood;
} LedDetails;

typedef struct _LedOrientation {


int LEDorigin;
int LED1;
int LED2;
double likelihood;
double angle;
} LedOrientation;

class sepLEDs
{
public:
sepLEDs();
~sepLEDs();
void ProcessField(IplImage* frame_image, int fieldin);
void ClearField(int fieldin);
void Get5RefPoints(CvPoint* ref_points);
void AdaptValues(int min_distance);
private:
void ProcessFrame_ScanPixels(void);
double ProcessFrame_PixelDist(int x_compare, int y_compare, int led_compare);
void ProcessFrame_PickLeds(void);
double TestCircle(CvPoint TestPoint);
void SizeHistogram(void);
void TestOrientation(void);
double PointAngle(CvPoint Origin, CvPoint Point1, CvPoint Point2);
int PixelValue(const char& charin);
bool QuestionPixel_Useful(char& r, char& g, char& b);
bool QuestionPixel_StrongRed(int& r, int& g, int& b);
bool QuestionPixel_BrightWhite(int& r, int& g, int& b);
char* AccessPixel(CvPoint pnt, IplImage* img);
int processtype;
IplImage* ImageData;
int field;
int peak_red;
int peak_white;
int adaptled_redthreshold;
int adaptled_whitethreshold;
int adaptled_distthreshold;
int adaptled_groupthreshold;

Appendix C - xiii
The University of Surrey UAV Vision and Control System Richard Lane

LedDetails PrevLEDsFound[LED_NUM_USED];
LedDetails LEDsFound[LED_NUM_USED];
map<int,LedDetails> PossibleLeds;
};

inline int sepLEDs::PixelValue(const char& charin) {


// convert from 2's compliment values
if (charin < 0) return (charin + 256);
else return (charin); }

inline bool sepLEDs::QuestionPixel_StrongRed(int& r, int& g, int& b) {


if ((r-b-g) > adaptled_redthreshold)
return true;
else
return false; }

inline bool sepLEDs::QuestionPixel_BrightWhite(int& r, int& g, int& b) {


if ((r+b+g) > adaptled_whitethreshold)
return true;
else
return false; }

inline char* sepLEDs::AccessPixel(CvPoint pnt, IplImage* img) {


int pos = ((img->widthStep)*pnt.y) + ((img->nChannels)*pnt.x);
return &img->imageData[pos]; }

#endif //__SEP_LEDS_H__

sep_leds.cpp

//--------------------------------------------------------------------------//
// sep_leds.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepLEDs class for processing frames from the
// cvcam callback
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_leds.h"
#include "sep_utils.h"
using namespace sep_utils;

// processing constructor
sepLEDs::sepLEDs()
: adaptled_redthreshold(DEFAULT_LED_REDTHRESHOLD),
adaptled_whitethreshold(DEFAULT_LED_WHITETHRESHOLD),
adaptled_distthreshold(DEFAULT_LED_DISTTHRESHOLD),
adaptled_groupthreshold(DEFAULT_LED_GROUPTHRESHOLD)
{
}

// processing deconstructor
sepLEDs::~sepLEDs(void)
{
}

// chooses process method


void sepLEDs::ProcessField(IplImage* frame_image, int fieldin)
{
field = fieldin;
ImageData = frame_image;
ProcessFrame_ScanPixels();
ProcessFrame_PickLeds();

Appendix C - xiv
The University of Surrey UAV Vision and Control System Richard Lane

void sepLEDs::Get5RefPoints(CvPoint* ref_points)


{
// pass ref points out of sepLEDs class
int refcount;
for (refcount = 0; refcount < 5; refcount++)
{
ref_points[refcount] = LEDsFound[refcount].Position;
// mark ref points on frame
cvCircle(ImageData, LEDsFound[refcount].Position, 10,
cvScalar(0,255,0,0), 2);
}
}

void sepLEDs::ClearField(int fieldin)


{
int linepos;
CvPoint testpoint;
field = fieldin;
// scan each line
for (testpoint.y = field; testpoint.y < ImageData->height-1+field;
testpoint.y += 2)
{
testpoint.x = 0;
// scan each pixel in this line
for (linepos = (ImageData->widthStep)*testpoint.y;
linepos < (ImageData->widthStep)*(testpoint.y+1);
linepos += ImageData->nChannels)
{
if ((PixelValue(AccessPixel
(testpoint, ImageData)[R_OFFSET] != 255)) &&
(PixelValue(AccessPixel
(testpoint, ImageData)[G_OFFSET] != 255)) &&
(PixelValue(AccessPixel
(testpoint, ImageData)[B_OFFSET] != 255)))
{
AccessPixel(testpoint, ImageData)[R_OFFSET] = 0;
AccessPixel(testpoint, ImageData)[G_OFFSET] = 0;
AccessPixel(testpoint, ImageData)[B_OFFSET] = 0;
}
testpoint.x++;
}
}
}

void sepLEDs::ProcessFrame_ScanPixels(void)
{
int linepos, ledcount(0), closest[2], distance;
CvPoint testpoint;
bool pixeladded;
PossibleLeds.clear();
// scan each line
for (testpoint.y = field; testpoint.y < ImageData->height-1+field;
testpoint.y += 2)
{
testpoint.x = 0;
// scan each pixel in this line
for (linepos = (ImageData->widthStep)*testpoint.y;
linepos < (ImageData->widthStep)*(testpoint.y+1);
linepos += ImageData->nChannels)
{
if (QuestionPixel_Useful(AccessPixel(testpoint, ImageData)[R_OFFSET],
AccessPixel(testpoint, ImageData)[G_OFFSET],
AccessPixel(testpoint, ImageData)[B_OFFSET]))
{
// check for close points already found

Appendix C - xv
The University of Surrey UAV Vision and Control System Richard Lane

closest[1] = adaptled_distthreshold + 1;
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
distance = ProcessFrame_PixelDist(testpoint.x,
testpoint.y, ledcount);
if ((distance <= adaptled_distthreshold) &&
(distance < closest[1]))
{
closest[0] = ledcount;
closest[1] = distance;
}
}
// if close enought to one of the possibleleds
if (closest[1] != adaptled_distthreshold + 1)
{
PossibleLeds[closest[0]].Position.x += testpoint.x;
PossibleLeds[closest[0]].Position.y += testpoint.y;
PossibleLeds[closest[0]].pixelcount += 1;
}
// if no points close enough add new
else
{
// first line below automatically creates a
// new entry in PossibleLeds
PossibleLeds[PossibleLeds.size()].Position.x =
testpoint.x;
// size has now therefore changed hence -1
PossibleLeds[PossibleLeds.size()-1].Position.y =
testpoint.y;
PossibleLeds[PossibleLeds.size()-1].pixelcount = 1;
PossibleLeds[PossibleLeds.size()-1].likelihood = 0.0;
}
}
testpoint.x++;
}
}
}

bool sepLEDs::QuestionPixel_Useful(char& r, char& g, char& b)


{
int rcomp, gcomp, bcomp;
rcomp = PixelValue(r);
gcomp = PixelValue(g);
bcomp = PixelValue(b);
// check for peak values
if (peak_red < (rcomp-bcomp-gcomp)) peak_red = (rcomp-bcomp-gcomp);
if (peak_white < (rcomp+bcomp+gcomp)) peak_white = (rcomp+bcomp+gcomp);

if (QuestionPixel_BrightWhite(rcomp, gcomp, bcomp))


{
#ifdef SHOW_USEFUL_ONLY
r = 255;
g = 255;
b = 255;
#endif
return true;
}
else if (QuestionPixel_StrongRed(rcomp, gcomp, bcomp))
{
#ifdef SHOW_USEFUL_ONLY
r = 255;
g = 0;
b = 0;
#endif
return true;
}
else

Appendix C - xvi
The University of Surrey UAV Vision and Control System Richard Lane

{
#ifdef SHOW_USEFUL_ONLY
r = 0;
g = 0;
b = 0;
#endif
return false;
}
}

double sepLEDs::ProcessFrame_PixelDist(int x_compare, int y_compare, int led_compare)


{
// get average from current values
x_compare -= PossibleLeds[led_compare].Position.x /
PossibleLeds[led_compare].pixelcount;
y_compare -= PossibleLeds[led_compare].Position.y /
PossibleLeds[led_compare].pixelcount;
return DiagLength(y_compare, x_compare);
}

void sepLEDs::ProcessFrame_PickLeds(void)
{
int ledcount, pickedcount, leastlikely;
double lowestlikelihood;
// save previously picked LEDs
for (ledcount = 0; ledcount < LED_NUM_USED; ledcount++)
PrevLEDsFound[ledcount] = LEDsFound[ledcount];
// check some simple likelyness statistics for possible LEDs
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// average x and y location values
PossibleLeds[ledcount].Position.x /= PossibleLeds[ledcount].pixelcount;
PossibleLeds[ledcount].Position.y /= PossibleLeds[ledcount].pixelcount;
// check pixel count is over group threshold
if (PossibleLeds[ledcount].pixelcount > adaptled_groupthreshold)
PossibleLeds[ledcount].likelihood = 1.0;
// test to see if shape is circlular
PossibleLeds[ledcount].likelihood *=
TestCircle(PossibleLeds[ledcount].Position);
}
// perform histogram on sizes
SizeHistogram();
// check orientation of points
TestOrientation();

#ifdef DEBUG_SHOW_LIKELIHOOD_INFO
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
cout << "[" << PossibleLeds[ledcount].Position.x << "," <<
PossibleLeds[ledcount].Position.y
<< "] pixels:" << PossibleLeds[ledcount].pixelcount << " lhood:"
<< PossibleLeds[ledcount].likelihood << endl;
// mark possible points with size according to stats
if ((int)(PossibleLeds[ledcount].likelihood*30.0) < 1)
PossibleLeds[ledcount].likelihood = 0.05;
cvCircle(ImageData, PossibleLeds[ledcount].Position,
(int)(PossibleLeds[ledcount].likelihood*30.0), cvScalar(0,255,0,0),
2);
}
#endif //def DEBUG_SHOW_LIKELIHOOD_INFO

// reset picked pixel count


for (pickedcount = 0; pickedcount < LED_NUM_USED; pickedcount++)
LEDsFound[pickedcount].likelihood = 0;
// sort through possible leds
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{

Appendix C - xvii
The University of Surrey UAV Vision and Control System Richard Lane

// find lowest pixelcount so far


lowestlikelihood = 1.1;
for (pickedcount = 0; pickedcount < LED_NUM_USED; pickedcount++)
{
if (LEDsFound[pickedcount].likelihood < lowestlikelihood)
{
lowestlikelihood = LEDsFound[pickedcount].likelihood;
leastlikely = pickedcount;
}
}
// check to find if led is more likely to be useful than least likely
if (PossibleLeds[ledcount].likelihood > lowestlikelihood)
{
LEDsFound[leastlikely] = PossibleLeds[ledcount];
}
}
}

double sepLEDs::TestCircle(CvPoint TestPoint)


{
double up(0.0), down(0.0), left(0.0), right(0.0);
double uleft(0.0), uright(0.0), dleft(0.0), dright(0.0);
double likelyhoodcalc(1.0), temp, templength, avg;
char* pixel;
CvPoint test;

int smallest;

// adjust TestPoint to decide on field to use


if ((field == FIELD_EVEN && (TestPoint.y % 2 == 0)) ||
(field == FIELD_ODD && (TestPoint.y % 2 != 0)))
{
TestPoint.y += 1;
up -= 1;
down += 1;
}

// scan up to find circle border


for (test = TestPoint; test.y >= 0; up+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y -= 2.0;
}
// scan down to find circle border
for (test = TestPoint; test.y < ImageData->height; down+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y += 2.0;
}
// scan left to find circle border
for (test = TestPoint; test.x >= 0; left+=1.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.x -= 1.0;
}
// scan right to find circle border
for (test = TestPoint; test.x < ImageData->width; right+=1.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.x += 1.0;
}
// scan up and left to find circle border
for (test = TestPoint; (test.y >= 0) && (test.x >= 0); uleft+=2.0)

Appendix C - xviii
The University of Surrey UAV Vision and Control System Richard Lane

{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y -= 2.0;
test.x -= 2.0;
}
// scan up and right to find circle border
for (test = TestPoint; (test.y >= 0) && (test.x < ImageData->width); uright+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y -= 2.0;
test.x += 2.0;
}
// scan down and right to find circle border
for (test = TestPoint; (test.y < ImageData->height) &&
(test.x < ImageData->width); dright+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y += 2.0;
test.x += 2.0;
}
// scan down and left to find circle border
for (test = TestPoint; (test.y < ImageData->height) && (test.x >= 0); dleft+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y += 2.0;
test.x -= 2.0;
}
// calc true diag values
uright = DiagLength(uright,uright);
uleft = DiagLength(uleft,uleft);
dright = DiagLength(dright,dright);
dleft = DiagLength(dleft,dleft);
// calc average
avg = (up + down + left + right + uleft + uright + dleft + dright) / 8.0;
// check differences in up/right/left/down
temp = ModDifference(up, right);
temp += ModDifference(right, down);
temp += ModDifference(down, left);
temp += ModDifference(left, up);
temp += ModDifference(down, up);
temp += ModDifference(left, right);
temp /= avg;
likelyhoodcalc -= LIKELIHOOD_CIRCLE_WEIGHT*temp;
// check differences in uleft/uright/dleft/dright
temp = ModDifference(uright, uleft);
temp += ModDifference(dright, uright);
temp += ModDifference(dleft, dright);
temp += ModDifference(uleft, dleft);
temp += ModDifference(uleft, dright);
temp += ModDifference(uright, dleft);
temp /= avg;
likelyhoodcalc -= LIKELIHOOD_CIRCLE_WEIGHT*temp;
// compare up/right/left/down to diags
temp = ModDifference(uright, up);
temp += ModDifference(dright, right);
temp += ModDifference(dleft, down);
temp += ModDifference(uleft, left);
temp += ModDifference(dright, up);
temp += ModDifference(dleft, right);
temp += ModDifference(uleft, down);
temp += ModDifference(uright, left);
temp /= avg;
likelyhoodcalc -= 2*LIKELIHOOD_CIRCLE_WEIGHT*temp;

Appendix C - xix
The University of Surrey UAV Vision and Control System Richard Lane

// make sure returned value is a min of 0


if (likelyhoodcalc < 0.0) likelyhoodcalc = 0.0;
return likelyhoodcalc;
}

void sepLEDs::SizeHistogram(void)
{
int ledcount, value, maxvalue, prevvalue, hist,
stepsize(SIZE_HISTOGRAM_STEPS_START);
map<int,int> histogram;
// perform two histograms
for (hist = 0; hist < HISTOGRAM_REPETITIONS; hist++)
{
maxvalue = 0;
prevvalue = 0;
histogram.clear();
// sort through possible leds and create histogram
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// ignore irrelivent possibleleds
if (PossibleLeds[ledcount].likelihood != 0)
{
// find the group value for current pixel
value = PossibleLeds[ledcount].pixelcount -
(PossibleLeds[ledcount].pixelcount % stepsize);
if (value > maxvalue) maxvalue = value;
// check it's not 0, ignoring 0 group
if (value)
{
// add to histogram
if (histogram.find(value) != histogram.end())
{
histogram[value]++;
}
else
{
histogram[value] = 1;
}
}
}
}
// scan through histogram
for (value = 0; value <= maxvalue; value += stepsize)
{
// check histogram entry exists
if (histogram.find(value) != histogram.end())
{
#ifdef DEBUG_SHOW_HISTOGRAM_INFO
cout << value << ": " << histogram[value] << endl;
#endif //def DEBUG_SHOW_HISTOGRAM_INFO
// cap histogram value at num of leds
if (histogram[value] > LED_NUM_USED)
histogram[value] = LED_NUM_USED;
}
}
// sort through possible leds and create histogram
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// find the group value for current pixel
value = PossibleLeds[ledcount].pixelcount -
(PossibleLeds[ledcount].pixelcount % stepsize);
// check histogram entry exists
if (histogram.find(value) != histogram.end())
{
PossibleLeds[ledcount].likelihood *=
(double)histogram[value] / (double)LED_NUM_USED;
}

Appendix C - xx
The University of Surrey UAV Vision and Control System Richard Lane

}
stepsize += SIZE_HISTOGRAM_STEP_JUMP;
}
}

void sepLEDs::TestOrientation(void)
{
int ledcount, led1count, led2count, testcount(0), prevorigin(-1), tempcount;
double tempangle, currentmax(0.0);
map<int, LedOrientation> orientations;
// sort through possible leds and calc orientations
// reference possible LEDs
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
if (PossibleLeds[ledcount].likelihood > 0.0)
{
// possible LEDs 1
for (led1count = 0; led1count < PossibleLeds.size(); led1count++)
{
if ((PossibleLeds[led1count].likelihood > 0.0) &&
(ledcount != led1count))
{
// possible LEDs 2
for (led2count = led1count;
led2count < PossibleLeds.size(); led2count++)
{
if ((PossibleLeds[led2count].likelihood > 0.0) &&
(ledcount != led1count)
&& (ledcount != led2count))
{
// calculate angle between two points
// wrt reference point
tempangle = PointAngle(
PossibleLeds[ledcount].Position,
PossibleLeds[led1count].Position,
PossibleLeds[led2count].Position);
if (((tempangle >= 180 –
ORIENTATION_ANGLE_VARIATION) &&
(tempangle <= 180 +
ORIENTATION_ANGLE_VARIATION)) ||
((tempangle >= 90 –
ORIENTATION_ANGLE_VARIATION) &&
(tempangle <= 90 +
ORIENTATION_ANGLE_VARIATION)))
{
#ifdef DEBUG_SHOW_ORIENTATION_INFO
cvLine(ImageData,PossibleLeds[ledcount].Position,
PossibleLeds[led1count].Position,cvScalar(255,0,0),1);
cvLine(ImageData,PossibleLeds[ledcount].Position,
PossibleLeds[led2count].Position,cvScalar(255,0,0),1);
#endif //def DEBUG_SHOW_ORIENTATION_INFO
// keep calculations for later
orientations[testcount].LED1 =
led1count;
orientations[testcount].LED2 =
led2count;
orientations[testcount].LEDorigin =
ledcount;
orientations[testcount].angle =
tempangle;
orientations[testcount].likelihood
= 5 *
PossibleLeds[ledcount].likelihood *
PossibleLeds[led1count].likelihood
* PossibleLeds[led2count].likelihood;
testcount++;
}

Appendix C - xxi
The University of Surrey UAV Vision and Control System Richard Lane

}
}
}
}
}
}
// sort through orientations
for (testcount = 0; testcount < orientations.size(); testcount++)
{
if (prevorigin == orientations[testcount].LEDorigin)
{
if (orientations[testcount].likelihood > currentmax)
{
currentmax = orientations[testcount].likelihood;
tempcount = testcount;
}
}
else if (prevorigin < 0)
{
prevorigin = orientations[testcount].LEDorigin;
currentmax = orientations[testcount].likelihood;
tempcount = testcount;
}
else
{
#ifdef DEBUG_SHOW_ORIENTATION_INFO
cvLine(ImageData,
PossibleLeds[orientations[tempcount].LEDorigin].Position,
PossibleLeds[orientations[tempcount].LED1].Position,
cvScalar(255,0,0),1);
cvLine(ImageData,
PossibleLeds[orientations[tempcount].LEDorigin].Position,
PossibleLeds[orientations[tempcount].LED2].Position,
cvScalar(255,0,0),1);
cout << "[" << PossibleLeds[prevorigin].Position.x << "," <<
PossibleLeds[prevorigin].Position.y << "] angle:"
<< orientations[testcount].angle
<< " prev:" << PossibleLeds[prevorigin].likelihood << " new:"
<< (PossibleLeds[prevorigin].likelihood * currentmax) << endl;
#endif //def DEBUG_SHOW_ORIENTATION_INFO
PossibleLeds[prevorigin].likelihood =
PossibleLeds[prevorigin].likelihood * currentmax;
currentmax = orientations[testcount].likelihood;
prevorigin = orientations[testcount].LEDorigin;
}
}
}

double sepLEDs::PointAngle(CvPoint Origin, CvPoint Point1, CvPoint Point2)


{
int pointcount;
double doublex, doubley, doubleangle, doubleang[2]={0.0};
CvPoint Points[2];
Points[0] = Point1;
Points[1] = Point2;
// calculate angle for points
for (pointcount = 0; pointcount < 2; pointcount++)
{
doublex = (double)(Origin.x - Points[pointcount].x);
doubley = (double)(Origin.y - Points[pointcount].y);
if ((doubley == 0.0) && (doublex >= 0.0))
doubleang[pointcount] = 0.0;
else if ((doubley == 0.0) && (doublex < 0.0))
doubleang[pointcount] = 180.0;
else if ((doublex == 0.0) && (doubley < 0.0))
doubleang[pointcount] = 90.0;
else if ((doublex == 0.0) && (doubley > 0.0))

Appendix C - xxii
The University of Surrey UAV Vision and Control System Richard Lane

doubleang[pointcount] = 270.0;
else {
doubleang[pointcount] = atan(doubley/doublex);
doubleang[pointcount] = (doubleang[pointcount]*180)/PI;
if ((doubley < 0) && (doublex > 0))
doubleang[pointcount] = 0.0 - doubleang[pointcount];
else if ((doubley > 0) && (doublex > 0))
doubleang[pointcount] = 360.0 - doubleang[pointcount];
else if ((doubley > 0) && (doublex < 0))
doubleang[pointcount] = 180.0 - doubleang[pointcount];
else if ((doubley < 0) && (doublex < 0))
doubleang[pointcount] = 180.0 - doubleang[pointcount]; }
}
doubleangle = ModDifference(doubleang[0],doubleang[1]);
if (doubleangle > 180)
doubleangle = 360 - doubleangle;
return doubleangle;
}

void sepLEDs::AdaptValues(int min_distance)


{
// adapt distance and group thresholds
if (min_distance == -1)
{
adaptled_distthreshold = DEFAULT_LED_DISTTHRESHOLD;
adaptled_groupthreshold = DEFAULT_LED_GROUPTHRESHOLD;
}
else if (min_distance > 0)
{
//adaptled_distthreshold = min_distance / 3;
//adaptled_groupthreshold = (min_distance / 5)^2;
//if (adaptled_distthreshold < 5) adaptled_distthreshold = 5;
//if (adaptled_groupthreshold < 2) adaptled_groupthreshold = 2;
}
// adapt red and white thresholds
adaptled_redthreshold = peak_red -
(int)(peak_red * PERCENT_OF_RED_PEAK); // % of peak
adaptled_whitethreshold = peak_white -
(int)(peak_white * PERCENT_OF_WHITE_PEAK); // % of peak
peak_red = 0;
peak_white = 0;
#ifdef DEBUG_SHOW_THRESHOLD_VALUES
cout << "Red:" << adaptled_redthreshold;
cout << " White:" << adaptled_whitethreshold << endl;
cout << "Distance: " << adaptled_distthreshold;
cout << " Group: " << adaptled_groupthreshold <<endl;
#endif //def DEBUG_SHOW_THRESHOLD_VALUES
}

sep_orientation.h

//--------------------------------------------------------------------------//
// sep_orientation.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepOrientation class
//--------------------------------------------------------------------------//

#ifndef __SEP_ORIENTATION_H__
#define __SEP_ORIENTATION_H__

#include "cv.h"

//#define DEBUG_SHOW_PITCH_DATA

Appendix C - xxiii
The University of Surrey UAV Vision and Control System Richard Lane

#define HEIGHT_FACTOR 300.0 // num of diagonal pixels for 1 metre height


#define MIN_FLYING_HEIGHT 0.5 // metres
#define MAX_FLYING_HEIGHT 2.0 // metres
#define NUM_REF_POINTS 5
#define NUM_PREVIOUS_POINTS 2
#define NUM_AVG_POINTS 50
#define GLITCH_THRESHOLD 50
#define GLITCH_PERCENT_THRESHOLD 20
#define PARTIAL_ERROR_THRESHOLD 5
#define COMPLETE_ERROR_THRESHOLD 5

typedef struct _Distance {


int distance;
int point1;
int point2;
} DistanceSort;

class sepOrientation
{
public:
sepOrientation();
~sepOrientation();
void SetTargetHeight(const double& target, char& errors);
bool GetCommands(IplImage* frame_image, CvPoint* ref_points_in,
double* directions_return);
int GetMinDistance(void);
void GetErrors(char& errors);
private:
int CheckRefData(void);
int CheckFrameGlitch(void);
void FindCentre(void);
void SortPoints(void);
void FindDirection(void);
void FindPitch(void);
void FindHeight(void);
void EstimateCentre(void);
void ReturnDirections(double* directions);
void FindMinDistance(void);
int PointDistance(CvPoint point1, CvPoint point2);
double PointAngle(CvPoint refpoint, CvPoint point1);
IplImage* ImageData;
CvPoint* RefData;
CvPoint CentrePoint;
int next_saved_point;
CvPoint PreviousPoints[NUM_PREVIOUS_POINTS];
CvPoint ActualCentrePoint[NUM_AVG_POINTS];
double ortn_direction;
double ortn_frontbackpitch; // (+)front (-)back
double ortn_leftrightpitch; // (+)left (-)right
double ortn_height;
double target_height;
int sortedpoints[NUM_REF_POINTS];
int min_distance;
int error_glitchcount;
int error_partialcount;
int error_completecount;
};

#endif //__SEP_ORIENTATION_H__

sep_orientation.cpp

//--------------------------------------------------------------------------//
// sep_orientation.cpp

Appendix C - xxiv
The University of Surrey UAV Vision and Control System Richard Lane

// - developed for University of Surrey - Special Engineering Project


// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepOrientation class for calculating orientation of
// fixed points to the camera
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "math.h"
#include "aerial_platform.h"
#include "sep_orientation.h"
#include "sep_utils.h"
using namespace sep_utils;

sepOrientation::sepOrientation()
: next_saved_point(-NUM_PREVIOUS_POINTS), target_height(1.0)
{
}

sepOrientation::~sepOrientation()
{
}

void sepOrientation::SetTargetHeight(const double& target, char& errors)


{
// check height is within limits
if ((target < MIN_FLYING_HEIGHT) || (target > MAX_FLYING_HEIGHT))
// set invalid command error bit
errors += ERRORBIT_INVALIDCOMMAND;
else
// set new target height
target_height = target;
}

bool sepOrientation::GetCommands(IplImage* frame_image,


CvPoint* ref_points_in, double* directions_return)
{
ImageData = frame_image;
RefData = ref_points_in;

// check for valid ref data


if (CheckRefData() == CALL_OK)
{
FindCentre();
SortPoints();
FindDirection();
// check for frame glitch
if (CheckFrameGlitch() == CALL_OK)
{
FindPitch();
FindHeight();
//EstimateCentre();
ReturnDirections(directions_return);
return true;
}
else
{
return false;
}
}
else
{
FindMinDistance();
return false;
}
}

Appendix C - xxv
The University of Surrey UAV Vision and Control System Richard Lane

int sepOrientation::GetMinDistance(void)
{
return min_distance;
}

// set relevent error bits for loss of tracking


void sepOrientation::GetErrors(char& errors)
{
if (error_completecount >= COMPLETE_ERROR_THRESHOLD)
{
errors += ERRORBIT_COMPLETETRACKING;
}
else if ((error_partialcount + error_glitchcount)
>= PARTIAL_ERROR_THRESHOLD)
{
errors += ERRORBIT_PARTIALTRACKING;
}
}

// check input ref data is valid


int sepOrientation::CheckRefData(void)
{
int pointcount, datamissing(0);
bool dataout(false);
for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)
{
if ((RefData[pointcount].x == -1) || (RefData[pointcount].y == -1))
datamissing++;
else if ((RefData[pointcount].x < 0) || (RefData[pointcount].y < 0)
|| (RefData[pointcount].x > CONTROL_CAMERA_WIDTH)
|| (RefData[pointcount].y > CONTROL_CAMERA_HEIGHT))
dataout = true;
}
if (datamissing)
{
if (datamissing == 5) error_completecount++;
else error_partialcount++;
#if SEP_REPORT_ERRORS
cerr << datamissing << " ref point(s) missing" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
else if (dataout)
{
#if SEP_REPORT_ERRORS
cerr << "ref point data out of bounds" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
else
{
error_partialcount = 0;
error_completecount = 0;
return CALL_OK;
}
}

// return calculated direction commands


// [0] Forward(+)/Backward(-), [1] Left(+)/Right(-)
// [2] Up(+)/Down(-), [3] TurnLeft(-180->0)/TurnRight(0->180)
void sepOrientation::ReturnDirections(double* directions)
{
// set up/down
directions[2] = target_height - ortn_height;
// set turn left/right
if (ortn_direction > 180.0)
directions[3] = 360.0 - ortn_direction;

Appendix C - xxvi
The University of Surrey UAV Vision and Control System Richard Lane

else
directions[3] = 0.0 - ortn_direction;
// set forward/backward

// set left/right

// check orientation data for frame glitch


int sepOrientation::CheckFrameGlitch(void)
{
// fill previous points for first time
if (next_saved_point < 0)
{
PreviousPoints[next_saved_point + NUM_PREVIOUS_POINTS] = CentrePoint;
next_saved_point++;
#ifndef PROCESS_FRAME_TEST
return CALL_ERROR;
#else
return CALL_OK;
#endif
}
// compare average of previous points to current point
else
{
// calculate average
CvPoint AvgPoint = {0,0};
int pointcount;
for (pointcount = 0; pointcount < NUM_PREVIOUS_POINTS; pointcount++)
{
AvgPoint.x += PreviousPoints[pointcount].x;
AvgPoint.y += PreviousPoints[pointcount].y;
}
AvgPoint.x /= NUM_PREVIOUS_POINTS;
AvgPoint.y /= NUM_PREVIOUS_POINTS;
// add new value to previous points array
PreviousPoints[next_saved_point++] = CentrePoint;
if (next_saved_point >= NUM_PREVIOUS_POINTS) next_saved_point = 0;
// compare
if ((PointDistance(AvgPoint, CentrePoint) < GLITCH_THRESHOLD) &&
PercentageThreshold(PointDistance(RefData[sortedpoints[1]],
CentrePoint), PointDistance(RefData[sortedpoints[4]], CentrePoint),
GLITCH_PERCENT_THRESHOLD) &&
PercentageThreshold(PointDistance(RefData[sortedpoints[2]],
CentrePoint), PointDistance(RefData[sortedpoints[3]], CentrePoint),
GLITCH_PERCENT_THRESHOLD))
{
return CALL_OK;
}
else
{
error_glitchcount++;
#if SEP_REPORT_ERRORS
cerr << "frame glitch detected" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
}
}

// find center of point and draw onto image


void sepOrientation::FindCentre(void)
{
int pointcount;
CentrePoint.x = 0;
CentrePoint.y = 0;
for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)

Appendix C - xxvii
The University of Surrey UAV Vision and Control System Richard Lane

{
CentrePoint.x += RefData[pointcount].x;
CentrePoint.y += RefData[pointcount].y;
}
CentrePoint.x /= NUM_REF_POINTS;
CentrePoint.y /= NUM_REF_POINTS;
cvCircle(ImageData, CentrePoint, 5, cvScalar(255,0,0,0), 2);
}

void sepOrientation::SortPoints(void)
{
DistanceSort Distances[NUM_REF_POINTS*NUM_REF_POINTS], Least[2], Most[4];
int pointcount, pointcount2, sortcount(0);
double refangle, tempangle;

// workout distances between points


for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)
{
for (pointcount2 = pointcount+1; pointcount2 < NUM_REF_POINTS;
pointcount2++)
{
Distances[sortcount].distance = PointDistance(RefData[pointcount],
RefData[pointcount2]);
Distances[sortcount].point1 = pointcount;
Distances[sortcount].point2 = pointcount2;
sortcount++;
}
}

// find least distances and therefore point 0


Least[0].distance = CONTROL_CAMERA_WIDTH*CONTROL_CAMERA_HEIGHT;
Least[1].distance = CONTROL_CAMERA_WIDTH*CONTROL_CAMERA_HEIGHT;
for (pointcount = 0; pointcount < sortcount; pointcount++)
{
if (Distances[pointcount].distance < Least[0].distance)
{
if (Least[1].distance > Least[0].distance)
{
Least[1] = Distances[pointcount];
}
else
{
Least[0] = Distances[pointcount];
}
}
else if (Distances[pointcount].distance < Least[1].distance)
{
Least[1] = Distances[pointcount];
}
}
if ((Least[0].point1 == Least[1].point1) || (Least[0].point1 == Least[1].point2))
sortedpoints[0] = Least[0].point1;
else if ((Least[0].point2 == Least[1].point1) ||
(Least[0].point2 == Least[1].point2))
sortedpoints[0] = Least[0].point2;

(Least[0].distance < Least[1].distance) ?


(min_distance = Least[0].distance) : min_distance = Least[1].distance;

// use angles to sort points 1 to 4


refangle = PointAngle(CentrePoint, RefData[sortedpoints[0]]);
for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)
{
if (pointcount != sortedpoints[0])
{
tempangle = PointAngle(CentrePoint, RefData[pointcount]);
tempangle -= refangle;

Appendix C - xxviii
The University of Surrey UAV Vision and Control System Richard Lane

if (tempangle < 0) tempangle += 360;


if ((tempangle >= 180) && (tempangle < 270))
sortedpoints[3] = pointcount;
if ((tempangle >= 270) && (tempangle < 360))
sortedpoints[4] = pointcount;
if ((tempangle >= 0) && (tempangle < 90))
sortedpoints[1] = pointcount;
if ((tempangle >= 90) && (tempangle < 180))
sortedpoints[2] = pointcount;
}
}
}

// find angle direction from center and draw line to point 0


void sepOrientation::FindDirection(void)
{
CvPoint MidLine = {0,0};
MidLine.y = CentrePoint.y;
ortn_direction = PointAngle(CentrePoint, RefData[sortedpoints[0]]);
cvLine(ImageData, CentrePoint, RefData[sortedpoints[0]], cvScalar(255,0,0,0), 2);
cvLine(ImageData, CentrePoint, MidLine, cvScalar(255,0,0,0), 2);
}

// find pitch of UAV relative to UGV


void sepOrientation::FindPitch(void)
{
CvScalar Green = {0,255,0,0}, Red = {0,0,255,0};
// left/right pitch
ortn_leftrightpitch = (double)
(PointDistance(RefData[sortedpoints[1]],RefData[sortedpoints[2]])
- PointDistance(RefData[sortedpoints[3]],RefData[sortedpoints[4]]));
if (ortn_leftrightpitch > 0.0)
{
cvLine(ImageData, RefData[sortedpoints[1]],
RefData[sortedpoints[2]], Red, 2);
cvLine(ImageData, RefData[sortedpoints[3]],
RefData[sortedpoints[4]], Green, 2);
}
else
{
cvLine(ImageData, RefData[sortedpoints[3]],
RefData[sortedpoints[4]], Red, 2);
cvLine(ImageData, RefData[sortedpoints[1]],
RefData[sortedpoints[2]], Green, 2);
}
// front/back pitch
ortn_frontbackpitch = (double)
(PointDistance(RefData[sortedpoints[3]],RefData[sortedpoints[2]])
- PointDistance(RefData[sortedpoints[1]],RefData[sortedpoints[4]]));
if (ortn_frontbackpitch > 0.0)
{
cvLine(ImageData, RefData[sortedpoints[3]],
RefData[sortedpoints[2]], Red, 2);
cvLine(ImageData, RefData[sortedpoints[1]],
RefData[sortedpoints[4]], Green, 2);
}
else
{
cvLine(ImageData, RefData[sortedpoints[1]],
RefData[sortedpoints[4]], Red, 2);
cvLine(ImageData, RefData[sortedpoints[3]],
RefData[sortedpoints[2]], Green, 2);
}
#ifdef DEBUG_SHOW_PITCH_DATA
//cout << "LR:" << ortn_leftrightpitch << " FB:" << ortn_frontbackpitch << endl;
#endif //def DEBUG_SHOW_PITCH_DATA
}

Appendix C - xxix
The University of Surrey UAV Vision and Control System Richard Lane

// calculate height from distance between point


void sepOrientation::FindHeight(void)
{
// calc avg diagonal distance
double avg;
avg = PointDistance(RefData[sortedpoints[1]],RefData[sortedpoints[3]]);
avg += PointDistance(RefData[sortedpoints[2]],RefData[sortedpoints[4]]);
avg /= 2;
// calc height using height factor
ortn_height = HEIGHT_FACTOR / avg;
}

// estimate actual centre from avg point and pitch


void sepOrientation::EstimateCentre(void)
{
// add next point to array
static int nextpoint(0);
ActualCentrePoint[nextpoint].x = CentrePoint.x;// - 40*ortn_frontbackpitch;
ActualCentrePoint[nextpoint].y = CentrePoint.y;// - 40*ortn_leftrightpitch;
nextpoint++;
if (nextpoint == NUM_AVG_POINTS) nextpoint = 0;

// find average
CvPoint AvgCentrePoint = {0,0};
int pointcount;
for (pointcount = 0; pointcount < NUM_AVG_POINTS; pointcount++)
{
AvgCentrePoint.x += ActualCentrePoint[pointcount].x;
AvgCentrePoint.y += ActualCentrePoint[pointcount].y;
}
AvgCentrePoint.x /= NUM_AVG_POINTS;
AvgCentrePoint.y /= NUM_AVG_POINTS;

// draw circle
cvCircle(ImageData, AvgCentrePoint, 5, cvScalar(255,0,0,0), 2);
}

// calculates the distance between two points


int sepOrientation::PointDistance(CvPoint point1, CvPoint point2)
{
double doublex, doubley, doublesqrt;
doublex = (double)(point1.x - point2.x);
doubley = (double)(point1.y - point2.y);
doublesqrt = (doublex*doublex) + (doubley*doubley);
return (int)sqrt(doublesqrt);
}

// calculates the angle of one point to another


double sepOrientation::PointAngle(CvPoint refpoint, CvPoint point1)
{
double doublex, doubley, doubleang(0.0);
doublex = (double)(refpoint.x - point1.x);
doubley = (double)(refpoint.y - point1.y);
if ((doubley == 0.0) && (doublex >= 0.0))
{
doubleang = 0.0;
}
else if ((doubley == 0.0) && (doublex < 0.0))
{
doubleang = 180.0;
}
else if ((doublex == 0.0) && (doubley < 0.0))
{
doubleang = 90.0;
}
else if ((doublex == 0.0) && (doubley > 0.0))

Appendix C - xxx
The University of Surrey UAV Vision and Control System Richard Lane

{
doubleang = 270.0;
}
else
{
doubleang = atan(doubley/doublex);
doubleang = (doubleang*180)/PI;
if ((doubley < 0) && (doublex > 0)) doubleang = 0.0 - doubleang;
else if ((doubley > 0) && (doublex > 0)) doubleang = 360.0 - doubleang;
else if ((doubley > 0) && (doublex < 0)) doubleang = 180.0 - doubleang;
else if ((doubley < 0) && (doublex < 0)) doubleang = 180.0 - doubleang;
}
return doubleang;
}

void sepOrientation::FindMinDistance(void)
{
int pointcount, pointcount2, tempdistance, usefulpoints(0);
min_distance = CONTROL_CAMERA_WIDTH*CONTROL_CAMERA_HEIGHT;
// workout min distance between points
for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)
{
for (pointcount2 = pointcount+1; pointcount2 < NUM_REF_POINTS;
pointcount2++)
{
if ((RefData[pointcount].x >= 0) && (RefData[pointcount].y >= 0) &&
(RefData[pointcount2].x >= 0) && (RefData[pointcount2].y >= 0))
{
tempdistance = PointDistance(RefData[pointcount],
RefData[pointcount2]);
if (tempdistance < min_distance) min_distance = tempdistance;
}
}
if ((RefData[pointcount].x >= 0) && (RefData[pointcount].y >= 0))
usefulpoints++;
}
if (usefulpoints == 0) min_distance = -1;
if (usefulpoints == 1) min_distance = 0;
}

sep_uavcomms.h

//--------------------------------------------------------------------------//
// sep_uavcomms.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepUAVComms class
//--------------------------------------------------------------------------//

#ifndef __SEP_UAVCOMMS_H__
#define __SEP_UAVCOMMS_H__

#include "TCPLink.h"
#include "aerial_platform.h"

//#define DEBUG_SHOW_COMMS_TRANSLATION

#ifdef USE_UGV_EMULATION
#define LOCAL_PORT 2001
#define REMOTE_PORT 2000
#define CHANNEL_TO_UAV "to_uav"
#define CHANNEL_TO_UGV "to_ugv"
#define LOCAL_ADDRESS "uav_machine"
#define REMOTE_ADDRESS "ugv_machine"

Appendix C - xxxi
The University of Surrey UAV Vision and Control System Richard Lane

#else //ifndef USE_UGV_EMULATION


#define LOCAL_PORT 6200
#define REMOTE_PORT 6000
#define CHANNEL_TO_UAV "ugv_to_uav"
#define CHANNEL_TO_UGV "uav_to_ugv"
#define LOCAL_ADDRESS "uav_system"
#define REMOTE_ADDRESS "DECISION"
#endif //USE_UGV_EMULATION

class sepUAVComms
{
public:
sepUAVComms();
~sepUAVComms(void);
bool commsConnected(void);
void commsSend(char statusbits);
bool commsCheck(double& passheight);
private:
bool commsInit(void);
bool startEmulator(void);
bool connected;
bool inittried;
bool initfailed;
int local_port;
int remote_port;
string channel_to_uav;
string channel_to_ugv;
string local_address;
string remote_address;
TCPLink* connection_to_ugv;
TCPLink* connection_from_ugv;
};

#endif //__SEP_UAVCOMMS_H__

sep_uavcomms.cpp

//--------------------------------------------------------------------------//
// sep_uavcomms.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepUAVComms class for communicating between UAV decision
// system and UGV decision systems, using netAPI
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "sep_uavcomms.h"
#include "util.h"

sepUAVComms::sepUAVComms(void)
: connected(false), inittried(false), initfailed(false)
{
}

sepUAVComms::~sepUAVComms()
{
// if connections exist delete them
if (connection_to_ugv)
delete connection_to_ugv;
if (connection_from_ugv)
delete connection_from_ugv;
}

bool sepUAVComms::commsInit(void)

Appendix C - xxxii
The University of Surrey UAV Vision and Control System Richard Lane

{
inittried = true;

#ifdef USE_UGV_EMULATION
// use values for ugv emulation
local_port = LOCAL_PORT;
remote_port = REMOTE_PORT;
channel_to_uav = CHANNEL_TO_UAV;
channel_to_ugv = CHANNEL_TO_UGV;
local_address = LOCAL_ADDRESS;
remote_address = REMOTE_ADDRESS;

#else //ifndef USE_UGV_EMULATION


char usedefault;
// ask if you want to use default testing values
cout << "would you like to use default values for netAPI (y/n)? ";
cin >> usedefault;
if ((usedefault == 'y') || (usedefault == 'Y'))
{
// use default values
local_port = LOCAL_PORT;
remote_port = REMOTE_PORT;
channel_to_uav = CHANNEL_TO_UAV;
channel_to_ugv = CHANNEL_TO_UGV;
local_address = LOCAL_ADDRESS;
remote_address = REMOTE_ADDRESS;
}
else
{
// input non default values
cout << "input setup values..." << endl;
cout << " local port: ";
cin >> local_port;
cout << " input channel name: ";
cin >> channel_to_ugv;
cout << " local address: ";
cin >> local_address;
cout << " remote port: ";
cin >> remote_port;
cout << " output channel name: ";
cin >> channel_to_uav;
cout << " remote address: ";
cin >> remote_address;
}
#endif //def USE_UGV_EMULATION

// initiate TCPLink
TCPLink::Load(local_port, local_address, _on, BCAST_LOCATION);
connection_to_ugv = new TCPLink(channel_to_ugv, remote_port, remote_address);
connection_from_ugv = new TCPLink(channel_to_uav, remote_port, remote_address);
if (connection_to_ugv && connection_from_ugv)
{
cout << "netAPI initilised, attempting to connect..." << endl;
#ifdef USE_UGV_EMULATION
#ifdef EMULATOR_EXE
return startEmulator();
#else
return true;
#endif //def EMULATOR_EXE
#else
return true;
#endif //def USE_UGV_EMULATION
}
else
{
initfailed = true;
cerr << "Error: unable to initialise netAPI" << endl;

Appendix C - xxxiii
The University of Surrey UAV Vision and Control System Richard Lane

return false;
}
}

bool sepUAVComms::startEmulator(void)
{
// run emulator automatically
if (util::ExecuteProcess(EMULATOR_EXE, "", 0, true))
{
cerr << "Error: unable to execute " << EMULATOR_EXE << endl;
return false;
}
else
{
cout << "UGV emulator started" << endl;
return true;
}
}

bool sepUAVComms::commsConnected(void)
{
if (connected)
return true;
else if (initfailed)
return false;
else if (!inittried)
{
commsInit();
return false;
}
else if ((*connection_to_ugv).isConnected() &&
(*connection_from_ugv).isConnected())
{
connected = true;
TCPLink::AllConnected();
cout << "connections established" << endl;
return true;
}
else
return false;
}

void sepUAVComms::commsSend(char statusbits)


{
(*connection_to_ugv) << statusbits << endl;
(*connection_to_ugv).Send();
#ifdef DEBUG_SHOW_COMMS_TRANSLATION
cout << "Message sent: " << endl;
if ((statusbits & ERRORBIT_ACKNOWLEDGE) == ERRORBIT_ACKNOWLEDGE)
cout << " - Acknowledgment" << endl;
if ((statusbits & ERRORBIT_DECISIONERROR) == ERRORBIT_DECISIONERROR)
cout << " - Decision error" << endl;
if ((statusbits & ERRORBIT_PARTIALTRACKING) == ERRORBIT_PARTIALTRACKING)
cout << " - Partial racking loss" << endl;
if ((statusbits & ERRORBIT_COMPLETETRACKING) == ERRORBIT_COMPLETETRACKING)
cout << " - Complete tracking loss" << endl;
if ((statusbits & ERRORBIT_INVALIDCOMMAND) == ERRORBIT_INVALIDCOMMAND)
cout << " - Invalid command" << endl;
if ((statusbits & ERRORBIT_BATTERYLOW) == ERRORBIT_BATTERYLOW)
cout << " - Battery low" << endl;
if ((statusbits & ERRORBIT_RESERVED1) == ERRORBIT_RESERVED1)
cout << " - Reserved bit1" << endl;
if ((statusbits & ERRORBIT_RESERVED2) == ERRORBIT_RESERVED2)
cout << " - Reserved bit1" << endl;
#endif //def DEBUG_SHOW_COMMS_TRANSLATION
}

Appendix C - xxxiv
The University of Surrey UAV Vision and Control System Richard Lane

bool sepUAVComms::commsCheck(double& passheight)


{
if ((*connection_from_ugv).Receipt())
{
(*connection_from_ugv) >> passheight;
cout << "Target height recieved: " << passheight << endl;
(*connection_from_ugv).FreeInputBuf();
return true;
}
else
return false;
}

sep_uavcontrol.h

//--------------------------------------------------------------------------//
// sep_uavcontrol.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepUAVContol class
//--------------------------------------------------------------------------//

#ifndef __SEP_UAVCONTROL_H__
#define __SEP_UAVCONTROL_H__

#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"

#ifdef CTRLMETHOD_SC8000
// set values for SC-8000
#define SC8_COMM_PORT 1
#define SC8_SERVO_MASK (unsigned char)15
#define SC8_DIO_MASK (char)'0'
// servo values for use with Hitec Laser 6 RC transmitter
#define AXIS_THROTTLE 2
#define AXIS_YAW 3
#define AXIS_PITCH 0
#define AXIS_ROLL 1
#define THROTTLE_OFF 8000
#define THROTTLE_MAX 22000
#define THROTTLE_STEP 1400
#define YAW_LEFTMAX 8000
#define YAW_MID 15000
#define YAW_RIGHTMAX 22000
#define YAW_STEP 1000
#define PITCH_BACKWARDMAX 8000
#define PITCH_MID 15000
#define PITCH_FORWARDMAX 22000
#define PITCH_STEP 1000
#define ROLL_LEFTMAX 8000
#define ROLL_MID 15000
#define ROLL_RIGHTMAX 22000
#define ROLL_STEP 1000
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
#include "Serial.h"
// servo values for use with Original X-UFO transmitter
// DAC range min=0x000, max=0xFFF
#define BIT_THROTTLE 0x20
#define BIT_YAW 0x10
#define BIT_PITCH 0x40
#define BIT_ROLL 0x80

Appendix C - xxxv
The University of Surrey UAV Vision and Control System Richard Lane

#define AXIS_THROTTLE 1
#define AXIS_YAW 0
#define AXIS_PITCH 2
#define AXIS_ROLL 3
#define SERVO_STEPS 10
#define THROTTLE_OFF 0
#define THROTTLE_MAX 2500
#define THROTTLE_STEP (THROTTLE_MAX-THROTTLE_OFF)/SERVO_STEPS
#define YAW_LEFTMAX 0
#define YAW_MID (YAW_RIGHTMAX-YAW_LEFTMAX)/2
#define YAW_RIGHTMAX 2500
#define YAW_STEP (YAW_RIGHTMAX-YAW_LEFTMAX)/SERVO_STEPS
#define PITCH_BACKWARDMAX 0
#define PITCH_MID (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/2
#define PITCH_FORWARDMAX 2500
#define PITCH_STEP (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/SERVO_STEPS
#define ROLL_LEFTMAX 0
#define ROLL_MID (ROLL_RIGHTMAX-ROLL_LEFTMAX)/2
#define ROLL_RIGHTMAX 2500
#define ROLL_STEP (ROLL_RIGHTMAX-ROLL_LEFTMAX)/SERVO_STEPS
#endif //def CTRLMETHOD_USB2VOLTAGE

class sepUAVControl
{
public:
sepUAVControl(void);
~sepUAVControl();
void SendCommands(double* directions_in);
void RefreshDisplay(IplImage* MainImg);
inline bool InitOK(void){return initok;}
void controlInit(void);
private:
bool InterpretDirections(double* directions_in);
void ConvertServoValues(void);
void SendServoValues(void);
IplImage* ControlImage;
double control_throttle;
double control_yaw;
double control_pitch;
double control_roll;
bool initok;
#ifdef CTRLMETHOD_SC8000
unsigned short servo_values[4];
int boardnumber;
int commnumber;
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
CSerial serial;
int servo_values[4];
#endif //def CTRLMETHOD_USB2VOLTAGE
};

#endif //__SEP_UAVCONTROL_H__

sep_uavcontrol.cpp

//--------------------------------------------------------------------------//
// sep_uavcontrol.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepUAVControl class for sending control signals
//--------------------------------------------------------------------------//

#include "stdafx.h"

Appendix C - xxxvi
The University of Surrey UAV Vision and Control System Richard Lane

#include "sep_uavcontrol.h"

#ifdef CTRLMETHOD_SC8000
#define _USE_DLL // for TTiSC8VC
#define __WIN32__
#include "TTiSC8VC.h"
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
// send serial data out
int send_serial(CSerial& serial, int* valuesin, int axis, int axisbit)
{
char transmit[2];
long error;
// ready values
transmit[1] = valuesin[axis] & 0xFF; // lower byte
transmit[0] = (valuesin[axis] & 0xF00)/256; // upper byte
transmit[0] += axisbit; // channel number
//transmit header
error = serial.Write("iu");
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
else
{
//transmit data
error = serial.Write(transmit, 2);
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
}
if (error != ERROR_SUCCESS)
return 0;
else
return 1;
}
#endif //def CTRLMETHOD_USB2VOLTAGE

sepUAVControl::sepUAVControl(void)
{
#ifdef CTRLMETHOD_SC8000
boardnumber = 0;
commnumber = SC8000_COMM_PORT;
#endif //def CTRLMETHOD_SC8000

controlInit();
}

sepUAVControl::~sepUAVControl()
{
#ifdef CTRLMETHOD_SC8000
//SC8_CleanUP(boardnumber);
// NOTE: ^ should be using this but gives unresolved external
// symbol error during compile for some reason
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
// close comm port
serial.Close();
#endif //def CTRLMETHOD_USB2VOLTAGE
}

void sepUAVControl::SendCommands(double* directions_in)


{
// take directions in and interpret to control values
if (InterpretDirections(directions_in))
{
// continue to send servo values if sc8 present
if (initok)

Appendix C - xxxvii
The University of Surrey UAV Vision and Control System Richard Lane

{
// convert control values to servo values
ConvertServoValues();
// send servo values
SendServoValues();
}
}
}

void sepUAVControl::RefreshDisplay(IplImage* MainImg)


{
// draw control visualisation box
cvRectangle(MainImg, cvPoint(MainImg->width - 101, MainImg->height - 201),
cvPoint(MainImg->width - 1, MainImg->height - 1), cvScalar(0,0,100),
CV_FILLED);
cvRectangle(MainImg, cvPoint(MainImg->width - 101, MainImg->height - 201),
cvPoint(MainImg->width - 1, MainImg->height - 1), cvScalar(200,200,200),
2);
// draw circles
cvCircle(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 151),
10, cvScalar(255, 255, 255), 1);
cvCircle(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 51),
10, cvScalar(255, 255, 255), 1);
// draw lines
cvLine(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 51),
cvPoint(MainImg->width - 51 - 5*control_throttle + 20, MainImg->height –
51),
cvScalar(255, 255, 255), 2);
cvLine(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 51),
cvPoint(MainImg->width - 51, MainImg->height - 51 + 5*control_yaw),
cvScalar(255, 255, 255), 2);
cvLine(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 151),
cvPoint(MainImg->width - 51 - 5*control_pitch, MainImg->height - 151),
cvScalar(255, 255, 255), 2);
cvLine(MainImg, cvPoint(MainImg->width - 51, MainImg->height - 151),
cvPoint(MainImg->width - 51, MainImg->height - 151 - 5*control_roll),
cvScalar(255, 255, 255), 2);
}

void sepUAVControl::controlInit(void)
{
#ifdef CTRLMETHOD_SC8000
boardnumber = SC8_Initialize(boardnumber, commnumber);
if (!boardnumber)
{
#if SEP_REPORT_ERRORS
cerr << "error initiating sc8" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
cout << "sc8 initiated" << endl;
initok = true;
}
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
// initiate serial control
long error;
// attempt to open the serial port
//error = serial.Open(_T(COMM_STRING),0,0,false);
error = serial.Open((COMM_STRING),0,0,false);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to open comm port " << COMM_STRING << endl;

Appendix C - xxxviii
The University of Surrey UAV Vision and Control System Richard Lane

#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
// setup the serial port (9600,N81) using hardware handshaking
error = serial.Setup(CSerial::EBaud9600,CSerial::EData8,
CSerial::EParNone,CSerial::EStop1);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to set comm port setting" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
// setup handshaking
error = serial.SetupHandshaking(CSerial::EHandshakeHardware);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to set comm port handshaking" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
cout << "serial comms initiated" << endl;
initok = true;
}
}
}
#endif //def CTRLMETHOD_USB2VOLTAGE
}

bool sepUAVControl::InterpretDirections(double* directions_in)


{
// [0] Forward(+)/Backward(-), [1] Left(+)/Right(-)
// [2] Up(+)/Down(-), [3] TurnLeft(-180->0)/TurnRight(0->180)
bool invalidvalue(false);
// interpret throttle directions
if (directions_in[2] < 0)
control_throttle = 0;
else
control_throttle = 10;
// interpret yaw directions
if (directions_in[3] < 0)
control_yaw = -5;
else
control_yaw = 5;
// interpret pitch directions
if (directions_in[0] < 0)
control_pitch = -5;
else
control_pitch = 5;
// interpret roll directions
if (directions_in[1] < 0)
control_roll = 5;
else
control_roll = -5;
// return correct value
if (invalidvalue)
return CALL_ERROR;
else
return CALL_OK;
}

Appendix C - xxxix
The University of Surrey UAV Vision and Control System Richard Lane

void sepUAVControl::ConvertServoValues(void)
{
// calc servo values from control values
servo_values[AXIS_THROTTLE] = THROTTLE_OFF + control_throttle * THROTTLE_STEP;
servo_values[AXIS_YAW] = YAW_MID + control_yaw * YAW_STEP;
servo_values[AXIS_PITCH] = PITCH_MID + control_pitch * PITCH_STEP;
servo_values[AXIS_ROLL] = ROLL_MID + control_roll * ROLL_STEP;
}

void sepUAVControl::SendServoValues(void)
{
#ifdef CTRLMETHOD_SC8000
SC8_SendPositions(0, SC8_SERVO_MASK, SC8_DIO_MASK, servo_values);
// NOTE: ^ the 0 should be sc8_board_num but sc8_board_num == 1
// and SC8_SendPositions only seems to work with 0
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
send_serial(serial, servo_values, AXIS_THROTTLE, BIT_THROTTLE);
send_serial(serial, servo_values, AXIS_YAW, BIT_YAW);
send_serial(serial, servo_values, AXIS_PITCH, BIT_PITCH);
send_serial(serial, servo_values, AXIS_ROLL, BIT_ROLL);
#endif //def CTRLMETHOD_USB2VOLTAGE
}

sep_utils.h

//--------------------------------------------------------------------------//
// sep_utils.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains some useful utility functions
//--------------------------------------------------------------------------//

#ifndef __SEP_UTILS_H__
#define __SEP_UTILS_H__

namespace sep_utils
{

#define PI 3.14159265

// returns mod difference of two inputs


template<class T> T ModDifference(const T& num1, const T& num2)
{
if (num1 < num2)
return (num2 - num1);
else
return (num1 - num2);
}

// returns returns diag length from x-y lines


template<class T> T DiagLength(T ydiff, T xdiff)
{
ydiff *= ydiff;
xdiff *= xdiff;
return sqrt((double)(ydiff + xdiff));
}

// returns true if num1 and num2 are within specified percentage


template<class T> bool PercentageThreshold(const T& num1, const T& num2,
int percent)
{

Appendix C - xl
The University of Surrey UAV Vision and Control System Richard Lane

T nummax = (num1 < num2) ? num2 : num1;


T numdiff = ((num1 - num2) < 0) ? (num2 - num1) : (num1 - num2);
if (numdiff > (nummax * percent / 100))
return false;
else
return true;
}

#endif //__SEP_UTILS_H__

stdafx.h

//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for aerial_platform project
//--------------------------------------------------------------------------//

// stdafx.h : include file for standard system include files,


// or project specific include files that are used frequently, but
// are changed infrequently

#pragma once

#define WIN32_LEAN_AND_MEAN // Exclude rarely-used stuff from Windows headers


#include <stdio.h>
#include <iostream>
using namespace std;

// cimg includes
//#include "CImg.h"
//using namespace cimg_library;

stdafx.cpp

//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for aerial_platform project
//--------------------------------------------------------------------------//

// stdafx.cpp : source file that includes just the standard includes


// aerial_platform.pch will be the pre-compiled header
// stdafx.obj will contain the pre-compiled type information

#include "stdafx.h"

Appendix C - xli
The University of Surrey UAV Vision and Control System Richard Lane

Appendix D: Remote Control Test Software Code


remote_control.h

//--------------------------------------------------------------------------//
// remote_control.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for remote_control project
//--------------------------------------------------------------------------//

#ifndef __REMOTE_CONTROL_H__
#define __REMOTE_CONTROL_H__

#ifdef CTRLMETHOD_SC8000
// set values for SC-8000
#define SC8_COMM_PORT 1
#define SC8_SERVO_MASK (unsigned char)15
#define SC8_DIO_MASK (char)'0'
// servo values for use with Hitec Laser 6 RC transmitter
#define AXIS_THROTTLE 2
#define AXIS_YAW 3
#define AXIS_PITCH 0
#define AXIS_ROLL 1
#define THROTTLE_OFF 8000
#define THROTTLE_MAX 22000
#define THROTTLE_STEP 1400
#define YAW_LEFTMAX 8000
#define YAW_MID 15000
#define YAW_RIGHTMAX 22000
#define YAW_STEP 1000
#define PITCH_BACKWARDMAX 8000
#define PITCH_MID 15000
#define PITCH_FORWARDMAX 22000
#define PITCH_STEP 1000
#define ROLL_LEFTMAX 8000
#define ROLL_MID 15000
#define ROLL_RIGHTMAX 22000
#define ROLL_STEP 1000
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
// set serial interface values
#define COMM_STRING "\\\\.\\COM13"
// servo values for use with Original X-UFO transmitter
// DAC range min=0x000, max=0xFFF
#define BIT_THROTTLE 0x20
#define BIT_YAW 0x10
#define BIT_PITCH 0x40
#define BIT_ROLL 0x80
#define AXIS_THROTTLE 1
#define AXIS_YAW 0
#define AXIS_PITCH 2
#define AXIS_ROLL 3
#define SERVO_STEPS 10
#define THROTTLE_OFF 0
#define THROTTLE_MAX 2500
#define THROTTLE_STEP (THROTTLE_MAX-THROTTLE_OFF)/SERVO_STEPS
#define YAW_LEFTMAX 0
#define YAW_MID (YAW_RIGHTMAX-YAW_LEFTMAX)/2
#define YAW_RIGHTMAX 2500
#define YAW_STEP (YAW_RIGHTMAX-YAW_LEFTMAX)/SERVO_STEPS
#define PITCH_BACKWARDMAX 0

Appendix D - i
The University of Surrey UAV Vision and Control System Richard Lane

#define PITCH_MID (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/2


#define PITCH_FORWARDMAX 2500
#define PITCH_STEP (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/SERVO_STEPS
#define ROLL_LEFTMAX 0
#define ROLL_MID (ROLL_RIGHTMAX-ROLL_LEFTMAX)/2
#define ROLL_RIGHTMAX 2500
#define ROLL_STEP (ROLL_RIGHTMAX-ROLL_LEFTMAX)/SERVO_STEPS
#endif //def CTRLMETHOD_USB2VOLTAGE

// prints current values to console


template<class T> void print_servo_values(T* values)
{
cout << endl;
cout << endl;
cout << '\t' << "throttle: " << (char)9
<< values[AXIS_THROTTLE] << endl;
cout << '\t' << "yaw: " << (char)9 << (char)9
<< values[AXIS_YAW] << endl;
cout << '\t' << "pitch: " << (char)9 << (char)9
<< values[AXIS_PITCH] << endl;
cout << '\t' << "roll: " << (char)9 << (char)9
<< values[AXIS_ROLL] << endl;
}

#endif //__REMOTE_CONTROL_H__

remote_control.cpp

//--------------------------------------------------------------------------//
// remote_control.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() for test remote control app for the aerial platform
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include <iostream>
using namespace std;

#include "remote_control.h"
#include "remote_keys.h"

#ifdef CTRLMETHOD_SC8000
#define _USE_DLL // for TTiSC8VC
#define __WIN32__
#include "TTiSC8VC.h"
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
#include "Serial.h"
// send serial data out
int send_serial(CSerial& serial, int* valuesin, int axis, int axisbit)
{
char transmit[2];
long error;
// ready values
transmit[1] = valuesin[axis] & 0xFF; // lower byte
transmit[0] = (valuesin[axis] & 0xF00)/256; // upper byte
transmit[0] += axisbit; // channel number
//transmit header
error = serial.Write("iu");
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
else

Appendix D - ii
The University of Surrey UAV Vision and Control System Richard Lane

{
//transmit data
error = serial.Write(transmit, 2);
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
}
if (error != ERROR_SUCCESS)
return 0;
else
return 1;
}
#endif //def CTRLMETHOD_USB2VOLTAGE

void print_instructions(void)
{
// draw nice ascii art transmitter with control instructions
cout << endl;
cout << '\t' << " |" << endl;
cout << '\t' << " X-UFO |" << endl;
cout << '\t' << " /|\\" << endl;
cout << '\t' << "/---------------------\\" << endl;
cout << '\t' << "| " << THROTTLE_UP << " "
<< PITCH_FORWARD << " |" << endl;
cout << '\t' << "| | | |" << endl;
cout << '\t' << "| " << YAW_LEFT << "-- --" << YAW_RIGHT
<< " " << ROLL_LEFT << "-- --" << ROLL_RIGHT
<< " |" << endl;
cout << '\t' << "| | | |" << endl;
cout << '\t' << "| " << THROTTLE_DOWN << " "
<< PITCH_BACKWARD << " |" << endl;
cout << '\t' << "\\---------------------/" << endl;
cout << endl;
}

int main(int argc, char** argv)


{
#ifdef CTRLMETHOD_SC8000
// initiate SC-8000 control
int sc8_board_num(0);
sc8_board_num = SC8_Initialize(sc8_board_num, SC8_COMM_PORT);
if (sc8_board_num)
{
cout << "sc8 initiated" << endl;
print_instructions();

// initiate command values and remoteKeys class


int throttle(0), yaw(0), pitch(0), roll(0);
unsigned short axisvalues[4];
remoteKeys keysIn(throttle, yaw, pitch, roll);

// loop until exit key pressed (throttle set to -1)


while (throttle != THROTTLE_ESC)
{
// calc servo values from command values
axisvalues[AXIS_THROTTLE] = THROTTLE_OFF + throttle *
THROTTLE_STEP;
axisvalues[AXIS_YAW] = YAW_MID + yaw * YAW_STEP;
axisvalues[AXIS_PITCH] = PITCH_MID + pitch * PITCH_STEP;
axisvalues[AXIS_ROLL] = ROLL_MID + roll * ROLL_STEP;

// print and send values to SC-8000


print_servo_values(axisvalues);
SC8_SendPositions(0, SC8_SERVO_MASK, SC8_DIO_MASK, axisvalues);
// NOTE: ^ the 0 should be sc8_board_num but sc8_board_num == 1
// and SC8_SendPositions only seems to work with 0

// get new command values

Appendix D - iii
The University of Surrey UAV Vision and Control System Richard Lane

keysIn.GetCommands();
system("cls");
}

// reset servo to neutral positions


axisvalues[AXIS_THROTTLE] = THROTTLE_OFF;
axisvalues[AXIS_YAW] = YAW_MID;
axisvalues[AXIS_PITCH] = PITCH_MID;
axisvalues[AXIS_ROLL] = ROLL_MID;
SC8_SendPositions(0, SC8_SERVO_MASK, SC8_DIO_MASK, axisvalues);
// NOTE: ^ the 0 should be sc8_board_num but sc8_board_num == 1
// and SC8_SendPositions only seems to work with 0

// clean up sc8 resources


//SC8_CleanUP(sc8_board_num);
}
else
{
cerr << "error initiating sc8" << endl;
}
#endif //def CTRLMETHOD_SC8000

#ifdef CTRLMETHOD_USB2VOLTAGE
// initiate serial control
CSerial serial;
long error;
// attempt to open the serial port
error = serial.Open(_T(COMM_STRING),0,0,false);
if (error != ERROR_SUCCESS)
cerr << "Unable to open comm port " << COMM_STRING << endl;
else
{
// setup the serial port (9600,N81) using hardware handshaking
error = serial.Setup(CSerial::EBaud9600,CSerial::EData8,
CSerial::EParNone,CSerial::EStop1);
if (error != ERROR_SUCCESS)
cerr << "Unable to set comm port setting" << endl;
else
{
// setup handshaking
error = serial.SetupHandshaking(CSerial::EHandshakeHardware);
if (error != ERROR_SUCCESS)
cerr << "Unable to set comm port handshaking" << endl;
else
{
// initiate command values and remoteKeys class
int throttle(0), yaw(0), pitch(0), roll(0), axisvalues[4];
remoteKeys keysIn(throttle, yaw, pitch, roll);

// loop until exit key pressed (throttle set to -1)


while (throttle != THROTTLE_ESC)
{
// calc servo values from command values
axisvalues[AXIS_THROTTLE] = THROTTLE_OFF + throttle*
THROTTLE_STEP;
axisvalues[AXIS_YAW] = YAW_MID + yaw*YAW_STEP;
axisvalues[AXIS_PITCH] = PITCH_MID + pitch*PITCH_STEP;
axisvalues[AXIS_ROLL] = ROLL_MID + roll*ROLL_STEP;

// print and send servo values via serial port


print_servo_values(axisvalues);
error = send_serial(serial, axisvalues,
AXIS_THROTTLE, BIT_THROTTLE);
error += send_serial(serial, axisvalues,
AXIS_YAW, BIT_YAW);
error += send_serial(serial, axisvalues,
AXIS_PITCH, BIT_PITCH);

Appendix D - iv
The University of Surrey UAV Vision and Control System Richard Lane

error += send_serial(serial, axisvalues,


AXIS_ROLL, BIT_ROLL);
if (error) break; // from while loop

// get new command values


print_instructions();
keysIn.GetCommands();
system("cls");
}

// reset servo to neutral positions


axisvalues[AXIS_THROTTLE] = THROTTLE_OFF;
axisvalues[AXIS_YAW] = YAW_MID;
axisvalues[AXIS_PITCH] = PITCH_MID;
axisvalues[AXIS_ROLL] = ROLL_MID;
send_serial(serial, axisvalues, AXIS_THROTTLE, BIT_THROTTLE);
send_serial(serial, axisvalues, AXIS_YAW, BIT_YAW);
send_serial(serial, axisvalues, AXIS_PITCH, BIT_PITCH);
send_serial(serial, axisvalues, AXIS_ROLL, BIT_ROLL);
}
}
}
// close comm port
serial.Close();
#endif //def CTRLMETHOD_USB2VOLTAGE

// pause then exit


system("pause");
return 0;
}

remote_keys.h

//--------------------------------------------------------------------------//
// remote_keys.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for remoteKeys class
//--------------------------------------------------------------------------//

#ifndef __REMOTE_KEYS_H__
#define __REMOTE_KEYS_H__

// remote control keys


#define ESC_KEY 27
#define SPACE_KEY 32
#define THROTTLE_ESC -2
#define THROTTLE_UP 'w'
#define THROTTLE_DOWN 's'
#define YAW_LEFT 'a'
#define YAW_RIGHT 'd'
#define PITCH_FORWARD 'i'
#define PITCH_BACKWARD 'k'
#define ROLL_LEFT 'j'
#define ROLL_RIGHT 'l'

class remoteKeys
{
public:
remoteKeys(int& throttleref, int& yawref, int& pitchref, int& rollref);
~remoteKeys();
void GetCommands(void);
private:
char keyin;

Appendix D - v
The University of Surrey UAV Vision and Control System Richard Lane

int& throttle;
int& yaw;
int& pitch;
int& roll;
};

#endif //__REMOTE_KEYS_H__

remote_keys.cpp

//--------------------------------------------------------------------------//
// remote_keys.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the remoteKeys class for inputing keys and returning
// remote command values
//--------------------------------------------------------------------------//

#include "remote_keys.h"
#include <iostream>
using namespace std;
#include <conio.h>
//#include <windows.h>

// constructor
remoteKeys::remoteKeys(int& throttleref, int& yawref,
int& pitchref, int& rollref)
: throttle(throttleref), yaw(yawref), pitch(pitchref), roll(rollref)
{
}

// deconstructor
remoteKeys::~remoteKeys()
{
}

// get one key from keyboard and adjust servo values accordingly
void remoteKeys::GetCommands(void)
{
// get key from console without echo
keyin = _getch();
// check for escape key
if (keyin == ESC_KEY) throttle = THROTTLE_ESC;
// check for space/reset key
else if (keyin == SPACE_KEY){
throttle = 0;
yaw = 0;
pitch = 0;
roll = 0; }
// adjust throttle
else if (keyin == THROTTLE_UP) throttle++;
else if (keyin == THROTTLE_DOWN) throttle--;
// adjust yaw
else if (keyin == YAW_LEFT) yaw--;
else if (keyin == YAW_RIGHT) yaw++;
// adjust pitch
else if (keyin == PITCH_FORWARD) pitch++;
else if (keyin == PITCH_BACKWARD) pitch--;
// adjust roll
else if (keyin == ROLL_LEFT) roll--;
else if (keyin == ROLL_RIGHT) roll++;
// check values for limits
if (throttle == -1) throttle = 0;
else if (throttle > 10) throttle = 10;

Appendix D - vi
The University of Surrey UAV Vision and Control System Richard Lane

if (yaw < -5) yaw = -5;


else if (yaw > 5) yaw = 5;
if (pitch < -5) pitch = -5;
else if (pitch > 5) pitch = 5;
if (roll < -5) roll = -5;
else if (roll > 5) roll = 5;
}

stdafx.h

//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for remote_control project
//--------------------------------------------------------------------------//

// stdafx.h : include file for standard system include files,


// or project specific include files that are used frequently, but
// are changed infrequently

#pragma once

//#define CTRLMETHOD_SC8000
#define CTRLMETHOD_USB2VOLTAGE

#define WIN32_LEAN_AND_MEAN // Exclude rarely-used stuff from Windows headers


#include <stdio.h>
#include <iostream>
#include <string>
using namespace std;

#ifdef CTRLMETHOD_USB2VOLTAGE
#include <tchar.h>
#include <crtdbg.h>
#include <windows.h>
#include <conio.h>
#endif //def CTRLMETHOD_USB2VOLTAGE

stdafx.cpp

//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for remote_control project
//--------------------------------------------------------------------------//

// stdafx.cpp : source file that includes just the standard includes


// aerial_platform.pch will be the pre-compiled header
// stdafx.obj will contain the pre-compiled type information

#include "stdafx.h"

Appendix D - vii
The University of Surrey UAV Vision and Control System Richard Lane

Appendix E: UGV Emulation Software Code


ugv_emulation.h

//--------------------------------------------------------------------------//
// ugv_emulation.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for ugv_emulation project
//--------------------------------------------------------------------------//

#ifndef __UGV_EMULATION_H__
#define __UGV_EMULATION_H__

//#define EMULATION_DEBUG
#define POLL_FREQ 20 //ms
#define SEND_INTERVAL 200 //polls per send

#define ERRORBIT_ACKNOWLEDGE 1 // 0x01


#define ERRORBIT_DECISIONERROR 2 // 0x02
#define ERRORBIT_PARTIALTRACKING 4 // 0x04
#define ERRORBIT_COMPLETETRACKING 8 // 0x08
#define ERRORBIT_INVALIDCOMMAND 16 // 0x10
#define ERRORBIT_BATTERYLOW 32 // 0x20
#define ERRORBIT_RESERVED1 64 // 0x40
#define ERRORBIT_RESERVED2 128 // 0x80

typedef enum {
STATE_CHECK,
STATE_CLOSE,
STATE_ERROR,
STATE_HEIGHT,
STATE_INIT,
STATE_WAIT
} state_machine;

#endif //__UGV_EMULATION_H__

ugv_emulation.cpp

//--------------------------------------------------------------------------//
// ugv_emulation.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() for emulation of ugv comms
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "ugv_emulation.h"
#include "netapi_interface.h"

int main(int argc, char** argv)


{
netAPI_UGV UGVcomms;
int pollcount(0);

// enter main state machine


state_machine current_state(STATE_INIT), next_state(STATE_INIT);
while(current_state != STATE_CLOSE)
{

Appendix E - i
The University of Surrey UAV Vision and Control System Richard Lane

switch (current_state)
{
// check for message state
case STATE_CHECK:
#ifdef EMULATION_DEBUG
cout << "Poll: check for message" << endl;
#endif // EMULATION_DEBUG
if (UGVcomms.commsCheck())
{
next_state = STATE_WAIT;
}
else
{
cerr << "Error: could not check for messages"
<< endl;
next_state = STATE_ERROR;
}
break;

// close state
case STATE_CLOSE:
break;

// error state
case STATE_ERROR:
cout << "A FATAL ERROR HAS OCCURED, " <<
"PLEASE CLOSE UGV EMULATION" << endl;
Sleep(-1);
break;

// send new height state


case STATE_HEIGHT:
#ifdef EMULATION_DEBUG
cout << "Poll: send height" << endl;
#endif // EMULATION_DEBUG
if (UGVcomms.commsSend())
{
next_state = STATE_WAIT;
}
else
{
cerr << "Error: could not send height" << endl;
next_state = STATE_ERROR;
}
break;

// init state
case STATE_INIT:
if (UGVcomms.commsInit())
{

next_state = STATE_WAIT;
}
else
{
cerr << "Error: could not initiate netAPI" << endl;
next_state = STATE_ERROR;
}
break;

// wait state
case STATE_WAIT:
if (UGVcomms.commsConnected())
{
if (pollcount < SEND_INTERVAL)
{
pollcount++;

Appendix E - ii
The University of Surrey UAV Vision and Control System Richard Lane

next_state = STATE_CHECK;
}
else
{
pollcount = 0;
next_state = STATE_HEIGHT;
}
}
// sleep to prevent CPU overuse
Sleep(POLL_FREQ);
break;

// default
default:
cerr << "Error: state machine entered unknown state"
<< endl;
next_state = STATE_ERROR;
}
current_state = next_state;
}

return 0;
}

netapi_interface.h

//--------------------------------------------------------------------------//
// netapi_interface.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for netAPI_UGV class
//--------------------------------------------------------------------------//

#ifndef __NETAPI_INTERFACE_H__
#define __NETAPI_INTERFACE_H__

#include "TCPLink.h"

#define USE_DEFAULTS
#define BCAST_LOCATION "C:/SEP_code/netAPI/"
#define LOCAL_PORT 2000
#define REMOTE_PORT 2001
#define CHANNEL_TO_UAV "to_uav"
#define CHANNEL_TO_UGV "to_ugv"
#define LOCAL_ADDRESS "ugv_machine"
#define REMOTE_ADDRESS "uav_machine"

#define STANDARD_HEIGHT 1.0 // metres


#define HEIGHT_JUMP 0.1 // metres

class netAPI_UGV
{
public:
netAPI_UGV(void);
~netAPI_UGV();
bool commsInit(void);
bool commsConnected(void);
bool commsCheck(void);
bool commsSend(void);
private:
double nextHeight(void);
double last_height;
bool height_error;
bool connected;

Appendix E - iii
The University of Surrey UAV Vision and Control System Richard Lane

int local_port;
int remote_port;
string channel_to_uav;
string channel_to_ugv;
string local_address;
string remote_address;
TCPLink* connection_to_uav;
TCPLink* connection_from_uav;
};

#endif //__NETAPI_INTERFACE_H__

netapi_interface.cpp

//--------------------------------------------------------------------------//
// netapi_interface.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the netAPI_UGV class for interfacing with netAPI
//--------------------------------------------------------------------------//

#include "stdafx.h"
#include "netapi_interface.h"
#include "ugv_emulation.h"

// constructor
netAPI_UGV::netAPI_UGV(void)
: connected(false)
{
}

// deconstructor
netAPI_UGV::~netAPI_UGV()
{
// delete TCPLink objects
if (connection_to_uav)
delete connection_to_uav;
if (connection_from_uav)
delete connection_from_uav;
}

// initiates connections using TCPLink


bool netAPI_UGV::commsInit(void)
{
#ifndef USE_DEFAULTS
char usedefault;
// ask if you want to use default testing values
cout << "would you like to use default values for netAPI (y/n)? ";
cin >> usedefault;
if ((usedefault == 'y') || (usedefault == 'Y'))
{
// use default values
local_port = LOCAL_PORT;
remote_port = REMOTE_PORT;
channel_to_uav = CHANNEL_TO_UAV;
channel_to_ugv = CHANNEL_TO_UGV;
local_address = LOCAL_ADDRESS;
remote_address = REMOTE_ADDRESS;
}
else
{
// input non default values
cout << "input setup values..." << endl;
cout << " local port: ";

Appendix E - iv
The University of Surrey UAV Vision and Control System Richard Lane

cin >> local_port;


cout << " input channel name: ";
cin >> channel_to_ugv;
cout << " local address: ";
cin >> local_address;
cout << " remote port: ";
cin >> remote_port;
cout << " output channel name: ";
cin >> channel_to_uav;
cout << " remote address: ";
cin >> remote_address;
}
#else //ifdef USE_DEFAULTS
// use default values
local_port = LOCAL_PORT;
remote_port = REMOTE_PORT;
channel_to_uav = CHANNEL_TO_UAV;
channel_to_ugv = CHANNEL_TO_UGV;
local_address = LOCAL_ADDRESS;
remote_address = REMOTE_ADDRESS;
#endif //ndef USE_DEFAULTS
// initiate TCPLink
TCPLink::Load(local_port, local_address, _on, BCAST_LOCATION);
connection_to_uav =
new TCPLink(channel_to_uav, remote_port, remote_address);
connection_from_uav =
new TCPLink(channel_to_ugv, remote_port, remote_address);
// check both connections have been created
if (connection_to_uav && connection_from_uav)
{
cout << "netAPI initilised, attempting to connect..." << endl;
return true;
}
else
return false;
}

// returns true if connected


bool netAPI_UGV::commsConnected(void)
{
// if known to be connected
if (connected)
return true;
// if just connected
else if ((*connection_to_uav).isConnected() &&
(*connection_from_uav).isConnected())
{
connected = true;
// report all connected so bcastmanager closes
TCPLink::AllConnected();
cout << "connections established" << endl;
return true;
}
else
return false;
}

// checks for message, and interprets and prints received messages


bool netAPI_UGV::commsCheck(void)
{
if ((*connection_from_uav).Receipt())
{
char inputtemp;
// read message from TCPLink
(*connection_from_uav) >> inputtemp;
// interpret and print message
cout << "Message recieved: " << endl;

Appendix E - v
The University of Surrey UAV Vision and Control System Richard Lane

if ((inputtemp & ERRORBIT_ACKNOWLEDGE)


== ERRORBIT_ACKNOWLEDGE)
cout << " - Acknowledgment" << endl;
if ((inputtemp & ERRORBIT_DECISIONERROR)
== ERRORBIT_DECISIONERROR)
cout << " - Decision error" << endl;
if ((inputtemp & ERRORBIT_PARTIALTRACKING)
== ERRORBIT_PARTIALTRACKING)
cout << " - Partial racking loss" << endl;
if ((inputtemp & ERRORBIT_COMPLETETRACKING)
== ERRORBIT_COMPLETETRACKING)
cout << " - Complete tracking loss" << endl;
if ((inputtemp & ERRORBIT_INVALIDCOMMAND)
== ERRORBIT_INVALIDCOMMAND){
cout << " - Invalid command" << endl;
height_error = true;}
if ((inputtemp & ERRORBIT_BATTERYLOW)
== ERRORBIT_BATTERYLOW)
cout << " - Battery low" << endl;
if ((inputtemp & ERRORBIT_RESERVED1)
== ERRORBIT_RESERVED1)
cout << " - Reserved bit1" << endl;
if ((inputtemp & ERRORBIT_RESERVED2)
== ERRORBIT_RESERVED2)
cout << " - Reserved bit1" << endl;
// clear message from TCPLink buffer
(*connection_from_uav).FreeInputBuf();
}
return true;
}

// sends a height message using TCPLink


bool netAPI_UGV::commsSend(void)
{
// get height value to send
double heighttemp = nextHeight();
// put value in buffer and send message
(*connection_to_uav) << heighttemp << endl;
(*connection_to_uav).Send();
return true;
}

// return next height value to send


double netAPI_UGV::nextHeight(void)
{
static double last_height(STANDARD_HEIGHT);
static bool direction_up(true);
// switch direction down if error recieved
if (direction_up && height_error)
{
direction_up = false;
height_error = false;
last_height -= HEIGHT_JUMP;
}
// switch direction up if error recieved
else if (!direction_up && height_error)
{
direction_up = true;
height_error = false;
last_height += HEIGHT_JUMP;
}
// else increment values accordingly
else if (direction_up)
{
last_height += HEIGHT_JUMP;
}
else if (!direction_up)

Appendix E - vi
The University of Surrey UAV Vision and Control System Richard Lane

{
last_height -= HEIGHT_JUMP;
}
return last_height;
}

stdafx.h

//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for ugv_emulation project
//--------------------------------------------------------------------------//

// stdafx.h : include file for standard system include files,


// or project specific include files that are used frequently, but
// are changed infrequently

#pragma once

#define WIN32_LEAN_AND_MEAN // Exclude rarely-used stuff from Windows headers


#include <stdio.h>
#include <windows.h>
#include <iostream>
#include <string>
using namespace std;

stdafx.cpp

//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for ugv_emulation project
//--------------------------------------------------------------------------//

// stdafx.cpp : source file that includes just the standard includes


// aerial_platform.pch will be the pre-compiled header
// stdafx.obj will contain the pre-compiled type information

#include "stdafx.h"

Appendix E - vii
The University of Surrey UAV Vision and Control System Richard Lane

Appendix F: X-UFO Technical Information

[19]

Manufacture’s Information: [20]


- Advanced mechanical gyro system help for stability of flying
- Unique electronic pitch and roll control system
- Ultra light weight carbon fibre frame with durable EPP foam body frame
- 4 individual motor to delivery smooth and constant power
- Digital proportional 4-channels radio control system with changeable transmitting crystal
- Default R/C crystal frequency: 26.590MHz
- Battery pack: 12V 350mAh NiMH

Appendix F - i
The University of Surrey UAV Vision and Control System Richard Lane

Appendix G: SC-8000 Technical Information

Technical Specifications: [21]


- Number of servo channels: 8
- USB->Serial protocol: 9600-N-8-1
- Pulse width range: 600 to 2200 [sec
- Pulse resolution: 0.1 [sec
- PPM frame size: 22.5 msec
- Positive or Negative shift PPM output

Appendix G - i
The University of Surrey UAV Vision and Control System Richard Lane

Appendix H: Wireless Camera Technical Information

Technical Specifications: [22]


- Image Sensor: 1/3” Colour CMOS
- Resolution: 380 TV lines
- Frames per Second: 30
- Standard Lens: 3.6 mm with 62' view
- Wide Angle Lens: 2.8 mm with 108' view
- Minimum Illumination: 1.0 Lux
- Frequency: ISM 2400-2483
- Power: 5 mW
- Power Supply: 8-9 V
- Current Consumption: 80 mA
- Camera Dimensions: 20 x 20 x 20 mm (with standard lens)
- Camera Weight: 20 g

Appendix H - i

You might also like