You are on page 1of 7

ICCAIS 2013: Main Track ICCAIS 2013

Hybrid Terminal Sliding Mode Control and Quadrotors Vision


Based Ground Object Tracking
Hoang-The Pham, Chi-Tinh Dang, Thanh-Binh Pham and Nguyen-Vu Truong

Abstract In this paper, we presents a systematic approach computational complexity. Here, due to hardware limitation
to the vision based ground object tracking problem of a low of AR.Drone, color based thresholding method is employed
cost AR.Drone quadrotor, using Hybrid Terminal Sliding to identify and estimate the targets position on the relative
Mode Controller (HTSM), based on nonsingular terminal reference of image frames captured from its integrated
sliding mode (NTSM) and higher order sliding mode (HOSM).
In this approach, NTSM manifold is used to provide fast
vision aids.
convergence and better tracking precision while HOSM control Here, the current scope of this paper is the systematic
law is utilized to eliminate chattering phenomenon. The formulation of the tracking problem into a closed-loop
Experimental results demonstrate that the proposed HTSM control system design which requires an explicit
approach outperforms conventional PD and sliding mode mathematical model of AR.Drone coordinate motion with
tracking controllers in terms of response speed, tracking respect to the target being tracked and appropriate control
precision and robustness.
algorithm. In this study, due to a number of factors including
poor resolution of AR.Drones vision systems, external
I. INTRODUCTION
environment noise and disturbance, model of such a system
In recent years, the miniature UAV as quadrotor has will consist of significant uncertainty in its parameters which
gained much popularity in a wide range of research and provide promising space for sliding mode controller to work.
application, i.e. [1]-[15]. Micro-quadcopter such as the That is in comparison with other methods, sliding mode
commercially available AR.Drone has been used as a controller has been attractively proven to be advantageous in
common research platform in educational laboratories due to terms of robustness to disturbance and low sensitivity to the
its low cost, small size, stability as well as agility to fly in systems parameters. However, chattering phenomenon has
small indoor environment [5]. been the major drawback of conventional sliding mode
The AR.Drone is equipped with a number of relevant on- control. That is switching control input will cause
board sensors and vision systems which enable the undesirable phenomenon of oscillation having finite
application of various vision based control and estimation frequency and amplitude. In order to avoid or at least reduce
algorithms [5]. However, its application are rather more this phenomenon, several attempts have been made. One of
restrictive than the ones normally bearing on micro UAV such is to use saturation or sigmoid function in replacement
applications. The challenge here is the problem of localizing of the discontinuous control in a boundary layer of the
the robot purely from its sensors and the capability to sliding mode manifold [16]. This method however could not
robustly navigate it under potential sensor loss. This is even guarantee the convergence of tracking error to zero within
at great importance to be addressed under low cost hardware such a boundary layer.
with inaccurate actuators, noisy sensors, low quality on- Another way to avoid chattering is to use higher order
board vision system, significant delays and limited sliding mode (HOSM) since HOSM uses continuous signals
computational resources. It, in turn, initiates a number of instead of discontinuous switching signals [17-22]. Here, in
interesting research questions. this paper, we combine Nonsingular Terminal Sliding Mode
One of the most popular vision based problems that UAV Manifold and HOSM in order to formulate the so called
has to solve is to track target by processing images acquired hybrid terminal sliding mode (HTSM) control system [21-
by its integrated cameras, i.e. [6]-[15]. This can be 22] for the tracking problem of AR.Drone. Within this set-
accomplished by a number of methods, ranging from shape / up, NTSM manifold is designed to improve convergence
pattern matching, i.e. [9] , using given targets model to speed and tracking precision; while HOSM control law
much simpler non-model based approach like color based guarantees the systems stability. In such a manner, the
detection using thresholding, i.e. [10], which is easy to control signal is continuous instead of being discontinuous
construct real time operation for its relatively low switching signal as in the conventional sliding mode control,
thus provide smooth control over the quadrotor objects
tracking problem.
Chi-Tinh Dang and Hoang-The Pham are students at the Department of
Mechatronics Engineering, HCMC University of Technical Education,
Vietnam. Email:{ ctdang, htpham }@irobotics.ac.vn The paper is organized as follows. Section 2 briefly
Thanh-Binh Pham and Nguyen-Vu Truong are with the Institute of describes the hardware platform (the AR.Drone Ver 1.0).
Applied Mechanics and Informatics, Vietnam Academy of Science and
Technology. Email: {tbpham,nvtruong}@irobotics.ac.vn The vision based target detection algorithm is presented in
Section 3. Section 4 concerns the data based system

978-1-4799-0572-0/13/$31.00 2013 IEEE 334


ICCAIS 2013: Main Track ICCAIS 2013

identification and control system design exercises. The


experimental results are reported in Section 5. Finally,
Section 6 concludes the paper.

II. HARDWARE PLATFORM


AR.Drone (Figure 1) is an commercially available electric
quad rotor which has been increasingly used in education
and research sector due to its low cost, robustness to crashes,
safety and reliability for both indoor and outdoor usages. Its
structure is made of carbon fiber, plastics, battery, four
BLDC motors, equipped with 6-degree-of-freedom inertial
measurement unit, 3-axis gyroscope & accelerometer,
control board with ultrasonic sensors ( Figure 2) and two
cameras (one in the front, 320x240, and one at the bottom,
176x144). Here, users can directly set the yaw, pitch, roll,
vertical speed and control board can adjust return motor Fig. 3. State transitions diagram of the vision based target detection
speed to match state requirements. The drone can achieve algorithm.
speed of more than 5 m/s for a continuous flight of 13 mins.
AR.Drone is initially placed on the ground and ready for
taking off. The global control system signals taking off
command to start the drones motors. It takes off after a
time delay of t0 for the motors to reach their sufficient flying
speeds.
A. Tracking
In the tracking phase, the developed algorithm is expected to
solved three problems: (1) Tracking initialization, (2) on-
time tracking, and (3) Error processing.

1. Tracking Initialization

Fig. 1 AR.Drone quadrotor version 1.0 The easiest way to excavate and segmentation an object
from a image is based on colors which are significantly
different from the object to its background. The acquired
AR.Drone is controlled by ARM9 processor (CPUs clock image of Ar. Drones camera is RGB format (Figure 4).
speed of 468MHz with 128MB DDRAM at 200MHz. Parrot, Here, in order to facilitate the color based image
the manufacturer, provides a console via ad-hoc wireless segmentation, the image is converted into HSV color space.
networks to control the drone using Ipad/ Iphone or Android
devices. Thanks to the supplied open-source SDK, several
control parameters of flight can be set via its API which also
provides access to the sensors data and images from the
cameras.

Fig. 2 Control board with ultrasonic sensors.

III. VISION BASED TARGET DETECTION


This section describes a vision based target detection
using color thresholding which implemented on the acquired
images on the host computer. Here, the target detection task Fig. 4 RGB images matrices
is divided into three phases (Figure 3): (1) taking off, (2)
following and (3) hovering.

978-1-4799-0572-0/13/$31.00 2013 IEEE 335


ICCAIS 2013: Main Track ICCAIS 2013

Similarly, HSV color space is consists of 3 matrices, 'Hue', IV. CONTROL SYSTEM DESIGN
'Saturation' and 'Value'. In OpenCV, value range for 'Hue',
In order to systematically and properly design the tracking
'Saturation' and 'Value' are respectively 0-179, 0-255 and 0-
controllers for AR. Drone, mathematical models for such a
255. 'Hue' represents the color, while 'Saturation' and
system performing such a task are required. Here, empirical
Value refer to the amount to which that respective color is
modeling approach, system identification technique, is
mixed with white and black correspondingly. In this study,
employed using the experimental input-output data acquired
the target (in red color as in Figure 5) has HSV values in
from the Drone (sampled at 20ms).
between 170-180, 160-255, 60-255 respectively, in which
the Hue is unique for that specific color distribution of the A. System Identification
object, while Saturation and Value may vary according to Data collection exercise has been performed with the (1)
the environments lighting conditions. inputs as the control commands Ux and Uy command sent to
AR.Drone to make it move left-right or forward-backward ,
and (2) outputs as error in pixels of the object on ground
against the central point of each video frame acquired from
its vertical camera. Note that X direction in 2D video frame
is controlled by roll angle while Y direction is controlled by
pitch angle (Figure 7).

Fig. 5 RGB image converted to HSV image.

2. On-line Tracking

Fig.7 Definition of roll, pitch and yaw angle

20-second worth of data collected in X_direction and


Y_direction together with their corresponding control inputs
Ux and Uy are as shown in Figures 8 and 9. The experiment
Fig. 6 HSV image converted to binary space by thresholding set up is as shown in Figure 10.
There are a number of contributing factors, i.e. poor
This module converts HSV image into binary image using
quality of integrated cameras, external disturbances such as
color based thresholding (Figure 6). In the binary space, the
sunlight, background conditions, etc. which influence the
target within the frames local coordinate is detected and
accuracy of this data collection outcome. Despite that fact,
estimated by finding the optimal probability distribution via
Gaussian filter. the input-output pattern as shown in Figures 8 and 9 depicts
that such a system can be well represented by linear models
3. Error Processing. with motion in X_direction and Y_direction decoupled.

Backup strategies are necessary to deal with tracking


failure which may occur due to a number of contributing
factors including background noise, possible objects loss,
etc. In the situation when the objects location is out of the
control area, the Drone hovers around the position where
object is last detected, trying to search for it again.
B. Landing.

Once receiving the landing command , AR.Drone reduces


its speed motor and lands.

Fig.8 Input (b) and output in pixel (a) data in X_direction

978-1-4799-0572-0/13/$31.00 2013 IEEE 336


ICCAIS 2013: Main Track ICCAIS 2013

0.97035 - 0.094315
Ax =
0.024882 0.95547
(3)
0.0057989
Bx = , C x = [391.43 - 7.2361]
- 0.0033453
0.9468 0.11246
Ay =
- 0.037014 0.91153
(4)
- 0.013124
By = , C y = [- 185.89 - 3.2416]
- 0.0056251

Figures 11 and 12 shows the comparison between the


models (1) and (2) predicted (5-step prediction) output
Fig.9 Input (b) and output in pixel (a) data in Y_direction
versus their actual measurements, demonstrating the
effectiveness of (1) and (2) in modeling AR.Drone tracking
task in X-Y coordinate.

Fig.10 Experiment Set-up Fig.11 X_direction: models predicted output (black) superimposed
on the actual measurement (blue)

Using Prediction Error Minimization (PEM) method, the


State Space Model for each direction of motion are as
follows:
X_Direction:

dl x
= Ax l x + Bx u x
dt (1)
x = C xl x
Y_Direction:
dl y
= Ay l y + B y u y
dt (2)
y = C yl y
In which, {l x , x, u x }, {l y , y, u y } are the {state variable,
Fig.12 Y_direction: models predicted output (black) superimposed
position measured, control input} for X_direction and on the actual measurement (green)
Y_direction respectively; and

978-1-4799-0572-0/13/$31.00 2013 IEEE 337


ICCAIS 2013: Main Track ICCAIS 2013

B. Conventional Sliding Mode Controller Design .


In order to stabilize the Drone to track the ground object ueq ,i (t ) = (Ci Bi ) 1 ( ref ,i Ci Ai li )
(that is to make the ground object always about the center of
t q . 2 p / q
the acquired images frame), a feedback control algorithm is u n ,i = (Ci Bi ) 1 [ ei + K i sign( si ) + i si ]d
needed for such a task. In this case, the sliding mode control 0 p i
law for a state space model (10)
where K i > 0, i > 0 are the design parameters.
dli (t )
= Ai li (t ) + Bi ui (t )
dt (5) Proof. Consider the following Lyapunov candidate function:
i (t ) = Ci li (t ) Vi = 0.5si
2

li (t ) {l x , l y }, i {x, y}, ui (t ) {u x , u y } Differentiating Vi with respect to time t, we get


is designed as below .
. . . p . p / q 1 ..
644474448
ueq ,i ( t )
u (t ) V i = s i si = (e + i ei ei ) si
. 647 n ,i
48 q
ui (t ) = (Ci Bi ) ( ref ,i Ci Ai li )+ K i sign( si )
1
(6) . p / q 1 . 2 p / q p ..
si = ei = ref ,i i = ei (ei +i ei ) si
q
in which, K i > 0 . . .
ei = ref ,i i = ref ,i Ci ( Ai li + Bi ui )
.

. .
The conventional sliding mode control law as in (6) is robust = ref ,i Ci Ai li ( ref ,i Ci Ai li + Ci Bi u n.i )
with respect to both the internal parameters uncertainties = Ci Bi u n ,i
and external disturbances of the system. However, this
discontinuous switching signal causes undesirable
chattering phenomenon of oscillation having finite .. . q . 2 p / q
e i = Ci Bi u n ,i = [ ei + K i sign( si ) + i si ]
frequency and amplitude. In order to eliminate such a p i
phenomenon, higher order sliding mode control based on . . p / q 1
nonsingular terminal sliding mode manifold (NTSM) is Vi = ei [ K i sign( si ) + i si ]si
applied to control such a system. . p / q 1
( K i sign( si ) si + i si )
2
= ei
C. HTSM Design . p / q 1
( K i || si || + i si )
2
With regarding to system represented by (5), in order to = ei
achieve better performance in term of fast convergence and
tracking precision, the NTSM manifold [21-22] is designed ( p q )
as below: . p / q 1 . 1/ q
ei = ei 0 since ( p q) is even number.
. p/q
si = ei + i e i (8)
where, i > 0;1 < p / q < 2 and p,qare odd numbers. .

This NTSM manifold (8) is used in order to realize second For || si || 0 , Vi 0 . There are two cases here. The first
order sliding mode control, and eliminate chattering . .
case is when ei = 0 , Vi = 0 but ei 0 . In this case, the
phenomenon. When s reaches zero in finite time, both ei and
.
state variable will not always stay at the point
.
ei will reach zero in finite time, and the system will stay on
( ei = 0 , ei 0 ), but will continue to cross the axis
.
the second order sliding mode ei = ei =0. Here, the HOSM . .
ei = 0 in the phase plane 0 ei ei [22].
control law is designed based on the following theorem.
.
Theorem 1. Consider a system which is represented by (5),
The second case is when ei 0 , then
and the NTSM manifold is chosen as in (8), the control law
is designed as . . p / q 1
( K i || si || + i si ) < 0
2
ui (t ) = ueq ,i (t ) + un ,i (t ) (9) Vi = ei
The study in [22] has shown that the system state can reach
with
the sliding manifold in finite time. Given t r ,i be the time

978-1-4799-0572-0/13/$31.00 2013 IEEE 338


ICCAIS 2013: Main Track ICCAIS 2013

when si reaches zero from si 0 , then it will stay at


.
zero,and from (8) , it can be seen that, ei and ei will
converge to zero within finite time. Thus, the total time from
si (0) 0 to ei (t si ) = 0 is calculated as [22]:
p q
p
t si = t r ,i + i q / p | ei (t r ,i ) | p
(11).
pq
This completes the proof.

V. EXPERIMENTAL RESULTS

Fig.15 Tracking of moving ground target: targets trajectory in


yellow, AR.Drones trajectory with HTSM
( x = y = 1, K x = K y = 50, x = y = 0.5 ) in blue
superimposed in conventional SM and PD controlling performance
[23] in red and green respectively.

During this exercise, in order to facilitate the comparison


between different methods, Mean-Squared-Error (MSE)
measure is used here. MSEs for HTSM, conventional SM
and PD are 8.96cm, 15.62cm and 13.14cm correspondingly.
Fig.13 Stationary ground target tracking using HTSM controller: This quantifies the superiority and merits of HTSM over the
AR.Drone in blue, Target in yellow lines. other two controllers under study in both smooth and precise
control performance as well as its robustness to both external
disturbance and internal models parameter uncertainty.

VI. CONCLUSION
A systematic approach for vision based ground target
tracking problem of AR.Drone using the so called Hybrid
Terminal Sliding Mode Control system is presented in this
paper. Hardware constraint of AR.Drones integrated vision
systems, in turn, creates a very interesting control problem
which involves both internal uncertainty, noise and external
disturbance.
The experimental results proves the superiority of HTSM
over conventional SM and classical PD controllers in
Fig.14 Stationary ground target tracking using conventional sliding providing smooth, precise tracking performance,its
mode controller: AR.Drone in blue, Target in yellow lines. robustness against noise, disturbance and internal
parameters uncertainty.
Here in order to facilitate the comparison, both HTSM and
conventional sliding mode controller are implemented in the REFERENCES
tracking of both stationary and moving target. As it shows in [1] R. W. Beard, Multiple UAV cooperative search under collision
Figure 13 and 14, HTSM control provides much better avoidance and limited range communication constraints, In Proc.
performance over the conventional sliding mode control. 2003 IEEE Conference on Decision and Control (CDC), 2003, pp.
2530.
[2] T. Tomic, K. Schmid, P. Lutz, A. Domel, M. Kassecker, E. Mair,I.L.
Figure 15 compares the tracking performance of HTSM Grixa, F. Ruess, M. Suppa, and D. Burschka, Toward a fully
controller versus conventional sliding mode and classical PD autonomous UAV: research platform for indoor and outdoor urban
controllers in the tracking exercise of ground moving target search and rescue, IEEE Robotics& Automation Magazine, 2012,
Vol. 19, No. 3, 46-56.
(where the ground target is moving in a triangle trajectory).

978-1-4799-0572-0/13/$31.00 2013 IEEE 339


ICCAIS 2013: Main Track ICCAIS 2013

[3] E. Altug, J. P. Ostrowski, and R. Mahony, Control of a quadrotor


helicoptor using visual feedback, in Proc. IEEE International
Conference on Robotics and Automation, 2002, pp. 72-77.
[4] J. H. Gillula and C. J. Tomlin, Guaranteed safe online learning via
reachability: tracking a ground target using a quadrotor, in Proc.
2012 IEEE International Conference on Robotics and Automation,
2012, pp. 2723-2730.
[5] T. Krajnk, V. Vonasek, D. Fiser, and J. Faigl, AR-drone as a
platform for robotic research and education, in Proc. 2011
Communications in Computer and Information Science (CCIS), 2011,
pp. 172-186.
[6] C. Bills, J. Chen, and A. Saxena, Autonomous MAV flight in indoor
environments using single image perspective cues, in Proc. 2011
IEEE Intl. Conf. on Robotics and Automation (ICRA), 2011, pp. 5776-
5783.
[7] J. Engel, J. Sturm, and D. Cremers, Camera-based navigation of a
low-cost quadrocopter, in Proc. 2012 International Conference on
Intelligent Robot Systems (IROS), 2012, pp. 2815-2821.
[8] T. Tomic, K. Schmid, P. Lutz, A. Domel, M. Kassecker, E. Mair, I.L.
Grixa, F. Ruess, M. Suppa, and D. Burschka, Toward a fully
autonomous UAV: research platform for indoor and outdoor urban
search and rescue, IEEE Robotics & Automation Magazine, 2012,
Vol. 19, No. 3, pp.46 56.
[9] Matthew Turpin, Nathan Michael, and Vijay Kumar, Trajectory
design and control for aggressive formation flight with quadrotors,
Autonomous Robots, 2012, Vol. 33, No. 1, pp.143156.
[10] Dorin Comaniciu, Visvanathan Ramesh, and Peter Meer, Real-time
tracking of non-rigid objects using mean shift, in Proc. 2000 IEEE
Conference on Computer Vision and Pattern Recognition, 2000, pp.
142-149.
[11] J. Xiao, C. Yang, F. Han, H. Cheng, and Sarnoff Corporation,
Vehicle and person tracking in aerial videos, Multimodal
Technologies for Perception of Humans, 2008, pp.203214.
[12] S. Zhang, Object tracking in unmanned aerial vehicle (UAV) videos
using a combined approach, in Proc. 2005 IEEE International
Conference on Acoustics, Speech, and Signal Processing, 2005, pp.
681-684.
[13] S. W. Zhang, X. A. Wang, Z. L. Liu, and M. V. Srinivasan,Visual
tracking of moving targets by freely flying honeybees, Visual Neuro-
science, 1990, pp. 379-386.
[14] S. Saripalli, J. F. Montgomery, and G. S. Sukhatme, Visually guided
landing of an unmanned aerial vehicle, IEEE Transactions on
Robotics and Automation, 2003, pp. 371-381
[15] Y. Sheikh and M. Shah, Object tracking across multiple indepen-
dently moving airborne cameras, in Proc. 2005 IEEE International
Conference on Computer Vision, 2005, pp. 10-17.
[16] K. Paponpen and M. Konghirun, An improved sliding mode observer
for speed sensorless vector control drive of PMSM, in Proc. IEEE
IPEMC, Aug. 1416, 2006, vol. 2, pp. 15.
[17] G. Bartolini, A. Ferrara, and E. Usani, Chattering avoidance by
secondorder sliding mode control, IEEE Trans. Autom. Control, vol.
43, no. 2, pp. 241246, Feb. 1998.
[18] A. Levant, Higher-order sliding modes, differentiation and
outputfeedback control, Int. J. Control, vol. 76, no. 9/10, pp. 924
941, Jun./Jul. 2003.

[19] D. Traore, F. Plestan, A. Glumineau, and J. de Leon, Sensorless


induction motor: High-order sliding-mode controller and adaptive
interconnected observer, IEEE Trans. Ind. Electron., vol. 55, no. 11,
pp. 38183827, Nov. 2008.
[20] A. Pisano, A. Davila, L. Fridman, and E. Usai, Cascade control of
PM DC drives via second-order sliding-mode technique, IEEE Trans.
Ind. Electron., vol. 55, no. 11, pp. 38463854, Nov. 2008.
[21] Y.Feng, J. Feng, X. Yu and N.V. Truong, Hybrid Terminal Sliding-
Mode Observer Design Method for a Permanent-Magnet Synchronous
Motor Control Systems, IEEE Trans. Industrial Electronics, Vo. 56,
No. 9, 2009, pp. 3424-3431.
[22] Y. Feng, X. H. Yu, and Z. H. Man, Non-singular adaptive terminal
sliding mode control of rigid manipulators, Automatica, vol. 38, no.
12, pp. 21592167, Dec. 2002.
[23] C.T. Dang, H.T. Pham, T.B. Pham and N.V. Truong, Vision based
Ground Target Tracking using AR.Drone Quadrotor, to appear in
Proc. 2nd International Conf. Control, Automation and Information
Sciences (ICCAIS 2013), Nha Trang, Vietnam, November 2013.

978-1-4799-0572-0/13/$31.00 2013 IEEE 340

You might also like