Professional Documents
Culture Documents
College of Civil Aviation, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
School of Information and Communication, Guilin University of Electronic Technology, Guilin 541004, China
Received 24 January 2011; revised 30 March 2011; accepted 8 August 2011
Abstract
The lack of autonomous aerial refueling capabilities is one of the greatest limitations of unmanned aerial vehicles. This paper
discusses the vision-based estimation of the relative pose of a tanker and unmanned aerial vehicle, which is a key issue in
autonomous aerial refueling. The main task of this paper is to study the relative pose estimation for a tanker and unmanned aerial
vehicle in the phase of commencing refueling and during refueling. The employed algorithm includes the initialization of the
orientation parameters and an orthogonal iteration algorithm to estimate the optimal solution of rotation matrix and translation
vector. In simulation experiments, because of the small variation in the rotation angle in aerial refueling, the method in which the
initial rotation matrix is the identity matrix is found to be the most stable and accurate among methods. Finally, the paper discusses the effects of the number and configuration of feature points on the accuracy of the estimation results when using this
method.
Keywords: unmanned aerial vehicles; aerial refueling; pose estimation; machine vision; orthogonal iteration algorithm
1. Introduction1
Autonomous aerial refueling (AAR) is an important
direction in the future development of unmanned aerial vehicles (UAVs) [1]. Currently, there are two main
methods of aerial refueling corresponding to two
hardware configurations. The first is the use of a Boeing flying boom, employed only by the United States
Air Force, and the other is the use of a probe and drogue, which is the standard for the United States Navy
and the air forces of most other nations [2]. The
frameworks are similar for the two configurations: a
UAV must approach the tanker, and before the refueling phase gets started, the UAV and tanker must be in a
stable relative configuration. For this purpose, sufficiently accurate and reliable sensors and the corresponding measurement algorithms are necessary to
estimate the relative pose between a tanker and UAV
from the pre-contact phase to the contact phase
and during refueling [3-4].
Although several sensors and the corresponding approaches have been considered for estimating the relative pose between a UAV and tanker, including the use
of a global positioning system (GPS) device, a laser
and infrared radar [5], there are limitations associated
with their use, including a lack of accuracy and reliability [6-7]. For example, in UAV formation flying, a
GPS device can be used to obtain relative parameters
and measurements with accuracy of 2 cm, but problems associated with lock-on, integer ambiguity, and
low bandwidth present challenges for application to
in-flight refueling [8]. Previous works have demon-
808
No.6
can be computed according to coordinates of the projections of FPs in image planes and locations of FPs on
the surface of the tanker. Fig. 1 is a block diagram of
the vision-based relative pose estimation algorithm.
The algorithm is divided into two parts: initialization of
orientation parameters, and the OI algorithm to estimate the optimal solution of rotation matrix R and
translation vector t (see Fig. 2).
Fig. 1
Fig. 2
Fig. 3
No.6
yiw
ziw ]T
(i = 1, 2, L , n; n 6)
(1)
= [ xic
yic
zic ]T
(i = 1, 2, L, n; n 6)
(2)
(3)
R12
R22
cos sin
R23
cos cos
R13
809
i=1
RR T = I , det(R)=1
s.t.
(4)
tr( DS )2
(6)
x2
where x2 and y2 are variances around the mean vectors of X and Y respectively. Let the singular decomposition of the covariance matrix of X and Y xy be
UDV T .
When rank(xy) m1, the optimum transformation
parameters are determined uniquely as
R = USV T
(7)
det( xy ) 0
I
S=
L
diag(1, 1, , 1, 1) det( xy ) < 0
(8)
where,
w
w
1
vi = r2 ( Pi + t y )(r3 Pi + t z )
where R = [r1T
r2T
r3T ]T . Hence,
vi = [ui
r1 Pi w + t x
w
r3 Pi + t z
(9)
vi 1]T =
r2 Pi w + t y
r3 Pi w + t z
r3 Pi w + t z
=
r3 Pi w + t z
1
( RPi w + t )
r3 Pi w + t z
(10)
810
(11)
(12)
i =1
1
1
I Vi
n
n i =1
n
1 n
(Vi I )RPi w
(13)
i =1
(14)
x 0
K = 0 y
0
0
u0
v0
1
No.6
w T
( Xi )
w T
vi ( X i )
( X iw )T
013
ui ( X iw )T
vi ( X iw )T P1
ui ( X iw )T P 2 = 0 (15)
3
013
P
( X iw )T
013
P1
vi ( X iw )T 2
P = Ap = 0 (16)
ui ( X iw )T 3
P
P = K [ R t ] = [ KR
Kt ]
we have
KR =[P1
P2
P3 ] = M
No.6
(17)
811
This section considers the relationship between estimation error and number of FPs, the accuracy and
robustness of pose estimation, and the relationship between the distribution of FPs and the estimation results.
In the simulation experiment conducted in this paper, the reference frames are defined as those in Fig. 3
and the intrinsic matrix of the camera K is
0 254
571
K = 0 571 206
0
1
0
Fig. 4
Table 1
P =
1
1
Pi w , Q = n Qic
n i =1
i =1
1 n
1 n
= || Pi w P ||2 , Q2 = || Qic Q ||2
n i =1
n i =1
2
P
PQ
1 n
= (Qic Q )( Pi w P )T
n i =1
/m
yiw
/m
ziw
/m
3.7
-7.13
0.7
3.7
7.13
0.7
7.79
-2.689
0.7
7.79
2.689
0.7
7.96
3.189
812
No.6
Fig. 5
not change significantly as the number of FPs increases. It has been noted that FPs are the projections
of intensity output from particular beacons (such as an
LED beacon) fixed on wings, horizontal tails and the
vertical tail of a tanker in the image plane [14]. Therefore, an increase in the number of FPs will decrease the
imperceptibility of the tanker. Additionally, it will increase the computational complexity and further reduce
real-time performance. Therefore, the following simulation experiments only select six FPs for pose estimation.
4.2. Error dependence on the configuration of FPs
No.6
Fig. 6
813
Fig. 7
814
Fig. 8
No.6
Fig. 9
5. Conclusions
This paper describes an approach for pose estimation with an OI algorithm. The approach is implemented and tested within a simulation environment for
the machine vision-based AAR problem. The following conclusions are drawn.
(1) Because the range of change in relative rotation
angles between the UAV and tanker is only [5, 5] in
No.6
AAR, the OI-based method in which the initial rotation matrix R is the identity matrix is the most stable
and accurate among algorithms.
(2) Considering different FP configurations, the estimation error reduces as the FP separation increases
for the same number of FPs and noise level.
(3) The number of FPs has no significant effect on
the precision of pose estimation.
[13]
[14]
[15]
References
[1]
Campa G, Napolitano M R, Perhinschi M, et al. Addressing pose estimation issues for machine vision
based UAV autonomous serial refuelling. Aeronautical
Journal 2007; 111(1120): 389-396.
[2] Ro K, Basaran E, Kamman J W. Aerodynamic characteristics of paradrogue assembly in an aerial refueling
system. Journal of Aircraft 2007; 44(3): 963-970.
[3] Mammarella M, Campa G, Napolitano R M, et al.
Comparison of point matching algorithms for the UAV
aerial refueling problem. Machine Vision and Applications 2010; 21: 241-251.
[4] Tandale D M, Bowers R, Valasek J. Trajectory tracking
controller for vision-based probe and drogue autonomous aerial refueling. Journal of Guidance, Control,
and Dynamics 2006; 29(4): 846-857.
[5] Williamson W R, Glenn G J, Dang V T,et al. Sensor
fusion applied to autonomous aerial refueling. Journal
of Guidance, Control, and Dynamics 2009; 32(1): 262275.
[6] Campa G, Seanor B, Perhinschi M, et al. Autonomous
aerial refueling for UAVs using a combined GPS-machine vision guidance. AIAA Guidance, Navigation,
and Control Conference. 2004; 1-11.
[7] Mati R, Pollini L, Lunghi A, et al. Vision-based
autonomous probe and drogue aerial refueling. 14th
Mediterranean Conference on Control and Automation.
2006; 1-6.
[8] Korbly R, Sensong L. Relative attitudes for automatic
docking. Journal of Guidance, Control, and Dynamics
1983; 6(3): 213-215.
[9] Campa G, Napolitano M R, Fravolini M. Simulation
environment for machine vision based aerial refueling
for UAVs. IEEE Transactions on Aerospace and Electronic Systems 2009; 45(1): 138-151.
[10] Ding M, Cao Y F, Wu Q X. Method of passive image
based crater autonomous detection. Chinese Journal of
Aeronautics 2009; 22(3): 301-306.
[11] Ding M, Cao Y F, Guo L. A method to recognize and
track runway in the image sequences based on template
matching. 1st International Symposium on Systems
and Control in Aerospace and Astronautics. 2006;
1218-1221.
[12] Valasek J, Gunnam K, Kimmett K, et al. Vision-based
sensor and navigation system for autonomous air refu-
[16]
[17]
[18]
[19]
[20]
815
Biographies:
DING Meng Born in 1981, he received B.S., M.S. and
Ph.D. degrees from Nanjing University of Aeronautics and
Astronautics (NUAA) in 2003, 2006 and 2010 respectively.
Currently, he is a lecturer of NUAA. His main research interest is machine vision, guidance, navigation and control.
E-mail: nuaa_dm@hotmail.com
WEI Li Born in 1982, she received B.S. degree from Nanjing University of Posts and Telecommunications in 2004.
Currently, she is a postgraduate of Guilin University of Electronic Technology. Her main research interest is machine
vision measurement and signal process.
E-mail: spvivi@126.com
WANG Bangfeng Born in 1969, he received Ph.D. degree
from Northeast University in 1999. Currently, he is a professor of NUAA. His main research interest is intelligent
traffic, guidance, navigation and control.
E-mail: bfwang@nuaa.edu.cn