Professional Documents
Culture Documents
1.Introduction
The increased autonomy of robots is directly linked to
their capability to perceive their environment.
Simultaneous Localization and Mapping (SLAM)
techniques, which associate perception and movement,
are particularly interesting because they provide
advanced autonomy to vehicles such as robots. In
outdoorenvironments,climaticconstraintsoroperational
context(dust,lightcondition,rain,etc.)showthelimitsof
usualperceptionsystems.Thisworkproposesradarasan
alternative to existing perception systems for ground
1 Damien Vivet, Franck Grossier, Paul Checchin, Laurent Trassoudaine and Roland Chapuis:
Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches
www.intechopen.com
ARTICLE
www.intechopen.com
Int. j. adv. robot. syst., 2013, Vol. 10, 307:2013
mobile robots. Its ability to work in all weather, day and
night, makes the radar very attractive for outdoor
applications. The combined use of radar with SLAM
techniques in the field of mobile ground robotics is the
maincontributionofthepresentedwork.
persecondmonitoringaroundthevehicle,
withanangularresolutionof3,inthe3100mrange.A
general view of the radar is presented in Fig. 1 and its
main characteristics are listed in Table 1. The example of
two radar images is presented in Fig. 2. Variations of
shading indicate variations of amplitude in the power
spectra.
4.FMTRSLAM:FourierMellin
transformbasedradarSLAM
Inordertoovercomethecomplexityofradarimageanalysis,
a trajectoryoriented EKFSLAM technique using data from
the360
fieldofviewradarsensorhasbeendeveloped.This
process makes no landmark assumptions and avoids the
data association problem. The method of egomotion
estimation makes use of the FourierMellin transform for
registering radar images in a sequence, from which the
rotation and translation of the sensor motion can be
estimated [31]. In the context of the scanmatching SLAM,
the use of the FourierMellin transform is original and
provides an accurate and efficient way of computing the
rigidtransformationbetweenconsecutivescans.
4.1RadarscanmatchingSLAM
The used formulation of our SLAM approach is to
estimate the vehicle trajectory defined by the estimated
state
0 k 1 k
T
T T T
v v v
x(k) x , , x , x ,
(
=
where
i
T
v i i i
x x , y ,| ( =
is the state vector describing the
location and orientation of the vehicle at time
i
t . There is
no explicit map; rather each pose estimate has an
associated scan of raw sensed data that can be next
alignedtoformaglobalmap.
Scanmatchingistheprocessoftranslatingandrotatinga
radarscansuchthatamaximaloverlapwithanotherscan
emerges.Assumingthedistributionofalignmenterrorsis
approximately Gaussian, a new vehicle pose is added to
the SLAM map by only adding the pose to the SLAM
statevector.
u
u
P P
P
u
P P
Figure1.TheK2PiFMCWradar
Carrierfrequency F0 24GHz
Transmitterpower Pt 20dBm
AntennagainG 20dB
Bandwidth 250MHz
Scanningrate 1Hz
Angularresolution 3
Angularprecision 0.1
RangeMin/Max 3m/100m
Distanceresolution 0.6m
Distanceprecision(canonicaltargetat100m) 0.05m
Size(lengthwidthheight) 2724
30cm
Weight 10kg
Table1.CharacteristicsoftheK2PiFMCWradar
Figure2.Twoconsecutiveradarimages((a)&(b))obtainedwith
theFMCWradarsensor(seeFig.1)
( ) ( ) ( )
( ) ( )
( )
j i
i, j
v v
k 1 k h x k 1 k
x k 1 k x k 1 k
+ = +
= + +
T
wheretheoperator istheinversetransformationoperator.
( )
( ) x y
1 w x w y
2 x y
1 x y
(w , w )
Corr x, y e .
(w , w )
A + A
= =
I
I
(1)
TakingtheinverseFouriertransform:
( )
( ) ( )
( )
= = A A
1
x y
Corr x, y F Corr w , w x x, y y , d
which means that ( )
Corr x, y is nonzero only at
( )
( )
( ) { }
x,y
x, y argmax Corr x, y . A A = If the two images
differ by rotational movement ( )
0
q with translation
( )
x, y , A A then
2 1 0 0
0 0
(x, y) (xcos ysin x,
xsin ycos y).
= + A
+ A
I I q q
q q
Converting from rectangular coordinates to polar
coordinates makes it possible to represent rotation as
shift:theFouriertransforminpolarcoordinatesis
( ) ( )
( ) x y
i w x w y
2 1 0
, , .e .
A + A
= I I r q r q-q
4 Int. j. adv. robot. syst., 2013, Vol. 10, 307:2013 www.intechopen.com
Introducing the magnitudes of
1
I and
2 1
I I and
2
I
respectively),theyarerelatedby
1 2 0
. I I , , q
The shift between the two images can now be resolved
using(1).
4.3Scanregistration
In order to perform a scan registration algorithm, the
FourierMellintransform(FMT)hasbeenchosen[34][31].
The FMT is a global method that takes the contributions
from all points in the images into account, in order to
provide a way to recover all rigid transformation
parameters,i.e.,rotation,translation.Itisanefficientand
accurate method to process a couple of images that are
fairlysimilar(seeFig.2).Thestepsofthescanregistration
algorithmarepresentedinAlgorithm1.
Input:Radarimages
k
I and
k 1
I
1.Apply the thresholding filter to eliminate the speckle noise
inbothimages
2.ApplytheFFTtoimages
k k
I I and
k 1 k 1
I I
3.Computethemagnitudes
k k 1
,
I I
4.Transform the resulting values from rectangular to polar
coordinates.
I I ,
5.Apply the FFT to polar images, a bilinear interpolation is
used.
I I , ,
6.Compute
Corr w , w
between
w , w
I , and
k 1
w , w
I , using(1)
7.Compute
1
Corr F Corr w , w
,
8.Findthelocationofthe maximumof
Corr , andobtain
therotationvalue
9.Construct Ir byapplyingreverserotationto
k 1
I
10.ApplytheFFTtoimage
k 1
Ir
11.Computethecorrelation
x y
Corr w , w using(1)
12.TaketheinverseFFT
Corr x, y of
x y
Corr w , w
13.return the values ( x, y) of the shift and the rotation
value
Algorithm1.Scanregistrationalgorithm
5.ROLAM:radaronlylocalizationandmapping
Inaroboticscontext,itisusuallyassumedthatthescanof
a range sensor is a collection of depth measurements
takenfromasinglerobotposition.Thiscanbedonewhen
working with lasers that are much faster than radar
sensors and can be considered instantaneous when
compared with the dynamics of the vehicle. However,
whentherobotismovingathighspeed,mostofthetime,
this assumption is unacceptable. Important distortion
phenomena appear and cannot beignored.In the case of
alaserrangefinderwitha75Hzscanningrate,distortion
exists but is ignored. This assumption is valid for low
speed applications, nevertheless still moving straight
aheadataspeedof5m/s,a7cmdistortioneffectappears.
At classical road vehicle speeds (in cities, on roads or
motorways) more important distortions can be observed.
Of course, the rotation of the vehicle itself during the
measurementacquisitionisanothersourceofdisturbance
which cannot be neglected for high speed displacement
or with slow sensors (cf. Fig. 3). Finally, let us note that
when the sensor is too slow, a stop & scan method is
oftenapplied[35].
( )
( )
( ) ( )
( ) ( )
( )
( )
c
t
c
t
A)Centerpositionattimet
c c
t t
t
c c
t
t t
B)Centerrotation
C)Detectionpositionattimet
x t x
y t
y
cos sin
cos
.
sin
sin cos
| |
| |
( (
= ( (
( (
(
(
(
+ (
(
(
(
r q
r q
c
t
c
t
c c
t 0
t
cos
x 2 2V t
sin ,
2 t y
sin
2
t. | |
( | |
( |
(
| |
\ . (
= (
|
(
| | ( \ .
(
|
( \ .
= +
w
w
w w
w
(3)
If based on this distortion parametric equation, the
measurements can be distorted or undistorted. The
equation is parameterized with respect to the linear and
angular velocities V and w. With a prior estimate of these
parameters, detection( )
, , q r done at time t in the sensor
frame,canbetransformedintotheworldframebasedon
(3)and(2)(cf.Fig.5).
Theobjectiveistoestimateproprioceptiveinformation,in
fact velocities V
and w,
Figure3.Distorteddatafromarealsensor:radardata(a)withdistortioneffect(b)withoutanydistortioneffect
Figure4.(a)Distortionphenomenon:whenthevehicleismovingalongthegreentrajectory,thesensorisscanning.Thesensorbeams
arerepresentedinredandblueduringthefirstandsecondradaracquisitionrespectively.Thefirstandthelastbeamofeachacquisition
do not measure the same thing. Each scan is distorted by movement. (b) represents the desired acquisition from the second position
(orincaseofstop&scanmethod)withcorrectedbeam
i
j
t
z' fromscan j takenattimestamp
i
t and(c)theobtainedmeasurementswhen
movingwithequivalentuncorrecteddetection
i
j
t
z
6 Int. j. adv. robot. syst., 2013, Vol. 10, 307:2013 www.intechopen.com
Figure5.(a)Dataobtainedinsensorframewithoutconsideringdistortion.(b)Undistorteddatabasedondistortionformulation
The pose of each measurement is directly linked to the
observation pose and to the angle of observation. This
posecanbeexpressedwiththeconstantvelocitymodelof
the vehicle and is only a function of the linear and
angularspeedsoftherobot,see(3).Let
1
and
2
bethe
landmarks representing the same point in the world
w
and
2
into the undistorted world frame by using the
parameters(i.e.,linearvelocity V andangularvelocity w )
anddistortionfunctions f andg (cf.Fig.6).Bycomparing
the different projected poses in each acquisition, velocity
parameterscanbeextracted.Inordertoachievethistask,
thedataassociationbetweenimages1and2isrequired.
= is
unknown because
1
g
. Finally,eachassociationcangivenewvalues
ofthevelocityparameters.
( ) ( )
( )
( )
( )
2 2
sensor d M M
i d d
sensor d
sensor d
Vcos t
x y 2
cos t
sin t
| |
= + + |
|
\ .
(
(
(
M
w
a
l
w
w
(4)
where l is the wavelength of the radar signal, a a
coefficient which links frequency and distance, and
sensor
w therotatingrateofthesensor.IfnoDopplereffect
has to be considered, as is the case with laser sensors,
simplynotethat
i d
. = M M
So,
1
and
2
detected in their respective scans can be
propagated in the world frame by their two respective
propagationfunctions f andg.
( ) ( )
w 1 w 2
f , V, g , V, . = = M M and M M w w (5)
For the first radar image, the function f can be expressed
as:
( ) ( )
( ) ( )
0
0 1 0 1
w v 1
0 1 0 1
1
0
1
1
0
cos t sin t
x
sin t cos t
t
cos
2 t 2V
sin
2 t
sin
2
| |
| |
|
|
( + +
= + (
+ +
(
( | |
+
( |
| |
\ . (
+
|
(
| |
\ .
( +
|
(
\ .
M M
w w
w w
w
w
w w
(6)
with
( )
1 1
1
sensor
arctan y , x
t . =
w
Similarly, for the second scan ( )
w 2
g , V, = M M w can be
easily deducted with
( )
2 2
2
sensor
arctan y , x 2
t .
+
=
p
w
The
functionarctanisdefinedon ; . ( +
p p
hastobepredicted
from
1
(in the first scan) onto the second scan. A
minimization technique is applied in order to calculate
the function
1 1
h( , V, ) M M because h cannot be
calculated directly. The cost function S for one landmark
isgivenby
2
w2 w1
( ) M M or:
2
1 1 1 1
and landmark
2
is then calculated,
based on Mahalanobis distance criteria by taking into
accountuncertaintiesofmeasurementsandpredictions.
As radar data are very noisy (cf. Fig. 2), both landmark
extractionanddataassociationcanbefalse.Forexample,
the speckle effect can lead to some ghost detections or
false disappearances due to the different possible
combinationsofradarsignals.Moreover,duetomultiple
reflections, radar data are not as accurate as laser data.
Thus,allpossibledataassociationshavetobeconsidered.
Figure7.Trajectoriesobtainedbythetwomethods.Inblack the
DGPS reference trajectory, in red (dotdashed curve) the
trajectory obtained by the FMTRSLAM approach and in blue
(stardashedcurve)theROLAMtrajectory
Second, the vehicle equipped with the radar sensor is
supposed to be moving during two consecutive
acquisitions at a constant velocity ( V and ). Actually,
whenthevehicleacceleratesordecelerates,theestimated
velocityobtainedwillbethemeanspeedofthevehicle.
Figure8.Translationaltransformationnormandestimationofrotationforbothapproaches.(a)Normofthetranslationaldisplacement
ateachstep,(b)normofthetranslationaldisplacementerror.(c)Valueofrotationateachstep,(d)errorinrotation
Figure9.(a)AerialGoogleview.(b)RadarmapobtainedbytheFMTRSLAMprocess.(c)RadarmapobtainedbytheROLAMprocess
9 Damien Vivet, Franck Grossier, Paul Checchin, Laurent Trassoudaine and Roland Chapuis:
Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches
www.intechopen.com
6.Experimentalresults
This section provides the experimental results of both
radarbased approaches. The radar and the
proprioceptivesensorsweremountedonautilitycar.The
experimental run that is presented was conducted in an
outdoor field, around the Auvergne Zenith car park (cf.
aerial view in Fig. 9(a)), with a semistructured
environment(buildings,trees,roads,roadsigns,etc.).The
K2Pi radar was on top of the vehicle, two metres above
the ground. A 2.5 km trajectory was travelled at a mean
speed of 30 km/h. In order to evaluate the performance
assessmentofeachapproach,aDGPSsystemwasusedto
provide a reference trajectory. Radar images as well as
the data obtained from the other sensors were recorded
andpostprocessedasexplainedpreviously.
Byanalysingthesevalues,wecanconcludethatboththe
FMTRSLAMandROLAMalgorithmsallowobtaininga
goodestimateofthevehicletrajectory,eveniftheFMTR
SLAM approach seems to provide more accurate
displacement estimates while the ROLAM technique
gives better results in rotation. Both methods produce
interestingtrajectoryresultswhichallowbuildingaradar
map by positioning each radar scan at its corresponding
estimated position. A radarmapping result based on the
FMTRSLAMprocess is given in Fig. 9(b). As illustrated
in Fig 9(c), a map can also be obtained by the ROLAM
algorithmiftheacquisitionposesofeachbeam,thathave
beenpredicted,arerecorded.Inthiscase,eachrawradar
spectrum is positioned and plotted in order to build a
map. In both cases, the maps reveal details of the
environment,butadifferenceappearsduetothefactthat
nofilteringisappliedinthesecondapproach.Ofcourse,
postprocessing could have been added in order to
remove these different perturbations (speckle effect,
Doppler, antenna aperture, etc.) and to improve the
appearanceofthemap.Wechosetopresenttheresultsin
this manner in order to highlight the difference between
both approaches: the first one is based on a scan
matchingtechnique,thesecondonedealswitheachradar
beamaspreviouslyexplained.