Professional Documents
Culture Documents
Abstract This paper presents a lane detection and linearparabolic lane tracking system using kalman filtering method.
First, the image horizon is detected in a traffic scene to split the
sky and road region. Road region is further analyzed with
entropy method to remove the road pixels. Lane boundaries
are then extracted from the region using lane markings
detection. These detected boundaries are tracked in
consecutive video frames with a linear-parabolic tracking
model. The model parameters are updated with Kalman
filtering method. Error-checking is performed iteratively to
ensure the performance of the lane estimation model.
Simulation results demonstrate good performance of the
proposed Kalman-based linear-parabolic lane tracking system
with fine parameters update.
Keywords-Lane Detection, Lane
Vehicle, and Driver Assistance System
I.
Tracking,
Intelligent
INTRODUCTION
351
II.
LANE DETECTION
Image Row
denoted as Lmap ( x, y ) D x + D y
& ( x, y ) = tan 1 (
Dy
Dx
) ,
Horizon Line
Mean Values
Gradient
Figure 2. (a) Vertical mean distribution, (b) Road region image Rroi(x,y).
Orientation
Figure 4. The edge distribution function.
III.
LANE TRACKING
Fig. 3: (a) Lane binary map Rbin(x,y), (b) Lane markings map Lmap(x,y).
352
C. Kalman Filtering
Kalman filtering method [13] can be used to update the
unknown state parameters for left-right edges. The linear
state and measurement equations are defined as:
(2)
A k +1 = Fk +1, k A k + G k +1, k u k + w k
y
Far-field
ym
Near-field
x
Figure 5. (a) Detected lane boundaries, (b) Possible Edges
B.
Linear Parabolic Model
Lane estimation model plays an important role in lane
tracking process. In this stage, a simple linear-parabolic
tracking model proposed in [9, 10] is used to construct leftright tracked lines. As shown in Fig. 5(a), an image is
divided into near and far-fields where ym is the border
between them. A linear model f ( y ) = a + by whereas
y > ym is applied to follow the straight line in the near-field
2a + y m (b d )
2
and e =
bd
2 ym
Correction
can be
Prediction
A
+G
=F
A
k
k , k 1
f ( y) =
b
d
2
2
2
2
a + 2 y ( y fj + ym ) 2 y ( y fj ym ) = x fj , j = 1"n
m
m
(1)
#
#
#
1
ynm
0
1
1
2
( y 2f 1 + ym
)"
( y f 1 ym )2
h = 1"
2 ym
2 ym
#
#
1
1
2
2
2
1"
( y fn + ym )"
( y fn ym )
2 ym
2 ym
H = hTL , hTR
Kk =
Pk HTk
H k Pk HTk + R k
=A
+ K (x H A
A
k
k
k k
k k)
Pk = (I K k H k )Pk
SIMULATION RESULTS
(4)
Pk = S k , k 1Pk 1STk , k 1 +Q k
IV.
(3)
T
k 1
k 1
k , k 1u
(2)
A = [( a, b, d ) L , ( a, b, d: ) R ]T
(3)
y k = Hk Ak + vk
(5)
353
[3]
[4]
[5]
(c)
(b)
(a)
V.
[6]
CONCLUSION
[7]
A vision-based lane detection and Kalman-based linearparabolic tracking system has been presented in this paper.
An image is first performed with horizon localization and
lane region analysis. It is followed by lane markings
extraction to obtain the left-right lane boundaries. The
detected left-right boundaries are used to predict a linearparabolic model. This model is continuously estimated and
tracked using Kalman filtering method. The possible edges
are to be found nearby to the estimated lane boundaries. No
camera parameters are involved in the proposed system. In
addition, the performance of lane detection is further
enhanced with the assistance of lane model estimation in
faster processing speed.
[8]
[9]
[10]
[11]
[12]
REFERENCES
[1]
[2]
[13]
[14]
(a)
(b)
(c)
(d)
(e)
Figure 8(a)-(g). Lane tracking results for video sample #1.
(f)
(g)
(a)
(b)
(c)
(d)
(e)
Figure 9(a)-(g). Lane tracking results for video sample #2.
(f)
(g)
(a)
(b)
(c)
(d)
(e)
(f)
(g)
Figure 10(a)-(g). Lane tracking results using method [9] for video sample #2. Red circles denoted the lost tracked model estimation.
354