Professional Documents
Culture Documents
Sensor Fusion
Fredrik Gustafsson
Lecture 1 2 3 4 5 6 7 8 9 10
Content Course overview. Estimation theory for linear models. Estimation theory for nonlinear models Sensor networks and detection theory Nonlinear lter theory. The Kalman lter. Filter banks. Kalman lter approximation for nonlinear models (EKF, UKF). The point-mass lter and the particle lter. The particle lter theory. The marginalized particle lter. Simultaneous localization and mapping (SLAM). Modeling and motion models. Sensors and lter validation.
Sensor Fusion
Exercises: compendium sold by Bokadademin. Software: Signals and Systems Lab for Matlab.
Lecture 1, 2012
Sensor Fusion
The weighted least squares (WLS) method. The maximum likelihood (ML) method.
The Cramer-Rao lower bound (CRLB).
Fusion algorithms.
Slides:
Lecture 1, 2012
Sensor Fusion
200
150
[m]
100
50
50
150
100
50
50 [m]
100
150
200
250
12 sensor nodes, each one with microphone, geophone and magnetometer. One moving target. Detect, localize and track/predict the target.
Lecture 1, 2012
Sensor Fusion
GPS gives good position. IMU gives accurate accelerations. Combine these to get even better position, velocity and acceleration.
Lecture 1, 2012
Sensor Fusion
Lecture 1, 2012
Sensor Fusion
Lecture 1, 2012
Sensor Fusion
p1=[0;0]; p2=[2;0]; x=[1;1]; X1=ndist(x,0.1*[1 -0.8;-0.8 1]); X2=ndist(x,0.1*[1 0.8;0.8 1]); plot2(X1,X2)
1.5
x2
0.5
S1
S2
0.5 0.5
0.5
1 x1
1.5
2.5
Lecture 1, 2012
Sensor Fusion
% WLS % LS
x2
0.5
S1
S2
0.5 0.5
0.5
1 x1
1.5
2.5
Lecture 1, 2012
Sensor Fusion
Information loops (updating with the same sensor reading several times) give
rise to optimistic covariances.
Lecture 1, 2012
10
Sensor Fusion
Safe fusion
Given two unbiased estimates x 1 , x 2 with information I1 = P1 and I2 = P2 (pseudo-inverses if singular covariances), respectively. Compute the following: 1. SVD: I1 2. SVD: D1
T = U1 D1 U1 . T U1 I2 U 1 D 1 1/2 T = U2 D2 U2 . 1/2 1/2 1 1
T T = U2 D1 U1 . 4. State transformation: x 1 and x 2 . The covariances of these are 1 = T x 2 = T x 1 C OV(x 1 ) = I and C OV(x 2 ) = D2 , respectively.
3. Transformation matrix:
= 1, 2, . . . , nx , let
ii ii i x i = x 1 , D = 1 if D2 < 1, ii ii ii i if D2 > 1. x i = x 2 , D = D2
x = T 1 x , P = T 1 D 1 T T
Lecture 1, 2012 11
Sensor Fusion
Transformation steps
x 2 x 2 x 1 x 2 x 1 x 1
Lecture 1, 2012
12
Sensor Fusion
Lecture 1, 2012
13
Sensor Fusion
Sequential WLS
The WLS estimate can be computed recursively in the space/time sequence yk . Suppose the estimate x k1 with covariance Pk based on observations y1:k1 . A new observation is fused using
x k = x k 1 +
T Pk1 Hk
T Hk Pk1 Hk
+ Rk
1 1
(yk Hk x k 1 ), Hk Pk1 .
Note that the fusion formula can be used alternatively. In fact, the derivation is based on the information fusion formula applying the matrix inversion lemma.
Lecture 1, 2012
14
Sensor Fusion
V W LS ( xN ) = =
N k=1 N k=1
1 (yk Hk x N )T R k (yk Hk x N )
1 N )T P 0 ( x0 x N ) ( x0 x
The second expression should be used in decentralized sensor network implementations and on-line algorithms. The last correction term to de-fuse the inuence of the initial values is needed only when this initialization is used.
Lecture 1, 2012
15
Sensor Fusion
p(y1:N ) = p(y1 )
k=2
p(yk |y1:k1 ).
p(y1:N ) =
k=1
T (yk ; Hk x k1 , Hk Pk1 Hk + Rk )
p(y1:N ) = ( xN ; x0 , P0 )
det(PN )
k=1
(yk ; Hk x N , R k ).