You are on page 1of 27

Quantitative Techniques

Topic 9
Autocorrelation

Reading: GJ (Ch. 12);

1
Summary

The Nature of Autocorrelation


OLS Estimation in the presence of
Autocorrelation
Consequences of Autocorrelation
Detection of Autocorrelation
Remedial Measures

2
The Nature of Autocorrelation
Autocorrelation is a systematic pattern in the
errors that can be either attracting (positive)
or repelling (negative) autocorrelation.
Autocorrelation refers to lag correlation
of a given series with itself, lagged by a
number of time units.
Autocorrelation and serial correlation
synonymously.
3
ut crosses line not enough (attracting)
Postive
. . .. . . . ..
.. .
Auto. 0
. . .. . . ...
. t
crosses line randomly
ut
No . . .. . . . . . . . .
Auto. 0
. . .. . . . . .. .
. . . . .t
ut . . crosses line .
too much (repelling)
. . .
Negative
0
. . .
Auto.
. . . . . . . t
. . 4
Positive vs. Negative
Autocorrelation

5
Regression Model
Yt = 1 + 2Xt + ut

zero mean: E(ut) = 0


homoskedasticity: var(ut) = 2
nonautocorrelation: cov(ut, us) = t = s

autocorrelation: cov(ut, us) = t = s


6
The Nature of the Problem

1. Inertia
2. Specification bias: (1) Excluded variables
bias, (2) Incorrect functional form
3. Lags
4. Cobb-Web Phenomenon
5. Data manipulation
6. Data transformation
7
Order of Autocorrelation
Yt = 1 + 2Xt + ut

1st Order: ut = ut1 + t

2nd Order: ut = 1 ut1 + 2 ut2 + t


3rd Order: ut = 1 ut1 + 2 ut2 + 3 ut3 + t
We will assume First Order Autocorrelation:
AR(1) : ut = ut1 + t 8
First Order Autocorrelation
Yt = 1 + 2Xt + ut

ut = ut1 + t where 1 < < 1

E(t) = 0 var(t) = 2 cov(t, s) = t = s


These assumptions about t imply the following about ut :
E(ut) = 0
2
var(ut) = u2 = 2 corr(ut, ut+k) = k for k > 0
1 9
OLS estimation in the presence of
autocorrelation
1. The least squares estimator is still linear
and unbiased but it is not efficient.

2. The formulas normally used to compute


the least squares standard errors are no
longer correct and confidence intervals and
hypothesis tests using them will be wrong.
10
Detecting Autocorrelation

1. The Graphical Method

11
Durbin-Watson Test
H o: = 0 vs. H1: = 0 , > 0, or < 0
The Durbin-Watson Test statistic, d, is :
n
^ ^
u t-1
2
t=2
u t
d = n
^
t
u 2
t=1

12
Durbin-Watson Test (Cont)

n
^ ^
u t-1
2
t=2
u t
d = n
^
u 2
t=1 t

13
Testing for Autocorrelation
The test statistic, d, is approximately related to ^
as:

d 2(1)
^

= 0 , the Durbin-Watson statistic is d 2.


When ^

= 1 , the Durbin-Watson statistic is d 0.


When ^
When = -1 , the Durbin-Watson statistic is d 4.

14
Durbin-Watson Decision Rule

15
16
Assumption underlying Durbin-
Watson Test

17
Detecting Autocorrelation: Runs Test

18
Detecting Autocorrelation: Breusch-Godfrey Test

19
Remedial Measures

Generalized Least Squares


Newey-West Standard errors

20
Generalized Least Squares
AR(1) : ut = ut1 + t substitute
in for ut
Yt = 1 + 2Xt + ut

Yt = 1 + 2Xt + ut1 + t

Now we need to get rid of ut1


(continued) 21
Yt = 1 + 2Xt + ut1 + t

Yt = 1 + 2Xt + ut
ut = Yt 12Xt lag the
errors
ut1 = Yt1 12Xt1 once

Yt = 1 + 2Xt + Yt1 12Xt1 + t

(continued) 22
Yt = 1 + 2Xt + Yt1 12Xt1 + t

Yt = 1 + 2Xt + Yt1 12Xt1 + t

Yt Yt1 = 1(1) + 2(XtXt1) + t

Y*t = 1* + 2Xt2*+ t

*
Yt = Yt Yt1 X*t2 = (XtXt1)
1 = 1(1)
*
23
= Yt Yt1 1 = 1(1)
* *
Yt
X*t2 = Xt Xt1 *
Yt = 1
*
+ 2X*t2 + t

Problems estimating this model with least squares:


1. One observation is used up in creating the
transformed (lagged) variables leaving only (n-
1) observations for estimating the model.

2. The value of is not known. We must find


some way to estimate it.
24
Estimating Unknown Value
If we had values for the uts, we could estimate:
ut = ut1 + t
First, use least squares to estimate the model:
Yt = 1 + 2Xt + ut

The residuals from this estimation are:


^u = Y - b - b X
t t 1 2 t
25
^u = Y - b - b X
t t 1 2 t

Next, estimate the following by least squares:


^u = u
^ +^
t t1 t

The least squares solution is:


n
^
u ^
u
t t-1
^= t=2

n
^
u 2
t=2 t-1
26
How To Estimate

1. The First Difference Method


2. Durbin-Watson d-statistics

3. The Cochrane-Orcutt Method


4. The Hildreth-Lu Method
5. The Maximum Likelihood Method 27

You might also like