You are on page 1of 33

Univariate Stationary Time Series Models

Applied Financial Econometrics


by

Sunil Paul
Madras School of Economics

05-08-2016

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Recap
I

Yt = Yt1 + t , t WN(0, 2 )

Its stable and stationary only if < 1 or roots of (1 L) > 1

AR(1) to MA()
(1 L)Yt = t Yt = (1 L)1 t
= (1 + L + 2 L2 + ...)t

Impulse response functions(j = j )


: 0 1 0 0 ...
y : 0 1 2 ...
Autocorrelations
j : 0 1 2 ...
j : 1 2 ...

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Recap

Yt = 1 Yt1 + 2 Yt2 + t , t WN(0, 2 )

Its stable and stationary only if roots(z1 and z2 ) of


(1 1 L 2 L2 ) > 1

(1 1 L 2 L2 ) = (1 1 L)(1 2 L) hence 1 = 1 + 2
and 2 = 1 2

The stability condition requires 1 + 2 < 1, 2 1 < 1 and


|2 | < 1 (please verify it yourself)

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Recap

AR(2) to MA()
(1 1 L)(1 2 L)Yt = t Yt = (1 1 L)1 (1 2 L)1 t
= (1 + 1 L + 2 L2 + ...)t

Impulse response functions(j = c1 j1 + c2 j2 )


: 0 1 0
0
y : 0 1 1 2 ...

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Recap

Auto-covariance and autocorrelations


(
1 1 + 2 2 + 2
j =
1 j1 + 2 j2

forj = 0
forj > 0

(
1
j =
1 j1 + 2 j2

forj = 0
forj > 0

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

AR(p) process

Yt = c + 1 Yt1 + 2 Yt1 + ... + p Ytp + t or


(1 1 L 2 L2 ... p Lp )Yt = c + t

The AR lag operator of this process (substituting L with z) is


given by (z) = (1 1 z 2 z 2 ... p z p )

AR(p) process is stable and stationary only if p roots of the


AR Characteristic equation have modulus greater than one

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Moments of Stationary AR(p)


I

= c/(1 1 2 ... p )

using the value of c from the above equation we have


(Yt ) = 1 (Yt1 )+2 (Yt2 )+...+p (Ytp )+t

Multiplying both sides with (Ytj ) and taking


expectations we get:
(
1 1 + 2 2 + ... + p p + 2 forj
j =
1 j1 + 2 j2 + ... + p jp forj
(
1
forj
j =
1 j1 + 2 j2 + ... + p jp forj

=0
>0
=0
>0

Solving these p Yule walker equations we can get the


expressions for j for j = 1, 2, ..., p
Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Wold Representation of Stationary AR(p) process

(L)Yt = c + t . where (L) = (1 1 L 2 L2 ... p Lp ).


Operating both sides of this equation by (L) = (L)1 we
get

Yt = (L)t
where
(1 1 L)1 (1 2 L)1 ...(1  p L)1
P (L) = 
P j j
P j j
j j

=
j=0 1 L
j=0 2 L ...
j=0 p L

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

AR(p) Process in vector form


Given: Yt = 1 Yt1 + 2 Yt2 ... + p Ytp + t

1 2

1 0

0 1

..
..

.
.
0 0
Ytp+1

Yt
Yt1
Yt2
..
.

Zt
p1

t
Yt1
p1 p
Yt2 0

Yt3 0

0
+

.. .. ..

. . .
1

= F Zt1 + vt
pp p1

Sunil Paul

p1

Lecture Notes

Ytp

Univariate Stationary Time Series Models

Wold Representation

Zt = FZt1 + vt can be solved as


Zt = Ft+1 Z1 + Ft v0 + ... + Fvt1 + vt
The first equation is
(t+1)

Yt = f11

(t)

(t+1)

Y1 + f12
(t1)

+f11 0 + f11

(t+1)

Y2 + ... + f1p

1
1 + ... + f11
t1 + t

Sunil Paul

Lecture Notes

Yp

Univariate Stationary Time Series Models

Wold Representation

We can also solve Zt = FZt1 + vt by forward substitution


Zt+j = Fj+1 Zt1 + Fj vt + ... + Fvt+j1 + vt+j

(j+1)

Yt+j = f11
(j)

(j+1)

Yt1 + f12

(j1)

+f11 t + f11

(j+1)

Yt2 + ... + f1p


(1)

Ytp

(t1) + ... + f11 t+j1 + t+j

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Impulse Response functions

1 =
2 =

Yt+1
Yt
(1)
=
= f11 = 1
t
t1

Yt+2
Yt
(2)
=
= f11 = 21 + 2
t
t2

Higher order IRF can also be derived

But an easier way to estimate IRF is by simulations

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Computation of IRF: An example


I

Consider
Yt = 0.8Yt1 + 0.6Yy 2 0.5Yt3 + t

let there is a one unit shock int then the changes in Yt can
captured by IRF as follows
t
0 = 1.00
t +1
1 = 0.8 0 = 0.80
t +2
2 = 0.8 1 + 0.6 0 = 1.24
t +3
3 = 0.8 2 + 0.6 1 0.5 0 = 0.97
..
.
t + j j = 0.8 j1 + 0.6 j2 0.5 j3 = ...
You may verify the results using
(j)

j = f11
Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Analytical Characterization of IRF


I

Since IRF is related to the element of F j the nature of IRF


can be better understood by analyzing the eigenvalues of F

Consider an AR(2) process


Yt = 1 Yt1 + 2 Yt2 ... + p Ytp + t

 

  
Yt
1 2 Yt1

=
+ t
Yt1
1 0 Yt2
0
Zt = FZt1 + vt

Eigenvalues of F are the values of for which


|F I| = 0

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Analytical Characterization of IRF


For an AR(2)

 


1 2

0
= 0 2 1 2 = 0

1 0

The eigenvalues of the this second order difference equation can be
obtained as:
q
q
2
1 1 + 42
1 + 21 + 42
2 =
, 1 =
2
2
and i = zi1
(Note: AR lag operator can be obtained by multiplying
2 1 2 by z 2 = 2 )

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Analytical Characterization of IRF

This can be generalized to an AR(P) model

Proposition
The eigenvalues of the matrix F of an AR(p) process are the values
of that satisfy
p 1 p1 2 p2 ... p1 p = 0

Proof.
See Hamilton

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Analytical Characterization of IRF

The dynamic behavior of the system depends on the nature of


eigenvalues

Eigenvalues can be distinct or repeated ,real or complex

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Solution with distinct eigenvalues

If eigenvalues are distinct the F matrix of an AR(p) process


can be decomposed as follows
F = TT1 ,
where T is a P P nonsingular matrix and is P P with
eigenvalues in the diagonal and zeros elsewhere.

It can be proved that


Fj = Tj T1 ,

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Solution with distinct eigenvalues

t11
t21

= .
..
tp1

t12
t22
..
.
tp2

Fj = Tj T1
j
11 12
1 0 0
t1p
t
t
j
21

t2p 0 2 0 t
t 22

..
.
..
.
..
.
. ..
.
. .. ..
tpp
t p1 t p2
0 0 jp

The (1,1) element of F j :


j
f11
= [t11 t 11 ]j1 + [t12 t 21 ]j2 + + [t1p t p1 ]jp

or
j
f11
= c1 j1 + c2 j2 + + cp jp =

Sunil Paul

Lecture Notes

Yt+j
t

t 1p
t 2p

..
.

t pp

Univariate Stationary Time Series Models

Solution with distinct eigenvalues


Proposition
If the eigenvalues(1 , 2 , , p )of the matrix F are distinct then
ip1
k6=1,k=1 (i k )

ci = Qp

Proof.
Please refer Hamilton
Yt+j
j
= f11
= j = c1 j1 + c2 j2 + + cp jp
t

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Examples
AR(1)
Yt = Yt1 + t
= 0 1 = 1
Ytj
= c1 j1 = j
t
AR(2)
Yt = 1 Yt1 + 2 Yt2 + t
Yt = 0.6Yt1 + 0.2Yt2 + t

2 1 2 = 0

2 0.6 0.2 = 0

Ytj
= c1 j1 + c2 j2
t
c1 =

2 = 0.24, 1 = 0.84

1
2
, c1 =
1 2
2 1
Sunil Paul

c2 = 0.222, c1 = 0.778
Lecture Notes

Univariate Stationary Time Series Models

Complex roots
I

The eigenvalues can be complex conjugate

Consider an AR(2) process with Complex roots


1 = a + bi
2 = a bi
where i =

1, a =

1
2

q
and b = (0.5) 21 42

In polar cordinate form


1 = R.[cos() + i. sin()] = R.[e i ]
2 = R.[cos() i. sin()] = R.[e i ]

where R = a2 + b 2 , cos() = Ra and sin() = Rb


Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Complex roots

Ytj
= c1 j1 + c2 j2
t
= c1 R j .[cos(j) + i. sin(j)] + c2 R j .[cos(j) i. sin(j)]
= (c1 + c2 )R j . cos(j) + (c1 c2 )i.R j sin(j)
= 2R j . cos(j) 2R j sin(j)
where c1 = + i, c2 = i

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Partial Autocorrelation Function

Order of a MA process can be identified using ACF

ACF of MA(q) process cuts off at q but ACF of AR(p) dies


off.

Order of the AR process could be identified using PACF

PACF is defined as correlation between Yt and Ytj after


removing the effects of Yt1 through Ytj+1 s

i.e. jj = Corr (Yt , Ytj |Yt1 , Yt2 , ..., Ytj+1 )

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Partial Autocorrelation Function

consider the following AR models, given Yt = (Yt )

+
Yt = 11 Yt1
t

+ Y +
Yt = 12 Yt1
22 t2
t

+ Y + Y +
Yt = 13 Yt1
13 t2
33 t2
t

here 11 is the partial autocorrelation of Yt and Yt1

Similarly 22 is the partial autocorrelation of Yt and Yt2 and


so on

22 measures the added contribution of Yt2 over Yt1

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Partial Autocorrelation Function

We can get partial autocorrelation from ACF as follows

jj =

1
j

for j = 1
Pj1

j1i ji
Pi=1
1 j1
i=1 j1i i

for j = 2, 3, ...

where ji = j1,i jj j1,ji , for i = 1, 2, ...j 1

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Inveritibility

If an MA(1) process such as Yt = + t + t1 can be


written as an AR() process by inverting MA lag operator
then the MA process is said to be invertible

L)1 ( Y ) = .
i.e. Yt = (1 + L)t = (1 P
t
t

1
)j Lj
where = and (1 L) =
(
j=0

Note that invertibility require || < 1.or the roots of (1 + L)


should lie outside the unit circle (L = 1 )

A MA(q) is invertible only if the roots of


(1 + 1 L + 2 L2 + ... + q Lq ) = 0 is greater than modulus 1.

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

ARMA(p,q) process
I

Yt = c + 1 Yt1 + 2 Yt1 + ... + p Ytp + t


+1 t1 + 2 t2 + ... + q tq , t WN(0, 2 )
I

or
(11 L2 L2 ...p Lp )Yt = c+(1+1 L+2 L2 +...+p Lp )t

Stationarity of ARMA model depends only on the roots of AR


characteristic equation

AR(p) process is stable and stationary only if p roots of the


AR Characteristic equation have modulus greater than one

If Stationary condition is satisfied then ARMA(p,q) can be


written in wold form as: Yt = (L)1 c + (L)
(L) t

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Moments of Stationary ARMA(p,q)


I

= c/(1 1 2 ... p )

using deviation form


(Yt ) = 1 (Yt1 )+2 (Yt2 )+...+p (Ytp )+t
+1 t1 + 2 t2 + ... + q tq we can get variance and auto
covariance

For auto covariance multiply both sides with (Ytj ) and


take expectations on both sides:
j = 1 j1 + 2 j2 + ... + p jp for j = q + 1, q + 2, ....

divide both sides of j by 0 to get the Yule Walter equations


as follows:..j = 1 j 1 + 2 j 2 for j = q + 1, q + 2, ...

Calculation of j 1 through q is complicated due to the


correlation between j tj and Ytj

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Example: ARMA(1,1)
I

Yt = c + 1 Yt1 + t + 1 t1

= c/(1 1 )

using deviation form (Yt ) = 1 (Yt1 ) + t +1 t1

For variance auto covariance multiply both sides with


(Ytj ) and take expectations on both sides:

E [(Yt )(Yt )] = 1 E [(Yt1 )(Yt )] + E [t


(Yt )] + 1 E [t1 (Yt )]
I

0 = 1 1 + 2 + 1 (1 + 1 ) 2

E (Yt )(Yt1 ) = 1 E [(Yt1 )(Yt1 )] + E [t


(Yt1 )] + 1 E [t1 (Yt1 )]
I

1 = 1 0 + 1 2

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

Example: ARMA(1,1)
I

E (Yt )(Ytj ) = 1 E [(Yt1 )(Ytj )] + E [t


(Ytj )] + 1 E [t1 (Yt2 )]
I

j = 1 j1 for j = 2, 3, 4, ...

Autocorrelation:
1 =

(1 + 1 1 )(1 + 1 )
1
=
0
(1 + 12 + 21 1 )

j =
I

j
j1
= 1
= 1 j1
0
0

ACF declines as lag length increases

Sunil Paul

Lecture Notes

Univariate Stationary Time Series Models

General Behavior of the ACF and PACF for ARMA


processes

AR(p)
ACF

Decays toward
zero (May ocillate)
PACF Cuts off after
lag p

MA(q)
Cuts off after
lag q
Decays toward
zero (may ocillate)

Sunil Paul

Lecture Notes

ARMA(p,q),
p0 and q0
Decay(either direct or oscillatory) after lag q
Decay(either direct or oscillatory) after lag p

Univariate Stationary Time Series Models

General Behavior of the ACF and PACF


Process
WN
AR(1), > 0

ACF
All j=0
Direct geometric decay, j = j
AR(1), < 0
Ocilating
decay,
j = j
MA(1), > 0
Positive spike at lag
1, j=0 , j > 1
MA(1), < 0
Negative spike at lag
1, j=0 , j > 1
ARMA(1,1), > geometric decay af0
ter lag 1, sign of
1 =sign(, + )
ARMA(1,1) < Ocillating decay af0
ter lag 1, sign of
1 =sign(, + )
Sunil Paul

PACF
All jj = 0
11 = 1 ; jj =
0, j > 1
11 = 1 ; jj =
0, j > 1
Osillating
decay,11 > 0
geometric
decay,
11 < 0
Osillating decay after lag1, 11 = 1
Geometric decay after lag 1 11 = 1 ,

Lecture Notes

You might also like