You are on page 1of 4

Birla Institute of Technology and Science, Pilani (Raj.

)
Second Semester, 2018-19
MATH F424 (Applied Stochastic Processes)
Mid Semester Examination (Close Book)
Max. Marks: 60 Max. Time: 90 mins Date: 15 March, 2019 (Friday)
Solution
PART A (Short answers)
Each question carries 3 marks.
Q1. Explain when a stochastic process is wide sense stationary (weak sense stationary).
Ans. The process {Xt ; t  T} is wide sense stationary or covariance-stationary or weak sense
stationary if
 The second moment of Xt is finite for all t, that is
E(Xt)2 <   t ,
 The first moment of Xt is independent of t, i.e. constant
E(Xt ) =   t and
 The covariance function CX(t1, t2) depends only on t1 - t2, that is
CX(t1,t2) = g(t1 - t2)  t1, t2. [3]

Q2. Randomly select playing cards from an ordinary deck. Consider the state space is S = {Red, Black}.
Calculate the chance of observing the sequence Red, Red, Black assuming without replacement
sampling methods.
Ans. Let Xn denote the nth observation.
Observing a sequence Red, Red, Black means X0 =Red, X1 = Red, X2 = Black.
Pr[X0 =Red, X1 = Red, X2 = Black]
= Pr[X2 = Black, X1 = Red| X0 =Red] Pr[X0 =Red] [1]
= Pr[X2 = Black| X1 = Red, X0 = Red]Pr[X1 = Red|X0 = Red] ] Pr[X0 =Red]
26 25 26
= 50 51 52 = 13/102 = 0.127 [2]

Q3. Imagine an office have only two telephones, so that at any point of time both the phones are free (0),
one phone is busy (1) or both phones are busy (2). Consider that the observation time is divided into
equal length intervals called slots. We observe the system state at the slot boundaries. During each
time slot, p is the chance that a call comes in, while q is the chance that a call is completed. Write the
transition rate matrix for this system.
Ans

(𝟏 − 𝒑) 𝒑 𝟎
𝑷 = [𝒒(𝟏 − 𝒑) (𝟏 − 𝒑)(𝟏 − 𝒒) + 𝒑𝒒 (𝟏 − 𝒒)𝒑]
𝟎 𝒒 (𝟏 − 𝒒)
(one mark for each row) [3]
Q4. State ergodic theorem for reducible Markov chain.
Ans. Statement: Let {Xn, n ≥ 0} be a finite MC with aperiodic states and single closed class. Let P be the
transition matrix of the m-state chain with state space S and P1 the submatrix of transitions among
the k (≤ m) members of the closed class C. Let V1 = { …, v1,…} be the stationary distribution
corresponding to the stochastic submatrix P1, i.e. P1n  eV1 . If V = (V1, 0) then as n   , Pn 
eV. In other words, V is the stationary distribution corresponding to matrix P. [3]
Q5. Let {N(t); t  [0, ) } be a Poisson process with rate . Find the probability Pr[N(1) ≥ 1, N(3) ≤ 2].
Ans. Since N(1) < N(3), we have
{N(1) ≥ 1, N(3) ≤ 2} = {N(1) = 1, N(3) ≤ 2}∪ {N(1) = 2, N(3) ≤ 2}
= {N(1) = 1, N(3) - N(1) ≤ 1}∪ {N(1) = 2, N(3) - N(1) ≤ 0} [1]

Using independence of N(1) and N(3) – N(1),


Pr[N(1) ≥ 1, N(3) ≤ 2] = Pr[N(1) = 1]Pr[N(3) – N(1) ≤ 1] + Pr[N(1) = 2]Pr[N(3) – N(1) = 0].

Use N(1) Poi ();N(3) –N(1)  Poi (2) the required probability equals
2𝜆 exp(−2𝜆) 𝜆2 exp⁡(−𝜆)
Pr[N(1) ≥ 1, N(3) ≤ 2] = 𝜆 exp(−𝜆) [ 1!
+ exp(−2𝜆)] + 2!
exp⁡(−2𝜆)
5
= [ 𝜆2 + 𝜆] exp⁡(−3𝜆) [2]
2

Q6. A machine is subject to failure due to shocks arriving from two independent sources. The shocks from
source 1 arrive according to Poisson process with rate 3 per day and those from source 2 at 4 per day.
What are the mean and variance of the total number of shocks from both the sources over an 8 hour
shift?

Ans. Given N1(t)  Poi(1 = 3/day) For 8hr day parameter  =38/24 = 1 per 8hr
and N2(t)  Poi(2 = 4/day) For 8hr day parameter  =48/24 = 4/3 per 8hr [1]
Let N(t) = N1(t) +N2(t)
E[N(t)] = E[N1(t)] +E[N2(t)] = 1 + 4/3 = 7/3. [2]

PART B

Q1. Consider a stochastic process {X(t)} such that X(t) = Acos(t + ) where A and  are constants and
 is uniform random variable distributed in the interval (-, ). Show that {X(t)} is stationary process
in wide sense. [10]

Ans. Given X(t) = Acos(t + ), since  is uniform random variable distributed in the interval (-, ) we
have pdf of  as
1
𝑓(𝜃) = 2𝜋 ⁡⁡⁡⁡⁡ − 𝜋 ≤ 𝜃 ≤ 𝜋 [1]
Consider,
𝜋 1
𝐸{𝑋(𝑡)} = 𝐸[𝐴𝑐𝑜𝑠(𝜔𝑡 + 𝜃)] = ∫−𝜋 𝐴𝑐𝑜𝑠(𝜔𝑡 + 𝜃) 2𝜋 𝑑𝜃
𝐴 𝜋 𝐴
= 2𝜋 ∫−𝜋 𝑐𝑜𝑠(𝜔𝑡 + 𝜃)𝑑𝜃 = 2𝜋 [sin⁡(𝜔𝑡 + 𝜃)]𝜋−𝜋⁡ = 0 [2]
Here E{X(t)} is constant.
𝜋 1 𝐴2
𝐸{𝑋 2 (𝑡)} = 𝐸[𝐴2 𝑐𝑜𝑠 2 (𝜔𝑡 + 𝜃)] = ∫−𝜋 𝐴2 𝑐𝑜𝑠 2 (𝜔𝑡 + 𝜃) 2𝜋 𝑑𝜃 = < [2]
2
Now, consider
𝐶𝑋 {𝑋(𝑡1 )𝑋(𝑡2 )} = 𝐸{𝑋(𝑡1 )𝑋(𝑡2 )} − 𝐸{𝑋(𝑡1 )}𝐸{𝑋(𝑡2 )} [1]
𝐶𝑋 ⁡{𝑋(𝑡1 )𝑋(𝑡2 )} = 𝐸{𝐴𝑐𝑜𝑠(𝜔𝑡1 + 𝜃)𝐴𝑐𝑜𝑠(𝜔𝑡2 + 𝜃)} − 0
⁡⁡⁡⁡= 𝐴2 𝐸{𝑐𝑜𝑠(𝜔𝑡1 + 𝜃)𝑐𝑜𝑠(𝜔𝑡2 + 𝜃)}
𝐴2
= [𝐸{𝑐𝑜𝑠(𝜔(𝑡1 − 𝑡2 )) + 𝑐𝑜𝑠(𝜔(𝑡1 + 𝑡2 ) + 2𝜃)}] [1]
2
𝐴2 𝜋 1 𝜋 1
= [∫−𝜋 𝑐𝑜𝑠(𝜔(𝑡1 − 𝑡2 )) 2𝜋 𝑑𝜃 + ∫−𝜋 𝑐𝑜𝑠(𝜔(𝑡1 + 𝑡2 ) + 2𝜃) 2𝜋 𝑑𝜃]
2
𝐴2 𝐴2
= 2
𝑐𝑜𝑠(𝜔(𝑡1 − 𝑡2 )) − 8𝜋 |𝑠𝑖𝑛[𝜔(𝑡1 + 𝑡2 ) + 2𝜃]|𝜋−𝜋
𝐴2
= 2 𝑐𝑜𝑠(𝜔(𝑡1 − 𝑡2 )) [2]
Here covariance is a function of (𝑡1 − 𝑡2 ), hence X(t) is wide sense stationary. [1]
Q2. Prove that in an irreducible Markov chain, all the states are of same type. They are either all transient,
all persistent null, or all persistent non-null. Also show that all the states are aperiodic if states are
persistent null and periodic with same period if states are persistent non-null. [10]
Ans. See page 87

Q3. Let {Nt; t  [0, )} be a Poisson process with rate . Fix two time points as 0 < s < t.
(a) Find E(Ns Nt). [Hint 𝑁𝑠 𝑁𝑡 = ⁡ 𝑁𝑠2 + ⁡ 𝑁𝑠 (𝑁𝑡 − 𝑁𝑠 )⁡].
(b) Find the co-variance between Ns and Nt. [6]

Ans. (a) E(𝑁𝑠 𝑁𝑡 ) = 𝐸(𝑁𝑠2 ) + ⁡𝐸(𝑁𝑠 (𝑁𝑡 − 𝑁𝑠 ))


= ((𝜆𝑠)2 + 𝜆𝑠) + (𝜆𝑠 × 𝜆(𝑡 − 𝑠)) (𝑁𝑠⁡ 𝑎𝑛𝑑⁡(𝑁𝑡 − 𝑁𝑠 ) are non-overlapping
intervals so independent) [1]
⁡⁡⁡⁡⁡⁡⁡= 𝜆𝑠(1 + 𝜆𝑡)⁡ [2]

(b) Cov(𝑁𝑠 𝑁𝑡 ) = E(𝑁𝑠 𝑁𝑡 ) − E(𝑁𝑠 )𝐸(𝑁𝑡 ) = 𝜆𝑠(1 + 𝜆𝑡) − 𝜆𝑠𝜆𝑡 = 𝜆𝑠 = ⁡𝜆⁡min(𝑠, 𝑡). [3]

Q4. Let B(t) be a pure birth process with parameter n and B(0) = 0; the birth rate n = n + . Write the
differential-difference equations for pn(t) = Pr{B(t) = n}. Without solving these equations, use them
to show that m(t) = E[B(t)] = ∑∞ 𝑛=0 𝑛𝑝𝑛 (𝑡)⁡satisfies m(t) = m(t) + .
[8]
Ans. Given that B(t) is a birth process with birth rate n = n + . Therefore, differential-difference
equations are
pn(t)= -(n+ ) pn(t) + ((n-1) )pn-1(t) ; n ≥ 1
p0(t)= - p0(t) ; n=0 [2]

Given m(t) = E[B(t)] = ∑∞ ∞ ′


𝑛=0 𝑛𝑝𝑛 (𝑡), i.e. m(t) = ∑𝑛=0 𝑛𝑝𝑛 (𝑡). [2]
Now, Multiply n in differential-difference equation and sum it from 0 to 

∑∞ ′ ∞ ∞
𝑛=0 𝑛𝑝𝑛 (𝑡) = − ∑𝑛=0 𝑛(𝑛𝜆 + 𝜈)𝑝𝑛 (𝑡) + ∑𝑛=1 𝑛((𝑛 − 1)𝜆 + 𝜈)𝑝𝑛−1 (𝑡) [2]
= − ∑∞ 2 ∞
𝑛=0(𝑛 𝜆 + 𝑛𝜈)𝑝𝑛 (𝑡) + ∑𝑛=0(𝑛 + 1)(𝑛𝜆 + 𝜈)𝑝𝑛 (𝑡)
= ∑∞ 2 2
𝑛=0((−𝑛 + 𝑛 + 𝑛)𝜆 − 𝑛𝜈 + (𝑛 + 1)𝜈)𝑝𝑛 (𝑡)
𝑛=0 𝑛𝜆𝑝𝑛 (𝑡)+ ∑𝑛=0 𝜈 𝑝𝑛 (𝑡) = m(t) + .
m(t) = ∑∞ ∞
[2]

Q5. Let {Sn : n ≥ 0} be a simple random walk with S0 = 0, show that Xn = |Sn| defines a Markov chain;
find the transition probabilities of this chain. [8]

Ans. Simple random walk :If {Sn : n ≥ 0} represents a simple random walk which means a particle is at
place i at nth observation it will move one step ahead with probability p and will move one step back
with probability q; p + q = 1. i.e. at (n + 1)th step particle is at i +1 place with probability p or it is at
i -1 place with probability q. So the state space is S = {0, 1, 2, 3,…}Also given that system is
initially at 0 (S0 = 0). Mathematically,
pi,i+1 = p, pi,i-1 = q - ≤ i ≤  [2]

Now, Xn = |Sn| is absolute value of Sn. But to determine Xn+1, knowledge of |Sn| is important and sign
of Sn is irrelevant, therefore Xn+1 = |Sn+1|=|Sn| + 1 with probability p and Xn+1 =|Sn+1|=|Sn| - 1 with
probability q.
Hence,
Pr(Xn+1 = i +1|Xn = i, B) = Pr(Xn+1 = i +1|Sn = i, B)Pr(Sn = i |Xn = i, B)
+ Pr(Xn+1 = i +1|Sn = - i, B)Pr(Sn = - i|Xn = i, B) [2]
where B = {Xr = ir for 0 ≤ r < n} and i0, i1, i2,… in-1 are integers.
Clearly

Pr(Xn+1 = i +1|Sn = i, B) = p and Pr(Xn+1 = i +1|Sn = - i, B) = q.

Let l be the time of the last visit to 0 prior to the time n, l = max{r: ir = 0}. During the time-interval
(l, n], the path lies entirely in either the positive integer or the negative integer. If the former, it is
required to follow the route prescribed by the event B ∩ {Sn = i}, and if the latter by the event B ∩
{Sn = - i}. The absolute probabilities of these two routes are
1 1 1 1
𝜋1 = 𝑝2(𝑛−𝑙+𝑖) 𝑞 2(𝑛−𝑙−𝑖) and 𝜋2 = 𝑝2(𝑛−𝑙−𝑖) 𝑞 2(𝑛−𝑙+𝑖) [2]

Therefore,
𝜋1 𝑝𝑖
Pr(Sn = i |Xn = i, B) = 𝜋 = 𝑝𝑖 ⁡+⁡𝑞𝑖 = 1 - Pr(Sn = - i |Xn = i, B)
1 +𝜋2

𝑝𝑖+1⁡ +⁡𝑞𝑖+1
Pr(Xn+1 = i +1|Xn = i, B) =⁡ = 1- Pr(Xn+1 = i - 1|Xn = i, B)
𝑝𝑖 ⁡+⁡𝑞 𝑖

These transition probabilities only depend on i not on the previous states (B), so Xn is Markov Chain.
[2]

****END****

You might also like