Professional Documents
Culture Documents
1. (20) Let X0 , X1 , . . . be a Markov chain with state space {0, 1, 2}, initial distribution a =
( , 0.4, 0.5), and transition matrix
.4
.4
0.34 0.26 0.40
.2 , P2 = 0.33 0.35 0.32
P = .3
.2 .5
0.33 0.26 0.41
(a) (2) Fill in the entries for P and a
(b) (1) P(X9 = 2 | X8 = 2)
(d) (4)P(X1 = 0)
(e) (4)P(X0 = 2 | X1 = 2)
(f) (5)E[X2 | X0 = 2]
2. (20) Let (Xn )n1 be a Markov chain with state space {1, 2, 3, 4, 5} with transition matrix
0.25 0.75 0
0
0
0.5 0.5
0
0
0
0
0.25
0
0.75
0
P=
.
0
0
0.5
0
0.5
0
0
0
0
1
(a) (3) Draw transition diagram.
(b) (6) Specify the classes and determine whether they are transient or recurrent.
2/5
P =
3. (14) Suppose you are working as an independent consultant for a company. Your performance is
evaluated at the end of each month and the evaluation falls into one of three categories: 1, 2, and
3. Your evaluation can be modeled as a discrete-time Markov chain and its transition diagram is
as follows.
Denote your evaluation at the end of nth month by Xn and assume that X0 = 2. Your monthly
salary is determined by the evaluation of each month in the following way:
Salary is
$ 1000 when your evaluation is 1
$ 4000 when your evaluation is 2
$ 8000 when your evaluation is 3
(a) (4) What are state space, transition probability matrix and initial distribution of Xn ?
(b) (4) Does the stationary distribution exist? If so, what is the stationary distribution? (Setup the
equations, you dont need to solve them)
(c) (2) What is the long-run fraction of time when your evaluation is either 2 or 3?
4. (20) Consider the reflected random walk on state space {0, 1, 2, . . . } with the following transition
probabilities: p00 = q + r, p01 = p and pi,i1 = q, pii = r, pi,i+1 = p for i 0, where p + q + r = 1
and p, q, r > 0
(a) (2) Is the Markov chain periodic or aperiodic. Explain and if it is periodic also give the
period.
(c) (8) Find the stationary distribution when p = 0.2, q = 0.4, r = 0.4. Is the stationary distribution unique?
(d) (4) Is the Markov chain positive recurrent when p = 0.5, q = 0.2, r = 0.3? If so, why? If
not, why not? (You dont need to prove it)
(e) (4) For the probabilities given in part (c), is the Markov chain positive recurrent? If so, why?
If not, why not?
5. (12 + Bonus 10) A call center is staffed by two agents, Mary and John. Marys service times are
iid, exponentially distributed with mean 4 minutes and Johns service time are iid exponentially
distributed with mean 6 minutes. Suppose that when you arrive, both Mary and John are busy,
and you are the only one waiting in the waiting line.
(a) (4) What is the probability that John will serve you?
(b) (4) What is the expected waiting time until you can be answered?
(c) (4) What is the probability that you will be answered in 4 minutes?
(d) (Bonus 10) What is your expected time in the system (waiting plus service)?
Note: Bonus question has no partial credit.
7. (Bonus 10) A professor continually gives exams to her students. She can give three possible types
of exams, and her class is graded as either having done well or badly. Let pi denote the probability
that the class does well on a type i exam, and suppose that p1 = 0.3, p2 = 0.6, p3 = 0.9. If the
class does well on an exam, then the next exam is equally likely to be any of the three types. If the
class does badly, then the next exam is always type 1. Identify a state space, model it as a DTMC
and show how to determine the proportion of exams that are type i, i = 1, 2, 3.
(Note: Bonus question has no partial credit. )