You are on page 1of 8

ISYE 3232C Spring 2015 Midterm 2

, do swear that I abide by the Georgia Tech Honor Code. I understand


I,
that any honor code violation will result in a F for this course.
Signature:
You will have 1 hour and 15 minutes.
Turn off your cell phone and leave it on the desk facing down.
This midterm is closed book and closed notes. Calculators are NOT allowed. No scrap paper is
allowed. If you need scrap paper, use the front and back pages of the test sheet.
If you need extra space, use the back of the page and indicate that you have done so.
Do not remove any page from the original staple. Otherwise, there will be 25 points off.
Show your work. If you do not show your work for a problem, we will give zero point for the
problem even if your answer is correct.
We will not select among several answers. Make sure it is clear what part of your work you
want graded. If two answers are given, zero point will be given for the problem.
Some guidelines for writing answers on the answer sheet:
1. Our grading is based on your answer and work in the answer sheet.
2. When a numerical answer is requested, your answer must be a real number or fractional
number (e.g., 0.356, 100, 3 per hr, 3/8 etc.).
3. If a numerical answer is not requested, it is better to leave your answer with mathematical
operations to avoid any point off due to calculation errors (e.g., 345 50 + 500 + 3(10 + 20 +
30)/49).
4. When a numerical answer is not expected, you will receive full credit if someone with no
understanding of calculus, probability, and statistics could simplify your answer to obtain a
correct numerical answer with a basic calculator.
5. If you are asked to show work, you must do so on the answer sheet. No point will be given
if your work is not shown on the answer sheet.

PLEASE DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO!

1. (20) Let X0 , X1 , . . . be a Markov chain with state space {0, 1, 2}, initial distribution a =
( , 0.4, 0.5), and transition matrix

.4
.4
0.34 0.26 0.40
.2 , P2 = 0.33 0.35 0.32
P = .3
.2 .5
0.33 0.26 0.41
(a) (2) Fill in the entries for P and a
(b) (1) P(X9 = 2 | X8 = 2)

(c) (4)P(X8 = 2, X10 = 1 | X7 = 0)

(d) (4)P(X1 = 0)

(e) (4)P(X0 = 2 | X1 = 2)

(f) (5)E[X2 | X0 = 2]

2. (20) Let (Xn )n1 be a Markov chain with state space {1, 2, 3, 4, 5} with transition matrix

0.25 0.75 0
0
0
0.5 0.5
0
0
0

0
0.25
0
0.75
0
P=
.

0
0
0.5
0
0.5
0
0
0
0
1
(a) (3) Draw transition diagram.

(b) (6) Specify the classes and determine whether they are transient or recurrent.

(c) (1)Identify the period of state 4.

(d) (10) Calculate limn Pn and fill the matrix.

2/5

P =

3. (14) Suppose you are working as an independent consultant for a company. Your performance is
evaluated at the end of each month and the evaluation falls into one of three categories: 1, 2, and
3. Your evaluation can be modeled as a discrete-time Markov chain and its transition diagram is
as follows.

Denote your evaluation at the end of nth month by Xn and assume that X0 = 2. Your monthly
salary is determined by the evaluation of each month in the following way:
Salary is
$ 1000 when your evaluation is 1
$ 4000 when your evaluation is 2
$ 8000 when your evaluation is 3
(a) (4) What are state space, transition probability matrix and initial distribution of Xn ?

(b) (4) Does the stationary distribution exist? If so, what is the stationary distribution? (Setup the
equations, you dont need to solve them)

(c) (2) What is the long-run fraction of time when your evaluation is either 2 or 3?

(d) (4) What is the long-run average monthly salary?


4

4. (20) Consider the reflected random walk on state space {0, 1, 2, . . . } with the following transition
probabilities: p00 = q + r, p01 = p and pi,i1 = q, pii = r, pi,i+1 = p for i 0, where p + q + r = 1
and p, q, r > 0
(a) (2) Is the Markov chain periodic or aperiodic. Explain and if it is periodic also give the
period.

(b) (2) Is the Markov chain irreducible? Explain.

(c) (8) Find the stationary distribution when p = 0.2, q = 0.4, r = 0.4. Is the stationary distribution unique?

(d) (4) Is the Markov chain positive recurrent when p = 0.5, q = 0.2, r = 0.3? If so, why? If
not, why not? (You dont need to prove it)

(e) (4) For the probabilities given in part (c), is the Markov chain positive recurrent? If so, why?
If not, why not?

5. (12 + Bonus 10) A call center is staffed by two agents, Mary and John. Marys service times are
iid, exponentially distributed with mean 4 minutes and Johns service time are iid exponentially
distributed with mean 6 minutes. Suppose that when you arrive, both Mary and John are busy,
and you are the only one waiting in the waiting line.
(a) (4) What is the probability that John will serve you?

(b) (4) What is the expected waiting time until you can be answered?

(c) (4) What is the probability that you will be answered in 4 minutes?

(d) (Bonus 10) What is your expected time in the system (waiting plus service)?
Note: Bonus question has no partial credit.

6. (14+Bonus 5) True or False.


(1)(True/False) A transient state is accessible from any recurrent state.
(1)(True/False) A recurrent state is accessible from all states in its class, and it is also accessible from recurrent states in other classes.
(1)(True/False) If DTMC is periodic with period d 2, then the limiting probability exist.
(1)(True/False) At least one, possibly more, recurrent states are accessible from a given
transient state.
(1)(True/False) In a MC with finite state space, all states can be transient.
(1)(True/False) An irreducible DTMC with finite state space can have infinite number of
stationary distributions.
(1)(True/False) A system can be in state 0,1 or 2. Let Xn be the state of the system at time
n. We know P(Xn+1 = 1|Xn = 0, Xn1 = 2) = 0.4 and P(Xn+1 = 1|Xn = 0, Xn1 = 1) = 0.5.
Then {Xn , n 0} cannot be a Markov chain.
(2)(True/False) Assume a random variable X always has finite value, i.e., P(X < ) = 1,
then E[X] < .
(2)(True/False) A discrete-time Markov chain with state space S = {1, 2, 3, 4} has the
steady-state probabilities (0.2, 0.4, 0.3, 0.1). In steady-state, on average it would take 2.5 to
return to state 2 given that the system is currently in state 2.
Suppose that whether or not it rains today depends on previous weather conditions through
the last two days. Specifically, suppose that if it has rained for the past two days, then it
will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain
tomorrow with probability 0.5; if it rained yesterday but not today, then it will rain tomorrow
with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with
probability 0.2.
(2)(True/False) If we let Xn be the state at time n which depends only on whether or not it is
raining at time n, then {Xn } is a Markov chain.
(1)(True/False) This problem cannot be modeled as a Markov chain.
(Bonus 5) If you think you could model this problem as a Markov chain, please indicate the
state space and transition probability. (Note: Bonus question has no partial credit.)

7. (Bonus 10) A professor continually gives exams to her students. She can give three possible types
of exams, and her class is graded as either having done well or badly. Let pi denote the probability
that the class does well on a type i exam, and suppose that p1 = 0.3, p2 = 0.6, p3 = 0.9. If the
class does well on an exam, then the next exam is equally likely to be any of the three types. If the
class does badly, then the next exam is always type 1. Identify a state space, model it as a DTMC
and show how to determine the proportion of exams that are type i, i = 1, 2, 3.
(Note: Bonus question has no partial credit. )

You might also like