Professional Documents
Culture Documents
Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006)
Recitation 1 February 9, 2005 1. Problem 1.2, page 52 of text. Let A and B be two sets. (a) Show the following two equalities Ac = (Ac B) (Ac B c ), B c = (A B c ) (Ac B c ) (b) Show that
(A B)c = (Ac B) (Ac B c ) (A B c )
(c) Consider rolling a six-sided die. Let A be the set of outcomes where the roll is an odd number. Let B be the set of outcomes where the roll is less than 4. Calculate the sets on both sides of the equality in part (b), and verify that the equality holds. 2. Problem 1.5, page 53 of text. Out of the students in a class, 60% are geniuses, 70% love chocolate, and 40% fall into both categories. Determine the probability that a randomly selected student is neither a genius nor a chocolate lover. 3. Example 1.5, page 13 of text. Romeo and Juliet have a date at a given time, and each will arrive at the meeting place with a delay between 0 and 1 hour, with all pairs of delays being equally likely. The rst to arrive will wait for 15 minutes and will leave if the other has not yet arrived. What is the probability that they will meet?
Page 1 of 1
Page 1 of ??
Page 1 of 1
We may also nd the solution through a simpler method: P(Fischer wins | Someone wins) = =
Fischer wins
p
q Spassky wins Fischer wins q Spassky wins
1-p
-q
p
Draw
p
Draw q
(b) P(the match lasted no more than 5 games) = (p + q) + (p + q)(1 p q) + (p + q)(1 p q)2 + (p + q)(1 p q)3 + (p + q)(1 p q)4 5 = (p+q)[1(1pq) ]
1(1pq)
= 1 (1 p q)5 P(Fischer wins in the rst game the match lasted no more than 5 games)
= p
Therefore, P(Fischer wins | the match lasted no more than 5 games)
wins the match lasted no more than 5 = P(Fischer the match lasted no more than 5 games) games)
P( p
= 1(1pq)5
1pq
1pq
Draw
p
q
p
Draw q
1pq
1p-
q
...
Page 1 of 2
Therefore, P(Fischer wins | the match lasted no more than 5 games) wins the match lasted no more than 5 = P(Fischer the match lasted no more than 5 games) games) P( p = p+q (d) P(Fischer wins at or before the 5th game | Fischer wins) before the 5th = P(Fischer wins at or P(Fischer wins)game Fischer wins) 5 p
= p[1(1pq) ] / p+q
p+q = 1 (1 p q)5 This part may be solved by observing that the events {Fischer wins} and {the match
lasted no more than 5 games} are independent (we know this from parts (a) and (c)):
P(the match lasted no more than 5 games | Fischer wins)
= P(the match lasted no more than 5 games)
= 1 (1 p q)5
Page 2 of 2
P (Ai | C)P (B | Ai ).
Page 1 of 1
P (Ai | C)P (B | Ai ) = =
P (C)
P (Ai )
where the last line is ONLY TRUE if the events Ai C and B Ai are independent of
each other.
Note also for the expression to be true, i = 1 and A1 has to be the entire sample space,
i.e. P (A1 ) = 1. Therefore, the given expression only holds if Ai C and B Ai are independent and i = 1.
Page 1 of 1
Page 1 of 1
There is a total of k dots put in n groups. Think of there being a separator mark between groups, so there are n 1 separator marks: | | |
N1 N2 Nn
This gives a grand total of k + n 1 dots and marks. The number of solutions is the number of ways to place k dots in k + n 1 locations: k + n 1 . k (c) If we know that X1 = , then applying the result of the previous part to the remaining balls and remaining draws from the urn gives (k ) + (n 1) 1 as the desired number. k Since this is just a way of breaking down the problem of the previous part, we have
k =0 k+n2 k
k+n1 k
3. (a) Students might say they are equal (both being the average number of students per bus) or have the correct intuition. (b) Make sure to dene the PMFs of X and Y . Then E[X] = E[Y ] = 40 33 25 50 40 + 33 + 25 + 50 39.3 148 148 148 148 1 1 1 1 40 + 33 + 25 + 50 = 37 4 4 4 4
Page 1 of 1
n i=1
P(Ai )E[X | Ai ]
whenever A1 , A2 , . . . , An is a partition of the sample space. 3. Suppose a discrete random variable X can have only non-negative integer values. Show that E[X] =
k=0
Page 1 of 1
pY (y)E[X | Y = y]
we better only include in the summation y such that P(Y = y) > 0. 3. The result follows by rewriting the expectation summation in the following manner: E[X] = =
k=0 =1
kpX (k) =
k=1
k
=1 n=0
=1 k=
pX (k)
P(X > 1) =
The manipulations could look unmotivated, but if you sketch the k- plane, then the inter change of summations is clear.
Page 1 of 1
Page 1 of 1
fX (x) =
ex x 0 0, otherwise
(a) Calculate E[X], var(X) and nd P(X E[X]). Hint: P(x k) = fX (x)dx
k
(b) Find P(X t + k|X > t). 2. You are allowed to take a certain test three times, and your nal score will be the maximum of the test scores. Your score in test i, where i = 1, 2, 3 takes one of the values from i to 10 with equal probability 1/(11 i), independently o f the scores in other tests. What is the PMF of the nal score? 3. Wanting to browse the net, Oscar uses his high-speed 300-baud modem to connect through his Internet Service Provider. The modem transmits bits in such a fashion that -1 is sent if a given bit is zero and +1 is sent if a given bit is one. The telephone line has additive zero-mean Gaussian (normal) noise with variance 2 (so, the receiver on the other end gets a signal which is the sum of the transmitted signal and the channel noise). The value of the noise is assumed to be independent of the encoded signal value.
2 Noise ~ N(0, )
0 1
-1 +1
Channel
Receiver/ Decoder
Decision (0 or 1?)
We assume that the probability of the modem sending 1 is p and the probability of sending 1 is 1 p. (a) Suppose we conclude that an encoded signal of 1 was sent when the value received on the other end of the line is less than a (where 1 < a < +1), and conclude +1 was sent when the value is more than a. What is the probability of making an error? (b) Answer part (a) assuming that p = 2/5, a = 1/2 and 2 = 1/4.
Page 1 of ??
1 e
(b) P(X > t + k|X > t) = e(k) Note: the exponential random variable is memoryless. 2. We rst compute the CDF FX (x) and then obtain the PMF as follows pX (k) = We have, FX (k) =
0, 1
k k1 k2 10 9 8
FX (k) FX (k 1) 0,
if k = 3, ...10, otherwise.
k < 3, 3 k 10, 10 k.
3. (a) P(error) = P(R1 |S0 )P(S0 ) + P(R0 |S1 )P(S1 ) = P(Z 1 > a)(p) + P(Z + 1 < a)(1 p) a1 a (1) + (1 p) = p 1 a+1 1a = pp + (1 p) 1 1a a+1 (1 p) = 1p
3/2 1/2 (b) P(error) = 1 0.4 ( 1/2 ) 0.6 ( 1/2 )
Page 1 of 1
2.0 1.0
1.0
2.0
(a) Prepare neat, fully labeled sketches of fX (x), fY (y), fY |X (y|x) and fX|Y (x|y). (b) Are X and Y independent? (c) Find fX,Y |A (x, y), where the event A corresponds to points (x, y ) within the unit circle centered at the origin. (d) Find E[X|Y = y] and var(X|Y = y). 2. Alexei is vacationing in Monte Carlo. The amount X (in dollars) he takes to the casino each evening is a random variable with a PDF of the form
fX (x) =
ax 0
if 0 x 40 otherwise
At the end of each night, the amount Y that he has when leaving the casino is uniformly distributed between zero and twice the amount that the came with. (a) Determine the joint PDF fX,Y (x, y ) (b) What is the probability that on a given night Alexei makes a positive prot at the casino? (c) Find the PDF of Alexeis prot Y X on a particular night, and also determine its expected value.
Page 1 of 1
0.4
f(x) x 0.2 x
0
-1.0
1.0
2.0
Figure 1: Marginal probabilities fX (x) and fY (y) obtained by integration along the y and x axes respectively The conditional PDFs are as shown in the gure below. (b) X and Y are NOT indepenent since fXY (x, y) = fX (x)fY (y). Also, from the gures we have fX|Y (x|y) = fX (x). (c) fX,Y |A (x, y) = = (d) E[X|Y = y] =
fX,Y ((x,y) P(A)
0
0.1 0.1
(x, y) A otherwise
(x, y) A otherwise
0 2 0
1
Page 1 of 3
1/2
2.0
1.0
-1.0
f (x|y) {-1<y<=1}
1/3
x|y
1.0
-1.0 1/2
x|y
2.0
-1.0 -1.0
1.0
2.0
f (x|y) {-2<y<=2}
-1.0
1.0
-2.0
y 2.0
1.0 1/4
1/2
f (y|x)
y|x -1.0
f (y|x)
y|x
Page 2 of 3
var(X|Y = y) =
2. (a) We have a = 1/800, so that fXY (x, y) = (b) P(Y > X) = 1/2 (c) Let Z = Y X. We have
1 1 1600 z + 40 , 1600 0,
1 1 40 ,
1/1600 0,
if 0 x 40 and 0 y 2x otherwise.
fZ (z) = E[Z] = 0.
if 40 z 0, if 0 z 40, otherwise.
Page 3 of 3
fX (x) =
fY (y) =
(a) Determine P (A), the probability that Al wins the race. (b) Determine the probability that Al wins a total of exactly 7 of the next 10 races. Assume all races are independent. You may use P (A) symbolically in your answer. (As long as your answer is explicit, compact, and fully explained, it need not be simplied.) (c) Determine, carefully sketch, and label the PDF for W , the elapsed time for the winner of the race. Fully explain each step of your work. 2. Random variables X and Y are independent and have PDFs as shown below.
f X(x) 5 4 3 2 1 0.2 0.4 0.6 0.8 1.0 x f Y(y) 5 4 3 2 1 0.2 0.4 0.6 0.8 1.0 y
Let W = X + Y , and nd fW (w) using a graphical argument. 3. Alice and Bob ip bias coins independently. Alices coin comes up heads with probability 1/4, while Bobs coin comes up head with probability 3/4. Each stop as soon as they get a head; that is, Alice stops when she gets a head while Bob stops when he gets a head. What is the PMF of the total amount of ips until both stop? (That is, what is the PMF of the combined total amount of ips for both Alice and Bob until they stop?)
Page 1 of 1
( 7 )7 ( 1 )3 8 8
w0 2 ,
fW (w) =
fX (x)fY (w x)dx
for w = x + y and x, y independent. This operation is called the convolution of fX (x) and fY (y).
5w, 0.5, 5(0.1 + (w 0.9)),
fW (w) =
5(0.1 + (1.1 w)), 1.0 w 1.1 1.1 w 1.9 1.9 w 2.0 otherwise
0,
3. Let X and Y be the number of ips until Alice and Bob stop, respectively. Thus, X + Y is the total number of ips until both stop. The random variables X and Y are independent geometric random variables with parameters 1/4 and 3/4, respectively. By convolution, we have
pX+Y (j) =
k= j1
= = = =
1 3k 4j k=1 1 4j 3j 1 1 31
3 (3j1 1) , 2 4j Page 1 of 2
Page 2 of 2
(a) Explain why one of the two could not possibly be its transform, and indicate which one is the true transform. (b) Find P (X = 0).
Page 1 of 1
] = =
b) To nd the mean and the variance we use the moment generating properties of the transform, namely: d E[X n ] = (1)n E[erx ] r=0 dr Thus we have: E[X] =
d E[erx ] r=0 dr rb rb e era 1 be aera 1 + = ba r2 ba r r=0
(L H opital) = =
b2 a2 a2 b2 ba ba b+a . 2
To nd the Variance we need to nd E[X 2 ] and thus we need to take the second derivative of the transform and evaluate at r = 0, E[X 2 ] =
d2 E[erx ] 2 r=0 dr ra ra 2 ra 2 1 e erb 2 ae berb a e b2 erb = + + ba r3 ba r2 ba r r=0
(L H opital) =
1 b3 a3 a3 b3 b3 a3 + + 3 ba ba ba 1 2 (b + ab + a2 ) 3
b+a 2
2. The transform for nonegative integer valued random variables is dened as: pT (z) = x
i=1
z xi P (X = xi ) = E[z X ]
Page 1 of 2
d E[z X ] z=1 dz 1 1 3 7 + + = 2 2 4 4
c) Direct computation thankfully produces the same results. 3. a) Note that by the denition of the transform, MX (s) =
esx pX (x)
and therefore when evaluated at s = 0, the transform should equal 1. We see that only the second
option satises this requirement.
b) It is observed that the transform is that of a Poisson random variable with parameter = 2.
Hence the pdf is given as follows:
k k!
pX (k) = e pX (0) = e2
Page 2 of 2
Page 1 of 1
k k2 + . 6 12
E[D] = N N e N .
Page 1 of 1
The number of widgets in any carton, K, is a random variable with PMF pK (k) = k e , k! k = 0, 1, 2, . . . .
The number of cartons in a crate, N , is a random variable with PMF pN (n) = pn1 (1 p), n = 1, 2, 3, . . . .
Random variables X, K, and N are mutually independent. Determine (a) The probability that a randomly selected crate contains exactly one widget. (b) The expected value and variance of the number of widgets in a crate. (c) The transform or the PDF for the total weight of the widgets in a crate. (d) The expected value and variance of the total weight of the widgets in a crate. 2. Using a fair three-sided die (construct one, if you dare), we will decide how many times to spin a fair wheel of fortune. The wheel of fortune is calibrated innitely nely and has numbers between 0 and 1. The die has the numbers 1,2 and 3 on its faces. Whichever number results from our throw of the die, we will spin the wheel of fortune that many times and add the results to obtain random variable Y . (a) Determine the expected value of Y . (b) Determine the variance of Y .
Page 1 of 1
1 2
if x 1 y x + 1 otherwise
1 10
if x 1 y x + 1 and 5 x 10 otherwise
10
10
xo
We now compute E[X|Y ] by rst determining fX|Y (x|y). This can be done by looking at the horizontal line crossing the compound PDF. Since fX,Y (x, y) is uniformly distributed in the dened region, fX|Y (x|y) is uniformly distributed as well. Therefore, g(y) = E[X|Y = y] = The plot of g(y) is shown here.
5+(y+1) 2
y
10+(y1) 2
if 4 y < 6 if 6 y 9 if 9 < y 11
Page 1 of 3
10
8
g (yo )
4 4
yo
10
11
(b) The linear least squares estimator has the form gL (Y ) = E[X] + cov(X, Y ) (Y E[Y ]) 2 Y
where cov(X, Y ) = E[(X E[X])(Y E[Y ])]. We compute E[X] = 7.5, E[Y ] = E[X] + 2 2 E[W ] = 7.5, X = (10 5)2 /12 = 25/12, W = (1 (1))2 /12 = 4/12 and, using the fact 2 = 2 + 2 = 29/12. Furthermore, that X and W are independent, Y X W cov(X, Y ) = E[(X E[X])(Y E[Y ])] = E[(X E[X])(X E[X] + W E[W ])] = E[(X E[X])(X E[X])] + E[(X E[X])(W E[W ])] 2 2 = X + E[(X E[X])]E[(W E[W ])] = X = 25/12. Note that we use the fact that (X E[X]) and (W E[W ]) are independent and E[(X E[X])] = 0 = E[(W E[W ])]. Therefore, gL (Y ) = 7.5 + 25 (Y 7.5). 29
The linear estimator gL (Y ) is compared with g(Y ) in the following gure. Note that g(Y ) is piecewise linear in this problem.
11
10
9
g (yo ) , gL (yo )
Linear predictor
4 4
yo
10
11
Page 2 of 3
x2 +x2 2 1 2 1 0
2 2
e 2 rdrd
= 1e
Page 3 of 3
Page 1 of 1
Page 1 of 1
Cov(X1 , X2 ) X1 X2
A = B = and therefore:
2 Var(B) = 2
Var(A) =
1 (A, B) = . 2
We proceed as above to nd the correlation of A, C.
Cov(A, C) = E[AC] E[A]E[C] = E[W Y + W Z + XY + XZ] = 0
and therefore
(A, C) = 0.
2. Solution is in the text, pp. 264265. 3. Solution is in the text, pp. 267268.
Page 1 of 1
For p = 0.05 and a particular value of d, Joe uses the Chebyshev inequality to conclude that n must be at least 50,000. Determine the new minimum value for n if: (a) the value of d is reduced to half of its original value. (b) the probability p is reduced to half of its original value, or p = 0.025. 3. Let X1 , X2 , . . . be a sequence of independent random variables that are uniformly distributed between 0 and 1. For every n, we let Yn be the median of the values of X1 , X2 , . . . , X2n+1 . [That is, we order X1 , . . . , X2n+1 in increasing order and let Yn be the (n + 1)st element in this ordered sequence.] Show that the sequence Yn converges to 1/2, in probability.
Page 1 of 1
Zi =
1 if Xi 0.5 + 0 otherwise
{Z1 , Z2 , ....} are i.i.d random variables and E[Zi ] = P (Zi = 1) = P (Xi 0.5 + ) = 0.5 . Hence, for the event {Yn 0.5 + } to occur, we must have at least n + 1 of the {Zi } to take value 1, P (Yn 0.5 + ) = P (
2n+1
= P(
n + 1) n+1 ) 2n + 1
P(
i=1 Zi 0.5) 2n + 1
Note that P (Zi = 1) = 0.5 . By the weak law of large numbers, the sequence (Z1 + + Z2n+1 )/(2n + 1) converges to 0.5 . To show that P Z1 ++Z2n+1 0.5 converges 2n+1 to zero, we need to show that for any given > 0, there exists N such that for all n > N , P Z1 ++Z2n+1 0.5 < . The fact that the sequence (Z1 + + Z2n+1 )/(2n + 1) converges to 2n+1
2n+1
i=1
0.5 ensures the existence of such N . Since P (Yn 0.5 + ) is bounded by P ( it also converges to zero.
Zi
2n+1
0.5),
Page 1 of 1
(a) Find the expected time until the rst failure. (b) Find the probability that there are no bulb failures before time t. (c) Given that there are no failures until time t, determine the conditional probability that the rst bulb used is a type-A bulb. (d) Find the variance of the time until the rst bulb failure. (e) Find the probability that the 12th bulb failure is also the 4th type-A bulb failure. (f) Up to and including the 12th bulb failure, what is the probability, that a total of exactly 4 Type-A bulbs have failed? (g) Determine either the PDF or the transform associated with the time until the 12th bulb failure. (h) Determine the probability that the total period of illumination provided by the rst two Type-B bulbs is longer than that provided by the rst Type-A bulb. 3. (Problem 5.16) Consider a Poisson process. Given that a single arrival occurred in a given interval [0, t], show that the PDF of the arrival time is uniform over [0, t].
Page 1 of 1
Page 1 of 1
Page 1 of 1
rij (n) =
k=0
rik (n 1)pkj
starting with rij (1) = pij where 1 0 0 [pij ] = 0.4 0.6 0 0.3 0.4 0.3 Plugging into the above formula gives : 1 0 0 0 [rij (2)] = 0.64 0.36 0.55 0.36 0.09 Similarly 1 0 0 1 0 0 1 0 0 [rij (3)] = 0.784 0.216 0 , [rij (5)] = 0.922 0.078 0 , [rij (10)] = 0.994 0.006 0 0.721 0.252 0.027 0.897 0.100 0.003 0.992 0.008 0 Eventually the spider will catch the y, thus : 1 0 0 lim [rij (n)] = 1 0 0 n 1 0 0 3. Problem 6.4 in text, page 354
Page 1 of 1
1. (Problem 6.9) A professor gives tests that are hard, medium or easy. If she gives a hard test, her next test will be either medium or easy, with equal probability. However, if she gives a medium or easy test, there is a 0.5 probability that her next test will be of the same diculty, and a 0.25 probability of each of the other two levels of diculty. Construct an appropriate Markov chain and nd the steadystate probabilities. 2. (Problem 6.10) Alvin likes to sail each Saturday to his cottage on a nearby island o the coast. Alvin is an avid sherman, and enjoys shing o his boat on the way to and from the island, as long as the weather is good. Unfortunately, the weather is good on the way to or from the island with probability p, independently of what the weather was on any past trip (so the weather could be nice on the way to the island, but poor on the way back). Now, if the weather is nice, Alvin will take one of his n shing rods for the trip, but if the weather is bad, he will not bring a shing rod with him. We want to nd the probability that on a given leg of the trip to or from the island the weather will be nice, but Alvin will not sh because all his shing rods are at his other home. (a) Formulate an appropriate Markov chain model with n + 1 states and nd the steady state probabilities. (b) What is the steadystate probability that on a given trip, Alvin sails with nice weather but without a shing rod? 3. (Problem 6.13) Ehrenfest model of diusion. We have a total of n balls, some of them black, some white. At each time step, we either do nothing, which happens with probability , where 0 < < 1, or, we select a ball at random, so that each ball has probability (1 )/n > 0 of being selected. In the latter case, we change the color of the selected ball (if white it becomes black, and vice versa), and the process is repeated indenitely. What is the steadystate distribution of the number of white balls?
Page 1 of 1
Page 1 of 1
1. Josephina is currently a 61 student. On each day that she is a 61 student, she has a probability of 1/2 of being a course 61 student the next day. Otherwise, she has an equally likely chance of becoming a 62 student, a 63 student, a course 9 student or a course 15 student the next day. On any day she is a 63 student, she has a probability of 1/4 of switching to course 9, a probability of 3/8 of switching to 61 and a probability of 3/8 of switching to 62 the next day. On any day she is a 62 student, she has a probability of 1/2 of switching to course 15, a probability of 3/8 of switching to 61 and a probability of 1/8 of switching to 63 the next day. In answering the questions below, assume Josephina will be a student forever. Also assume, for parts (a)(f ) that if Josephina switches to course 9 or course 15, she will stay there and will not change her course again. (a) What is the probability that she eventually will leave course 6? (b) What is the probability that she will eventually be in course 15? (c) What is the expected number of days until she leaves course 6? (d) Every time she switches into 61 from 62 or 63, she buys herself an ice cream cone at Toscis. She can only aord so much ice cream, so after shes eaten 2 ice cream cones, she stops buying herself ice cream. What is the expected number of ice cream cones she buys herself before she leaves course 6? (e) Her friend Oscar started out just like Josephina. He is now in course 15. You dont know how long it took him to switch. What is the expected number of days it took him to switch to course 15? [Hint: He had no particular aversion to course 9.] (f) Josephina decides that course 15 is not in her future. Accordingly, when she is a course 61 student, she stays 61 for another day with probability 1/2, and otherwise she has an equally likely chance of becoming any of the other options. When she is 62, her probability of entering 61 or 63 are in the same proportion as before. What is the expected number of days until she is in course 9? (g) Suppose that if she is course 9 or course 15, she has probability 1/8 of returning to 61, and otherwise she remains in her current course. What is the expected number of days until she is 61 again? (Notice that we know today she is 61, so if tomorrow she is still 61, then the number of days until she is 61 again is 1).
Page 1 of 1
9
1/2 1/8 1/4 3/8 1/8 3/8 3/8 1/8
15
6-1
1/8
1/2
6-3
6-2
1/8
By inspection, the states 6-1, 6-2, and 6-3 are all transient, since they each have paths leading to either state 9 or state 15, from which there is no return. Therefore she eventually leaves course 6 with probability 1 . (b) This is simply the absorption probability for the recurrent class consisting of the state course-15. Let us denote the probability of being absorbed by state 15 conditioned on being in state i as ai . Then a15 = 1 a9 = 0 1 1 a61 = a61 + (1) + 2 8 1 3 a62 = (1) + a61 + 2 8 1 3 a63 = (0) + a61 + 4 8 Solving this system of equations yields 105 0.571 184
We will keep the other ai s around as well - they will be useful later:
a61 = a62 = 0.77717 a63 = 0.50543 Page 1 of 5
1 1 1 a62 + (0) + a63 8 8 8 1 a63 8 3 a62 8
(d) The student buys one ice cream cone every time she goes from 6-2 to 6-1 or from 6-3 to 6-1, and buys no more than 2 ice cream cones. Let us denote vi (j) as the probability that she transitions from from 6 2 to 6 1 or from 6 3 to 6 1 j times before leaving course 6, conditioned on being in state i. Then we are interested in the expected value of the random variable N , which denotes the number of cones bought before leaving course 6, and takes on the values 0, 1, or 2. So E[N ] = (0)v61 (0) + (1)v61 (1) + (2)(1 v61 (0) v61 (1)) We use the total probability theorem, conditioning on the next day, to yield the following set of recursive equations: v15 (0) = 1 v9 (0) = 1 1 1 1 1 1 v61 (0) = v61 (0) + v62 (0) + v63 (0) + (1) + (1) 2 8 8 8 8 3 1 1 v62 (0) = (0) + v63 (0) + (1) 8 8 2 3 3 1 v63 (0) = (0) + v62 (0) + (1) 8 8 4
Solving this system of equations yields:
v61 (0) = 46 0.754 61
We still need to nd v61 , and we do this by again conditioning on the second following day: v61 (1) = v62 (1) = v63 (1) = 1 v61 (1) + 2 3 v61 (0) + 8 3 v61 (0) + 8 1 v62 (1) + 8 1 v63 (1) + 8 3 v62 (1) + 8 1 1 1 v63 (1) + (0) + (0) 8 8 8 1 (0) 2 1 (0) 4 Page 2 of 5
Finally, we can solve for the expected number of cones: E[N ] = (0)v61 (0) + (1)v61 (0) + (2)(1 v61 (0) v61 (1)) 690 225 = + 2( ) 3721 3721 1140 = 0.306 3721 (e) We want to nd the expected time to absorption conditioned on the event that the student eventually ends up in state 15, which we will call A. So Pi,j|A = P(Xn+1 = j|Xn = i, X = 15) = = P(X = 15|Xn+1 = j)P(Xn+1 = j|Xn = i) P(X = 15|Xn = i) aj Pi,j ai
where ak is the absorption probability of eventually ending up in state 15 conditioned on being in state k, which we found in part (b). So we may modify our chain with these new conditional probabilities and calculate the expected time to absorption on the new chain. Note that state 9 now disappears. Also, note that Pj,j|A = Pj,j , but Pi,j|A = Pi,j for i = j, which means that we may not simply renormalize the transition probabilities in a uniform fashion after conditioning on this event. Let us denote the new expected time to absorption, conditioned on being in state i as i Our system of equations now becomes 15 = 0 61 = 1 + 62 63 a61 1 61 + 0 + a61 2 a61 3 = 1+0+ 61 + a62 8 a61 3 = 1+0+ 61 + a63 8 a62 1 a63 1 62 + 0 + 63 a61 8 a61 8 a63 1 63 a62 8 a62 3 62 a63 8
Solving this system of equations yields 61 = (f) The new Markov chain is shown below. 1763 3.65 483
Page 3 of 5
9
1/2 1/6 1/4 3/8 1/6 3/4 3/8
6-1
1/6
6-3
6-2
1/4
This is another expected time to absorption question on the new chain. Let us dene k to be the expected number of days it takes the student to go from state k to state 9 in this new Markov chain: 1 61 = 1 + 61 + 2 3 62 = 1 + 61 + 4 3 63 = 1 + 61 + 8 Solving this system of equations yields: 61 = 86 6.615 13 1 62 + 6 1 63 4 3 62 + 8 1 1 63 + (0) 6 6 1 (0) 4
1 (g) The corresponding Markov chain is the same as the one in part (a) except p9,61 = 8 , p9,9 = 7 1 7 8 , p15,61 = 8 , p15,15 = 8 instead of p9,9 = 1, p15,15 = 1. We can consider state 6-1 as an absorbing state. Let k be the expected number of transi tions to be absorbed if we start at state k
9 = 15 63 62
7 + (1 + 9 ) 9 = 8 8 7 = + (1 + 15 ) 15 = 8 8 3 1 = + (1 + 62 ) + (1 + 9 ) 8 4 1 1 = + (1 + 63 ) + (1 + 15 ) 8 2 344 312 62 = , 63 = 61 61
1 8 1 8 3 8 3 8
Page 4 of 5
Page 5 of 5
1. (Example 7.8) We load on a plane 100 packages whose weights are independent random variables that are uniformly distributed between 5 and 50 pounds. What is the probability that the total weight will exceed 3000 pounds? Find an approximate answer using the Central Limit Theorem. 2. (Problem 7.6) Before starting to play the roulette in a casino, you want to look for biases that you can exploit. You therefore watch 100 rounds that result in a number between 1 and 36, and count the number of rounds for which the result is odd. If the count exceeds 55, you decide that the roulette is not fair. Assuming that the roulette is fair, nd an approximation for the probability that that you will make the wrong decision. 3. (Problem 7.7) During each day, the probability that your computers operating system crashes at least once is 5%, independent of every other day. You are interested in the probability of at least 45 crashfree days out of the next 50 days. (a) Find the probability of interest by using the normal approximation to the binomial. (b) Repeat part (a), this time using the Poisson approximation to the binomial.
Page 1 of 1
1. See solution in text, page 390. 2. See online solutions. 3. See online solutions.
Page 1 of 1