You are on page 1of 3

MTH6141 Random Processes, Spring 2012

Solutions to Exercise Sheet 7


1. (a) Under assumption (i) the fact that I arrive at a random time means that
the time I wait (in minutes) is a random variable uniformly distributed
on [0, 10]. Such an r.v. has expectation 5. So I expect to wait 5
minutes. Under assumption (ii) we have that, by the lack of memory of
the Poisson process, the buses arrive as a Poisson process (even given
that I arrive at a random time). The time I wait is T1 (or S1) for the
process starting from when I arrive. So (in minutes) the waiting time
is an Exp(6/60) = Exp(1/10) random variable. The expectation of this
r.v. is 10. So I can expect to wait 10 minutes.
(b) This result appears paradoxical since in both cases the average interval
between buses is 10 minutes. Intuitively the average time I wait should
be half this. This argument is correct in case (i). However in the Poisson
process case I am more likely to arrive during a long interval between
buses (because the long intervals take up more time) and so the average
time I wait is longer.
2. Applying the same calculation for general n 1:
F
T
n
(t) = Pr(X(t) n)
= 1 Pr(X(t) = 0) Pr(X(t) = 1) Pr(X(t) = n 1)
= 1
_
1 + t +
(t)
2
2!
+ +
(t)
n2
(n 2)!
+
(t)
n1
(n 1)!
_
e
t
.
Dierentiating,
f
T
n
(t) =
_
+
2
t + +

n2
t
n3
(n 3)!
+

n1
t
n2
(n 2)!
_
e
t
+
_
1 + t +
(t)
2
2!
+ +
(t)
n2
(n 2)!
+
(t)
n1
(n 1)!
_
e
t
=

n
t
n1
(n 1)!
e
t
(Notice how all the terms bar one cancel out in pairs.)
3. As usual with continuous random variables (and as the hint suggests), the
cdf is easier to work out than the pdf.
F
T
1
|X(t)=n
(u) = P(T
1
u | X(t) = n)
= 1 P(T
1
> u | X(t) = n)
= 1 P(X(u) = 0 | X(t) = n)
= 1
_
1
u
t
_
n
1
for 0 < u t, since the conditional distribution of X(u), given X(t) = n, is
Bin(n,
u
t
))
Dierentiating this with respect to u gives the pdf:
f
T
1
|X(t)=n
(u) =
n
t
_
1
u
t
_
n1
for 0 < u t.
The expectation can be found in the usual way:
E(T
1
| X(t) = n) =
_
t
0
x
n
t
_
1
x
t
_
n1
dx
=
_
x
_
1
x
t
_
n
_
x=t
x=0
+
_
t
0
_
1
x
t
_
n
dx (integrating by parts)
= 0
_
t
n + 1
_
1
x
t
_
n+1
_
x=t
x=0
=
t
n + 1
.
4. In each part of this question we are told that X(t) = n for some t and n. We
consider random variables U
i
for 1 i n where the U
i
are independent and
each is distributed uniformly on [0, t]. We know from lectures (Theorem 2.5)
that if s is a symmetric function of the T
i
(as all the functions given are)
then
E(s(T
1
, . . . , T
n
) | X(t) = n) = E(s(U
1
, . . . , U
n
)).
(a) By the above, and linearity of expectation,
E(T
1
+T
2
+T
3
| X(4) = 3) = E(U
1
+U
2
+U
3
) = E(U
1
) +E(U
2
) +E(U
3
),
where U
i
U[0, 4]. Then E(U
i
) = 2, and
E(T
1
+ T
2
+ T
3
| X(4) = 3) = 2 + 2 + 2 = 6.
(b) By linearity of expectation,
E(T
1
+T
2
+T
3
+T
4
| X(4) = 3) = E(T
1
+T
2
+T
3
| X(4) = 3)+E(T
4
| X(4) = 3).
We saw in part (a) that the rst term is 6. Now T
4
is the time of the
rst arrival after time 4, and so T
4
4 is distributed Exp(). So E(T
4
|
X(4) = 3) = 4 +
1
and E(T
1
+ T
2
+ T
3
+ T
4
| X(4) = 3) = 10 +
1
.
2
(c) As with part (a),
E(T
2
1
+ T
2
2
+ T
2
3
+ T
2
4
+ T
2
5
| X(2) = 5) = E(U
2
1
+ U
2
2
+ U
2
3
+ U
2
4
+ U
2
5
)
= E(U
2
1
) +E(U
2
2
) +E(U
2
3
) +E(U
2
4
) +E(U
2
5
)
where U
i
U[0, 2]. Then E(U
2
i
) =
_
2
0
1
2
x
2
dx = [
1
6
x
3
]
2
0
=
4
3
and
E(T
2
1
+ T
2
2
+ T
2
3
+ T
2
4
+ T
2
5
| X(2) = 5) =
20
3
.
5. Processing occurs every T minutes and costs k each time. This gives a cost
of
k
T
per minute.
If there are n requests waiting at time T then the total waiting cost they
have incurred is a random variable with] expectation
E
_
n

i=1
c(T T
i
)

X(T) = n
_
We can use the same trick as Question 2 (replacing T
i
with U
i
U[0, T]) to
get that
E
_
n

i=1
c(T T
i
)

X(T) = n
_
= E
_
n

i=1
c(T U
i
)
_
=
ncT
2
.
Now, conditioning on the number of requests waiting at time T, the expected
total waiting cost is

n0
E
_
n

i=1
c(T T
i
)

X(T) = n
_
P(X(T) = n) =
cT
2

n0
ne
T
(T)
n
n!
=
cT
2

2
(where the last identity comes from the fact that the expectation of a Po(T)
random variable is T).
So the expectation of the total waiting cost per minute is
cT
2
. The total
expected cost per minute (processing and waiting) is
k
T
+
cT
2
as required.
Sketching the graph of this function against T we see that it has a single
minimum. Dierentiating we get that the minimum is the solution to
c
2

k
T
2
= 0
So we should take T =
_
2k
c
.
3

You might also like