Professional Documents
Culture Documents
Variance:
Consider
=+
= 2
= + + 2
= + 2
= [ 2 + 2
+
+
= + + 2(, )
In general, the covariance (, ) is not equal to
zero, and thus, the variance of a sum is not necessarily
equal to the sum of the individual variances.
3
[ ] +
=1
[ , ]
=1 =1
2 = 2
[ ] =
=1
=1
5
()) = ( )( )
Therefore,
= () ()
The above results can be generalized to the sum of
independent random variables as
= 1 + 2 + +
=
= 1+2++
= 1 2
= 1 2
Thus, the characteristic function of the sum of
independent random variables can be expressed as a
product of the characteristic functions of the individual
random variables.
7
2 /2 2
= 1, 2, ,
2
The characteristic functions of these individual Gaussian
random variables are given by
2 2
2 ,
= 1, 2, ,
8
2 2
2
=1
= exp{ 1 + 2 + +
2 12 + 22 + + 2 /2}
2 2
2
=
This is the characteristic function of a
Gaussian random variable with mean and
2
variance .
1
2
22
10
for = 1, ,
11
=
,
= 1, 2, ,
Therefore, characteristic function of :
=
Thus, is an n-Erlang random variable, with its pdf given by
=
( )2
,
1 !
> 0
12
=
1
Therefore, the probability generating function of :
=
1
14
=1
| = =
= ,
=1
we have
| =
Therefore,
=
16
Characteristic function of
=1
17
1 +2 ++
| = =
=
Therefore,
| =
18
Hence,
=
= =()
=
Thus, the characteristic function of is
obtained by evaluating the generating function of
at = ().
19
Example
The number of jobs submitted to a computer
in an hour is a geometric random variable with
parameter and the job execution times are
independent exponentially distributed random
1
variables with mean . Find the pdf for the sum
20
=
1
A job execution time is exponentially distributed
= ,
0
Its characteristic function for is given by
=
Therefore, the characteristic function of is given by
=
21
=
+ 1
=
=+ 1
Therefore,
= 1
1
= 1 + 1
= + 1 e ,
0
1
22
Interpretation:
1. With probability , there are no job arrivals, and
hence, the total execution time is zero.
2. With probability (1 ), there are one or more
arrivals and the execution time is an exponential
1
random variable with mean .
23
Sample mean
Let X be a random variable with mean: = .
Let 1 , 2 , , be independent repeated measurements of .
are iid random variables.
The sample mean of the sequence is used to estimate []:
1
=
=1
24
25
1
=
=1
26
Properties of an estimator:
(i) It should give the correct value of the parameter
being estimated, i.e,, = .
(ii) It should not vary too much about the correct
value of the parameters, i.e., 2 is
small.
27
1
=
=1
1
=
[ ]
=1
1
=
=
=1
If = 1 + 2 + +
1
= 2 1 + 2 + +
2
1
= 2 2 =
29
Thus, the probability that the sample mean is close to the true mean
approaches 1 as n becomes very large. Using Chebysbev inequality:
[ ]
2
=
2
=
2
[ ] 2
2
1 < 2
2
< 1 2 = 1
probability 1 or greater.
30
Example 1
A voltage of constant but unknown value is to be
measured. Each measurement is actually the
sum of the desired voltage and a noise voltage
of zero mean and standard deviation of 1.
= +
= 1,2,
Assume that the noise voltages are independent
random variables. How many measurements are
required so that the probability that is within
= 0.5 of the true mean is at least 0.99
31
= 1 0.99 = 0.01
= 1
= 0.5
2
1
= 2=
= 400
2
(0.01) 0.5
Thus, if we were to repeat the measurements
400 times and compute the sample mean, on
the average, at least 99 times out of 100, the
resulting sample mean will be within 0.5 of
the true mean.
32
34
1
2 /2
lim =
2
36
37