Professional Documents
Culture Documents
In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables and then for general random variable. Finally we introduce the notion of conditional expectations using conditional probabilities.
are said to be
where
may be
Proof. Let
. Here
is discrete, we have
Set
Then
Now define
Then
is a partition of
and
Remark 6.0.6 If
range of is at the most countable. Here 'effective' range means those values taken by positive probability. This leads to the name 'discrete' random variable.
Remark 6.0.7 If
that
is a discrete random variable, then one can assume without the loss of generality
Since if
, then set
and
for
be such that
is a countable partition of
then
, set
Then clearly
Also if
then
. Therefore
. Then
denoted by
has range
, then
be a Bernoulli(
Hence
be a Binomial
Hence
be a Poisson (
Hence
be a Geometric (
Hence
and
be a representation of
. Then
implies
for all
(ii) Let
has a representation
. Now by setting
and
. Therefore
Hence
Definition 6.3. (Simple random variable) A random variable is said to be simple if it is discrete and the
distribution function has only finitely many discontinuities. Theorem 6.0.26Let be random variable in satisfying . such that , then there exists a sequence
. as follows:
Proof. For
Then
Proof. Since
exists. Also since
Therefore
simple and
Let
where
is a partition of
. Fix
, set for
and
Since
Also
Hence
we have
Hence
(6.0.1)
Now let,
in (6.0.1), we get
Since,
is arbitrary, we get
is defined as (6.0.2)
where
Remark 6.0.9
. Then
Proof. By using the simple functions given in the proof of Theorem 6.0.25, we get
(6.0.3)
where
Hence
is said to
where
Note that
Theorem 6.0.28
Let
. Then
Proof. Set
Hence
Then
Similarly, set
Hence
Then
Now
(6.0.4)
and
(6.0.5)
The last equality follows by the arguments from the proof of Theorem 6.0.26. Combining (6.0.4) and (6.0.4), we get
Now as in the proof of Theorem 6.0.26, we complete the proof. We state the following useful properties of expectation. The proof follows by approximation argument using the corresponding properties of simple random variables
(ii) For
(iii) Let
. Then
In the context of Riemann integration, one can recall the following convergence theorem. `` If , then is a sequence of continuous functions defined on the such that uniformly in
i.e., to take limit inside the integral, one need uniform convergence of functions. In many situations in it is highly unlikely to get uniform convergence. In fact uniform convergence is not required to take limit inside an integral. This is illustrated in the following couple of theorem. The proof of them are beyond the scope of this course.
Let
[Here
means
.]
Definition 6.6. (Higher Order Moments) Let moment of and is called the
is called the
th
called the variance. Now we state the following theorem whose proof is beyond the scope of this course.
and
be a continuous
The above theorem is generally referred as the ``Law of unconscious statistician'' since often users treat the
above as a definition itself. Now we define conditional expectation denoted by E[Y|X] of the random variable Y given the information about the random variable X. If Y is a Bernoulli (p) random variable and X any discrete random variable, then we expect E[Y|X = x] to be P{Y = 1|X = x}, since we know that EY = p = P{Y = 1}. i.e.,
Where
liner and any discrete random variable can be written as a liner combination of Bernoulli random variable we get the following definition.
. Then
Example 6.0.41
. Calculate Set
Let
For
Now
Therefore
i.e.,
Now
When X and Y are discrete random variable. E[Y|X] is defined using conditional pmf of Y given X. Hence we define E[Y|X] when X and Y are continuous random variable with joint pdf f in a similar way as follows.
. Then conditional
Remark 6.0.10 One can extend the defination of E[Y|X] when X is any random variable (discrete, continuous or mixed) and Y is a any random variable with finite mean. But it is beyound the scope of this course. Theorem 6.0.33 (i)
respectively. Then if Let be discrete random variables with joint pmf has finite mean, then , marginal pmfs and
be continuous random variables with joint pdf has finite mean, then
, marginal pdfs
and
respectively.
Find
. Note that
and
Also
elsewhere. Therefore