You are on page 1of 12

Chapter 6 Expectation and Conditional Expectation Lectures 24 - 30

In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables and then for general random variable. Finally we introduce the notion of conditional expectations using conditional probabilities.

Definition 6.1. Two random variables


equal almost surely (in short a.s.) if

defined on a probability space

are said to be

Now we give a useful characterization of discrete random variables.

Theorem 6.0.23 Let


there exists a partition

be a discrete random variable defined on a probability space of and

. Then such that

where

may be

. . Let be the set of all discontinuities of

Proof. Let
. Here

be the distribution function of may be . Since

is discrete, we have

Set

Then

is pairwise disjoint and

Now define

Then

is a partition of

and

Remark 6.0.6 If

is a discrete random variable on a probability space

, then the 'effective' which has

range of is at the most countable. Here 'effective' range means those values taken by positive probability. This leads to the name 'discrete' random variable.

Remark 6.0.7 If
that

is a discrete random variable, then one can assume without the loss of generality

Since if

, then set

and

for

Theorem 6.0.24 Let


Then if

be such that

is a countable partition of

then

Proof. For each

, set

Then clearly

Also if

then

. Therefore

This completes the proof.

Definition 6.2. Let


expectation of

be a discrete random variable represented by is defined as

. Then

denoted by

provided the right hand side series converges absolutely.

Remark 6.0.8 In view of Remark 6.0.5., if

has range

, then

Example 6.0.37 Let

be a Bernoulli(

) random variable. Then

Hence

Example 6.0.38 Let

be a Binomial

random variable. Then

Hence

Here we used the identity

Example 6.0.39 Let

be a Poisson (

) random variable. Then

Hence

Example 6.0.40 Let

be a Geometric (

) random variable. Then

Hence

Theorem 6.0.25 (Properties of expectation) Let


means. Then (i) If (ii) For , then .

and

be discrete random variables with finite

Proof. (i) Let


. Hence

be a representation of

. Then

implies

for all

(ii) Let

has a representation

. Now by setting

one can use same partition for

and

. Therefore

Hence

Definition 6.3. (Simple random variable) A random variable is said to be simple if it is discrete and the
distribution function has only finitely many discontinuities. Theorem 6.0.26Let be random variable in satisfying . such that , then there exists a sequence

of simple random variables (i) For each ,

(ii) For each

as , define simple random variable

. as follows:

Proof. For

Then

's satisfies the following:

Lemma 6.0.3 Let

be a non negative random variable and

be a sequence of simple random exists and is given by

variables satisfying (i) and (ii) of Theorem 6.0.25. Then

Proof. Since
exists. Also since

, we have 's are simple, clearly,

(see exercise). Hence

Therefore

Hence to complete the proof, it suffices to show that for

simple and

Let

where

is a partition of

. Fix

, set for

and

Since

, we have for each

Also

Hence

From the definition of

we have

Hence

(6.0.1)

Using continuity property of probability, we have

Now let,

in (6.0.1), we get

Since,

is arbitrary, we get

This completes the proof.

Definition 6.4. The expectation of a non negative random variable

is defined as (6.0.2)

where

is a sequence of simple random variables as in Theorem 6.0.25.

Remark 6.0.9

One can define expectation of

, non negative random variable, as

But we use Definition 6.4., since it is more handy.

Theorem 6.0.27 Let

be a continuous non negative random variable with pdf

. Then

Proof. By using the simple functions given in the proof of Theorem 6.0.25, we get

(6.0.3)

where

is the point given by the mean value theorem.

Hence

Definition 6.5. Let


exists if either or

be a random variable on is finite. In this case

. The mean or expectation of is defined as

is said to

where

Note that

is the positive part and

is the negetive part of

Theorem 6.0.28

Let

be a continuous random variable with finite mean and pdf

. Then

Proof. Set

Hence

Then

is a sequence of simple random variables such that

Similarly, set

Hence

Then

Now

(6.0.4)

and

(6.0.5)

The last equality follows by the arguments from the proof of Theorem 6.0.26. Combining (6.0.4) and (6.0.4), we get

Now as in the proof of Theorem 6.0.26, we complete the proof. We state the following useful properties of expectation. The proof follows by approximation argument using the corresponding properties of simple random variables

Theorem 6.0.29 Let


(i) If , then .

be random variables with finite mean. Then

(ii) For

(iii) Let

be a random variable such that

. Then

has finite mean and

In the context of Riemann integration, one can recall the following convergence theorem. `` If , then is a sequence of continuous functions defined on the such that uniformly in

i.e., to take limit inside the integral, one need uniform convergence of functions. In many situations in it is highly unlikely to get uniform convergence. In fact uniform convergence is not required to take limit inside an integral. This is illustrated in the following couple of theorem. The proof of them are beyond the scope of this course.

Theorem 6.0.30 (Monotone convergence theorem)


random variables such that . Then

Let

be an increasing sequence of nonnegative

[Here

means

.]

Theorem 6.0.31 (Dominated Convergence Theorem) Let


(i) (ii) (iii) Then has finite mean.

be random variables such that

Definition 6.6. (Higher Order Moments) Let moment of and is called the

be a random variable. Then th central moment of

is called the

th

. The second central moment is

called the variance. Now we state the following theorem whose proof is beyond the scope of this course.

Theorem 6.0.32 Let

be a continuous random variable with pdf is finite. Then

and

be a continuous

function such that the integral

The above theorem is generally referred as the ``Law of unconscious statistician'' since often users treat the

above as a definition itself. Now we define conditional expectation denoted by E[Y|X] of the random variable Y given the information about the random variable X. If Y is a Bernoulli (p) random variable and X any discrete random variable, then we expect E[Y|X = x] to be P{Y = 1|X = x}, since we know that EY = p = P{Y = 1}. i.e.,

Where

is the conditional pmf of Y given X. Now since we expect conditional expectation to be

liner and any discrete random variable can be written as a liner combination of Bernoulli random variable we get the following definition.

Definition 6.7. Let


conditional expectation of

are discrete random variable with conditional pmf given is defined as

. Then

Example 6.0.41
. Calculate Set

Let

be independent random variables with geometric distribution of parameter , where

For

Now

Therefore

i.e.,

Now

When X and Y are discrete random variable. E[Y|X] is defined using conditional pmf of Y given X. Hence we define E[Y|X] when X and Y are continuous random variable with joint pdf f in a similar way as follows.

Definition 6.8. Let


expectation of given

be continuous random variable with conditional pdf is defined as

. Then conditional

Remark 6.0.10 One can extend the defination of E[Y|X] when X is any random variable (discrete, continuous or mixed) and Y is a any random variable with finite mean. But it is beyound the scope of this course. Theorem 6.0.33 (i)
respectively. Then if Let be discrete random variables with joint pmf has finite mean, then , marginal pmfs and

(ii) Let Then if

be continuous random variables with joint pdf has finite mean, then

, marginal pdfs

and

respectively.

Proof. We only prove (ii).

Example 6.0.42 Let

be continuous random variables with joint pdf given by

Find

and hence calculate

. Note that

and

elsewhere. Hence for

Also

elsewhere. Therefore

You might also like