You are on page 1of 9

Moments and Moment Generating Functions

The kth moment of a random variable X is given by E[Xk]. The kth central moment of a random variable X is given by E[(X-E[X])k]. The moment generating function of X is given by: (9)

If X is non-negative, we can define its Laplace transform: (10)

Taking the power series expansion of

yields: (11)

Taking the expectation yields: (12)

We can then find the kth moment of X by taking the kth derivative of the moment generating function and setting . (13)

For the Laplace transform, the moments can be found using: (14)

Example:

(15)

= = =

(16) (17) (18)

(19)

(20)

(21)

(22)

For X non-negative, integer-valued, and discrete, we can define the z-transform:

(23)

The first and second moments can be found as follows: (24)

(25)

A property of transforms, known as the convolution theorem is stated as follows: Let be mutually independent random variables. Let exists for all i, then exists, and: (26) . If

Example: Let X1 and X2 be independent exponentially distributed random variables with parameters and respectively. Let Y = X1+X2. Find the distribution of Y.

The Laplace transforms for X1 and X2 are: (27)

(28)

By the convolution theorem: (29)

Expanding this into partial fractions: (30)

where: (31)

(32)

Taking the inverse Laplace transform yields: (33)

Lesson 10: Moment Generating Functions


Introduction

The expected values E(X), E(X2), E(X3), ..., and E(Xr) are called moments. As you have already experienced in some cases, the mean: = E(X) and the variance: 2 = Var(X) = E(X2) 2 which are functions of moments, are sometimes difficult to find. Special functions, called moment-generating functions can sometimes make finding the mean and variance of a random variable simpler. In this lesson, we'll first learn what a moment-generating function is, and then we'll earn how to use moment generating functions (abbreviated "m.g.f."):

to find moments and functions of moments, such as and 2 to identify which probability mass function a random variable X follows

Objectives

To learn the definition of a moment-generating function. To find the moment-generating function of a binomial random variable. To learn how to use a moment-generating function to find the mean and variance of a random variable. To learn how to use a moment-generating function to identify which probability mass function a random variable X follows. To understand the steps involved in each of the proofs in the lesson. To be able to apply the methods learned in the lesson to new problems.

What is an MGF?
Definition. Let X be a discrete random variable with probability mass function f(x) and support S. Then:

is the moment generating function of X as long as the summation is finite for some interval of t around 0. That is, M(t) is the moment generating function ("m.g.f.") of X if there is a positive number h such that the above summation exists and is finite for h < t < h.

Example
What is the moment generating function of a binomial random variable X? Once we find the moment generating function of a random variable, we can use it to... tada!... generate moments! Lectures on Probability, Statistics and Econometrics Home > Additional topics in probability theory

Moment generating function of a random variable


Prerequisites and complementary topics: Random variables and univariate probability distributions, Expected value, Moments of a random variable

Moment generating function - Definition


We start this lecture by giving a definition of moment generating function. Definition_ Let be a random variable. If the expected value:exists and is finite for all real numbers belonging to a closed interval , with , then we say that possesses a moment generating function and the function defined by:is called the moment generating function (or mgf) of .

Moment generating function - Example


The following example shows how the moment generating function of an exponential random variable is calculated: Example_ Let be an exponential random variable with parameter . Its support is the set of positive real numbers:and its probability density function is:

Its moment generating function is computed as follows:

Note that the above integral is finite for for any , so that possesses a moment generating function:

Deriving moments with the mgf


The moment generating function takes its name by the fact that it can be used to derive the moments of , as stated in the following proposition: Proposition_ If a random variable possesses a moment generating function , then, for any , the -th moment of (denote it by ) exists and is finite. Furthermore:

where is the -th derivative of with respect to . Proving the above proposition is quite complicated, because a lot of analytical details must be taken care of (see e.g. Pfeiffer, P. E. (1978) Concepts of probability theory, Courier Dover Publications). The intuition, however, is straightforward: since the expected value is a linear operator and differentiation is a linear operation, under appropriate conditions one can differentiate through the expected value, as follows: which, evaluated at the point , yields: Example_ Continuing the example above, the moment generating function of an exponential random variable is:The expected value of can be computed by taking the

first derivative of the moment generating function:and evaluating it at :

The second moment of can be computed by taking the second derivative of the moment generating function: and evaluating it at :

And so on for the higher moments.

Characterization of a distribution via the mgf


The most important property of the moment generating function is the following: Proposition (Equality of distributions)_ Let and be two random variables. Denote by and their distribution functions and by and their moment generating functions. and have the same distribution (i.e. for any ) if and only if they have the same moment generating functions (i.e. for any ). While proving this proposition is beyond the scope of this introductory exposition, it must be stressed that this proposition is extremely important and relevant from a practical viewpoint: in many cases where we need to prove that two distributions are equal, it is much easier to prove equality of the moment generating functions than to prove equality of the distribution functions. Also note that equality of the distribution functions can be replaced in the proposition above by equality of the probability mass functions (if and are discrete random variables) or by equality of the probability density functions (if and are absolutely continuous random variables).

Moment generating function - More details


Moment generating function of a linear transformation
Let be a random variable possessing a moment generating function . Define:where are two constants and . Then, the random variable possesses a moment generating function and: Proof

Moment generating function of a sum of mutually independent random variables


Let , ..., be mutually independent random variables. Let be their sum:Then, the moment generating function of is the product of the moment generating functions of , ..., :This is easily proved using the definition of moment generating function and the properties of

mutually independent variables ( mutual independence via expectations):

Moment generating function - Keywords


Main keywords found in this lecture: moment generating function, mgf, moment generating function of a sum, mgf of a sum.

You might also like