Professional Documents
Culture Documents
Administrative details
STAT/MTHE 353: Multiple Random Variables Instructor: Tamas Linder
Email: linder@mast.queensu.ca
T. Linder Office: Je↵ery 401
Phone: 613-533-2417
Queen’s University
Office hours: Wednesday 2:30–3:30 pm
Winter 2012 Class web site: http://www.mast.queensu.ca/⇠ stat353
All homework and solutions will be posted here.
Check frequently for new announcements
STAT/MTHE 353: Multiple Random Variables 1 / 34 STAT/MTHE 353: Multiple Random Variables 2 / 34
Review
Text: Fundamentals of Probability with Stochastic Processes, 3rd
ed., by S. Ghahramani, Prentice Hall.
Lecture slides will be posted on the class web site. The slides are not
self-contained; they only cover parts of the material. S is the sample space;
Homework: 9 HW assignments. P is a probability measure on S: P is a function from a collection of
Homework due Friday before noon in my mailbox (Je↵ery 401). subsets of S (called the events) to [0, 1]. P satisfies the axioms of
No late homework will be accepted! probability;
Evaluation: the better of A random variable is a function X : S ! R. The distribution of X
Homework 20%, midterm 20%, final exam 60% is the probability measure associated with X:
Homework 20%, final exam 80%
P (X 2 A) = P ({s : X(s) 2 A}), for any “reasonable” A ⇢ R.
Midterm Exam: Thursday, February 16 in class (9:30 - 10:30 am)
STAT/MTHE 353: Multiple Random Variables 3 / 34 STAT/MTHE 353: Multiple Random Variables 4 / 34
Here are the usual ways to describe the distribution of X:
Joint Distributions
Distribution function: F : R ! [0, 1] defined by
STAT/MTHE 353: Multiple Random Variables 5 / 34 STAT/MTHE 353: Multiple Random Variables 6 / 34
STAT/MTHE 353: Multiple Random Variables 9 / 34 STAT/MTHE 353: Multiple Random Variables 10 / 34
{i1 , . . . , ik } ⇢ {1, . . . , n}
Example: In an urn there are ni objects of type i for i = 1, . . . , r. The
Then the marginal joint pmf of (Xi1 , . . . , Xik ) can be obtained from total number of objects is n1 + · · · + nr = N . We randomly draw n
pX (x) = pX1 ,...,Xn (x1 , . . . , xn ) as objects (n N ) without replacement. Let Xi = # of objects of type i
drawn. Find the joint pmf of (X1 , . . . , Xr ). Also find the marginal
pXi1 ,...,Xik (xi1 , . . . , xik )
distribution of each Xi , i = 1, . . . , r.
= P (Xi1 = xi1 , . . . , Xik = xik )
Solution: . . .
= P (Xi1 = xi1 , . . . , Xik = xik , Xj1 2 R, . . . , Xjn k
2 R)
where {j1 , . . . , jn k } = {1, . . . , n} \ {i1 , . . . , ik }
X X
= ··· pX1 ,...,Xn (x1 , . . . , xn )
x j1 x jn k
Thus the joint pmf of Xi1 , . . . , Xik is obtained by summing pX1 ,...,Xn
over all possible values of the complementary variables xj1 , . . . , xjn k .
STAT/MTHE 353: Multiple Random Variables 11 / 34 STAT/MTHE 353: Multiple Random Variables 12 / 34
Marginal joint probability density functions
Let X1 , . . . , Xn be jointly continuous with pdf fX = f . As before, let In conclusion, for {i1 , . . . , ik } ⇢ {1, . . . , n},
Let B ⇢ R . Then
k
Rn k
STAT/MTHE 353: Multiple Random Variables 13 / 34 STAT/MTHE 353: Multiple Random Variables 14 / 34
Find the marginal pdfs of Xi , i = 1, 2, 3, and the marginal jpdfs of FXi1 ,...,Xik (xi1 , . . . , xik )
(Xi , Xj ), i 6= j. = P (Xi1 xi1 , . . . , Xik xik )
Solution: . . . = P (Xi1 xi1 , . . . , Xik xik , Xj1 < 1, . . . , Xjn k
< 1)
Example: With X1 , X2 , X3 as in the previous problem, consider the = lim · · · lim FX1 ,...,Xn (x1 , . . . , xn )
xj1 !1 x jn k
!1
quadratic equation
X1 y 2 + X2 y + X3 = 0 That is, we let the variables complementary to xi1 , . . . , xik converge
in the variable y. Find the probability that both roots are real. to 1
Solution: . . .
STAT/MTHE 353: Multiple Random Variables 15 / 34 STAT/MTHE 353: Multiple Random Variables 16 / 34
Independence (iii) Suppose gi : R ! R, i = 1, . . . , n are “reasonable” functions. Then
if X1 , . . . , Xn are independent, then so are g1 (X1 ), . . . , gn (Xn ).
Definition The random variables X1 , . . . , Xn are independent if for all
“reasonable” A1 , . . . , An ⇢ R, Proof For A1 , . . . , An ⇢ R,
STAT/MTHE 353: Multiple Random Variables 17 / 34 STAT/MTHE 353: Multiple Random Variables 18 / 34
Theorem 1 Theorem 2
Let F be the joint cdf of the random variables X1 , . . . , Xn . Then Let X1 , . . . , Xn be discrete r.v.’s with joint pmf p. Then X1 , . . . , Xn are
X1 , . . . , Xn are independent if and only if F is the product of the independent if and only if p is the product of the marginal pmfs of the
marginal cdfs of the Xi , i.e., for all (x1 , . . . , xn ) 2 Rn , Xi , i.e., for all (x1 , . . . , xn ) 2 Rn ,
F (x1 , . . . , xn ) = FX1 (x1 )FX2 (x2 ) · · · FXn (xn ) p(x1 , . . . , xn ) = pX1 (x1 )pX2 (x2 ) · · · pXn (xn )
Q
n
The converse that F (x1 , . . . , xn ) = FXi (xi ) for all (x1 , . . . , xn ) 2 Rn
i=1
implies independence is out the the scope of this class. ⇤
STAT/MTHE 353: Multiple Random Variables 19 / 34 STAT/MTHE 353: Multiple Random Variables 20 / 34
Theorem 3
Let X1 , . . . , Xn be jointly continuous r.v.’s with joint pdf f . Then
Q
n
X1 , . . . , Xn are independent if and only if f is the product of the
Proof cont’d: Conversely, suppose that p(x1 , . . . , xn ) = pXi (xi ) for
i=1 marginal pdfs of the Xi , i.e., for all (x1 , . . . , xn ) 2 Rn ,
any x1 , . . . , xn . Then, for any A1 , A2 , . . . , An ⇢ R,
X X f (x1 , . . . , xn ) = fX1 (x1 )fX2 (x2 ) · · · fXn (xn ).
P (X1 2 A1 , . . . , Xn 2 An ) = ··· p(x1 , . . . , xn )
x1 2A1 xn 2An Q
n
X X Proof: Assume f (x1 , . . . , xn ) = fXi (xi ) for any x1 , . . . , xn .Then for
= ··· pX1 (x1 ) · · · pXn (xn ) i=1
x1 2A1 xn 2An any A1 , A2 , . . . , An ⇢ R,
! ! ! Z Z
X X X
= pX1 (x1 ) pX2 (x2 ) · · · pXn (xn ) P (X1 2 A1 , . . . , Xn 2 An ) = ··· f (x1 , . . . , xn ) dx1 · · · dxn
x1 2A1 x2 2A2 xn 2An Z A1 Z An
= ··· fX1 (x1 ) · · · fXn (xn ) dx1 · · · dxn
= P (X1 2 A1 )P (X2 2 A2 ) · · · P (Xn 2 An ) A1 An
Z ! Z !
Thus X1 , . . . , Xn are independent. ⇤ = fX1 (x1 ) dx1 ··· fXn (xn ) dxn
A1 An
so X1 , . . . , Xn are independent.
STAT/MTHE 353: Multiple Random Variables 21 / 34 STAT/MTHE 353: Multiple Random Variables 22 / 34
F (x1 , . . . , xn ) = P (X1 x1 , X2 x2 , . . . , Xn xn )
Z x1 Z xn
= ··· f (t1 , . . . , tn ) dt1 · · · dtn
1 1
@n Example: . . .
F (x1 , . . . , xn ) = f (x1 , . . . , xn )
@x1 · · · @xn
@n
f (x1 , . . . , xn ) = F (x1 , . . . , xn )
@x1 · · · @xn
@n
= FX (x1 ) · · · FXn (xn )
@x1 · · · @xn 1
= fX1 (x1 ) · · · fXn (xn ) ⇤
STAT/MTHE 353: Multiple Random Variables 23 / 34 STAT/MTHE 353: Multiple Random Variables 24 / 34
Expectations Involving Multiple Random Variables If X = (X1 , . . . , Xn )T is a random vector, we sometimes use the
notation
T
E(X) = E(X1 ), . . . , E(Xn )
P
For X1 , . . . , Xn discrete, we still have E(X) = x xp(x) with the
Recall that the expectation of a random variable X is understanding that
8X X X
> xp(x) if X is discrete xp(x) = (x1 , . . . , xn )T p(x1 , . . . , xn )
>
<
x x (x1 ,...,xn )
E(X) = Z 1 ⇣X X X ⌘T
>
>
: xf (x) dx if X is continuous = x1 pX1 (x1 ), x2 pX2 (x2 ), . . . , xn pXn (xn )
1 x1 x2 xn
P T
if the sum or the integral exist in the sense that x |x|p(x) < 1 or = E(X1 ), . . . , E(Xn )
R1
1
|x|f (x) dx < 1.
Similarly, for jointly continuous X1 , . . . , Xn ,
Example: . . . Z
E(X) = xf (x) dx
ZR
n
STAT/MTHE 353: Multiple Random Variables 25 / 34 STAT/MTHE 353: Multiple Random Variables 26 / 34
only take a countable number of values with positive probability, the = g(x)p(x) ⇤
x
same is true for
(Y1 , . . . , Yk )T = Y = g(X)
Example: Linearity of expectation. . .
so Y1 , . . . , Yk are discrete random variables.
E(a0 + a1 X1 + · · · an Xn ) = a0 + a1 E(X1 ) + · · · + an E(Xn )
STAT/MTHE 353: Multiple Random Variables 27 / 34 STAT/MTHE 353: Multiple Random Variables 28 / 34
Transformation of Multiple Random Variables
STAT/MTHE 353: Multiple Random Variables 29 / 34 STAT/MTHE 353: Multiple Random Variables 30 / 34
where A = {x 2 R : h(x) 2 B} = h
n 1
(B) = g(B).
The multivariate change of variables formula for x = g(y) implies that This implies the following:
Z Z
Theorem 5 (Transformation of Multiple Random Variables)
P (X1 , . . . , Xn ) 2 A = · · · f (x1 , . . . , xn ) dx1 · · · dxn
Suppose X1 , . . . , Xn are jointly continuous with joint pdf f (x1 , . . . , xn ).
g(B)
Z Z Let h : Rn ! Rn be a continuously di↵erentiable and one-to-one
= ··· f g1 (y1 , . . . , yn ), . . . , gn (y1 , . . . , yn ) |Jg (y1 , . . . , yn )| dy1 · · · dyn function with continuously di↵erentiable inverse g. Then the joint pdf of
ZB Z Y = (Y1 , . . . , Yn )T = h(X) is
= · · · f (g(y))|Jg (y)| dy
fY (y1 , . . . , yn ) = f g1 (y1 , . . . , yn ), . . . , gn (y1 , . . . , yn ) |Jg (y1 , . . . , yn )|.
B
STAT/MTHE 353: Multiple Random Variables 31 / 34 STAT/MTHE 353: Multiple Random Variables 32 / 34
Example: Suppose X = (X1 , . . . , Xn )T has joint pdf f and let
Y = AX, where A is an invertible n ⇥ n (real) matrix. Find fY .
(2) Often it is easier to directly compute the cdf of Y1 :
Solution: . . .
FY1 (y) = P (Y1 y) = P h1 (X1 , . . . , Xn ) y
Often we are interested in the pdf of just a single function of = P (X1 , . . . , Xn ) 2 Ay
X1 , . . . , Xn , say Y1 = h1 (X1 , . . . , Xn ). where Ay = {(x1 , . . . , xn ) : h(x1 , . . . , xn ) y}
Z Z
(1) Define Yi = hi (X1 , . . . , Xn ), i = 2, . . . , n in such a way that the = · · · f (x1 , . . . , xn ) dx1 · · · dxn
Ay
mapping h = (h1 , . . . , hn ) satisfies the conditions of the theorem (h
has an inverse g which is continuously di↵erentiable).
Then the theorem gives the joint pdf fY (y1 , . . . , yn ) and we obtain Di↵erentiating FY1 we obtain the pdf of Y1 .
fY1 (y1 ) by “integrating out” y2 , . . . , yn :
Example: Let X1 , . . . , Xn be independent with common distribution
Z Z
Uniform(0, 1). Determine the pdf of Y = min(X1 , . . . , Xn ).
fY1 (y1 ) = · · · fY (y1 , . . . , yn ) dy2 . . . dyn
Rn 1 Solution: . . .
A common choice is Yi = Xi , i = 2, . . . , n.
STAT/MTHE 353: Multiple Random Variables 33 / 34 STAT/MTHE 353: Multiple Random Variables 34 / 34