Professional Documents
Culture Documents
MIE1650 Lecture 1
S YLLABUS
T ENTATIVE S CHEDULE
Dates Topic
12/09 - 19/09 Probability Review
26/09 - 10/10 Discrete Time Markov Chains
17/10 - 24/10 Poisson Processes
30/10 Midterm Exam, 5.15 - 7.30 PM
31/10 - 07/11 Continuous Time Markov Chains
14/11 - 21/11 Renewal Processes
28/12 - 05/12 Brownian Motion and Martingales
12/12 Final Exam, 9.00 - 11.30 PM
O UTLINE
1 Course Overview
2 Probability Basics
Introduction
Random variables
Sum of Independent RVs
Functions of RVs
3 Limit theorems
4 Generating Functions
Moment Generating Functions
Probability Generating Functions
5 Random Sums
6 Simple Branching Process
7 Simple Random Walk
P ROBABILITY S PACE
P ROBABILITY S PACE
P ROBABILITY S PACE
P ROBABILITY S PACE
Ex: F = {∅, {1, 2, 3, 4}, {1, 2}, {3, 4}, {1, 3}, {2, 4}} is not a σ-field,
because, {1, 2} ∪ {1, 3} = {1, 2, 3} is not in F.
R ANDOM VARIABLES
Examples:
◦ Discrete RV
◦ Continuous RV
◦ Mixed RV
D ISTRIBUTION F UNCTIONS
BAYES ’ T HEOREM
P(A ∩ B) P(B|A)P(A)
P(A|B) = =
P(B) P(B)
Example:
In a city, 51% of the adults are males. Also, 9.5% of males smoke
cigars, whereas 1.7% of females smoke cigars. If a randomly
selected adult smokes cigars, what’s the probability that selected
subject is a male?
J OINT DISTRIBUTION
M ARGINAL DISTRIBUTION
M ARGINAL DISTRIBUTION
P(X = x, Y = y) p(x, y)
pX|Y (x|y) = P(X = x|Y = y) = =
P(Y = y) pY (y)
f (x, y)
fX|Y (x|y) =
fY (y)
I NDEPENDENCE
If A ⊥ B ⇒ A ⊥ B̄
I NDEPENDENCE
C ONDITIONAL I NDEPENDENCE
R and B are being independent does not imply that A and B are
conditionally independent. Likewise, conditional independence of
R and B does not imply that they are independent.
P(A ∩ B ∩ C) =?
M EAN (E XPECTATION )
Joint distributions:
Z Z Z
E[aX] = axf (x, y)dydx = a xfX (x)dx = aE[X]
R R R
Z Z
E(aX + bY) = (ax + by)f (x, y)dydx = aE[X] + bE[Y]
R R
M EAN (E XPECTATION )
Moments of a RV:
The rth moment of X is E[X r ]
The rth central moment of X is E[(X − E[X])r ]
C ONDITIONAL E XPECTATION
C ONDITIONAL E XPECTATION
Solution:
3
E[T] = E[T|R1 < R2 ]P(R1 < R2 ) + E[T|R2 < R1 ]P(R2 < R1 ) = · · · = λ1 +λ2
C OVARIANCE
C OVARIANCE PROPERTIES
Cov(cX, Y) = cCov(X, Y)
Var(cX) = c2 Var(X)
C OVARIANCE PROPERTIES
Ex: Flip a fair coin 3 times. Let X be the number of heads in the
first 2 flips and let Y be the number of heads on the last 2 flips (so
there is overlap on the middle flip). Compute Cov(X, Y).
1
Solution: 4
S TANDARD DEVIATION / CV
p
Standard deviation of the RV X is σ(X) = Var(X)
⇒ Standard deviation and variance are measures of dispersion
about the mean
C ORRELATION
Ex: A box contains red, white and black balls. We draw balls from
the box n times where at each draw we note ball color and then
replace it to the box. Let X1 and X2 be the number of red balls and
white balls drawn, respectively. Find ρ(X1 , X2 ).
−np√1 p2
Solution: √
np1 (1−p1 ) np2 (1−p2 )
Beware!
If X and Y are uncorrelated, this does not imply they are
independent
Ex:
◦ X = sum of 2 coin flips, and Y = difference of the 2 flips
B INOMIAL RV
G EOMETRIC RV
N EGATIVE B INOMIAL RV
P OISSON RV
◦ Var(X) = λ
U NIFORM RV ( DISCRETE )
Parameters a, b ∈ Z, b ≥ a.
1
pmf: P(X = k) = , k ∈ {a, a + 1, . . . , b − 1, b}
b−a+1
bkc − a + 1
cdf: F(k; a, b) =
b−a+1
a+b (b − a + 1)2 − 1
E[X] = , Var(X) =
2 12
Ex: Roll a die. E[X] = 7/2, Var(X) = 35/12.
N ORMAL (G AUSSIAN ) RV
1 (x − µ)2
f (x) = √ exp{− }, x∈R
2πσ 2 2σ 2
F(x) = 21 1 + erf σx−µ
√
2
, x∈R
1 2
pdf of standard normal distribution: φ(x) = √ e−x /2
2π
1 x−µ
⇒ pdf of X ∼ N(µ, σ 2 ) : fX (x) = φ( )
σ σ Z x
1 2
cdf of standard normal distribution: Φ(x) = √ e−t /2 dt
2π −∞
x−µ
⇒ cdf of X ∼ N(µ, σ 2 ) : FX (x) = Φ
σ
M Cevik MIE1605 - Probability Review 35 / 96
Probability Basics Random variables
E XPONENTIAL RV
E XPONENTIAL RV
E RLANG RV
λk xk−1 e−λx
f (x; k, λ) = , 0≤x<∞
(k − 1)!
k−1
X 1 −λx
F(x) = 1 − e (λx)n , 0≤x<∞
n!
n=0
k k
E[X] = , Var(X) =
λ λ2
λ
If X ∼ Erlang(k, λ) ⇒ aX ∼ Erlang(k, )
a
If X ∼ Erlang(k1 , λ), and Y ∼ Erlang(k2 , λ)
⇒ X + Y ∼ Erlang(k1 + k2 , λ)
G AMMA RV
β α α−1 −βx
f (x; α, β) = x e , 0<x<∞
Γ (α)
Z βx
1
F(x) = tα−1 e−t dt, 0<x<∞
Γ (α) 0
α α
E[X] = , Var(X) =
β β2
G AMMA RV
If Xi ∼ Gamma(αi , β), i = 1, 2, . . . , N
XN XN
⇒ Xi ∼ Gamma( αi , β)
i=1 i=1
If X ∼ Gamma(1, β) ⇒ X ∼ Expo(β)
B ETA RV
xα−1 (1 − x)β−1
f (x; α, β) = , 0≤x≤1
B(α, β)
B(x; α, β)
F(x) = , 0≤x≤1
B(α, β)
Z 1 Z x
Beta functions: B(a, b) = ta−1 (1 − t)b−1 dt, B(x; a, b) = ta−1 (1 − t)b−1 dt
0 0
α αβ
E[X] = , Var(X) =
α+β (α + β)2 (α + β + 1)
U NIFORM RV ( CONTINUOUS )
pdf:
1
, if x ∈ [a, b]
f (x) = b − a
0, otw
cdf:
0, if x < a
x − a
F(x) = , if x ∈ [a, b]
b−a
1, if x ≥ b
M EMORYLESS P ROPERTY
P(X > m + n, X ≥ n)
P(X > m + n | X ≥ n) =
P(X ≥ n)
P(X > m + n)
= = P(X > m)
P(X ≥ n)
S UM OF I NDEPENDENT D ISCRETE RV S
S UM OF I NDEPENDENT D ISCRETE RV S
S UM OF I NDEPENDENT D ISCRETE RV S
Soln:
X + Y ∼ Poisson(λ + µ)
S UM OF I NDEPENDENT D ISCRETE RV S
Soln:
X + Y ∼ Binom(n + m, p)
S UM OF I NDEPENDENT C ONTINUOUS RV S
Z ∞
P(X + Y ≤ a) = P(X ≤ a − y)fY (y)dy : CDF of X + Y
0
Z ∞
d
fX+Y (a) = FX+Y (a) = fX (a − y)fY (y)dy : pdf of X + Y
da 0
S UM OF I NDEPENDENT C ONTINUOUS RV S
Soln:
z,
if 0 ≤ z ≤ 1
fZ (z) = 2 − z, if 1 < z < 2
0, otw
⇒ pdf of a triangular RV
Ex: If X and Y have joint density function f , find the density function
U = XY.
M ODES OF CONVERGENCE
C ONVERGENCE IN P ROBABILITY
Alternative terminology:
a.e.
Xn → X almost everywhere, Xn −−→ X
w.p.1
Xn → X with probability 1, Xn −−−→ X
M Cevik MIE1605 - Probability Review 60 / 96
Limit theorems
C ONVERGENCE IN D ISTRIBUTION
M ARKOV ’ S INEQUALITY
E[X]
⇒ If t > 0, P(X ≥ t) ≤
t
Scaling Markov’s inequality: For t > 0
P(X ≥ tE[X]) ≤ (tE[X])−1 E[X] = 1/t
M Cevik MIE1605 - Probability Review 64 / 96
Limit theorems
C HEBYSHEV ’ S INEQUALITY
C HEBYSHEV ’ S INEQUALITY
C HEBYSHEV ’ S INEQUALITY
Remarks:
◦ If n is large, then X̄n ≈ Nor(µ, σ 2 /n)
◦ Xi ’s need not be normally distributed
◦ Usually n ≥ 30 for better approximations (fewer observations
needed when Xi ’s are from symmetric distribution)
M Cevik MIE1605 - Probability Review 69 / 96
Limit theorems
L IMIT THEOREMS
L IMIT THEOREMS
L IMIT THEOREMS
G ENERATING F UNCTIONS
Z
iXt
Fourier Transform: E[e ] = eixt fX (x)dx
x∈R
Z
Laplace Transform: E[e−Xt ] = e−xt fX (x)dx
x∈R
Z
Xt
Moment Generating Functions: E[e ] = ext fX (x)dx
x∈R
Z
Probability Generating Functions: E[sX ] = sx fX (x)
x∈R
φX (0) = 1
∂φX (t) ∂
= E[ ext ] = E[XeXt ]
∂t ∂t
0 00
⇒ φX (0) = E[X], φX (0) = E[X 2 ], etc.
Sum of RVs:
P
Xi t
φPi Xi (t) = E[e i ] = E[Πi eXi t ].
If Xi0 s are independent ⇒ E[Πi eXi t ] = Πi E[eXi t ] = Πi φXi (t)
Bernoulli Distribution:
φX (t) = E[eXt ] = pet + (1 − p)
Binomial Distribution:
P
φX (t) = E[eXt ] = E[e i Xi t
] = (E[eXi t ])n
= (pet + (1 − p))n → since Xi0 s are idd Bernoulli RVs.
Note that φX (t) gives a hint about distribution of a RV. If we
recognize something like (pet + (1 − p))n , then we can say that it’s
a Binomial(n, p) RV.
Exponential Distribution:
φX (t) = E[ext ] = λ/(λ − t)
Ex: Poisson RV
∞ ∞
X e−λ λk X (λs)k
P(s) = sk = e−λ = eλ(s−1)
k! k!
k=0 k=0
Ex: Geometric RV
∞
X 1
P(s) = sk (1 − p)k p = p , (s < 1/(1 − p))
1 − s(1 − p)
k=0
P(k) (0)
Then, pk = , k = 0, 1, 2, . . .
k!
P(n) (1)
= E X(X − 1) . . . (X − n + 1)
00 0 0
Var(X) = E[X 2 ] − (E[X])2 = P (1) + P (1) − (P (1))2
Ex:
◦ P(s) = p/(1 − qs)
0
◦ P (s) = qp/(1 − qs)2
⇒ P0 (0) = (1 − p)p = P(X = 1)
⇒ P0 (1) = q/p = E[X]
00 00
⇒ P (s) = (2(1 − qs)q2 p)/(1 − qs)4 ⇒ P (0) = 2q2 p
00
Then, P(X = 2) = (1/2!)P (0) = (1 − p)2 p
R ANDOM S UMS
Let X1 , X2 , X3 , . . . be iid (non-negative integer valued) RVs with
P(Xi = k) = pk , k = 0, 1, 2...
Let N be a non-negative integer valued RV which is independent
of {X1 , X2 , . . .} where P(N = k) = αk , k ≥ 0. Define
◦ S0 = 0P
◦ Sn = ni=1 Xi , n = 1, 2, . . .
R ANDOM S UMS
E[N]
X
E[SN ] = E[N]E[X1 ] 6= E[Xi ] → E[N] may not be integer!
i=1
∞
X k
hX i ∞
X
= E Xi αk = kE[X1 ]αk = E[X1 ]E[N]
k=0 i=1 k=0
∞
X ∞
X ∞
X
= αk sj P(Sk = j) = αk (PX1 (s))k
k=0 j=0 k=0
P(s) = P1 (s) = q + ps
2 n
P(Zn+1 = 0) = q + pq + p q + . . . + p q
P(Zn+1 = 1) = pn+1
P(Zn+1 = 2) = 0
∞
X
lim P(Zn+1 = 0) = q pi = q/(1 − p) = 1 ⇒ this family will extinct!
n→∞
i=0
Theorem
Suppose 0 < p0 < 1.
◦ If m = E[z1 ] ≤ 1, then π = 1.
◦ If m > 1, then π < 1 is the unique non-negative solution of
s = P(s).
φn = P(N = n), n ≥ 0
φ0 = 0
φ1 = p
n−2
X
φn = qφj φn−j−1
j=1
∞
X
Φ(s) = sn φn
n=0
E[N] =?