You are on page 1of 7

Chapter 1.

Experiments, Models, & Probabilities


Goals of this course:
1. to introduces the logic of probability theory
Definitions of new concepts (to establish logic for stochastic reasoning)
Axioms (accepted facts w/o proof)
there are only three axioms of probability theory
Theorems (can be proved logically using definitions and axioms)
2. to develop intuition into how the theory applies to practical situations
problem solving, practicing examples, HWs, quizzes, etc.
3. to apply probability theory to solving engineering problems
The aim is achieved by studying random signals in LTI systems
Web-site http://www.wiley.com/college/yates provides solutions to quizzes, and
MATLAB codes for certain random variables and their distributions.

Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Table 1.1 (p. 9) The terminology of set theory and probability.


Note: The set theory forms the mathematical basis of the probability
there exist a 1-1 mapping between the two

A set is a collection of things or elements


A subset is used to establish a (belonging) relationship between two sets.
A universal set is a set of all things under consideration in a given context.
A null set contains no element.
Set algebra or set operations and the Venn diagrams:
union, intersection, complement, and difference
mutually exclusive vs. collectively exhaustive
and the De Morgans law: relating all three basic set operations
Correspondingly, in probability theory, we have such concepts as the sample space, an
event, and the outcomes of a random experiment.
Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Figure 1.1 (p. 11) Illustrations of sample space, event using Venn

diagrams
In this example of Theorem 1.2, the event space is B = {B1, B2, B3, and B4}.
Note: if we treat B as a sample space, then the collection of B1, B2, B3, and B4
forms a collectively exhaustive of the sample space B.
Note also, here the collection of B1, B2, B3, and B4 are also mutually exclusive,
and Ci = A Bi for i = 1,,4. It should be apparent that A = C1 C2 C3 C4.
Probability: P(A) = P(C1 C2 C3 C4) = P(C1)+P(C2)+P(C3)+P(C4).
Here Ci = A Bi for i = 1,,4 are mutually exclusive.
Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Figure 1.2 (p. 25): The sequential tree for Example 1.24. Independence
among two stages of the sequential experiment results in a multiplication of
probabilities from each stage.

Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Figure 1.3: Reliability analysis of a systems


with different configuration of sub-systems.

Probability Series system is operational Probability all sub-systems are operational ,


P W1 =1 W2 =1 W3 =1

= P W1 =1 P W2 =1 P W3 =1 = p 3
Probability Parallel system is operational 1 Probability Parallel system is NOT operational ,
1 Probability all sub-systems are NOT operational
1 P W1 =0 W2 =0 W3 =0
3
= 1 P W1 =0 P W2 =0 P W3 =0 = 1 1 p
Assumption: each sub-system is operational with probability p, independent of each other.
1, with probability p
One can use a Bernoulli trial indicator Wi
0, with probability 1-p
to represent "if the subs-system is operational or not"
Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Figure 1.4:The operation described in Example 1.44. On the left is the original
operation. On the right is the equivalent operation with each pair of series
components replaced with an equivalent component.

Probability the system is operational 1 Probability neither sub-system W5 nor W6 are operational ,
1 P W5 =0 W6 =0
1 P W5 =0 P W6 =0
= 1 1 p 2

Assumption: each sub-system is operational with probability p, independent of each other.


1, with probability p
One can use a Bernoulli trial indicator Wi
0, with probability 1-p
P W5 =0 1 P W1 =1 W2 =1 1 P W1 =1 P W2 =1 1 p 2
P W6 =0 1 P W3 =1 W4 =1 1 P W3 =1 P W4 =1 1 p 2
Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

Application Example Prior probability, conditional probability, and posterior probability

The Prior Probabilities:


Given:
P{X=2} = 2 P{X=1}; and P{X=3} = 3 P{X=1}
P{ X 1| Y 1}

P{ X 1, Y 1}
P{Y 1}

At the Tx, we have:


P{X=1} + P{X=2} + P{X=3} = 1

Therefore, the priors can be found as,


P{X=1}=1/6; P{X=2} = 2/6; and P{X=3} = 3/ 6

The Posterior Probabilities:


P{X=1 | Y=1}=? P{X=2 | Y=1}=? P{X=3 |Y=1}=?
More generally, P{X=m | Y=n}=?, with n, m =1,2,3.

P{X=1,Y=1}
1

; P{X=2|Y=1}=
P{Y=1}
(1 ) 1.5
(1 ) 1.5
1
While P{ X 1, Y 1} P{Y 1| X 1} P{ X 1}
,
6
3
1 2 3
P{Y=1}= P{Y=1|X=n} P{X=n} (1 )
6 2 6 2 6
n=1

Solution:
P{X=1|Y=1}=
P{X=1}
+ P{X=2}
+ P{X=3} = 1

Probability and Stochastic Processes, 2/E by Roy D. Yates and David J. Goodman
Copyright 2005 by John Wiley & Sons, Inc. All rights reserved.

You might also like