You are on page 1of 51

Y

m 

 
 

m
by
AMIR D. ACZEL
&
JAYAVEL SOUNDERPANDIAN
6th edition.
YY

 Y

à 
Y

Y à 
w Using Statistics
w Basic Definitions: Events, Sample Space, and

Probabilities
w Basic Rules for Probability

w Conditional Probability

w Independence of Events

w Combinatorial Concepts

w The Law of Total Probability and Bayes¶ Theorem

w Joint Probability Table

w Using the Computer


Y

Y ÷ 


@ 
   
w Define probability, sample space, and event.
w Distinguish between subjective and objective probability.
w Describe the complement of an event, the intersection, and
the union of two events.
w Compute probabilities of various types of events.
w Explain the concept of conditional probability and how to
compute it.
w Describe permutation and combination and their use in
certain probability computations.
w Explain Bayes¶ theorem and its applications.
Y

Y   

w A quantitative measure of   


w A measure of the §   
 in the
occurrence of an uncertain event
w A measure of the degree of    

 
     of an uncertain
event
w Measured by a number between 0 and 1 (or
between 0% and 100%)
Y

  

w   m
§§
 

‰ based on equally-likely events
‰ based on long-run relative frequency of events

‰ not based on personal beliefs

‰ is the same for all observers (objective)

‰ examples: toss a coin, throw a die, pick a card


Y

   

w   



‰ based on personal beliefs, experiences, prejudices,
intuition - personal judgment
‰ different for all observers (subjective)

‰ examples: Super Bowl, elections, new product


introduction, snowfall
Y

Y Y  !

w  

  
  §  § 
 §
‰ Empty set (denoted by å)
w a set containing no elements
‰ Universal set (denoted by S)
w a set containing all possible elements
‰ Complement ü@is
(Not). The complement of A
w a set containing all elements of S not in A
Y"

##   $

Venn Diagram illustrating the Complement of an event


Y %

 ! 

‰
§ (And) ü@ ^
± a set containing all elements in both A and B
‰   (Or) ü@ 
± a set containing all elements in A or B or both
Y 

$ & ' ( ) 

@^
Y Y

$ & * 


Y 

 ! 

* 


§  §  §§
±sets having no elements in common, having no
intersection, whose intersection is the empty set
* 
±a collection of mutually exclusive sets which
together include all possible elements, whose
union is the universal set
Y 

+ ,- .  !/ $

§      


Y 

$ 





Y 


Y 

,-#
* Process that leads to one of several possible
outcomes *, e.g.:
‰ Coin toss
* eads, Tails
‰ Throw die
* 1, 2, 3, 4, 5, 6
‰ Pick a card
A , K , Q , ...
‰ Introduce a new product
* Each trial of an experiment has a single observed
outcome.
* The precise outcome of a random experiment is
unknown before a trial.
* Also called a basic outcome, elementary event, or simple event
Y 

,.  !
w Sample Space or Event Set
‰ Setof all possible outcomes (universal set) for a given
experiment
w E.g.: Roll a regular six-sided die
S = {1,2,3,4,5,6}

w Event
‰ Collection of outcomes having a common characteristic
w E.g.: Even number
A = {2,4,6}
Event A occurs if an outcome in the set A occurs
w Probability of an event
‰ Sum of the probabilities of the outcomes of which it consists
w P(A) = P(2) + P(4) + P(6)
Y 

,0 1 


2   ' ,-#
* or example:
‰ Throw a die
* Six possible outcomes {1,2,3,4,5,6}
* If each is equally-likely, the probability of each is 1/6 = 0.1667 =
16.67%
1
à( ) ƒ
(  )
* Probability of each equally-likely outcome is 1 divided by the number of
possible outcomes
‰ Event A (even number)
* P(A) = P(2) + P(4) + P(6) = 1/6 + 1/6 + 1/6 = 1/2
* à ( @ ) ƒ à ( ) for e in A
( @ ) 3 1
ƒ ƒ ƒ
(  ) 6 2
Y "

 1   $# $ 


o§  § m
§ §
Union of A A A A
K K K K
Event µAce¶
Events µ eart¶ Q Q Q Q
 ( @ ) 4 1
and µAce¶ J J J J
à ( @ ) ƒ ƒ ƒ
10 10 10 10
à(    @ ) ƒ  ( ) 52 13
9 9 9 9
8 8 8 8
(    @ ) 7 7 7 7
ƒ 6 6 6 6
 ( ) 5 5 5 5
4 4 4 4
16 4 3 3 3 3
ƒ
2 2 2 2
52 13

The intersection of the


events µ eart¶ and µAce¶
Event µ eart¶
comprises the single point
 (   ) 13 1
à (   ) ƒ ƒ ƒ circled twice: the ace of hearts
( ) 52 4  (    @ ) 1
à (    @ ) ƒ ƒ
 ( ) 52
Y Y%

Y
 3  

w d  
§   0 a à ( @) a 1

w m 
  § - Probability of
 A
à( @ ) ƒ 1 à( @)

w
§ - Probability of both A 
 B
à ( @ ^ ) ƒ ( @ ^ )
(  )
‰ 


§   § (A and C) :
à( @ ^ ) ƒ 0
Y Y

 3  



*   - Probability of A ¦ B or both (rule of unions)
à( @  ) ƒ ( @  ) ƒ à ( @)  à ( ) à( @ ^ )
(  )

‰Mutually exclusive events: If A and B are mutually exclusive, then

à( @ ^ ) ƒ 0 ¦ à ( @  ) ƒ à ( @)  à ( )
Y YY

$ & * 

à( @  )
Y Y

Y  

* m  
 
 - Probability of A Õ  B

à(@ ^ )
à(@ )ƒ ,   à( ) â 0
à( )

‰Independent events:
à( @ ) ƒ à( @)
à( @) ƒ à( )
Y Y

   

d
§   
 


à( @ ) ƒ à( @ ^ ) so à( @ ^ ) ƒ à( @ ) à( )
à( )
ƒ à( @) à( @)

If events A and D are §§

   :

à ( @  ) ƒ à ( @)
so à ( @ ^  ) ƒ à ( @) à (  )
à (  @) ƒ à (  )
Y Y

(   ,-# Y Y


Counts

  


   40 10  Probability that a project
m § 20 30 
is undertaken by IBM
Õ  it is a
 
  
telecommunications
Probabilities project:
à(   )

  
à(  ) ƒ
à ( )

   .40 .10 
0 . 10
ƒ ƒ 0 .2
m § .20 .30  0 . 50

 
  
Y Y

Y '   ,.

Conditions for the statistical independence of events A and B:


à ( @ ) ƒ à ( @)
à ( @) ƒ à ( )

à ( @  ) ƒ à ( @) à ( )
à ( @   ) à(   @ )
à ( @  ) ƒ à(  @ ) ƒ
à(  ) à ( @ )
1 1
1 1
ƒ 52 ƒ ƒ à ( @ ) ƒ 52 ƒ ƒ à (  )
13 13 4 4
52 52

4 13 1
à ( @    ) ƒ * ƒ ƒ à ( @ ) à (   )
52 52 52
Y Y

'   ,. 4
  ,. 4
,-# Y

Events    ¦ () and ¦ ( ) are


assumed to be    

)à(  ) ƒ à( ) à( )
ƒ 0.04 * 0.06 ƒ 0.0024
)à( ) ƒ à( )  à( ) à(  )
ƒ 0.04  0.06 0.0024 ƒ 0.0976
Y Y

  3  ' ,.

The probability of the  § of several independent events


is the product of their separate individual probabilities:
à ( @ ^ @ ^ @ ^’ ^ @ ) ƒ à( @ ) à ( @ ) à ( @ )’ à( @ )
1 2 3 1 2 3
The probability of the   of several independent events
is 1 minus the product of probabilities of their complements:
à ( @ Ñ @ Ñ @ ђ Ñ @ ) ƒ 1  à( @ ) à( @ ) à( @ )’ à ( @ )
1 2 3 1 2 3

Example 2-7:
(´ Ñ´ Ñ´ ђ Ñ´ ) ƒ1 à(´ )à(´ )à(´ )’ à(´ )
1 2 3 10 1 2 3 10
ƒ10.9010 ƒ10.3487 ƒ 0.6513
Y Y"

Y  #  


Consider a pair of six-sided dice. There are six possible outcomes
from throwing the first die {1,2,3,4,5,6} and six possible outcomes
from throwing the second die {1,2,3,4,5,6}. Altogether, there are
6*6 = 36 possible outcomes from throwing the two dice.
In general,  
  §         
  §§
§      §    
§  
  §  §
.
w Pick 5 cards from a deck of w Pick 5 cards from a deck of
52 -  
   52 -  
  
‰ 52*52*52*52*52=525 ‰ 52*51*50*49*48 =
380,204,032 different 311,875,200 different
possible outcomes possible outcomes
Y
%

+  #  


 !(#

.
Order the letters: A, B, and C
C
. ABC

. .. B
C B .. ACB

. . . A
B A C

.. BAC

. . C
C
A
A
B
BCA

. B
A
. CAB

CBA
Y


' 

o   §    


§  m
There are 3 choices for the first letter, 2 for the second, and 1 for
the last, so there are 3*2*1 = 6 possible ways to order the three
letters A, B, and C.

o   §    


§ m 
  Y!"Y

 
: or any positive integer , we define
    as:


  
. We denote n factorial as
.
The number is the number of ways in which  objects can
be ordered. By definition   
Y
Y

# 5  #

 §
   
§ m   
There are 6 ways to choose the first letter, 5 ways to choose the
second letter, and 4 ways to choose the third letter (leaving 3
letters unchosen). That makes 6*5*4=120 possible orderings or
  §.

  § are the possible selections of  objects out


of a total of objects. The number of permutations of  objects
taken  at a time is denoted by
where
à ƒ #
     #
¦  :
6! 6! 6 * 5 * 4 * 3 * 2 * 1
6 à3 ƒ ƒ ƒ ƒ 6 * 5 * 4 ƒ 120
( 6 3 )! 3! 3 * 2 *1
Y

# 5   '#

Suppose that when we pick 3 letters out of the 6 letters A, B, C, D, E, and 


we chose BCD, or BDC, or CBD, or CDB, or DBC, or DCB. (These are the
6 (3!) permutations or orderings of the 3 letters B, C, and D.) But these are
orderings of the same combination of 3 letters. ow many combinations of 6
different letters, taking 3 at a time, are there?
m   § are the possible selections of  items from a group of  items 

regardless of the order of selection. The number of combinations is denoted  
and is read as
 . An alternative notation is
We define the number
of combinations of r out of n elements as:



  ƒ
  ƒ
  
 
¦  :

 6! 6! 6 * 5 * 4 * 3 * 2 *1 6 * 5 * 4 120
  ƒ 6  3 ƒ ƒ ƒ ƒ ƒ ƒ 20
 3! ( 6  3 )! 3!3! (3 * 2 * 1)(3 * 2 * 1) 3 * 2 *1 6
Y

,-# #   (


# 6 #
Y

Y   7)    


8 #
The law of total probability:
à( @) ƒ à( @ ^ )  à( @ ^ )

In terms of conditional probabilities:


à( @) ƒ à( @ ^ ) à( @ ^ )
ƒ à ( @ ) à ( )  à( @ ) à( )

More generally (where Bi make up a partition):


à( @) ƒ à( @ ^ )
ƒ à( @ ) à( )
Y


 7)   


,-# Y "
Event U: Stock market will go up in the next year
Event W: Economy will do well in the next year
à (  ) ƒ .7 5
à (  ) ƒ 3 0
à ( ) ƒ .8 0 à ( ) ƒ 1  .8 ƒ .2

à (  ) ƒ à (  ^  )  à ( ^  )
ƒ à (  ) à ( )  à (  ) à ( )
ƒ (.7 5 )(.8 0 )  (.3 0 )(.2 0 )
ƒ .6 0  .0 6 ƒ .6 6
Y


8 #
* § theorem enables you, knowing just a little
more than the probability of A given B, to find the
probability of B given A.
* Based on the definition of conditional probability
and the law of total probability.
à(@ )
à( @) ƒ
à ( @)
à(@ ) Applying the law of total
ƒ probability to the denominator
à(@ )  à(@ )
à(@ )à( ) Applying the definition of
ƒ
à(@ )à( )  à(@ ) à( ) conditional probability throughout
Y


8 # ,-# Y %

* A medical test for a rare disease (affecting 0.1% of the


population [ à ( ) ƒ 0.001 ]) is imperfect:
‰When administered to an ill person, the test will indicate so
with probability 0.92 [ à( ) ƒ .92  à( ) ƒ .08 ]
The event ( ) is a 
§  
‰When administered to a person who is not ill, the test will
erroneously give a positive result (false positive) with
probability 0.04 [ à( ) ƒ 0.04  à( ) ƒ 0.96 ]
The event ( ) is a 
§ § . .
Y
"

,-# Y %  

à ( ) ƒ 0.001 à(  )
à( ) ƒ
à( )
à(  )
ƒ
à(  )  à(   )
à ( ) ƒ 0.999
à( ) à( )
ƒ
à( ) à( )  à( ) à( )
à (  ) ƒ 0.92 ƒ
(.9 2 )( 0.0 0 1)
(.9 2 )( 0.0 0 1)  ( 0.0 4 )(.9 9 9 )
0.0 0 0 9 2 0.0 0 0 9 2
ƒ ƒ
0.0 0 0 9 2  0.0 3 9 9 6 .0 4 0 8 8
à (  ) ƒ 0.04
ƒ .0 2 2 5
Y %

,-# Y %  !(#


Prior Conditional Joint
Probabilities Probabilities Probabilities

à(  ) ƒ 092
. à(   ) ƒ (0.001)(0.92) ƒ.00092

à(  ) ƒ 008
. à(   ) ƒ (0.001)(0.08) ƒ.00008
à( ) ƒ 0001
.

à( ) ƒ 0999
. à(  ) ƒ 004
. à(   ) ƒ (0.999)(0.04) ƒ.03996

à(  ) ƒ 096
.

à(   ) ƒ (0.999)(0.96) ƒ.95904
Y 

8 # ,-

* iven a partition of events B1,B2 ,...,Bn:


à( @^ )
à( @) ƒ
1
1

à( @)
Applying the law of total
à( @^ ) probability to the denominator
ƒ 1

à( @^ )
Applying the definition of
à( @ ) à( ) conditional probability throughout
ƒ 1 1

 à( @ ) à( )
Y Y

8 # ,-


,-# Y 
w An economist believes that during periods of high economic growth, the U.S.
dollar appreciates with probability 0.70; in periods of moderate economic
growth, the dollar appreciates with probability 0.40; and during periods of
low economic growth, the dollar appreciates with probability 0.20.
w During any period of time, the probability of high economic growth is 0.30,
the probability of moderate economic growth is 0.50, and the probability of
low economic growth is 0.50.
w Suppose the dollar has been appreciating during the present period. What is
the probability we are experiencing a period of high economic growth?
 : Event A  Appreciation
- igh growth P( ) = 0.30 à ( @  ) ƒ 0.70
M - Moderate growth P(M) = 0.50 à ( @  ) ƒ 0.40
L - Low growth P(L) = 0.20 à ( @ ) ƒ 0.20
Y

,-# Y   

à(  @)
à( @) ƒ
à( @)
à(  @)
ƒ
à(  @)  à(   @)  à(   @)
à ( @ ) à( )
ƒ
à( @ ) à( )  à( @  ) à(  )  à( @ ) à( )
( 0.70)( 0.30)
ƒ
( 0.70)( 0.30)  ( 0.40)( 050
. )  ( 0.20)( 0.20)
0.21 0.21
ƒ ƒ
0.21 0.20  0.04 0.45
ƒ 0.467
Y

,-# Y   !(#


Prior Conditional Joint
Probabilities Probabilities Probabilities
à( @ ) ƒ 0. 70 à ( @   ) ƒ ( 0.30 )( 0. 70 ) ƒ 0. 21

à( @ ) ƒ 0.30
à (  ) ƒ 0 .3 0 à( @  ) ƒ ( 0.30 )( 0.30 ) ƒ 0. 09

à ( @  ) ƒ 0. 40 à ( @   ) ƒ ( 0.50)( 0. 40) ƒ 0. 20

à (  ) ƒ 0.50

à ( @  ) ƒ 0.60 à ( @   ) ƒ ( 0.50)( 0.60) ƒ 0.30


à ( @  ) ƒ 0. 20
à (  ) ƒ 0. 20 à ( @   ) ƒ ( 0. 20 )( 0. 20 ) ƒ 0. 04

à ( @  ) ƒ 0.80 à ( @   ) ƒ ( 0. 20 )( 0.80) ƒ 0.16


Y

Y   9  

w A joint probability table is similar to a contingency


table , except that it has probabilities in place of
frequencies.
w The joint probability for Example 2-11 is shown

below.
w The row totals and column totals are called

marginal probabilities.
Y 

 9  

w A joint probability table is similar to a contingency


table , except that it has probabilities in place of
frequencies.
w The joint probability for Example 2-11 is shown on

the next slide.


w The row totals and column totals are called

marginal probabilities.
Y 

 9  


,-# Y 

w The joint probability table for Example 2-11


is summarized below.
o     


§
Y Y  
 §
$   

 
  Y 
Marginal probabilities are the   
§ and the 
  
§.
Y 

Y  *( # #   (


       
Y "

Y  *( # #   (


  #  ( 
 ,-# Y 
Y %

Y  *( # #  


3.   ,-# Y 
Y 

Y  *( # #  


3.   ,-# Y 

m        §§




You might also like