You are on page 1of 54

Probability and Statistics (MATH F113)

Pradeep Boggarapu
Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

January 18, 2018

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 1 / 24
Text Book:
Introduction to Probability and Statistics, ‘Principles and
applications for engineering and the computing sciences’ by J. S.
Milton and J. C. Arnold, 4th ed., Tata McGraw-Hill Pub.
References:
1 Vol: 1, 2: An Introduction to Probability Theory and
Applications by Feller, 3rd edition, John Wiley & Sons, 2008.
2 A First Course in Probabilitys by Sheldon M. Ross, 7th edition,
Prentice Hall, 2002.
3 Miller & Freund’s-Probability & Statistics for Engineers by
Richard A. Johnson, 6th Edition, Pearson Education Inc., First
Indian Reprint, 2001.
4 Introduction to Mathematical Statistics by Hogg, R. V. and
Craig, A, Pearson Education, 2005.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 2 / 24
Teachers:
1 Dr. Pradeep Boggarapu (CC-113)
Lecture for Sec. L2 and Tutorial for Sec. T5.

2 Dr. Gauranga Charan Samanta (CC-115)


Lecture for Sec. L2 and Tutorial for Sec. T4.

3 Godinho Aloysius Querobino/ Jai Tusharr


Tutorial for Sec. T6.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 3 / 24
Evaluation Scheme:

Components Duration Weightage Nature


1. Mid-term 1Hr. 30 Min. 90 CB
2. Compre. 3 Hrs. 130 CB
3. Surprise tests* 30 Min. 80 OB

CB: Closed Book, OB: Open Book

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 4 / 24
Miscellaneous

Chamber consultation hour:

Tuesday and Thursday: 11 AM to 12.30 PM


(CC-113).

Mail me at “pradeepb@goa.bits-pilani.ac.in” to find


me.

All notices regarding the course MATH F113 will be


displayed on online course platform; moodle/LMS.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 5 / 24
Introduction to Probability

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 6 / 24
Outline

1 Basic terminology

2 Axioms of probability and further properties

3 Conditional probability

4 Bayes’ theorem

5 Independent events

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 7 / 24
Basic Termonology

What is Probability?
Ans. The measure of the chances that an event occur
in an experiment.
Random Experiment. An experiment or a process for
which the outcome cannot be predicted with certainty.
Although the outcome of the experiment will not be
known in advance, but the set of all possible outcomes
is known.
Sample Space. The set of all possible outcomes of a
random experiment is known as the sample space of
the experiment and is denoted by S.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 8 / 24
Basic Termonology

Events. Any subset E of sample space S of a random


experiment is known as event.
Algebra of Events. Union and intersection of finitely
many events is an event. Complement of an event is
an event.
Mutually exclusive events. The collection of events
{E1 , E2 , E3 , · · · } is said to be mutually exclusive, if
Ei ∩ Ej = ∅, for all i 6= j.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 9 / 24
Axioms of Probability
Probability is a function P : 2S → R satisfying:
Axiom 1.
0 ≤ P(E ) ≤ 1

Axiom 2.
P(S) = 1

Axiom 3. For any sequence of mutually exclusive


events E1 , E2 , . . .,
[n  Xn
P Ej = P(Ej ),
j=1 j=1

for any positive integer n.


Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 10 / 24
Axioms of Probability

Definition 0.1.
Classical Formula. Let S be finite sample space of a
random experiment having equally likely outcomes, then
for any event E ⊂ S,
n(E )
P(E ) = .
n(S)

Example 1. A committee of size 5 is to be selected


from a group of 6 men and 9 women. If the selection
is made randomly, what is the probability that the
committee consists of 3 men and 2 women?
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 11 / 24
Further Properties

Let E and F be two from a sample space S. Then


1 P(∅) = 0.
2 P(E c ) = 1 − P(E ).
3 P(E ∪ F ) = P(E ) + P(F ) − P(E ∩ F ).
4 If E ⊂ F then we have that P(E ) ≤ P(F ) and
P(F \ E ) = P(F ) − P(E ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 12 / 24
Problems

1 Example 2. A total of 28 percent of American males


smoke cigarettes, 7 percent smoke cigars, and 5
percent smoke both cigars and cigarettes. What
percentage of males smoke neither cigars nor
cigarettes?
2 Example 3. The probability that a dealer will sell
atleast 20 televisions in a day is 0.45 and the
probability that he will sell less than 24 televisions is
0.74. What is the probability that he will sell 20, 21,
22 and 23 televisions during the day ?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 13 / 24
Problems

Example 4. Why does it pay to bet consistently on


seeing 6 atleast once in 4 throws of a die, but not on
seeing a double six atleast once in 24 throws with two
dice?
Example 5. The probability of a horse A winning a
race is 1/5 and the probability of another horse B
winning the race is 1/4 what is the probability that (i)
either of them will win (ii) none of them will win ?
Example 6. The sum of two non-negative quantities is
equal to 2n. Find the chance that their product is not
less than 3/4 times their greatest product.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 14 / 24
Conditional Probability

Definition 0.2.
Let E and F be events such that P(F ) 6= 0. The
conditional probability of E given F , denoted by P(E |F ),
is defined as
P(E ∩ F )
P(E |F ) = .
P(F )

Example 7. Roll a die and observe the number. Let E be


an event that the die shows odd number and F be the
event the die shows atleast 4. What is P(E |F )?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 15 / 24
Total Probability Rule

Note that P(E ∩ F ) = P(F )P(E |F ). In general,

P(E1 E2 E3 · · · En )
= P(E1 )P(E2 |E1 )P(E3 |E1 E2 ) · · · P(En |E1 E2 · · · En−1 ),

which is known as ‘multiplication rule’.

Theorem 0.3 (Total Probability Rule).


Let E1 , E2 , E3 , . . . , En be a collection of mutually exclusive events
whose union is sample space S. Let E be any event, then
n
X
P(E ) = P(E |Ej )P(Ej ).
j=1

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 16 / 24
Bayes’ Theorem

Theorem 0.4 (Bayes’ Theorem).


Let E1 , E2 , E3 , . . . , En be a collection of mutually exclusive
events whose union is sample space S. Let E be any event
such that P(E ) 6= 0. Then for any event Ek ,
k = 1, 2, 3, . . . n,

P(E |Ek )P(Ek )


P(Ek |E ) = n .
X
P(E |Ej )P(Ej )
j=1

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 17 / 24
Problems

Example 8. Bag I contains 4 white and 6 black balls while


another Bag II contains 4 white and 3 black balls. One
ball is drawn at random from one of the bags, then (i)
what is the probability that the ball drawn is white and (i)
if the drawn ball is found to be black what is the
probability that it was drawn from Bag I.
Problem 9. A sign reads “ARKANSAS”. Three letters are
removed and put back into the three empty space at
random. What is the Probability the sign still reads
“ARKANSAS”?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 18 / 24
Independent Events

Definition 0.5 (Independent Events).


Two events E and F from a sample space S are said to be
‘independent’ if P(EF ) = P(E )P(F ). Two events E and
F are said to be ‘dependent’ if they are not independent.
Example 10. A card is selected at random from an
ordinary deck of 52 playing cards. If E is the event that
the selected card is an ace and and F is the event that it
is a spade, then E and F are independent.

Note that if E and F are independent


P(E |F ) = P(E ) and P(F |E ) = P(F ).
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 19 / 24
Independent Events

Definition 0.6.
The three events E , F and G are said to be (mutually)
independent if
P(EFG ) = P(E )P(F )P(G )
P(EF ) = P(E )P(F )
P(FG ) = P(F )P(G )
P(GE ) = P(G )P(E ).

We may also extend the definition of independence to


more than three events.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 20 / 24
Independent Events

For instance, the events E1 , E2 , E3 , . . . , En are said to


be independent if, for any collection Er1 , Er2 , . . . , Erk ,
k ≤ n from these events

P(Er1 Er2 · · · Erk ) = P(Er1 )P(Er2 ) · · · P(Erk ).

Theorem 0.7.
Let E , F and G be three events from a sample space S.
1 If E and F are independent, then so are E and F c .
2 If E , F and G are independent, then E is independent
of F ∪ G .
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 21 / 24
Problems

Example 11. An infinite sequence of independent trials is


to be performed. Each trial results in a success with
probability p and a failure with probability 1 − p. What is
the probability that
1 at least 1 success occurs in the first n trials;
2 exactly k successes occur in the first n trials;
3 all trials result in successes?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 22 / 24
Problems

Example 12. In a lecture hall there are 4 class I boys, 6


class I girsl and 6 class II boys. How many class II girls
must be present in that lecture hall if boy and class I are
to be independent when a student is selected at random?
Assume that there are only two types of students in the
lecture hall that is class I and class II.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 23 / 24
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 24 / 24
Probability and Statistics (MATH F113)

Pradeep Boggarapu
Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

January 18, 2018

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 1 / 18
Random Variables. Discrete Random Variables

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 2 / 18
Outline

1 Definitions of random variable and discrete random


variable.
2 Density function and cumulative distribution of a RV.
3 Expectation and distribution parameters. (Variation,
standard deviation, moments and moment generating
function)

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 3 / 18
Random Variables

Definition 0.1.
Random Variable Random variable is a real-valued
function from a sample space S. We use uppercase letters
to denote a random variable and lowercase letter to denote
the numberical values observed by random variable (rv).

Example 1. Suppose that our experiment consists of


tossing 3 fair coins. If we let Y denote the number of
heads that appear, then Y is a random variable taking
one of the values 0, 1, 2, and 3.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 4 / 18
Examples for RV

Example 2. Consider the experiment of throwing two dice.


Let X denotes the sum of the numbers shown by the dice.
Then the X is a random variable which takes the values 2,
3, 4, . . . , 12.
Notation: P[X ∈ I ] = P[{s ∈ S : X (s) ∈ I }]. The
probability that the rv X takes values in I ⊂ R.
Example 3. Three balls are to be randomly selected
without replacement from an urn containing 20 balls
numbered 1 through 20. If we bet that atleast one of the
balls that are drawn has a number as large as or larger
than 17, what is the probability that we win the bet?
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 5 / 18
Discrete random variable
Definition 0.2 (Discrete random variable).
A random variable is discrete if it can assume at most a
finite or countably infinite numbers of possible values.

The random variables discussed in the above examples are


discrete random variables.
Definition 0.3 (Probability density function or mass function).
For a discrete random variable X , we define the
probability density function f (x) of X by

f (x) = P(X = x)

where x is real number.


Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 6 / 18
Probability density function

Remark 0.4.
A real valued function f (x) is a probability density
function for a discrete random variable if and only if
1 f (x) ≥ 0,
X
2 f (x) = 1.
allx

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 7 / 18
Problems

Example 4. Write down the probability mass functions


and verify the above remark for the random variables
defined in Example 1, Example 2 and Example 3.
Example 5. Five distinct number are randomly distributed
to players numbered 1 through 5. Whenever two players
compare their numbers, the one with higher one is
declared the winner. Initially, player 1 and 2 compare their
numbers; the winner then compares with player 3, and so
on. Let X denote the number of times player 1 is winner.
Find P(X = i) for i = 0, 1, 2, 3, 4.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 8 / 18
Cumulative distribution function- Discrete

Definition 0.5 (Cumulative distribution function (cdf)).


Let X be a discrete random variable with density f . The
cumulative distribution function for X , denoted by F , is
defined by
X
F (x) = P[X ≤ x] = f (a) for x real.
a≤x

Example 6. Find the cdf for the random variable defined


in Example 1 and Exampe 2.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 9 / 18
Expectation of a random variable

Definition 0.6 (Expected value of a random variable X ).


Let X be a discrete random variable with density function
f (x). The expectation or expected value of X , denoted by
E [X ], is defined by
X
E [X ] = x f (x).
all x

Example 7 Find E [X ] where X is the outcome when we


roll a fair die.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 10 / 18
Expectation of a random variable

Note that expected value is also known as mean and


some times we use ‘µ’ to denote the expectation or
expected value or mean.
Let X be a discrete random variable with density
function f (x) and H(X ) be a real-valued function of
X , then H(X ) is a random variable and its
expectation is given by
X
E [H(X )] = H(x) f (x).
all x

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 11 / 18
Variance and Standard deviation of random variable

Definition 0.7 (Variance and standard deviation).


Let X be a discrete random variable with mean µ.
1 The variance of X , denoted by Var [X ] or σ 2 , is
defined by
Var [X ] = σ 2 = E [(X − µ)2 ].

2 The standard deviation of X , denoted by σ, is defined


by p
σ = Var [X ]

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 12 / 18
Rules for expectations

Theorem 0.8.
Let X and Y be two discrete random varibles and c be
any real number.
1 E [c] = c
2 E [cX ] = cE [X ]
3 E [X + Y ] = E [X ] + E [Y ].

Corollary 0.9.

Var [X ] = E [X 2 ] − (E [X ])2 .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 13 / 18
Rules for variance

Theorem 0.10.
Let X and Y be two discrete random varibles and c be
any real number.
1 Var [c] = 0
2 Var [cX ] = c 2 Var [X ]
3 Var [X + Y ] = Var [X ] + Var [Y ], provided X and Y
are independent random variables.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 14 / 18
Problem

Example 8. A fair die is tossed. Let X be random variable


denoting ‘twice the number appearing’ and Y be the
random variable takes 1 or 3 accordingly as odd or even
number appears. Then find the pmf, expectation and
variance for the random variables X , Y , Z = X + Y and
W = X · Y . Also, verify the following:
1 E [Z ] = E [X ] + E [Y ].
2 Var [Z ] 6= Var [X ] + Var [Y ].
3 E [W ] 6= E [X ] · E [y ].

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 15 / 18
Moments and moment generating function (mgf)

Definition 0.11 (Moments and mgf).


Let X be a discrete random variable with density function
f (x).
1 The kth moment of X is defined as E [X k ].
2 The moment generating function for X is denoted by
mX (t) and is defined by

mX (t) = E [e tX ]

provided this expectation is finite for all real numbers


t in some open interval (−h, h).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 16 / 18
Moments and moment generating function (mgf)
Example 9. Two balls are randomly chosen from an urn
containing 2 white, 2 red, and 4 black balls. Suppose that
we win Rs. 1 for each white ball selected and lose Rs. −1
for each red ball selected. If we let X denote our total
winnings from the experiment, then find the first, second
moments of X and mgf for X .
Theorem 0.12.
If mX (t) is the moment generating function for a random
variable X , then the kth moment of X is given by
k d k mX (t)
E [X ] = .
dt k t=0

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 17 / 18
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 18 / 18
Probability and Statistics (MATH F113)

Pradeep Boggarapu
Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

January 18, 2018

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 1 / 12
Standard Examples for Discrete Random Variables

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 2 / 12
Outline

1 Bernoulli and Binomial random variables.


2 Geometric random variable.
3 Poisson random variable.
4 Hypergeometric random variable.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 3 / 12
Bernoulli random variable
Definition 0.1 (Bernoulli trial).
A random experiment or a trial whose outcome can be
classified as either success or a failure is called Bernoulli
trial.
In Bernoulli trial, define a random variable X by
X = 1, when the outcome is a success and X = 0
when it is a failure, then X is called Bernoulli random
variable.
If p is the probability that the trial is success, then the
probability mass function is given by
f (x) = p x (1 − p)1−x for x = 0, 1.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 4 / 12
Binomial random variable

Consider n Bernoulli trials which are indipendent and


identical in the sense that the outcome of one trial has
no effect on the outcome of any other and the
probability of success, p, 0 ≤ p ≤ 1 (let’s say) remains
the same from trial to trial.
If X denotes the number of success that occur in the
n trials, X is said to be binomial random variable with
parameters (n, p).
The pmf of a binomial random variable having
parameters (n, p) is given by
 
n x
f (x) = p (1 − p)n−x , for x = 0, 1, 2 . . . n.
x
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 5 / 12
Examples

Example 1. Five fair coins are flipped. If the outcomes are


assumed independent, find the probability mass function
of the number of heads obtained. And also find the
probability that atleast two heads are obtained.
Example 2. It is known that disks produced by a certain
company will be defective with probability 0.01
independently of each other. The company sells the disks
in packages of 10 and offers a money-back guarantee that
at most 1 of the 10 disks is defective. (i) What proportion
of packages is returned? (ii) If someone buys three
packages, what is the probability that exactly one of them
will be returned?
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 6 / 12
Mean, Variance and Mgf of Binomial RV

Theorem 0.2.
If X is a binomial random variable with parameters (n, p),
then
1 E (X ) = np
2 Var (X ) = np(1 − p)
3 The mgf of X is given by mX (t) = (pe t + 1 − p)n .

Proof. Note that the pmf or pdf of X is given by


 
n x
fX (x) = p (1 − p)n−x , for x = 0, 1, 2, . . . n.
x

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 7 / 12
Mean, Variance and Mgf of Binomial RV
n  
X n x
E (X ) = x p (1 − p)n−x
x=0
x
n
X n!
= x p x (1 − p)n−x
x=0
x!(n − x)!
n
X (n − 1)!
= np p x−1 (1 − p)n−1−x
x=1
(x − 1)!(n − x)!
n−1
X (n − 1)! j
= np p (1 − p)n−1−j
j!(n − j)!
j=0
= np(p + 1 − p)n−1 = np

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 8 / 12
Mean, Variance and Mgf of Binomial RV
n  
2
X n x
2
E (X ) = x p (1 − p)n−x
x
x=0
n
X
2 n!
= x p x (1 − p)n−x
x!(n − x)!
x=0
n
X (n − 1)!
=np (x − 1 + 1) p x−1 (1 − p)n−x
(x − 1)!(n − x)!
x=1
n
X (n − 1)! 
=np (x − 1) p x−1 (1 − p)n−1−x
(x − 1)!(n − x)!
x=1
n
X (n − 1)! 
+ np p x−1 (1 − p)n−1−x
(x − 1)!(n − x)!
x=1
=n(n − 1)p 2 + np = np(1 − p) + n2 p 2 .

Therefore, Var (X ) = E [X 2 ] − (E [X ])2 = np(1 − p), since E [X ] = np.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 9 / 12
Mean, Variance and Mgf of Binomial RV

Moment generating function is given by


n  
tx n
X
tX
mX (t) =E [e ] = e p x (1 − p)n−x
x=0
x
n  
X n
= (pe t )x (1 − p)n−x = pe t + 1 − p)n .
x=0
x

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 10 / 12
Cumulative Distribution Function of Binomial RV

Remark 0.3.
The cdf of bionomial random variable X with parameters
(n, p) is given by



 0, if x < 0
[x]  


X n j
F (x) = p (1 − p)n−j , if 0 ≤ x < n
 j


 j=0
1, if x ≥ n.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 11 / 12
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 18, 2018 12 / 12

You might also like