You are on page 1of 59

American Heritage Dictionary defines Probability Theory as the branch of Mathematics that

studies the likelihood of occurrence of random events in order to predict the behavior of defined
systems.
Starting with this definition, it would (probably :-) be right to conclude that the Probability
Theory, being a branch of Mathematics, is an exact, deductive science that studies uncertain
quantities related to random events. This might seem to be a strange marriage of mathematical
certainty and uncertainty of randomness. On a second thought, though, most people will agree
that a newly conceived baby has a 50-50 chance (exact but, likely, inaccurate estimate) to be, for
example, a girl or a boy, for that matter.
Interestingly, a recent book by Marilyn vos Savant dealing with people's perception of
probability and statistics is titled The Power of Logical Thinking. My first problems will be
drawn from this book.
As with other mathematical problems, it's often helpful to experiment with a problem in order to
gain an insight as to what the correct answer might be. By necessity, probabilistic experiments
require computer simulation of random events. It must sound as an oxymoron - a computer (i.e.,
deterministic device) producing random events - numbers, in our case, to be exact. See, if you
can convince yourself that your computer can credibly handle this task also. A knowledgeable
reader would, probably, note that this is a program (albeit deterministic) and not the computer
that does the random number simulation. That's right. It's me and not your computer to blame if
the simulation below does not exactly produce random numbers.
When you press the "Start" button below, the program will start random selection. Every second
it will pick up one of the three numbers - 1, 2, or 3. You can terminate the process anytime by
pressing the "Stop" button. Frequencies of selections appear in the corresponding input boxes.
Do they look random?
Top of
Form
Bottom 2 3
of Form
1
0 0 0

T
o
p

o
f
F
o
r
m
B
o
t
t
o
m

o
f
F
o
r
m

Remark
Actually, the process of selection includes no selection at all. As a mathematician Robert
Coveyou from the Oak Ridge National Laboratory has said, The generation of random numbers
is too important to be left to chance. Instead, I have a function that is invoked every second.
Each time it's invoked, it produces one of the three 1, 2, 3 numbers. This is how the function
works.
I start with an integer seed = 0. When a new random number is needed, the seed is replaced with
the result of the following operation
seed = (7621 × seed + 1) mod 9999
In other words, in order to get a new value of seed, multiply the old value by 7621, add 1, and,
finally, take the result modulo 9999. Now, assume, as in the example above, we need a random
selection from the triple 1, 2, 3. That is, we seek a random integer n satisfying 1 ≤ n ≤ 3. The
formula is
n = [3 × seed/9999] + 1.
Taking it step by step, dividing seed by 9999 produces a nonnegative real number between 0 and
1. This times 3 gives a real number between 0 and 3. Brackets reduce the latter to the nearest
integer which is not greater than the number itself. The result is a nonnegative integer that is less
than 3. Adding 1 makes it one of the three 1, 2, or 3.

Teaching activities
Introduction
Hold up a 1-6 dice. Tell the class you are going to throw the dice 30 times and see how many
times each number is rolled. Draw a tally chart ready for the results and ask the class if they can
guess. Choose five children to roll the dice 6 times and record the results. Were they as
expected? Explain that all the numbers have an equal chance of being rolled. Could a 7 be
rolled? How likely is this? Why? Write the following probability terms down on the board or a
poster for revision: impossible, unlikely, equal chance, likely, certain. Show children the
coloured sticker dice. Using the probability terms on the board, ask children questions. What are
the chances of rolling a yellow (unlikely)? What are the chances of rolling a red (likely)? What
are the chances of rolling a purple (impossible)?.
Activity
Display the online activity on the class whiteboard. Demonstrate the first part of the activity and
the show the children how to use the lever on the probability machine. Small groups or pairs can
then work through the remainder of the activity on the class computers. While awaiting their
turn, remaining groups can take a set of number cards 1-10, put them in a bag, and then draw and
replace a card 30 times. They should draw up a tally chart of their results. Then convert this to a
tally chart of odd and even numbers. What are the chances of drawing a 5 (1 in 10 or one tenth)?
How many 5s do they predict should be drawn (3)? Is this the case? What are the chances of
drawing an odd number (1 in 2 or one half). How many odd numbers do they predict should be
drawn (15). Is this the case?.
Plenary
Ask a pair of children to talk to the class about their number card tally charts and results. Was it
as they expected?
Extension
Children can work through the online quiz or complete the Worksheet.
Homework
Ask children to roll a 1-6 dice 50 times. What patterns do they notice? What are the chances of
drawing an odd number? An even number? A multiple of 4? A multiple of 2?

100 Prisoners and a Light Bulb


Some time ago, Ilia Denotkine has posted the following problem on the CTK Exchange
There are 100 prisoners in solitary cells. There's a central living room
with one light bulb; this bulb is initially off. No prisoner can see the light
bulb from his or her own cell. Everyday, the warden picks a prisoner
equally at random, and that prisoner visits the living room. While there,
the prisoner can toggle the bulb if he or she wishes. Also, the prisoner
has the option of asserting that all 100 prisoners have been to the living
room by now. If this assertion is false, all 100 prisoners are shot.
However, if it is indeed true, all prisoners are set free and inducted into
MENSA, since the world could always use more smart people. Thus, the
assertion should only be made if the prisoner is 100% certain of its
validity. The prisoners are allowed to get together one night in the
courtyard, to discuss a plan. What plan should they agree on, so that
eventually, someone will make a correct assertion?

He then added a background to his question:

I have seen this problem on the forums, and here are some of the best solutions (in
my opinion):
1. At the beginning, the prisoners select a leader. Whenever a person (with the
exception of the leader) comes into a room, he turns the lights on (but he
does this only once). If the lights are already on, he does nothing. When the
leader goes into the room, he turns off the lights. When he will have turned
off the lights 99 times, he is 100% sure that everyone has been in the room.
2. wait 3 years, and with a great probability say that everyone has been in the
room.
Does anyone know The optimal solution???
I have taken this problem from the www.ocf.berkeley.edu site, but I believe
that you can find it on many others.

As I had a recollection of seeing this problem in [Winkler], I replied

The problem is indeed popular. It's even included in P. Winkler's Mathematical


Puzzles, which is a recommended book in any event. Winkler also lists a slew of
sources where the problem appeared, including ibm.com and a newsletter of the
MSRI.
The solution is this:
The prisons select a fellow, say Alice, who will have a special responsibility. All
other prisoners behave according to the same protocol: each turns the light off twice,
i.e. they turn it off the first two times they find it on. They leave it untouched
thereafter. Alice turns the light on if it was off and, additionally, counts the number
of times she entered the room with the light off. When her count reaches 2n - 3 she
may claim with certainty that all n prisoners have been to the room.

As it happened, I was wrong. This may be immediately surmised from Stuart Anderson's
response. In my wonderment I contacted Peter Winkler who kindly set things straight for me.
The formulation in his book is somewhat different, but this difference proves to be of major
significance:

Each of n prisoners will be sent alone into a certain room, infinitely often, but in
some arbitrary order determined by their jailer. The prisoners have a chance to
confer in advance, but once the visits begin, their only means of communication will
be via a light in the room which they can turn on or off. Help them design a protocol
which will ensure that some prisoner will eventually be able to deduce that everyone
has visited the room.
N

Solution by Stuart Anderson


This would work of course, but is it optimal? For instance, this would also work, I think:
Alice counts the times she finds the light on, and ensures that it is always off when she leaves the
room. Everyone else turns on the light the first time they find it off, and then never touches it
again. This way, between visits of Alice, at most one prisoner will turn on the light, and no
prisoner turns it on more than once. Therefore the number of times Alice finds the light on is no
more than the number of different prisoners that have entered the room. Each prisoner knows he
has been counted once he has turned the light on, since he is the only one who touched the switch
since Alice last visited. When Alice counts to n-1, she knows everyone has visited the room.
What does optimal mean here? It could only reasonably mean that the prisoners are freed in the
shortest time. So what is the expected time they must wait until Alice has counted to n-1? This is
a rather elaborate calculation in probability, so the prisoners turn to the actuary (who is in prison
for embezzlement) for some answers.
He explains that using Bayes theorem,
P(X|Y)·P(Y) = P(X&Y) = P(Y|X)·P(X)
and the linearity of expected value,
E(X|Y)·P(Y) + E(X|~Y)·P(~Y) = E(X)
you can calculate the expected time in prison like this:
Suppose Alice has just visited the room, and let K be the number of days that pass before her
next visit (so she visits again K+1 days from now), let n be the number of prisoners, let c be the
number of times she has found the light on so far, and let P(ON) and P(OFF) be the probabilities
that she finds the light on or off on her next visit. Then E(K) = n - 1, P(K=k) = 1/n·((n-1)/n)k,
P(K = k & OFF) = 1/n·(c/n)k, which are fairly obvious.
Summing the last formula over all k gives P(OFF) = 1/(n-c). Bayes theorem then gives P(K = k|
OFF) = (1-c/n)·(c/n)k, and from this you can calculate E(K|OFF) = c/(n-c) and linearity gives
E(K|ON) = ((n-1)(n-c)-c/(n-c))/(n-c-1).
Now let m be the number of times Alice visits and L be the number of days that pass before she
next finds it on. Each time she finds it is off, c does not change, so all the calculations regarding
the time until her next visit also do not change.
Therefore, the expected number of days until she next finds the light on is found by summing
over all possible m to get the expected total time wasted on visits where the light is off, plus the
expected time for the one visit where it was on. This gives
E(L) = (1+E(K|ON))P(ON) + sum(m(1+E(K|OFF))P(OFF)m
= n(1/(n-c-1) - 1/(n-c) + 1 - 1/(n-c)2).
Now we know how long we expect to wait from count = c to count = c+1. Therefore, we must
sum this up from c=0 to c=n-2 to find the total expected time E(T). The result is E(T) = n2 - n/(n-
1) - a, where a = Σ (1/c2) from 2 to n. Putting n=100 into this gives 9935.5 days, which is 26.2
years.
But (continues the actuary) this is absurdly long to wait. Simple probability shows that we can be
almost certain much sooner than this. The probability that on day d the count is c is P(c, d),
which is obviously equal to P(c-1,d-1)·(1-(c-1)/n) + P(c, d-1)·(c/n). Of course, P(0, 0) = P(1, 1) =
1 and P(1,0) = 0, so we can recursively calculate the probability P(n, d). It turns out that
P(100,1146) = 0.999, and P(100,1375) = 0.9999, P(100,1604) = 0.99999, and P(1833) =
0.999999. That means that in 3.14 years, we have a less than 1/1000 chance of failing, and in
exactly 5 years and a week, we have less than one in a million chances of failing. I say we should
wait 5 years and then say "let us out, we've all seen the light."
As they are about to kill Alice (who was already a member of Mensa) for coming up with a crazy
plan to keep them in prison for 26 years, the game theorist (who is in prison for insider trading
on the stock market) steps in to point out that this is a losing move. If they kill her now she will
never go into the room, and the warden will keep them here forever.
In the happy ending, they let Alice live, and they all get out of prison in 5 years. Strangely, they
all decline to join Mensa, preferring to enter actuarial training.

Example:
At a car park there are 100 vehicles, 60 of which are cars, 30 are vans and the remainder are
lorries. If every vehicle is equally likely to leave, find the probability of:
a) van leaving first.
b) lorry leaving first.
c) car leaving second if either a lorry or van had left first.
Solution:
a) Let S be the sample space and A be the event of a van leaving first.
n(S) = 100
n(A) = 30
Probability of a van leaving first:

b) Let B be the event of a lorry leaving first.


n(B) = 100 – 60 – 30 = 10
Probability of a lorry leaving first:

c) If either a lorry or van had left first, then there would be 99 vehicles remaining, 60 of which
are cars. Let T be the sample space and C be the event of a car leaving.
n(T) = 99
n(C) = 60
Probability of a car leaving after a lorry or van has left:

Example:
A survey was taken on 30 classes at a school to find the total number of left-handed students in
each class. The table below shows the results:
No. of left-handed
0 1 2 3 4 5
students
Frequency (no. of
1 2 5 12 8 2
classes)
A class was selected at random.
a) Find the probability that the class has 2 left-handed students.
b) What is the probability that the class has at least 3 left-handed students?
c) Given that the total number of students in the 30 classes is 960, find the probability that a
student randomly chosen from these 30 classes is left-handed.
Solution:
a) Let S be the sample space and
A be the event of a class having 2 left-handed students.
n(S) = 30
n(A) = 5

b) Let B be the event of a class having at least 3 left-handed students.


n(B) = 12 + 8 + 2 = 22

c) First find the total number of left-handed students:


No. of left-handed
0 1 2 3 4 5
students, x
Frequency, f
1 2 5 12 8 2
(no. of classes)
fx 0 2 10 36 32 10
Total no. of left-handed students = 2 + 10 + 36 + 32 + 10 = 90
Here, the sample space is the total number of students in the 30 classes, which was given as 960.
Let T be the sample space and C be the event that a student is left-handed.
n(T) = 960
n(C) = 90

Probability and Area


Example:
ABCD is a square. M is the midpoint of BC and N is the midpoint of CD. A point is selected at
random in the square. Calculate the probability that it lies in the triangle MCN.

Solution:
Let 2x be the length of the square.

Area of square = 2x × 2x = 4x 2

Area of triangle MCN is

Drawing Marbles

Date: 02/01/97 at 18:53:26


From: Kayla
Subject: probablity

A jar contains two red marbles, three blue marbles, and four green
marbles. Niki draws one marble from the jar, and then Tom draws a
marble from those remaining. What is the probablity that Niki draws a
green marble and Tom draws a blue marble? Express your answer as a
common fraction.

Date: 02/03/97 at 21:28:42


From: Doctor Wallace
Subject: Re: probablity

Hi Kayla!

The probability of some event happening is expressed as a fraction or


a decimal from 0 to 1. A probability of 0 means the event can never
happen. A probability of 1 means the event is certain to happen.

One useful rule is that to find a basic probability, with all outcomes
equally likely, we make a fraction like this:

number of chances of our event


---------------------------------
number of total chances

For example, suppose we have a jar with 4 red marbles and 6 blue. We
want to find the probability of drawing a red one at random. So our
event is "drawing a red marble." The probability of this is:

number of red marbles (the chances of our event)


-----------------------
total marbles in jar (the number of total chances)

In our example, this is 4/10 which is 2/5, reduced. So the


probability of drawing a red marble is 2/5. This is because all the
outcomes are equally likely. That is, any individual marble has the
same chance of being drawn. If we numbered all the marbles, what is
the probability of picking out no. 5? Well, there is only 1 number 5
marble, and still 10 marbles in the jar, so the answer is 1/10.

Now suppose we have 2 events. Let's say that Niki is going to draw 1
marble, and then Tom is going to draw one from the remaining marbles.
What is the probability that Niki gets a blue one? What is the
probability that Tom gets a red one?

Again, we use our fraction. When Niki draws, there are 10 marbles in
the jar, of which 6 are blue, so her probability of drawing a blue is
6/10 or 3/5. After she draws, it is Tom's turn. But now there are
only 9 marbles left. 4 of these are red, so his probability of
drawing a red marble is 4/9.

Now, it is important to distinguish in any probability problem how


many events you have. Here we have figured the probability for TWO
events. The first is that Niki draws a blue marble. The second is
that Tom draws a red one AFTER Niki has drawn.

But, suppose we want to know the probability of the ONE event: "Niki
draws a blue marble AND Tom draws a red one." It seems like the same
question, but it isn't. The reason is that now we have more than one
way this could happen. We could have:

(1) Niki draws a blue, then Tom draws a blue


(2) Niki draws a blue, then Tom draws a red
(3) Niki draws a red, then Tom draws a blue
(4) Niki draws a red, then Tom draws a red

These are the only 4 possibilities. They are not all equally likely,
however. When we have ONE event which is made up of two separate
events with the word AND, we multiply the individual probabilities to
get the answer.

So, the probability of (2) above, is:

Niki draws a blue = (3/5) times Tom draws a red = (4/9)

which is 12/45 or 4/15.

How about the probability of (1)?

Well, we already figured the probability of Niki drawing blue; it's


3/5. How about Tom drawing blue also? Well, after Niki draws blue,
there are 9 marbles left, and 5 blue, so its 5/9. And so 3/5 times
5/9 is 3/9 or 1/3.

See how to do it?

You should be able to do your problem, now. I got a bit lengthy here,
since I can't tell from your question if it's meant to be just 1 event
with an "and." I think that it is. But, if not, you can also figure
out just the individual probabilities for the two marble draws, as
well.

Gambler's Fallacy
Date: 03/14/2003 at 14:13:18
From: Kevin
Subject: Lottery: Betting same number vs randomly selecting a number.

A current co-worker and I are in a friendly disagreement about the


probability of selecting the winning number in any lottery, say pick
5. He states that he would rather bet the same set of five numbers
every time for x period of time, but I insist that the probability is
the same if you randomly select any set five numbers for the same
period of time. The only assumption we make here is betting one set of
numbers on any given day. Who is correct?

I tried explaining to him that the probability of betting on day one


is the same for both of us. On day two it is the same. On day three it
is the same, etc. Therefore the sum of the cumulative probabilities
will be the same for both of us.

Thank you for your anticipated response.


Date: 03/15/2003 at 03:29:05
From: Doctor Wallace
Subject: Re: Lottery: Betting same number vs randomly selecting a
number.

Hello Kevin,

You are correct. If you have the computer randomly select a different
set of 5 numbers to bet on every day, and your friend selects the same
set of numbers to bet on every day, then you both have exactly the
same probability of winning.

Tell your friend to think of the lottery as drawing with tickets


instead of balls. If the lottery had a choice of, say, 49 numbers,
then imagine a very large hat containing 1 ticket for every possible
combination of 5 numbers. 1, 2, 3, 4, 5; 1, 2, 3, 4, 6; etc.

On the drawing day, ONE ticket is pulled from the hat. It is equally
likely to be any of the C(49,5) tickets in the hat. (There would be
1,906,884 tickets in the hat in this case.)

Since both you and your friend have only ONE ticket in the hat, you
both have the same chance of winning.

On the next drawing day for the lottery, ALL the tickets are replaced.
Each lottery draw is an event independent of the others. That is to
say, the probability of any combination winning today has absolutely
NO effect on the probability of that or any other combination winning
tomorrow. Each and every draw is totally independent of the others.

The reason your friend believes that he has a better chance of winning
with the same set of numbers is probably due to something called the
"gambler's fallacy." This idea is that the longer the lottery goes
without your friend's "special" set of numbers coming up, the more
likely it is to come up in the future. The same fallacy is believed by
a lot of people about slot machines in gambling casinos. They hunt for
which slot hasn't paid in a while, thinking that that slot is more
likely to pay out. But, as the name says, this is a fallacy; pure
nonsense. A pull of the slot machine's handle, like the lottery draw,
is completely independent of previous pulls. The slot machine has no
memory of what has come before, and neither has the lottery. You might
play a slot machine for 2 weeks without hitting the big jackpot, and
someone else can walk in and hit it in the first 5 minutes of play.
People wrongly attribute that to "it was ready to pay out." In
reality, it's just luck. That's why they call it gambling. :)

This used to be a "trick" question on old math tests:

"You flip a fair coin 20 times in a row and it comes up heads every
single time. You flip the coin one more time. What is the probability
of tails on this last flip?"

Most people will respond that the chance of tails is now very high.
(Ask your friend and see what he says.) However, the true answer is
that the probability is 1/2. It's 1/2 on EVERY flip, no matter what
results came before. Like the slot machine and the lottery, the coin
has no memory.

Math Symbol for C

Date: 12/15/97 at 22:50:24


From: Andrew Valovcin
Subject: Re: Math symbol for C

This is my first time to your site and I find it very interesting and
enjoyable. I am puzzled by one symbol of typing math. What does the
upper case letter C mean? Like in (2 C 1) (3 C 1) / (47 C 2) = 6/1081.
I can't figure it out and before I get any more frustrated I thought
I'd better write to you.

Thanks for your help.

Date: 12/17/97 at 08:12:11


From: Doctor Anthony
Subject: Re: Math symbol for C

The symbol nCr stands for the number of combinations of r things that
can be formed from n different things.

If you look in any textbook on probability and statistics you will


find that permutations and combinations will feature a good deal in
the early chapters. Modern text books now use a different notation for
nCr, but it is written on two lines, the n above the r and with
brackets around them but no line between. This notation is not
suitable for ascii presentation, so we have to revert to the older nCr
notation. In ascii we also use C(n,r) to mean the same thing, and
this has advantages if we are working with numbers as the numbers
themselves stand out more clearly.

10!
10 C 4 = C(10,4) = ------ = 210
4! 6!

and if we are dealing with permutations, or arrangements, of 4 things


that could be formed from 10 different things, then we write:

10!
10 P 4 = P(10,4) = ----- = 5040
6!

Odds Slang for Probability?


Date: 02/06/97 at 15:44:00
From: Anonymous
Subject: Odds and probability

What is the difference between odds and probability? Or is odds a


slang term for probability?

Date: 02/06/97 at 21:41:26


From: Doctor Wallace
Subject: Re: Odds and probability

No, odds is not slang for probability. There is a difference.

A probability is a number from 0 to 1 inclusive, usually expressed as


a fraction, which is the ratio of the number of chances of a specific
event to the total number of chances possible.

For example, if I have 4 marbles in a jar, 3 red and 1 blue, then the
probability of drawing the blue is 1/4. There is one chance of a blue
marble and 4 total chances (marbles).

Odds are expressed as the number of chances for (or against) versus
the number of chances against (or for). So, since there is 1 chance
of your picking the blue, and 3 chances of your picking red, the odds
are 3 to 1 AGAINST you picking the blue. For odds in favor, we just
reverse them. The odds are 1 to 3 IN FAVOR OF you picking the blue.

This can be a little confusing, so I'll say it again. If you express


odds as AGAINST, you put the number of chances against first, versus
the number of chances for. If you express odds as IN FAVOR OF, you
put the chances for the event happening first.

Note that this does NOT mean that the probability is 1/3 for or
against in the above example.

To convert odds to probability, we have to ADD the chances. So, if


the odds against a horse winning are 4 to 1, this means that, out of
5 (4 + 1) chances, the horse has 1 chance of winning. So the
PROBABILITY of the horse winning is 1/5 or 20 percent.

I hope this helps. If you need more assistance, don't hesitate to


write back!

Probability of Two Dice Summing to 5

Date: 09/22/2001 at 20:41:31


From: Yuxiao
Subject: Probabilities

I don't really understand how to do probabilities. For example:


If a person rolls two dice, what is the probability of getting a five
as the sum of the two dice?

Can you explain it step by step?

Date: 09/23/2001 at 07:33:40


From: Doctor Mitteldorf
Subject: Re: Probabilities

Dear Yuxiao,

It takes a lot of getting used to. The only way to get a feeling that
you really understand probabilities is to do lots and lots of
examples.

One important principle is multiplication. You're familiar with


multiplying numbers like 7*3 = 21, where the numbers get bigger as you
multiply them. But if the numbers are fractions less than 1, then
multiplying them together makes the result smaller.

All probabilities are less than or equal to 1, so multiplying them


together makes a smaller number. If the probability of one thing
happening is x and the probability of another thing happening is y,
you can multiply x times y to get a smaller number that is the
probability of both things happening.

Let's apply this to the two dice. You know that the probability of
getting a 1 on the first die is 1/6. The probability of getting a 4 on
the second die is also 1/6. So multiply these two together and you
find that the probability of getting BOTH a 1 on the first die AND a
4 on the second die is 1/36.

That's one of the ways you can get a 5 with two dice. So 1/36 is part
of the probability of rolling a 5, but not all of it. Can you list the
other ways?

First die Second die


1 4
2 3
3 2
4 1

We've listed four ways to get a five, and that's all there are. Each
of these combinations has a probability of 1/36 of happening; so the
total probability of rolling a 5 is 4/36, which is 1/9.

A good next step for you would be to make a chart of all the results
1 through 12 and calculate the probabilities for each in the way I
just did for 5. You can check your chart when you're finished by
adding up the probabilities for all 12 numbers: The probabilities
should add up to 1. That's because one of these numbers HAS TO come
up, so the probability of getting any number 1 through 12 is 1.

- Doctor Mitteldorf, The Math Forum


http://mathforum.org/dr.math/

Date: 09/23/2001 at 07:47:13


From: Doctor Anthony
Subject: Re: probabilities

In your example, the probability is the ratio

Number of ways we can get a total of 5


--------------------------------------
Total number of possible outcomes

From here you simply count the ways we can get 5.

1 + 4 probability of this is (1/6)(1/6) = 1/36


2 + 3 " " = 1/36
3 + 2 " " = 1/36
4 + 1 " " = 1/36
--------------
Total probability = 4/36 = 1/9

Alternatively, there are 4 ways we can get a total of 5 and there are
36 possible outcomes when you roll two dice.

4 1
Required probability = ----- = ---
36 9

Probability: Permutations and Combinations


Date: 12/10/2002 at 20:27:26
From: Apurva Yeluru
Subject: Probability: Permutations and Combinations

Dr. Math,

I am in eighth grade. We are not doing probability but I am interested


in it. Last year in our school, we had a probability fair and I could
not understand it very well. Now I can! I know what probability is and
what permutations are and what combinations are. But this is the part
where I am stuck: How is probability related to permutations and
combinations?

I would be delighted if you replied.

Thanks!

Date: 12/11/2002 at 11:50:25


From: Doctor Ian

Hi Apurva,
The relationship is that when you want to compute the probability that
something will happen, you do this:

how many ways can the thing happen?


p = -----------------------------------
how many ways can anything happen?

Combinations and permutations are useful for computing the numerator


and denominator of this fraction. For example, if you want to know
the probability of drawing two pairs in poker, that's

the number of combinations of 5 cards with 2 pairs


p = --------------------------------------------------
the number of combinations of 5 cards

In other words, when you're computing probabilities, you often need to


compute the number of ways that things can be selected from a group
(combinations) or ordered within a group (permutations).

Does that make sense?

Coin Flipping

Date: 01/26/98 at 05:19:38


From: Lee, Choong Loon
Subject: Probability

I wonder how I can figure out the chances of the following case:

Flipping a coin five times with the result a combination of T,T,T,H,H.


(T=Tail, H=Head)

To flip the same coin five times, what will be the chances of getting
the same combination (exact sequence) "right away"? I know that
1/5X1/5X1/5X1/5X1/5 is the formula to get 3T and 2H right away.
Somebody told me there is some method call "condition" special for
this kind of problem. It is like 1/2 X1/3 X1/4 X1/5 X1/6 but I am not
sure.

Please help me solve this case and if possible please explain in


detail. Thank you very much.

Lee Choong Loon

Date: 02/01/98 at 22:10:37


From: Doctor Wolf
Subject: Re: Probability

Let's begin by reviewing The Fundamental Principle of Counting, a


mainstay of probability theory:

If experiment A can result in N distinct outcomes, and experiment B


can result in M distinct outcomes, then experiment A followed by
experiment B can result in exactly N*M different outcomes. And of
course this can be generalized to more than two experiments.

Example: You roll a single die, then flip a coin. How many different
outcomes are possible? Let's see... 6 outcomes for the die, 2 for the
coin, so 6*2 or 12 different outcomes are possible. They may be
thought of as (3,T), (1,H), (6,H), etc.

Also, since the outcome of the die in no way affects the tossing of
the coin, each of the 12 possible outcomes will have probability
(1/6)*(1/2) or 1/12. This is called an "equiprobability space," and is
quite common.

Now, back to your problem. You will be performing the same experiment
5 times in succession; that is, flipping a coin. Each flip of the
coin can result in 2 distinct and equally likely outcomes, H or T.
Moreover, the result of any coin flip is not influenced by or
dependent upon any previous coin flip. That last statement regarding
independence of the coin flips is very important; it tells us that all
possible outcomes after 5 coin flips are equally likely, or have the
same probability.

By the Principle of Counting, there are 2*2*2*2*2, or 32 possible


outcomes to your problem. The one you are interested in is (TTTHH).
The chance of getting T(first flip), then T(second flip), then T(third
flip), H on the fourth flip and H on the fifth flip is:

(1/2)*(1/2)*(1/2)*(1/2)*(1/2) = 1/32.

In fact, every one of the 32 possible outcomes from flipping a coin 5


times has the same probability .... 1/32.

Factorial

Date: 4/1/96 at 7:14:46


From: Leigh Ausband
Subject: Factorial

From Mrs. Fields' seventh grade math classes at


Robert Smalls Middle School, Beaufort, SC.

Dear Dr. Math,

What is factorial? For example: 4 factorial. What is it used for?

Eric Peevey
Date: 4/2/96 at 4:2:32
From: Doctor Jodi
Subject: Re: Factorial

Hi Eric!

4 factorial is written 4! and means 4*3*2*1

Say you were having a lottery and were picking 4 numbers out of 60 with
no repeats of numbers.

You'd find out the probability of picking a certain 4-number set (in a
certain order) by saying

60 choices for the first number,


59 for the second,
58 for the third,
57 for the fourth.

So there are 60*59*58*57 ordered sets of 4 numbers.

This can also be written as

60!
---
(60-4)!

which will give you 60! divided by 56!

Since the numbers 56 and less divide out, you're left with 60*59*58*57
as the number of ordered sets of 4 lottery numbers that could be drawn

Getting Two Heads in Four Tosses of a Coin

Date: 05/17/2000 at 22:01:23


From: Melissa
Subject: Probability of two heads on four tosses

Dear Dr. Math,

The question I need to ask is: What is the probability of getting two
heads on four flips of an unbiased coin? I have looked at your other
answers, and think it would be 1/8 because:

1/2 + 1/2 + 1/2 + 1/2 = 1/8

Thanks so much for your help.

Melissa Dismukes
Date: 05/18/2000 at 13:42:43
From: Doctor TWE
Subject: Re: Probability of two heads on four tosses

Hi Melissa - thanks for writing to Dr. Math.

I think what you meant was (1/2)*(1/2)*(1/2)*(1/2) = 1/16, but that


would be the answer for getting 4 heads in 4 flips (or 0 heads in 4
flips).

Let's draw a probability tree:

/\
Toss:
/ \
/ \
/ \
T H 1st
| |
/ \ / \
/ \ / \
/ \ / \
T H T H 2nd
| | | |
/ \ / \ / \ / \
/ \ / \ / \ / \
T H T H T H T H 3rd
| | | | | | | |
/ \ / \ / \ / \ / \ / \ / \ / \
T H T H T H T H T H T H T H T H 4th
| | | | | | | | | | | | | | | |
0H 1H 1H 2H 1H 2H 2H 3H 1H 2H 2H 3H 2H 3H 3H 4H
4T 3T 3T 2T 3T 2T 2T 1T 3T 2T 2T 1T 2T 1T 1T 0T
* * * * * *

Since it is an unbiased coin, each branch has a probability of 1/2,


each outcome is equiprobable (P = 1/16), and the probability of
tossing exactly 2 heads (outcomes marked with *) is 6/16 = 3/8.

For a more general solution to this type of problem, search Dr. Math
for "binomial probability" (without the quotation marks) using our
archive search engine at:

http://mathforum.org/mathgrepform.html

History of Probability

Date: 04/07/97 at 21:44:45


From: Patty Liao
Subject: Probability's history

Who first researched probability? I haven't really found


anything on it. Any other historical facts about probability?
It's for a report for a project.
Date: 04/08/97 at 09:34:26
From: Doctor Statman
Subject: Re: Probability's history

Dear Patty,

In about 1654 Blaise Pascal started to investigate the


chances of getting different values for rolls of dice,
and his discussions with Pierre de Fermat are usually
considered the beginning of the study of probability.

I think if you look up these two famous mathematicians you


will be on the right track.

Independent vs. Dependent Events


Date: 06/09/2003 at 16:42:08
From: Sonny
Subject: What is the difference between "independent" and "dependent"
events?

This is from the probability of a compound event. Could you give me


an example?

Date: 06/09/2003 at 16:58:39


From: Doctor Ian
Subject: Re: What is the difference between "independent" and
"dependent" events?

Hi Sonny,

Two events are independent if the outcome of one has no effect on the
outcome of the other. The classic example would be rolling a pair of
dice. What happens with one die has no effect on what happens with the
other die.

Two events are dependent if the outcome of one has an effect on the
outcome of the other. The classic example would be drawing cards from
a deck without replacement. The probability of drawing an ace changes
depending on what other cards have already been drawn.

Probabilities for independent events often involve exponents, while


probabilities for dependent events often involve factorials.

How many ways are there to roll three dice? There are 6 ways to roll
the first, 6 ways to roll the second, and 6 ways to roll the third, so
the number of possible outcomes is

6*6*6 = 6^3
How many ways are there to draw three cards from a deck without
replacement? There are 52 ways to draw the first one; but now there
are only 51 ways to draw the second (because one card has been
removed); and only 50 ways to draw the third. So the number of
possible outcomes is

52 * 51 * 50 = 52! / (52 - 3)!

Does this help?

- Doctor Ian, The Math Forum


http://mathforum.org/dr.math/

Infinite Probability: A Point on a Line


Date: 11/03/2004 at 19:46:12
From: Aman
Subject: probability and infinity

Imagine a line extending infinitely in both directions. A line


segment of length 10m has endpoints at point A and Point B, both of
which are on the line. What is the probability that a randomly chosen
point on the line is on line segment AB?

I simply thought of this problem and was wondering how to solve it. I
became curious and decided to ask you. I was thinking something along
the lines of 1/infinity but I don't really know what to do.

Date: 11/04/2004 at 09:28:56


From: Doctor Vogler
Subject: Re: probability and infinity

Hi Aman,

Thanks for writing to Dr. Math. While it might not seem like it, your
question is really not so much a question of computing the probability
as it is in understanding what probability is all about. So let me
ask you a different, but related, question:

Suppose you pick a positive integer at random. What is the


probability that the number is smaller than one million?

To give a reasonable answer to this question or your own, you need to


think hard about what you really mean by "choose at random." Before
going into more detail, I did a search on our archives for

probability infinity

and found some interesting reading. Doctor Wallace gave a very nice
and detailed description of a similar problem on:

Probability in the Infinite Plane


http://mathforum.org/library/drmath/view/62553.html

And the question about random positive integers is addressed at

Probability that a Random Integer...


http://mathforum.org/library/drmath/view/56540.html

where Doctor Tom gives a reasonable meaning for "choose a random


integer." If you use his idea, then the answer to my question to you
is zero. The answer to your question to me can be described in the
same way, and you would, again, get zero.

But now let's back up again and think about these random numbers.
Doctor Tom said to pick a random number up to M, and then calculate
the probability. Then you take a limit, which means to assume that M
is really, really big. And the bigger it gets, the closer your
probability is getting to.... Well, is it getting closer to some
number? It doesn't always, but it did in his problem, and it does in
ours. Suppose that we pick a random number from 1 to M. Then if M is
very large, the probability that our number will be less than a million is

1000000
-------.
M

If M is many, many millions, then this will be a small number. The


bigger M gets, the smaller this number gets, and the probability goes
to zero.

But what if we thought about this in a different way. Suppose we pick


a random positive integer. How many digits would it have, on average?
If you use the same logic as above, it would have, on average, about
as many digits as half of M. And this gets bigger and bigger as M
does, so a random integer has infinitely many digits.

Huh? That doesn't make much sense! And now we get to probability
theory. Mathematicians describe this in terms of measures, which you
would not be familiar with, so I'll describe the concepts and try to
be more understandable than precise.

A probability measure is a way of answering the question "What is the


probability that we choose THIS group of choices?" If there are only
finitely many total choices to choose from (like 300 choices), then
you can answer the question like this: The probability is the number
of choices in the group divided by 300. This is known as "uniform
probability." You can do something similar about choosing points on a
line. If the line has finite length, then the probability that your
choice lies on some portion of the line is the length of the portion
divided by the length of the whole line. This is also "uniform
probability."

This isn't the only way to answer the question, though. Suppose you
have a weighted coin that lands on heads two-thirds of the time, and
lands on tails only one third of the time. That is also a probability
but it is NOT uniform probability.

But now let's suppose that you have infinitely many choices. You
can't divide by infinity. So that means that it doesn't make sense to
use uniform probability. So we must have some kind of "weighted"
probability. There are many ways to do this. Here is one way:

Let's suppose we choose a random positive integer, and we will choose


the integer n with probability (1/2)^n. That is, 1/2 of the time we
choose the number 1. And 1/4 of the time we choose the number 2. And
1/8 of the time we choose the number 3. And so on. You'll notice
that the probabilities of all of the numbers added together is 1. (If
you are not familiar with infinite series, notice that each time you
add another probability it gets closer to 1. For more than this,
search our archives for "infinite sums" and "geometric series.")

Now we ask the question: What is the probability that I choose a


random number less than one million? And the answer is: very good.
In fact, the probability is more than 99.999999%. That's a lot better
than it was using the other way, and that is because this new
probability measure is heavily weighted toward small numbers. The
other one was uniform probability up to M, which means that high
numbers (near M) have the same probability as low numbers, but there
are more high numbers than low numbers.

And this is not the only way to decide the probabilities. There are
infinitely many ways. For example, you can choose the integer n with
probability 2/(3^n), or you choose choose the integer n with
probability 6/(n*pi)^2. The only requirement is that all of the
probabilities together add up to 1.

So that is why I say: What do you mean by "random point"? If all


points have equal probability, then you get 1/infinity, like you said,
which is zero. But when you are talking about infinite lines, then
all points having equal probability doesn't seem so reasonable any
more, but that leaves you to answer the question: What probability do
you want to use instead?

There is a lot to learn here, and perhaps you shouldn't try to learn
it all now. After all, most mathematicians don't learn about these
kinds of things until college. But you'll learn about these kinds of
things as you learn more math. In the meantime, if you have any
questions about this or need more help, please write back, and I will
try to explain more.

Math in Card Games

Date: 10/06/2000 at 10:11:25


From: Jack Joshi-Powell
Subject: Card games

How exactly is math involved in the playing, creating, and setting up


of card games?

I've checked many books and Web sites, but they talk about strategy,
not math.
Date: 10/06/2000 at 12:49:13
From: Doctor TWE
Subject: Re: Card games

Hi Jack - thanks for writing to Dr. Math.

Exactly what kind of card games are you talking about? I can think of
at least three categories: games using standard playing cards (4 suits
of 13 cards each) or similar decks (like Bridge, Cribbage, Poker,
etc.); games that use card decks specific to that game (like Old Maid,
Quartet, Set, etc.); and the new "customizable" or "collectible" card
games (like Pokemon, Magic: the Gathering, etc.) Whichever type of
card game you are referring to, they use math in many of the same
ways.

First, the designs of the cards are themselves an exercise in geometry


(along with a little bit of psychology and physiology). Designers must
consider the size of the cards. Too large or too small, and the cards
are impractical to hold in one hand.

The shape of the card is also a geometric consideration. Most cards


are rectangular with rounded corners, but what is a good height-to-
width ratio? Some games use other-shape cards, like square or round
cards, instead. Are these better or worse in terms of handling during
a game?

Then there's the symmetry of the cards. The faces of the cards in some
games (particularly the "customizable" card games) have no symmetry,
while others (particularly standard playing cards) have two-way or
near two-way symmetry, and yet others have four-fold symmetry. My wife
doesn't like playing with standard playing cards because she's
left-handed and the corner symbols on most playing card decks are
designed for right-handers. She spreads her card hand "backward" and
thus sees the blank corner instead of the card symbol. The card backs
are frequently symmetric geometric patterns as well.

Math is also involved in the design of the game in terms of winning


and losing. In many games, the designer wants the probability of each
player winning (assuming equally good strategies) to be equal, or as
nearly equal as possible. But determining whether the first player has
an advantage or disadvantage is an exercise in probability. It would
not be a very interesting game if the first player could always win.
For casino games like Blackjack, the house has an advantage. But if
the advantage is too large, players won't play the game; they'd lose
their money too quickly. Of course, if the house doesn't have the
advantage, the casino loses money and will go out of business. So
determining the probability of winning is an important step in the
design of a game.

Determining the best playing strategy also involves math. Knowing how
to determine the probability of the occurrence of random events can
help a player determine the best strategy for winning. The play of
many card games also requires basic arithmetic skills. In many games,
you have to add or subtract points. (For example, in Blackjack you
need to add the values of your cards and subtract it from 21.) Some
card games, like Twenty-Four, require the players to do mathematical
computations as part of winning the hand. (In the game Twenty-Four,
each card has 4 numbers on it. The first player to be able to make an
expression that equals 24 using the 4 numbers and basic arithmetic
operations wins the card.) Some "customizable" card games also have a
"casting cost" or equivalent requirement before a card can be put into
play. The player must determine what combination of cards (s)he can
afford to play on each turn.

Most card games also require some form of scorekeeping from round to
round or hand to hand. This often just involves simple arithmetic, but
that is math as well.

I hope this gives you some ideas as to where to start. Perhaps you can
then explore these areas in more depth for the particular game or type
of card game you're interested in.

Probability and the 'Ways Method'


Date: 10/25/2002 at 12:16:52
From: Rae
Subject: Investigation of games of chance

Suppose we roll one six-sided die. What are the possible outcomes?
What is the probabiliy of rolling a 4?

If we have two dice how many outcomes are there? With two dice what is
the probability of rolling a 5?

I would appreciate it if you could lead me in the direction of getting


the correct answer.

Thanks.

Date: 10/25/2002 at 13:35:56


From: Doctor Achilles
Subject: Re: Investigation of games of chance

Hi Rae,

Thanks for writing to Dr. Math.

I like to think about these problems in terms of what I call the "ways
method" [Note: this is my own name for a common statistical device
that other people call other things.]

Here's how the "ways method" works:

Step 1: List all the possible outcomes.

For one die, that is easy to do; there are 6:

1
2
3
4
5
6

Step 2: List the number of equally likely ways that you can get each
outcome:

For 1, there is one way to get it (roll a 1)


For 2, there is one way to get it (roll a 2)
For 3, there is one way to get it (roll a 3)
For 4, there is one way to get it (roll a 4)
For 5, there is one way to get it (roll a 5)
For 6, there is one way to get it (roll a 6)

Step 3: add up the total number of ways for all outcomes:

1+1+1+1+1+1 = 6

Step 4: the probability of a given outcome is equal to the number of


ways you can get it divided by the total number of ways for all
outcomes.

So, for example, there is one way to get a 5 and there are 6 ways
total, so the probability of getting a 5 is equal to 1/6.

Let's apply the "ways method" to two dice.

Step 1: list the possible outcomes:

2
3
4
5
6
7
8
9
10
11
12

Step 2: list the number of ways to get each outcome:

For 2: There is only one way: you have to roll a 1 on the first die
and a 1 on the second die.

For 3: There are two ways: you can roll a 1 on the first die and a 2
on the second or a 2 on the first die and a 1 on the second.

For 4: There are 3 ways: you can roll a 1 on the first die and a 3
on the second, or a 2 on the first die and a 2 on the second, or a
3 on the first die and a 1 on the second.

For 5: There are 4 ways: you can roll a 1 on the first die and a 4
on the second, or a 2 on the first and a 3 on the second, or a 3 on
the first and a 2 on the second, or a 4 on the first and a 1 on the
second.

For 6: There are 5 ways: you can roll a 1 on the first and a 5 on
the second, or a 2 on the first and a 4 on the second, or a 3 on the
first and a 3 on the second, or a 4 on the first and a 2 on the
second, or a 5 on the first and a 1 on the second.

For 7: There are 6 ways: you can roll a 1 on the first and a 6 on
the second, or a 2 on the first and a 5 on the second, or a 3 on the
first and a 4 on the second, or a 4 on the first and a 3 on the
second, or a 5 on the first and a 2 on the second, or a 6 on the first
and a 1 on the second.

For 8: There are 5 ways: you can roll a 2 on the first and a 6 on
the second, or a 3 on the first and a 5 on the second, or a 4 on the
first and a 4 on the second, or a 5 on the first and a 3 on the
second, or a 6 on the first and a 2 on the second.

For 9: There are 4 ways: you can roll a 3 on the first and a 6 on
the second, or a 4 on the first and a 5 on the second, or a 5 on the
first and a 4 on the second, or a 6 on the first and a 3 on the
second.

For 10: There are 3 ways: you can roll a 4 on the first and a 6 on
the second, or a 5 on the first and a 5 on the second, or a 6 on the
first and a 4 on the second.

For 11: There are 2 ways: you can roll a 5 on the first and a 6 on
the second or a 6 on the first and a 5 on the second.

For 12: There is only 1 way: you have to roll a 6 on both.

To summarize:

2 has 1 way
3 has 2 ways
4 has 3 ways
5 has 4 ways
6 has 5 ways
7 has 6 ways
8 has 5 ways
9 has 4 ways
10 has 3 ways
11 has 2 ways
12 has 1 way

Step 3: add up the total number of ways:

1+2+3+4+5+6+5+4+3+2+1 = 36

Step 4: to calculate the probability of each outcome, take the number


of ways for that outcome divided by the total number of ways (36).

2 has a probability of 1/36


3 has a probability of 2/36 = 1/18
4 has a probability of 3/36 = 1/12
5 has a probability of 4/36 = 1/9
6 has a probability of 5/36
7 has a probability of 6/36 = 1/6
8 has a probability of 5/36
9 has a probability of 1/9
10 has a probability of 1/12
11 has a probability of 1/18
12 has a probability of 1/36

How can we use this method to find the probability with 3 dice?

One thing to note: this method works for all probability calculations,
but it is not necessarily always the best. It assumes that each "way"
is EQUALLY likely (which is true for dice, as long as they aren't
weighted). You have to do some complicated adjustments if the "ways"
aren't equally likely.

Also, the calculations become more difficult if the probabilities


change (for example, if events are not independent). For example, with
a deck of cards, the chances of getting a King of Spades in one card
is 1/52, but if you draw a card and it's not the King of Spades, then
your chances of getting the King of Spades in the second trial become
1/51. Again, this method can deal with that, but there may be others
that are more efficient.

I hope this helps. If you have other questions about this or you're
still stuck, please write back.

Probability of Even vs. Odd Sums

Date: 10/11/2000 at 17:32:15


From: Tara Hage
Subject: Probability of even vs. odd sums

Hi Dr. Math,

My second-grade son asked a question of his math teacher and was not
satisfied with her answer. She taught the class that the sum of two
even numbers will be even, the sum of two odd numbers will be even,
and the sum of one odd and one even number will be odd.

He asked her, since there are more ways to achieve an even sum, is it
more likely that an addition problem will have an even answer? She
said, "No, it depends on the addends." He understands that it depends
on the specific problems he is given, but still feels that overall,
addition problems are more likely to have even answers.

Can you provide a reasonable explanation for him as to why this is


true or not true?

Thank you in advance,


Tara Hage

Date: 10/12/2000 at 12:15:44


From: Doctor Anthony
Subject: Re: Probability of even vs. odd sums

You have to be a little careful here because you are dealing with
infinite sets. However, if we are limited to the set of numbers 1 to
100, there are 50 even numbers and 50 odd numbers.

We can choose 2 even numbers in 50 x 50 = 2500 ways


We can choose 2 odd numbers in 50 x 50 = 2500 ways
We can choose 1 odd and 1 even in 50 x 50 = 2500 ways
We can choose 1 even and 1 odd in 50 x 50 = 2500 ways

The sum of the two numbers is even on 2 x 2500 = 5000 occasions. The
sum of the two numbers is odd on 2 x 2500 = 5000 occasions. So there
is no bias in favor of an even sum.

Probability of Matching Times on a Clock


Date: 09/14/2004 at 12:17:07
From: Michael
Subject: What is the probablity

What is the probability of two different times within the same hour
ending in the same last digit? Like 08:13 and 08:43?

I don't know the process of calculating the probability.

Date: 09/14/2004 at 12:59:58


From: Doctor Edwin
Subject: Re: What is the probablity

Hi, Michael.

In general, the chance of an event occurring is

# of ways the event could happen


--------------------------------------
total # of ways things could turn out

So for example my chance of rolling a one or a two on a six-sided die


is:

# of ways I could roll 1 or 2


---------------------------------------
total # of ways the die could come up

or:
2
---
6

which is just 1/3.

Your question has an interesting twist. The most obvious answer is


1/10 or 10%. Let's ignore everything but the last digit. If I pick a
number between zero and nine, and you do the same, the chance that you
picked the same number I did is 1/10. So if I pick a time at random,
and you pick a time at random, the chance that the last digits will
match is still 1/10.

But in your problem, it adds two additional pieces of information.


First, it says that the times must be different. So we can't both
pick 8:13, for example. The other piece is that the two times must
fall within the same hour.

In order to figure out the probability of your time event, we have to


figure out how many ways we could have the same number at the end,
while picking DIFFERENT times in the same hour.

Suppose like in your example I pick 8:13. Now, how many ways can you
pick a time that ends with the same digit as mine?

8:03
8:13
8:23
8:33
8:43
8:53

but one of those is the same time I picked, and you're not allowed to
pick that one, so you're down to 5 possible ways to pick the time.

So your probability is

5
----------------------------------------
total # of times you could have picked

So how many times could you have picked? There are 60 minutes in an
hour, and you're not allowed to pick one of them. Can you figure out
the probability from there? Write back if you're stuck.

- Doctor Edwin, The Math Forum


http://mathforum.org/dr.math/

Probability of Picking Coins


Date: 01/12/2003 at 12:31:51
From: Kita
Subject: Probability

Max has 5 coins in his pocket that total 47 cents. What is the
probability that he will reach into his pocket and pull out a dime,
and then without replacing it reach in and pull out a quarter?

A. 1/20
B. 1/10
C. 1/25
D. 2/25

Could it be 1/10?

Date: 01/12/2003 at 22:02:38


From: Doctor Kastner
Subject: Re: Probability

Hi Kita -

Not only could it be 1/10, it is 1/10. But why?

Let's first think about the coins that Max has in his pocket. 5 coins,
47 cents. He has to have 2 pennies in order to make the 47, so that
takes care of two of the coins. Now we need to think how he could have
3 coins that make 45 cents. How could he get the 5 cents of the 45?
If he had a nickel, that would mean we need to get two coins to make
40 cents and that can't happen. Therefore, he has to have a quarter,
which leaves us with making 20 cents with two coins. That gives him
two dimes.

So in Max's pocket he has 2 pennies, 2 dimes, and 1 quarter.

Now we can finish the problem off. He has two chances of pulling out a
dime from his five coins, and then he has 1 chance of pulling out a
quarter from the coins he has left. Remember that these probabilities
need to be multiplied together and you'll not only get your answer but
will be able to explain how you got there.

I hope this helps. Write back if you're still stuck, or if you have
other questions.

- Doctor Kastner, The Math Forum


http://mathforum.org/dr.math/

Probability of Divisibility
Date: 06/18/2002 at 19:19:18
From: Lisa Vinson
Subject: common fractions

What is the probability that a randomly selected three-digit number


is divisible by 5? Express your answer as a common fraction?
Date: 06/18/2002 at 20:45:19
From: Doctor Ian
Subject: Re: common fractions

Hi Lisa,

How many 3-digit numbers are there? Call that Q. (If you're not
sure how many there are, think about how many miles you could drive
when the odometer in your car reads between 100 and 999.)

How many of them are divisible by 5? Call that P. (If you're not
sure how many there are, start listing them. You should quickly see
a pattern.)

Once you've found P and Q, the probability you're looking for is


P/Q.

Probability and Odds


Date: 01/05/2008 at 17:36:37
From: Alfredo
Subject: Probability vs. Odds Ratio

In layman's language, what is the difference between odds ratio and


probability?

I find this confusing because both are measures of chance. It also


confuses me how to interpret results. For example if you get a high
probability, say 80%, most likely the outcome of the odds ratio is
greater than 1, which is, as I understand it, interpreted as a higher
chance of occurrence. So I am confused if there is a significant
difference between probability and odds ratio.

I've tried researching the internet for the answers to my confusion


between the 2 measures of chances. I do understand that probability
is occurrence/whole while odds ratio is occurrence/non-occurrence.
But i see no difference in their interpretation. It just makes me
wonder if I am understanding the subject matter correctly or not.
Hope you could help me on this.

Date: 01/05/2008 at 23:00:31


From: Doctor Peterson
Subject: Re: Probability vs. Odds Ratio

Hi, Alfredo.

You know the definitions, but I'm not sure what more you mean by
"interpretation". Let's look at a simple example and explore the
differences; then you can tell me whether I've shown that the sort of
interpretation you have in mind is indeed different.

Suppose I roll one die, and consider whether I roll a six. I can
describe this event in three ways:

ways to succeed 1
Probability of six = --------------- = ---
total outcomes 6

ways to succeed 1
Odds in favor of six = --------------- = --- = 1:5
ways to fail 5

ways to fail 5
Odds against six = --------------- = --- = 5:1
ways to succeed 1

(Usually odds are expressed as a ratio, 1:5 or 5:1, rather than a


fraction; probability is expressed as a fraction, decimal, or
percentage. I should also add that the "ways" I'm talking about have
to be equally likely, as they are here with a fair die.)

Probability tells you what fraction of the time you can expect an
event to occur; you will roll a 6 about 1/6 of the time. This is
never greater than 1, but the higher it is, the more probable the
event is, with a probability of 1 representing (virtual) certainty,
and 0 representing (virtual) impossibility.

Odds tells you the ratio of time the event occurs to the time it
doesn't (or vice versa); you roll a 6 once for every 5 times you roll
something else (in the long run). Odds can be any (positive) ratio at
all, from 0:1 to 1:0. Something that never happens will have odds of
0:1 in favor, and something that always happens will have odds of 1:0
in favor (0:1 against), though we never express these cases as odds!
Odds of 1:1 are "fifty-fifty", equally like to occur or not; this
corresponds to 50% probability.

The idea of odds comes from gambling, which is where probability


theory largely arose; the idea is that a bet in that ratio is fair.
If I bet $1 to your $5 that I will roll a 6, I will come out even in
the long run. Probability is easier to work with mathematically but
harder to apply to gambling. That's why we have two different ways to
express the concept.

I imagine your confusion lies in the fact that both probability and
odds in favor are higher when something is more likely, so they sound
at first like the same thing. But the meaning of "high" in each case
is different: a probability of 9/10 is pretty high, but odds of 9:10
are not high at all! In fact, in the latter case, you are less likely
to succeed than to fail. The odds corresponding to a 9/10 probability
would be 9:1. Now THAT'S a likely event!
Penny Toss

Date: 12/17/97 at 16:53:39


From: Dave Kugelstadt
Subject: Penny Toss

Dear Dr. Math:

My son is in the 7th grade. I try to help him grasp the concepts of
advanced math because I believe that a firm understanding of math will
do a great deal to advance his quality of life, as it has mine.
Usually I am very good but I am not quite sure about this one. His
"Problem of the week 6" goes as follows...

"Three people each toss a penny at the same time. What is the
probability that two people get the same side of the penny and the
other person gets the opposite side?"

Since there was no specification as to which individual got what, I


reasoned that there are only two possible combinations, all of them
get the same side or two of them get one side and one gets the other.
With this I decided that once I determined the odds of all getting the
same side I could subtract that from 100% to get the chances of any
two getting the same side. Assuming all else is equal, there is a 1/2
chance that any one person gets any one particular side. Since there
are three people I calculated .5 x .5 x .5 = .125 or 12.5% chance that
all would get the same side. That leaves 87.5% chance that any two
people would get the same side if all three tossed their pennies at
the same time (or even not at the same time, I suppose).

Before I tell my son this is it I would like to know if my reasoning


is sound.

Thanks,
Dave K.

Date: 12/17/97 at 19:25:35


From: Doctor Tom
Subject: Re: Penny Toss

Close, but not quite.

If there are only 3 pennies, it's easy just to list the possibilities:

HHH
HHT
HTH
HTT
THH
THT
TTH
TTT

where the first column represents the result for the first person,
etc. So there are 8 equally likely ways the experiment can come out,
6 of which have two faces the same. Thus the probability is 6/8 = 75%.

You were on the right track, but the .5*.5*.5 is the probability that
all three throw heads (and also the probability that all three throw
tails). So there is a 12.5% chance that all three throw heads, a 12.5%
chance that all throw tails, and hence, a 100% - 12.5% - 12.5% = 75%
chance that all three flips aren't the same.

I don't know if your kid knows about combinations (like "6 choose 2"
- the number of ways of picking 2 things from a set of 6), but if
so, that's a good way to work it.

There are 3 flips, so there are 2^3 = 8 (2 cubed = 8) possible


outcomes, and the favorable outcomes are if there is 1 head of the
three or 2 heads. (If there are 0 or 3 heads, it's an unfavorable
outcome.) The number of favorable outcomes is thus

(3 choose 1) + (3 choose 2) = 3 + 3 = 6 favorable out of 8 possible,


or a probability of 6/8, or 75%.

This last method is far more powerful in terms of answering more


complex problems, but you have to know how to count combinations.

For example, if the problem were, "7 people flip 1 penny each. What is
the probability that there are at least 4 heads tossed?"

Well, there are 2^7 = 128 outcomes, and there are:

(7 choose 4) + (7 choose 5) + (7 choose 6) (7 choose 7)

favorable outcomes.

If you work it out, this is 35+21+7+1 = 64 favorable of 128, giving


a probability of 64/128, or 1/2.

-Doctor Tom, The Math Forum


Check out our web site! http://mathforum.org/dr.math/

Date: 12/17/97 at 19:28:16


From: Doctor Anthony
Subject: Re: Penny Toss

You are partly but not completely right.

The easiest way is to consider the probability of NOT getting two of


one and one of the other. This would be either HHH or TTT

The chance of HHH is (1/2)^3 and of TTT is also (1/2)^3

= 1/8 + 1/8 = 1/4

So the chance of two of one and one of the other is 3/4

-Doctor Anthony, The Math Forum


Check out our web site! http://mathforum.org/dr.math/
Most Frequently Rolled Number

Date: 10/27/1999 at 21:54:54


From: Zoe Nesbitt
Subject: Determining odds

If I had two dice each with the following numbers on them: -3, -2,
-1, 0, 1, and 2, and rolled them 100 times, what sum would be rolled
most often and why? Can you help?

Date: 10/28/1999 at 03:29:20


From: Doctor Ian
Subject: Re: Determining odds

Hi Zoe,

Let's look at the case for regular dice, numbered 1-6. Here are the possible
sums:

second
die

+ 1 2 3 4 5 6
------------------
1 | 2 3 4 5 6 7
2 | 3 4 5 6 7 8
first 3 | 4 5 6 7 8 9
die 4 | 5 6 7 8 9 10
5 | 6 7 8 9 10 11
6 | 7 8 9 10 11 12

How often does each sum appear in the table?

Sum Appears
--- -------
2 1 time
3 2 times
4 3 times
5 4 times
6 5 times
7 6 times
8 5 times
9 4 times
10 3 times
11 2 times
12 1 time

Now, suppose you had 36 balls, and you marked them this way:

Mark Number of balls


---- ---------------
2 1
3 2
4 3
5 4
6 5
7 6
8 5
9 4
10 3
11 2
12 1

Put the balls in a hat, and draw one out. Toss it back in, and draw one out
again. If you keep doing this, which mark would you expect to draw most
often?

Do you see why the problem with the dice and the problem with the balls are
really just two versions of the same problem? Can you see how this relates
to _your_ problem?

- Doctor Ian, The Math Forum


http://mathforum.org/dr.math/

Pie Graphs and Probability

Date: 09/21/2000 at 16:39:21


From: Kristine Pj
Subject: Pie graphs and probability

I don't know how to get the probability from a pie graph.

Date: 09/21/2000 at 17:45:09


From: Doctor Ian
Subject: Re: Pie graphs and probability

Hi Kristine,

A pie graph or pie chart tells you what fraction of the time something
happened, or is expected to happen, and that's what probability tells
you, too. In fact, a pie graph is just a way of drawing probabilities
instead of writing them down using numbers.

For example, suppose a pie graph has three areas, with people who will
vote for Gore (41%), people who will vote for Bush (44%), and people
who will vote for someone else (11%).

This is like having 100 balls in a jar, with 41 marked 'Gore', 44


marked 'Bush', and 11 marked 'other'.

So the probability that you'll pull a 'Bush' ball out of the jar is
44/100.

I hope this helps. Write back if I didn't quite answer your question,
or if you have other questions.

- Doctor Ian, The Math Forum


http://mathforum.org/dr.math/
Popsicle Probability
Date: 02/27/2003 at 19:22:41
From: Ashley
Subject: Probability/Combinations

There are 9 popsicles: 3 orange, 3 cherry, 3 grape. There are 4


children. What is the probability that all 4 children will get the
flavor of their choice?

We know the answer is 26/27 but we do not understand how to get to


that answer and why. We know that the first child has a 9/9 chance,
the second 8/8, the third 7/7, and the forth 6/9 (2/3). We also know
that there are 126 different combinations of chidren and popsicles.

Date: 02/28/2003 at 14:32:59


From: Doctor Douglas
Subject: Re: Probability/Combinations

Hi, Ashley,

Thanks for submitting your question to the Math Forum.

Consider the 4 children. Each of them has a flavor preference, and so


there are 3x3x3x3 = 81 possibilities for what the kids want (we assume
that each kid does in fact want one of the three flavors: orange,
cherry, or grape).

Of these 81 possibilities, 3 of them mean that one child will be


unhappy: OOOO, CCCC, and GGGG (i.e., when all four of them want the
same flavor as the others). For all of the other possibilities (e.g.
OOCG, OOOC, OOCC,...) there are going to be enough popsicles of the
required flavors.

Assuming that each of the 81 possibilities is equally likely,

Pr(4 happy kids) = 1 - Pr(at least one unhappy kid)


= 1 - 3/81
= 1 - 1/27
= 26/27

A supply of four of each flavor would have guaranteed four happy kids,
but a supply of three of each flavor works in almost all (96%) of the
cases.

- Doctor Douglas, The Math Forum


http://mathforum.org/dr.math/
Probability of Getting your Hat Back

Date: Mon, 09 Oct 1995 16:36:05 -0400


From: chris mcmaken
Subject: Hat probability

There are 5 people with 5 hats. The hats are put into a box.
What is the probability that each person will get his or her hat?

Date: 5/30/96 at 14:32:10


From: Doctor Charles
Subject: Re: Hat probability

The probability that the first person will get the right hat is
1/5. Now there are four people and four hats so the probability
that the second person gets their own hat is 1/4. Similarly for
the third and fourth people. Of course, if the first four people
all get the correct hat then the last person must get the right
hat too as it is the only one left.

The total probability of everyone getting his own hat is found by


multiplying all these together.

That is (1/5)*(1/4)*(1/3)*(1/2)*1 = 1/(5*4*3*2*1) = 1/120.

Incidentally, the function that gives n*(n-1)*(n-2)* ... *3*2*1 is


called the factorial function and is written n! . In general n! is
the number of ways of placing n objects in a row. So if we had n
people with n hats then the probability would be 1/(n!).

-Doctor Charles, The Math Forum

Probability Problem

Date: 4 Jun 1995 12:15:53 -0400


From: Anonymous
Subject: Probability

6th grade math problem: There's a box with 12 letters, one of which is a D,
and 2 are E's. I know that the probability of getting a D is therefore 1 out
of 12, and an E is 1 out of 6. However, how do I determine the probability
of getting a D and then an E, if the D is replaced after being selected?

Date: 6 Jun 1995 12:28:53 -0400


From: Dr. Ken
Subject: Re: probability

Hello there!
You can find out how many different favorable possibilities there are for
drawing, and then divide that by the total number of possibilities for
drawing. In this case, there are 2 different ways you can get a D and then
an E. You can draw the D and then the first E, or you can draw the D and
then the other E.

Now how many different ways are there you can draw 2 letters from the bag?
Well, since there are 12 possibilities for each draw, there are 12x12 = 144
possibilities. So the probability we get a D and then an E is 2/144, or
1 out of 72. Thanks for the question!

-Ken

Probability Question from a Math Test

Date: 3/8/96 at 21:9:23


From: Anonymous
Subject: Question

Dear Dr. Math,

I have a question about a math problem that came up on a math test


I took recently.

There are 15 homerooms in the school. There are 20 students in


each homeroom. If the principal selects 5 of the homerooms for a
pizza party, what is the probability of Mr. Smith's homeroom being
selected?

a. 1/3 b. 1/5 c. 15/20 d. 1/20

Can you help me? I feel the answer is a. 1/3, but why is it not
1/15 * 14 * 1/13 * 1/12 * 1/11? And is this a way to find the
probability of anything, and if so what would the question be?

Date: 3/21/96 at 21:4:52


From: Doctor Aaron
Subject: Re: Question

Your feeling is right, if we pick 5 out of 15 rooms the


probability is 1/3. You could be thinking about several things by
the multiplication 1/15 * 1/14 * 1/13 * 1/12 * 1/11. One
probability question that this would answer is:

What is the probability of the principal picking 5 specific rooms


in a specific order? The first time (s)he must pick one room out
of 15, then out of 14, and so on, so the probability of the
principal picking all 5 rooms in order is the product of the
probability of picking each room individually, i.e.,
1/15 * 1/14 * 1/13 * 1/12 * 1/11.

A more common question would ask: what is the chance of the


principal picking 5 specific rooms in no particular order. In this
case, the probability is that of picking one of the five out of a
total of 15, times picking one of the remaining four out of the
remaining total of 14, etc. This is (5 * 4 * 3 * 2 * 1)/(15 * 14 *
13 * 12 * 11), an equation of the form: 1/(n!/k!(n-k)!) (where a!
= a * a-1 * a-2 * .... * 3 * 2 * 1), which comes up a lot in
probability and other areas of math.

One name that mathematicians use to refer to the number n!/k!(n-


k)! is n choose k, meaning that this represents the number of
different combinations of n elements taken k at a time. Here's an
example:If we let n = 5 and k = 3, we can think of how many
different ways can we take 3 elements from the set {1,2,3,4,5}.
We have(1,2,3), (1,2,4), (1,2,5) ... if we write all of them out,
we'll have

5!/3!*2! = 10 different combinations.

So, had the principal chosen 5 rooms out of 15, (s)he would have
had 15 choose 5 choices, from which (s)he chose 1. Then the
probability of choosing that 1 set of five rooms would be
1/(15 choose 5) = (5 * 4 * 3 * 2 * 1)/(15 * 14 * 13 * 12 * 11).

I hope that this is helpful.

-Doctor Aaron, The Math Forum

Probability That a Sum Is Divisible by Three


Date: 11/04/2009 at 07:53:02
From: Sepha
Subject: Probabilities divisible by three

In how many ways can you choose three numbers from 1-100 whose sum is
divisible by three?

I don't get how to even start doing this. I tried to do it manually,


like finding all the numbers divisible by three, like 3, 6, 9 and 12,
etc. and seeing what three numbers can add up to them.

Date: 11/04/2009 at 09:19:38


From: Doctor Ian
Subject: Re: Probabilities divisible by three

Hi Sepha,

I'd probably start thinking about ways that I could just look at three
numbers, and see if the sum will be divisible by 3.

For example, as you've noted, if you choose all three numbers that are
individually divisible by 3, the sum will also be divisible by 3.
What if you choose two numbers that are divisible by 3, and a third
that is not? Try some examples, and convince yourself that this won't
work.

What about two numbers that aren't divisible by 3? Well, this CAN
work, but only if one is 1 less than a multiple of 3, and the other is
1 more than a multiple of 3.

And there is a fourth possibility--that none of the numbers is


divisible by 3. What can happen then? I'll leave that for you to
think about.

Now, suppose we have 100 balls, marked with the numbers from 1 to 100.
We color all the multiples of 3 green: 3, 6, 9, 12, ... 99.

The balls whose numbers are 1 MORE than a multiple of 3 are colored
blue: 1, 4, 7, 10, 13, ..., 100. (Recall that 0 is a multiple of 3,
which is why 1 goes in this group.)

The remaining balls must have numbers that are 1 LESS than a multiple
of 3: 2, 5, 8, 11, 14, ..., 98. We can color them red.

So now, every number has a color, right? If you can figure out which
color combinations will give you a multiple of 3, then your problem
becomes one about selecting colored balls out of a jar... which is
probably more like the problems you've dealt with in the past, right?

- Doctor Ian, The Math Forum


http://mathforum.org/dr.math/

Date: 11/05/2009 at 06:15:11


From: Sepha
Subject: Thank you (Probabilities divisible by three)

Thank you very much for your help! I was expecting that I would just
get an answer, but now that you've explained it, I know how to
actually do it! Thank you!

Probability Tree Diagrams

Date: 8/22/96 at 1:20:30


From: Philip Carter
Subject: Probabilities

I wonder if you would help to settle an arguement by providing answers


to the three problems below:

1. A woman has two children. What are the odds that both are boys?

2. Charlie hits the target 80 times in 100 shots. Jim hits the target
90 times in 100 shots. What are the chances that the target will be
hit if each fires once?

3. In 1946, statistics showed that 2 percent of fruit boats arrived


with their cargoes ruined. If two boats arrived, what was the
probability that both cargoes were ruined?

Philip Carter

Date: 8/22/96 at 2:36:22


From: Doctor Paul
Subject: Re: Probabilities

Let's set up a tree diagram:

50% / \ 50%
boy girl

50% / \50% 50%/ \50%


boy girl boy girl

To find the odds, just multiply across the path that leads to two
boys:

.50 * .50 = .25 or 25 percent.

Had you wanted to know the odds of a boy and a girl, you would have
multiplied across the path that leads to a boy and a girl and then add
that to the path that leads to a girl and then a boy. They both
satisfy the criteria so you add them.

2. Again, let's set up a tree diagram...assume that Charlie goes


first.

80%/ \20%
hits misses

90%/ \10% 90%/ \10%


hits misses hits misses

Let's see which branches lead to the target being hit:

if it is hit and then hit again, that counts; if it is hit and then
missed, that counts; if it is missed and then hit, that counts; if
both miss it, that doesn't count... so let's multiply along the first
three branches and then add:

(.8 * .9) + (.8 * .1) + (.2 * .9) = .72 + .08 + .18 = .98,
or 98 percent. Pretty good odds, eh?

3. Tree diagram again... (are you noticing a theme?)

98%/ \2%
good bad

98%/ \2% 98%/ \2%


good bad good bad

Both ruined: .02 * .02 = .0004 = .04 percent (pretty low)

I hope this helps you out. These problems are easy if you use tree
diagrams!

Regards,

-Doctor Paul, The Math Forum


Check out our web site! http://ma

Randomly Selecting a Card

Date: 02/20/98 at 14:49:29


From: Sandra Jordan
Subject: probability

A card is drawn at random from a standard deck of 52 playing cards.


Find the probability that the card is a king and a club.

Here's how I have tried:

I know there is 13/52 clubs and 4/52 kings, and 1 of the kings shares
the club, so that would make 12/52 clubs, and 3/52 kings, and 1/52
king of clubs.

What do I do from here?

Date: 02/20/98 at 16:17:08


From: Doctor Sam
Subject: Re: probability

Sandra,

I think you're making the problem harder than it needs to be. There
is only one king of clubs in an ordinary deck. So the probability of
picking it is 1/52.

A more difficult problem (which you are well on your way to solving)
is to find the probability that the card you pick is either a king OR
a club. Now the probability is 16/52.

You can get this answer the hard way, by listing all possibilities:

the clubs: A, 2, 3, 4, 5, 6, 7, 8, 9, 10, J, Q, K, and


the kings: of hearts, of spades, of diamonds

There are sixteen possibilities, so this answer is 16/52.

But there might have been too many possibilities to list. Your method
can be used to answer this question. Many people get this question
WRONG by assuming that
P(king OR club) = P(king) + P(club)
= 4/52 + 13/52.

But this counts the king of clubs twice: once as a king and once as a
club. The correct method is to subtract the extra time this counts the
king of clubs:

P(king OR club) = P(king) + P(club) - P(king AND club)


= 4/52 + 13/52 - 1/52
= 16/52.

I hope that helps.

-Doctor Sam, The Math Forum


Check out our web site http://mathforum.org/dr.math/

Rolling a Number Cube

Date: 02/27/2002 at 19:34:22


From: Jennifer
Subject: Probability word problem

Here is a word problem my 4th grader brought home from school.


I can't figure out how to solve it.

A number cube is labeled with the numbers 3,6,9,12,15,and 18.


Describe an outcome of rolling the cube that has a probability of 3/6.

Date: 02/27/2002 at 21:12:04


From: Doctor Twe
Subject: Re: Probability word problem

Hi Jennifer - thanks for writing to Dr. Math.

Any characteristic that describes exactly 3 of the 6 numbers will do.


Here are several examples:

- rolling a number less than 10


- rolling a multiple of 6
- rolling an even number
- rolling a 3, 12, or 15
- rolling a number, which when divided by 3, yields a prime number

There are, of course, many other outcomes that have a probability of


3/6, but I hope that these examples clarify what is meant.

I hope this helps. If you have any more questions, write back.

- Doctor TWE, The Math Forum


http://mathforum.com/dr.math/

Socks

Date: 8/31/96 at 17:34:15


From: Anonymous
Subject: Socks

Jack's sock drawer contains 10 blue socks and 12 gray socks.


The room is dark and he cannot turn on the light. What is
the least number of socks he must take out of the drawer to
be certain of each of the following conditions?

a. He has a pair of the same color.


b. He has a pair of the blue socks
c. He has a pair of gray socks.
d. He has one pair of each color.

I've just looked at it a lot and I can not figure it out.


Thank you.

Date: 9/1/96 at 3:36:53


From: Doctor Mike
Subject: Re: Socks

>a. He has a pair of the same color.

Least number = 3, because if the first two do not make a pair,


then the third will match one of the first two.

>b. He has a pair of the blue socks

Least number = 14, because the worst case is he will pick all
the grey ones on the first 12 tries, and then he is guaranteed
that the next two will be blue.

>c. He has a pair of gray socks.

Least number = 12, because the worst case is he will pick all
the blue ones of the first 10 tries, and then he is guaranteed
that the next two will be grey.

>d. He has one pair of each color.

Least number = 14, following essentially the same logic as (b).


He could go for 12 tries and get only grey, and then the next
two would be blue. If you select 14 of them to take care of
this worst case scenario, then all the other situations would
be taken care of also.
Another answer would be to pick your socks out the previous evening
when you could turn the lights on, in which case the first three
answers would be 2 and the answer to d would be 4 (unless you are
color-blind).

I hope this helps.

-Doctor Mike, The Math Forum

Challenge Exercises: Unit 6


Probability Theory

Directions: Read each question below. Select your answer by clicking on its button.
Feedback to your answer is provided in the RESULTS BOX. If you make a mistake, choose
a different button. For some problems, the answers have been rounded to the nearest
percent.

1. Which of the following is the sample space when 2 coins


are tossed?
Top of Form

{H, T, H, T}

{H, T}

{HH, HT, TH, TT}

None of the above.

RESULTS BOX:

Bottom of Form

2. At Kennedy Middle School, 3 out of 5 students make


honor roll. What is the probability that a student does
not make honor roll?
Top of Form

65%
40%

60%

None of the above.

RESULTS BOX:

Bottom of Form

3. A large basket of fruit contains 3 oranges, 2 apples and


5 bananas. If a piece of fruit is chosen at random, what
is the probability of getting an orange or a banana?
Top of Form

None of the above.

RESULTS BOX:

Bottom of Form

4. A pair of dice is rolled. What is the probability of


getting a sum of 2?
Top of Form
None of the above.

RESULTS BOX:

Bottom of Form

5. In a class of 30 students, there are 17 girls and 13 boys.


Five are A students, and three of these students are
girls. If a student is chosen at random, what is the
probability of choosing a girl or an A student?
Top of Form

None of the above.

RESULTS BOX:

Bottom of Form

6. In the United States, 43% of people wear a seat belt


while driving. If two people are chosen at random, what
is the probability that both of them wear a seat belt?
Top of Form
86%

18%

57%

None of the above.

RESULTS BOX:

Bottom of Form

7. Three cards are chosen at random from a deck without


replacement. What is the probability of getting a jack, a
ten and a nine?
Top of Form

None of the above.

RESULTS BOX:

Bottom of Form

8. A city survey found that 47% of teenagers have a part


time job. The same survey found that 78% plan to
attend college. If a teenager is chosen at random, what is
the probability that the teenager has a part time job and
plans to attend college?
Top of Form

60%

63%

37%

None of the above.

RESULTS BOX:

Bottom of Form

9. In a school, 14% of students take drama and computer


classes, and 67% take drama class. What is the
probability that a student takes computer class given
that the student takes drama class?
Top of Form

81%

21%

53%

None of the above.

RESULTS BOX:

Bottom of Form

10. In a shipment of 100 televisions, 6 are defective. If a


person buys two televisions from that shipment, what is
the probability that both are defective?
Top of Form
None of the above.

RESULTS BOX:

Bottom of Form

Problem : If a dice is rolled once, what is the probability that it will show an even
number? An odd number?

Problems
Problem : If a dice is rolled once, what is the probability that it will show an even number? An
odd number?
Solution for Problem 1 >>

,
Close

Problem : If a dice is rolled once, what is the probability that it will show a prime number ( 1 is
not prime)?
Solution for Problem 2 >>

Close

Problem : If a dice is rolled once, what is the probability that it will show a multiple of 3 ?
Solution for Problem 3 >>

Close

Problem : If a dice is rolled once, what is the probability that it will show a multiple of 1 ? A
multiple of 7 ?
Solution for Problem 4 >>
1,0
Close

Problem : If a coin is flipped twice, what is the probability that it will land heads once and tails
once?
Solution for Problem 5 >>

Close

Problem : If a coin is flipped twice, what is the probability that it will land heads at least once?
Solution for Problem 6 >>

Close

Complementary Events
Two events are said to be complementary when one event occurs if and only if the other does
not. The probabilities of two complimentary events add up to 1 .

For example, rolling a 5 or greater and rolling a 4 or less on a die are complementary events,
because a roll is 5 or greater if and only if it is not 4 or less. The probability of rolling a 5 or

greater is = , and the probability of rolling a 4 or less is = . Thus, the total of their

probabilities is + = =1.

Example: If the probability of an event is , what is the probability of its complement?

The probability of its complement is 1 - = - = .


Odds
The odds of an event is the ratio of the probability of an event to the probability of its
complement. In other words, it is the ratio of favorable outcomes to unfavorable outcomes. We
say the odds are "3 to 2," which means 3 favorable outcomes to every 2 unfavorable outcomes,
and we write 3 : 2 . For example, the odds of rolling a 5 or greater are 2 : 4 , which reduces to 1 :
2.
Example 1: If we flip a coin two times, what are the odds for it landing heads at least once?

Favorable outcomes: 3 -- HH, HT, TH.


Unfavorable outcomes: 1 -- TT.

Thus, the odds for it landing heads at least once are 3 to 1 , or 3 : 1 .

Example 2: If the probability of an event happening is , what are the odds for that event?

Since the probability of the event is , the probability of its complement is 1 - = - = .

Thus, the odds for that event are : , which is equivalent to 2 : 5 .

Example 3. If the odds for an event are 3 : 2 , what is the probability of the event happening?

Favorable outcomes = 3 .

Possible outcomes = favorable outcomes + unfavorable outcomes = 3 + 2 = 5 .

Thus, the probability of the event happening is .

Probability
Probability is the branch of mathematics that studies the possible outcomes of given events
together with the outcomes' relative likelihoods and distributions. In common usage, the word
"probability" is used to mean the chance that a particular event (or set of events) will occur
expressed on a linear scale from 0 (impossibility) to 1 (certainty), also expressed as a percentage
between 0 and 100%. The analysis of events governed by probability is called statistics.
There are several competing interpretations of the actual "meaning" of probabilities. Frequentists
view probability simply as a measure of the frequency of outcomes (the more conventional
interpretation), while Bayesians treat probability more subjectively as a statistical procedure that
endeavors to estimate parameters of an underlying distribution based on the observed
distribution.
A properly normalized function that assigns a probability "density" to each possible outcome
within some interval is called a probability density function (or probability distribution function),
and its cumulative value (integral for a continuous distribution or sum for a discrete distribution)
is called a distribution function (or cumulative distribution function).
A variate is defined as the set of all random variables that obey a given probabilistic law. It is
common practice to denote a variate with a capital letter (most commonly ). The set of all
values that can take is then called the range, denoted (Evans et al. 2000, p. 5). Specific
elements in the range of are called quantiles and denoted , and the probability that a variate
assumes the element is denoted .
Probabilities are defined to obey certain assumptions, called the probability axioms. Let a sample

space contain the union ( ) of all possible events , so


(1
)

and let and denote subsets of . Further, let be the complement of , so that
(2
)

Then the set can be written as


(3
)

where denotes the intersection. Then


(4
)
(5
)
(6
)
(7
)
(8
)

where is the empty set.


Alice and Bob play a fair game repeatedly for one nickel each game. If originally
Alice has a nickels and Bob has b nickels, what is Alice's chances of winning all of
Bob's money, assuming the play goes on until one person has lost all her or his
money?

Let denote the conditional probability of given that has already occurred, then
(9)
(1
0)
(1
1)
(1
2)
(1
3)
(1
4)

The relationship
(1
5)

holds if and are independent events. A very important result states that
(1
6)
which can be generalized to

Let p(n) be Alice's chances of winning the total amount of a + b, provided she has n nickels in
her possession. Obviously p(0) = 0. If she is left with a non-zero capital, Alice may, at every
trial, win or lose one nickel, both with the probability of 1/2,
p(n) = p(n + 1)/2 + p(n - 1)/2, n > 0.
In other words, 2p(n) = p(n + 1) + p(n - 1), or p(n + 1) - p(n) = p(n) - p(n - 1). From here,
recursively,
p(n + 1) - p(n) = p(n) - p(n - 1)
= p(n - 1) - p(n - 2)
= p(n - 2) - p(n - 3)
...
= p(2) - p(1)
= p(1) - p(0)
= p(1).
It follows that p(n) = n p(1) and, since, p(a + b) = 1, p(1) = 1 / (a + b). It follows that p(a) = a /
(a + b).
References
1. E. J. Barbeau, M. S. Klamkin, W. O. J. Moser, Five Hundred Mathematical Challenges,
MAA, 1995, #494

Problems

Problem : What is the probability of an event if the probability of its compliment is ?


Solution for Problem 1 >>

Close

Problem : What is the probability of an event if its complement is impossible (has probability
0 )?
Solution for Problem 2 >>
1
Close

Problem : If the probability of an event is , what is the probability of its complement?


Solution for Problem 3 >>

Close

Problem : If a coin is flipped three times, the probability and getting all heads is . What is the
probability of getting tails at least once?
Solution for Problem 4 >>
Close

Problem : When flipping a coin, what are the odds for getting heads?
Solution for Problem 5 >>
1:1
Close

Problem : Paul has 1 green shirt, 5 red shirts, and 9 striped shirts. He randomly draws one out of
his drawer.

a) What are the odds for the shirt being green?


b) What are the odds for the shirt being red?
c) What are the odds for the shirt being striped?
d) What are the odds for the shirt not being striped?
Solution for Problem 6 >>
a) 1 : 14
b) 5 : 10 , or 1 : 2
c) 9 : 6 , or 3 : 2
d) 6 : 9 , or 2 : 3
Close

Problem : Using the previous problem:

a) What is the probability that the shirt is green?


b) What is the probability that the shirt is red?
c) What is the probability that the shirt is striped?
d) What is the probability that the shirt is not striped?
Solution for Problem 7 >>

a)

b)

c)

d)
Close
Problem : If the probability of an event is , what are the odds for the event? The odds against
it (the odds for its complement)?
Solution for Problem 8 >>
7:5,5:7
Close

Problem : If the odds for an event are 4 : 5 , what is the probability of the event? Of its
complement?
Solution for Problem 9 >>

,
Close