You are on page 1of 90

5 Modeling the preference

with utility

ZHAN, Wenjie (Professor)


School of Management, Huazhong
University of Science & Technology
Tel: 027-87556472
Email: wjzhan@mail.hust.edu.cn
5 Modeling the preference
with utility
Objectives:
Know the difference between value and
utility
Know the EMV criterion and EU criterion.
How to derive a utility function: probability-
equivalence approach
Interpreting utility functions: risk attitude
and risk premium
Know the utility theory and its development.
5 Modeling the preference
with utility
5.1 Introduction
5.2 Limitations of the EMV criterion
5.3 How to derive a utility function
5.4 Interpreting utility functions
5.5 The axioms of utility
5.6 Development of utility theory
5.1 Introduction
In evaluating the consequences of possible
actions, two major problems are encountered:

The first is that the values of the consequences may


not have any obvious scale of measurement.

The second is that the scale may not reflect true


"value" to the decision maker even when there is a
clear scale (usually monetary) by which consequences
can be evaluated.
The first problem in evaluating the
consequences of possible actions
The first is that the values of the
consequences may not have any
obvious scale of measurement.
Three examples
The first problem: no obvious scale
to measure the consequence
Example 1: prestige, customer goodwill, and
reputation are important to many businesses,
but it is not clear how to evaluate their
importance in a concrete way.
The first problem: no obvious scale
to measure the consequence
Example 2: if the alternatives are:
Apple
Orange
Banana

preference refers to the set of assumptions related to ordering


some alternatives, based on the degree of happiness, satisfaction,
gratification, enjoyment, or utility they provide.
The first problem: no obvious scale
to measure the consequence
Example 3: if the alternatives are:

A system of preferences or preference structure


refers to the set of qualitative relations between
different alternatives of consumption.
The second problem in evaluating
the consequences of possible actions
The second is that the scale may not
reflect true "value" to the decision
maker even when there is a clear scale
(usually monetary) by which
consequences can be evaluated.
Three examples
The second problem: not reflect
true value to decision maker
Example 1: As an example, consider the value to you of money.
Assume you have the opportunity to do a rather unpleasant task for
$100. At your present income level, you might well value the 100
dollars enough to do the task. If, on the other hand, you first received
10 thousand dollars, the value to you of an additional $100 would be
much less, and you would probably choose not to do the task. In other
words, the value of $1,000,100 is probably not the same as the value
of $1,000,000 plus the value of $100.

Utility (The law of diminishing marginal utility)

Money
0 100
100000 100100
The second problem: not reflect
true value to decision maker
Example 2: Suppose you are offered a choice
between the following two options:
A: receiving a gift of $10,000 for sure;
B: participating (for free) in a gamble wherein
you have a 50-50 chance of winning $0 or
$25,000.

Risk attitude is the chosen response of an individual or


group to uncertainty that matters, driven by perception.
Understanding risk attitude is a critical success factor
that promotes effective decision-making in risky situations.
Example 3: St. Petersburg
Game
Consider the following gamble, known as the
St. Petersburg game: You flip a coin repeatedly
until a tail first appears. The pot starts at $1
and doubles every time a head appears. You
win whatever is in the pot the first time you
throw tails and the game ends. For example:
Tail on the 1st toss: Win $1
Tail on the 2nd toss: Win $2
Tail on the 3rd toss: Win $4
Tail on the 4th toss: Win $8

How much would you like to pay to take part


in this game?
Example 3: St. Petersburg
Game (cont.)
What is the Expected Value of the St.
Petersburg game? First, what is the probability
of throwing a tail on a given round:
1st round:
2nd round: Pr (Heads)*Pr(Tails) =
3rd round: Pr (Heads)*Pr(Heads)*Pr(Tails) = 1/8
kth round: 1/2k

How much can you expect to win on average?


With probability you win $1, you win $2,
1/8 you win $4, 1/16 you win $8.
Example 3: St. Petersburg
Game (cont.)
How much can you expect to win on
average? With probability you win
$1, you win $2, 1/8 you win $4,
1/16 you win $8.

So the EV = *1 + *2 + 1/8*4 +
1/16*8 +
= + + + +
Example 3: St. Petersburg
Game (cont.)
The expected value of the game is
infinite, and yet, few people would
be willing to pay more than $20 to
play it.
This is known as the St. Petersburg
Paradox.
Example 3: St. Petersburg
Game (cont.)
The resolution, proposed by Daniel
Bernoulli in 1738, was that the value
of a gamble is not its monetary value.
Instead, people attach some subjective
value, or utility, to monetary outcomes.

Thus, people do not seek to maximize


expected values, but instead maximize
expected utility.
5.1 Introduction
To work mathematically with ideas of "value," it will be
necessary to assign numbers indicating how much
something is valued.
Such numbers are called utilities, and utility theory deals with
the development of such numbers.
Utility is the (perceived) ability of something to satisfy
needs or wants.
Utility is an important concept in economics and game theory,
because it represents satisfaction experienced by the
consumer of a good.
In economics, utility is a representation of preferences
over some set of goods and services.
5.2 Limitations of the EMV
criterion: an example

Example: Imagine that you own a high-technology company which has


been given the task of developing a new component for a large
engineering corporation. Two alternative, but untried, designs are being
considered (for simplicity, we will refer to these as designs 1 and 2), and
because of time and resource constraints only one design can be
developed. Table 5.3 shows the estimated net returns which will accrue to
your company if each design is developed, and the estimated probabilities
of failure, partial success and total success for each design.
Example: Limitations of EMV
Limitation 1
Thus according to the EMV criterion you should develop
design 2, but would this really be your preferred course
of action?
There is a 30% chance that design 2 will fail and lead to a loss of
$6 million. If your company is a small one or facing financial
problems then these sort of losses might put you out of business.
Design 1 has a smaller chance of failure, and if failure does occur
then the losses are also smaller.
Remember that this is a one-off decision, and there is therefore
no chance of recouping losses on subsequent repetitions of the
decision.
Clearly, the risks of design 2 would deter many people. The EMV
criterion therefore fails to take into account the attitude to risk of
the decision maker.
Limitation 2
A further limitation of the EMV criterion is that it
focuses on only one attribute: money.
In choosing the design in the problem we considered
above we may also wish to consider other attributes:
the effect on company image of successfully developing a
sophisticated new design
the spin-offs of enhanced skills and knowledge resulting
from the development and the time it would take to develop
the designs.
All these attributes, like the monetary returns, would
probably have some risk associated with them.
Conclusion:
The EMV criterion may have been
appropriate for the decision maker
because he was only concerned with
monetary rewards, and his decision was
repeated a large number of times so
that a long-run average result would
have been of relevance to him.
5.3 How to derive a utility
function
According to limitations of the EMV criterion,
the concept of utility can be used to take into
account the decision makers attitude to risk
(or risk preference) .
The attitude to risk of a decision maker can
be assessed by eliciting a utility function.
This is to be distinguished from the value
functions we met in previous chapters. Value
functions are used in decisions that do not
involve any consideration of risk attitudes.
An Example
A business woman who is organizing a business
equipment exhibition in a provincial town has to choose
between two venues: the Luxuria Hotel and the Maxima
Center. To simplify her problem, she decides to estimate
her potential profit at these locations on the basis of two
scenarios: high attendance and low attendance at the
exhibition.
If she chooses the Luxuria Hotel, she reckons that she has a
60% chance of achieving a high attendance and hence a profit of
$30 000 (after taking into account the costs of advertising, hiring
the venue, etc.). There is, however, a 40% chance that
attendance will be low, in which case her profit will be just $11
000.
If she chooses the Maxima Center, she reckons she has a 50%
chance of high attendance, leading to a profit of $60 000, and a
50% chance of low attendance leading to a loss of $10 000.
Decision tree of the example
Decision by EMV criterion
Now, if we apply the EMV criterion to the decision we
find that the business womans expected profit are:
EMV of Luxuria:
0.6$30 000 + 0.4 $11 000=$22 400
EMV of Maxima:

0.5 $60 000 0.5 $10 000=$25 000

This suggests that she should choose the Maxima


Center, but this is the riskiest option, offering high
rewards if things go well but losses if things go badly.
Utility function
Let us now try to derive a utility function to represent the
business womans attitude to risk. We will use the
notation u ( ) to represent the utility of the sum of money
which appears in the parentheses.
First, we rank all the monetary returns which appear on
the tree from best to worst and assign a utility of 1.0 to
the best sum of money and 0 to the worst sum.
Utility function (cont.)
Note: We also can used 0 and 100 for
utilities, but the use of 0 and 1 here will
enable us to interpret what utilities
actually represent. If other values were
used they could easily be transformed
to a scale ranging from 0 to 1 without
affecting the decision makers
preference between the courses of
action.
Utility function (cont.)
We now need to determine the business womans
utilities for the intermediate sums of money.
There are several approaches which can be adopted
to elicit utilities. The most commonly used methods
involve offering the decision maker a series of
choices between receiving given sums of money for
certain or entering hypothetical lotteries. The
decision makers utility function is then inferred from
the choices that are made.
The method which we will demonstrate here is an
example of the probability-equivalence approach.
Probability-equivalence approach
(p) Best Outcome
Utility = 1
(1 p) Worst Outcome
Utility = 0

Other Outcome
Utility = ?
Expected utility of alternative 2 = Expected utility of alternative 1
Utility of other outcome = (p)(utility of best outcome, which is 1)
+ (1 p)(utility of the worst outcome,
which is 0)
Utility of other outcome = (p)(1) + (1 p)(0) = p
Utility function (cont.)
To obtain the business womans utility for $30
000 using this approach we offer her a choice
between receiving that sum for certain or
entering a hypothetical lottery which will
result in either the best outcome on the tree
(i.e. a profit of $60 000) or the worst (i.e. a
loss of $10 000) with specified probabilities.
These probabilities are varied until the
decision maker is indifferent between the
certain money and the lottery. At this point,
as we shall see, the utility can be calculated.
A typical elicitation session
Question1: Which of the following would you prefer?
A: $30 000 for certain;
B: A lottery ticket which will give you a 70% chance of $60
000 and a 30% chance of $10 000?

Answer: A 30% chance of losing $10 000 is too risky, Ill


take the certain money.

We therefore need to make the lottery more


attractive by increasing the probability of the best
outcome.
A typical elicitation session
(cont.)
Question2: Which of the following would you prefer?
A: $30 000 for certain; or
B: A lottery ticket which will give you a 90% chance of $60
000 and a 10% chance of $10 000?

Answer: I now stand such a good chance of winning the


lottery that I think Ill buy the lottery ticket.

The point of indifference between the certain money


and the lottery should therefore lie somewhere
between a 70% chance of winning $60 000 (when
the certain money was preferred) and a 90% chance
(when the lottery ticket was preferred).
A typical elicitation session
(cont.)
Suppose that after trying several probabilities
we pose the following question.

Question3: Which of the following would you


prefer?
A: $30 000 for certain;
B: A lottery ticket which will give you an 85%
chance of $60 000 and a 15% chance of $10
000?

Answer: I am now indifferent between the certain


money and the lottery ticket.
A typical elicitation session
(cont.)
We are now in a position to calculate the utility of
$30 000. Since the business woman is indifferent
between options A and B the utility of $30 000 will be
equal to the expected utility of the lottery. Thus:
u($30 000) = 0.85 u($60 000) + 0.15 u($10 000)

Since we have already allocated utilities of 1.0 and 0


to $60 000 and $10 000, respectively, we have:
u($30 000) = 0.85(1.0) + 0.15(0) = 0.85
A typical elicitation session
(cont.)
Note: once we have found the point of
indifference, the utility of the certain money
is simply equal to the probability of the best
outcome in the lottery.

Thus, if the decision maker had been


indifferent between the options which we
offered in the first question, her utility for $30
000 would have been 0.7.
A typical elicitation session
(cont.)
We now need to determine the utility of $11
000.
Suppose that after being asked a similar
series of questions the business woman
finally indicates that she would be indifferent
between receiving $11 000 for certain and a
lottery ticket offering a 60% chance of the
best outcome ($60 000) and a 40% chance
of the worst outcome ($10 000).
This implies that u($11 000) = 0.6.
A typical elicitation session
(cont.)
We can now state the complete set of utilities
and these are shown below:
Decision by EU criterion
(cont.)
These results are now applied to the
decision tree by replacing the monetary
values with their utilities.
By treating these utilities in the same
way as the monetary values, we are
able to identify the course of action
which leads to the highest expected
utility (EU).
Decision by EU criterion
(cont.)
Decision by EU criterion
(cont.)
Choosing the Luxuria Hotel gives an expected utility:
0.6 0.85 + 0.4 0.6 = 0.75

Choosing the Maxima Center gives an expected utility:


0.5 1.0 + 0.5 0 = 0.5

Thus the business woman should choose the Luxuria


Hotel as the venue for her exhibition.
Clearly, the Maxima Center would be too risky.
Expected Value
Definition: Suppose we associate a monetary payoff with
each possible event in a gamble. The expected value of
the gamble is the weighted average of the payoffs where
the weight for each event is its probability.

More formally, let there be n events, let pi be the


probability of event i, and let i be the payoff associated
with event i. The following formula gives the expected
value of the gamble.
n
EV pi i
i 1
Expected Utility
Definition: Suppose we consider an individual as
having a utility function over possible payoffs from a
gamble, u( i ). The expected utility of a gamble is the
expected value of the utility.
n
EU pi u( i )
i 1

n
EV pi i
i 1
5.4 Interpreting utility functions
The business womans utility
function has been plotted on a
graph in Figure 5.5.
If we selected any two points
on this curve and drew a straight
line between them then it can be
seen that the curve would always
be above the line.
Utility functions having this
concave shape provide evidence
of risk aversion, which is
consistent with the business
womans avoidance of the riskiest
option.
Definitions: Risk attitudes
Risk Aversion : a decision maker
who may choose a lower
expected value to avoid risk or a
big loss.

Risk Neutral : a decision maker


who uses expected values only no
matter how small or larger are the
numbers involved.

Risk proneness : a decision


maker who may choose a higher
expected value to receive a big
win.
Risk attitudes
U(W) U(W) U(W)

U(b)
U(b)

U(a)

U(a)

a b W a b W a b W
Risk Proneness Risk Neutral Risk Aversion
U'(W) > 0 U'(W) > 0 U'(W) > 0
U''(W) > 0 U''(W) = 0 U''(W) < 0
Risk attitudes
(cont.)
P: risk proneness (or
seeking , loving)

N: risk neutral

A: risk aversion
Risk Premium

k = E(x) s >0
Risk Premium
(cont.)

k = E(x) s<0
Mathematic express to risk
attitude
Suppose u(x) is a utility function, u(x) is its first
derivative, and u(x) is its second derivative. We
have:
u(x) >0, which means that u(x) monotone
increasing. It implies that people always prefer high
value to lower value.

If u(x) is risk aversion, we have u(x) < 0;


If u(x) is risk seeking, we have u(x) > 0;
If u(x) is risk neutralness, we have u(x) = 0;
Mathematic express to risk
attitude (cont.)

0, risk aversion
u ( x)
r ( x) 0, risk neutral
u ( x)
0, risk seeking
Convex Function and
Concave Function

Definition of Convex : Definition of Concave :


Let f(x) be a function defined in the Let f(x) be a function defined in the
interval [A,B] . If: interval [A,B] . If:
f(A) + f(B) > f[(A+B)/2] f(A) + f(B) < f[(A+B)/2]
then we say that f(x) is convex in then we say that f(x) is concave in
the interval [A,B] the interval [A,B]
Convex Function and
Concave Function
Convex Function and
Concave Function
Properties: Let f (x) be the 1st order
derivative of function f (x), and f (x) be
the 2st order derivative of function f
(x), then:
(1) when f (x) is monotonically increasing in
an interval [A,B], f(x) is convex in this
interval (f (x) >0).
(2) when f (x) is monotonically decreasing
in an interval [A, B], f(x) is concave in this
interval (f (x) <0).
5.5 The axioms of utility
In the previous typical elicitation
session, we know that in order to derive
the utility function, the decision maker
should answer the question Which of
the following would you prefer.
What is the relationship between
preference and utility?
Definition of preference
relation
If there are two lotteries X and Y, then:
X Y: means that the decision maker prefers X to Y.
X Y: means that the decision maker prefers Y to X.
X ~ Y: means that the decision maker feels
indifferent between X and Y.
X Y: means that the decision maker fells that Y is
not preferred to X.
X Y: means that the decision maker fells that X is
not preferred to Y.
ordinal number & cardinal
number
ordinal number: A number indicating position
in a series or order. The ordinal numbers are
first (1st), second (2nd), third (3rd), and so
on.

Preference relationship is the ordinal number,


not cardinal number.
For example: X Y means that X is the first
order in the decision makers preference, and
Y is the second order.
ordinal number & cardinal
number
cardinal number: A number, such as 0.85 or
0.6, used in counting to indicate quantity but
not order.

Utility function is the cardinal ordinal number,


not ordinal number.
For example: if U(X)=0.85 and U(Y)=0.6,
U(X)>U(Y) for 0.85>0.6.
With utility function, we not only know the
preference between X and Y (X Y), but also
know how much X prefers to Y. It is
U(X)- U(Y)=0.85-0.6=0.25.
From preference relation to
utility function
A utility function U(x) represents a
preference relation if and only if:

xy U(x) > U(y)

x y U(x) < U(y)

x~y U(x) = U(y).


From preference relation to
utility function (cont.)
Question: Does every preference relation can be
represented by a utility function.

Von Neumann and Morgenstern (1944)


proved that if preferences obey the
following four reasonable axioms, then
they can be represented by a utility
function.

The four axioms is named the Axioms of Utility


or Axioms of Von Neumann and Morgenstern .
Axiom 1: The complete
ordering axiom
If there are two lotteries X and Y, then
the preference relation is either X Y, X
~ Y, or X Y.
Axiom 2: The transitivity
axiom
If the preference relation is X Y and Y
Z, then: X Z.

The continuity axiom implies that the


decision makers preferences must be
transitive.
Axiom 3: The continuity axiom
If the preference relation is X Y Z,
there exists a unique p such that:
p X + (1-p)Z ~ Y, p(0,1).

The continuity axiom implies the


existence of a continuous utility function
representing the preference relation.
Axiom 4: The substitution
axiom
If the preference relation is X Y, then:
X + (1- )Z Y + (1- )Z for any
0<<1.
Explain:
X Y

X+(1- )X Y+(1- )Y

X+(1- )Z Y+(1- )Z
5.6 Development of utility
theory
4.6.1 Allaiss paradox
4.6.2 Ellsbergs paradox
4.6.3 Prospect theory
5.6 Development of utility
theory
We have seen that utility theory is designed
to provide guidance on how to choose
between alternative courses of action under
conditions of uncertainty.
A rational decision maker always choose the
action which leads to the highest expected
utility (EU).
Does people always make decision like this?
5.6.1 Allaiss paradox
5.6.1 Allaiss paradox (cont.)
If we let u($5 m) =1, and u($0 m) = 0, then
selecting A suggests that:

u($1 m) > 0.89 u($1 m) + 0.1 u($5 m) + 0.01 u($0 m)

u($1 m) >0.89 u($1 m) + 0.1

u($1 m) > 0.1/0.11


5.6.1 Allaiss paradox (cont.)
However, choosing X implies that:

0.9 u($0) + 0.1 u($5 m) > 0.89 u($0) + 0.11 u($1 m)

0.1 > 0.11 u($1 m)

0.1/0.11 > u($1 m)

u($1 m) < 0.1/0.11


5.6.2 Ellsbergs paradox
Suppose there are 100 balls in an opaque jar. We
know that there are 33 red balls, and the remaining
balls are either black or yellow. We only the sum of
black balls and yellow balls is 67, but do not know the
exact number of each kind of the two balls.

G1 $1000 if red
G2 $1000 if black
33

67
G3 $1000 if red or yellow
G4 $1000 if black or yellow
5.6.2 Ellsbergs paradox (cont.)
Red Black Yellow
G1 $1000 $0 $0
G2 $0 $1000 $0
G3 $1000 $0 $1000
G4 $0 $1000 $1000
5.6.2 Ellsbergs paradox (cont.)
EU for G1: p(red)u($1000)
EU for G2: p(black) u($1000)

If people prefer G1 to G2, then we have:


p(red)u($1000) > p(black) u($1000)

p(red) > p(black)


5.6.2 Ellsbergs paradox (cont.)
EU for G3: [p(red)+p(yellow)]u($1000)
since: p(red)+p(yellow)+ p(black)=1 ; then: p(red)+p(yellow)=1- p(black)
So: EU for G3=(1- p(black) )u($1000)

EU for G4: [p(black)+ p(yellow)] u($1000)


since: p(red)+p(yellow)+ p(black)=1 ; then: p(black)+p(yellow)=1- p(red)
So: EU for G4=(1- p(red) )u($1000)

If people prefer G4 to G3, then we have:


(1- p(black) )u($1000)<(1- p(red) )u($1000)

(1- p(black)) <(1- p(red) )

p(red) < p(black )


Summary of paradoxes
The Allaiss paradox was put forward in 1953, and the
Ellsbergs paradox was put forward in 1961. Both
paradoxes has stimulated much debate.
However, we should emphasize that utility theory does not
attempt to describe the way in which people make
decisions like those posed above.
The utility theory is intended as a normative theory, which
indicates what a rational decision maker should do if he
accepts the axioms of the theory.
However, people in real world sometimes behave as
bounded rational decision maker, not complete rational.
Remember that utility theory is designed as simply an aid
to decision making, and if a decision maker wants to
ignore its indications then that is his prerogative.
5.6.3 Prospect theory (PT)


5.6.3 Prospect theory (PT)

EU Theory: is a normative decision theory.


U ( X ) E[u ( x)] p1u ( x1 ) p2u ( x2 ) ... pn u ( xn )
n
pi u ( xi )
i 1

PT Theory: is a descriptive decision. It applies the


v(x) (value function) to replace the U(x), the (p)
(decision weight function) to replace the p
V ( X ) E[v( x)] ( p1 )v( x1 ) ( p2 )v( x2 ) ... ( pn )v( xn )
n
( pi )v( xi )
i 1
5.6.3 Prospect Theory (PT)

v(x) (p)
(value function) (decision weight function)
5.6.3 Prospect theory
Kahneman and Tversky (1979, 1986) convincingly
demonstrate that the EU model fails to accommodate
preference patterns exhibited by a large number
subjects in some well-defined situations.

Three effects gotten by Kahneman and Tversky (1986) ,


which that inform the development of prospect theory.
(1) certainty effect
(2) reflection effects
(3) isolation effect
(1) Certainty effect
First, consider this example: which of the following
options do you prefer?
A. 33% chance to win $2500, 66% chance to win $2400, and
1% chance to win nothing;
B. 100% chance to win $2400

Now, consider this problem: which of the following


options do you prefer?
C. 33% chance to win $2500 and 67% chance to win nothing
D. 34% chance to win $2400 and 66% chance to win nothing

In the first case, 18% of participants chose option A while 82% chose
option B.
In the second case, 83% of participants chose option C while only 17%
chose option D.
(1) Certainty effect
Paradox in the examples:
In the first case, we have U(A) <U(B). Then,
0.33U(2500)+0.66U(2400)<U(2400)
0.33U(2500) <0.34U(2400)
In the second case, we have U(C) >U(D). Then,
0.33U(2500) > 0.34U(2400)

Conclusion: when facing uncertainty (or risk),


people often prefer certain result, i.e. highly
value the certain result.
(2) Reflection effect
Example 1: you must choose between one of the two gambles:
A: A 100% chance of losing $3000.
B: An 80% chance of losing $4000, and a 20% chance of losing
nothing.

Example 2: you must choose between:


C: A 100% chance of receiving $3000.
D: An 80% chance of receiving $4000, and a 20% chance of
receiving nothing.

Kahnemann and Tversky found that 20% of people chose D, while 92% chose B.
A similar pattern held for varying positive and negative prizes, and probabilities.
This led them to conclude that when decision problems involve not just possible
gains, but also possible losses, people's preferences over negative prospects
are more often than not a mirror image of their preferences over positive
prospects. Simply put - while they are risk-averse over prospects involving gains,
people become risk-loving over prospects involving losses.
(3) isolation effect
Example1 : Consider the following two-stage
game: In the first stage, there is a 75%
chance you win nothing and the game ends,
and a 25% chance of moving into the second
stage. If you reach the second stage you
have a choice between:
A: An 80% chance of $4,000

B: $3,000 for sure


(3) isolation effect (cont.)
Example2 : Consider the following one-stage
game:
C: An 20% chance of $4,000

D: An 25% chance of $3,000


(3) isolation effect (cont.)
Example1:
EU for A:0.250.8 U($4000)
= 0.20U($4000)
EU for B:0.251 U($3000)
= 0.25U($3000)

Example2:
EU for C:0.20U($4000)
EU for D:0.25U($4000)
(3) isolation effect (cont.)
People are thus much more risk averse
in the sequential presentation, even
though the gambles are identical.
Prospect Theory:
Weighting Function

Probability weights
1 The attention given to
an outcome depends
0.8
not only on the
probability of the
weight v

0.6

0.4
outcomes but also on
the favorability of the
0.2 outcome in comparison
0 to the other possible
0 0.5 1 outcomes.
probability p
Prospect Theory:
Value Function
Value Function
Value (utility) measured
relative to reference point:
Usually current wealth
4

Framing can influence


2

0
reference point
-10 -5 0 5 10
Value

-2

-4
Gains and losses treated
-6
differently:
Risk aversion for gains
-8

-10
Risk seeking for losses
Gain/Loss
Homework 4:
A manager faces the following decision problem. The
decision tree is as follows. ($ 1million s)

Low 0.2
$4
Opt1 Medium 0.5
$3
High 0.3
$2
Low 0.2 $2
Opt2 Medium 0.5
$3
High 0.3
$4
Low 0.2
$3
Opt3 Medium 0.5
$4
High 0.3
$2
Homework 4 (cont.):
If his utility function is:
(1): U(x)=x;
(2): U(x)=0.3*x2;
(3): U(x)=2*x- 0.3*x2;
where: x = profit in million dollars .
Q1: What is his option according to EMV
criterion?
Q2:What are his options according to the three
utility functions (EU criterion)?
Q3: If the utility function is a linear function,
such as U(x)=a+b*x, (a0, b>0), do the values
of parameter a and b affect his options?
The end

You might also like