You are on page 1of 8

Introduction

What is markove model?


Is mathematical model technique, , derived from matrix algebra
that describes the transitions a cohort patients make among
number of mutually exclusive and exhaustive health states
during a series of short interval or cycles .
Properties of markov model
 Patient is always in one of a finite number of health states.
 Events are modeled as transitions from one state to
another.
 Contribution of utility to overall prognosis depends on
length of time spent in health state.
 During each cycle the patients by make transitions from
one state to another.
Markov model are particularly useful to describe a wide
variety of behavioral such as consumer behavior patterns,
mobility patterns friend ship formations ,net works, voting
patterns environments management .example as patient
movement between hospital stations ),etc. among in
numerable other applications in the social sciences.
Markov models represent a variant of decision analysis for
pharmacoeconomices evaluation where the treatment
pathways and options may be both complex and repetitive
markov models can also be used in situations where
prevalence as well as incidence based assessments one
required.
In representing complex decision process in simple and
convenient mathematical form.
Development planning contains a number of therioes and
planning models . markov model is one of the best planning
model of them.
Markov model is astochastic model used to model randomly
changing systems where it is assumed that future stateas
depende only on the current state not on the eventes that
occurred before it (thatis ti, assumeas the markov property).
Markov model chain is one of the most powerful tools for
analyzing complex stochastic system.
Markov chain model have become popular in manpower
planning system .
Several reserchers have adopted markov chain models to
clarify manpower policy issues.
What is markov chain model?
Astochastic model that desciribes the probabilities of
transition among the states the states of a system.
It is arandom process that undergoes transitions from one to
another on a state space.
Change of stateas dependes probabilistically only on the
current state of the system.
It is required to possess aproperty that is usually
characterized as “memoryloss”:the probability distribution of
the next state depends only on the current state and not on
the sequence of events that preceded it.
A markov model is defined by asset of states
Some states emit symbols
Other states (examples . the begin state )are silent.
Changes of state dependes probabilistically on the current
state of the system .
Markov chain model makes calculation of conditional
probability easy.
Markov assumptions
The probabilities of moving from a state to all others sum to
one .
The probabilities apply to all system participants.
The probabilities are constant over time.
Configuration of the markov –chain model
Markov system deal with stochastic environments in which
possible “out comes occur at the end of a well –defined ,
usually first period “.
This situation further involves a multi period time frame ,
during which the occurring consumer’s transient behavior ,
for example, affects the stability of the firm’sperformance.
This transient behavior , whose future out come is unknown
but needs to be predicated , createas inter –period
transitional probabilities.
Such stochastic process ,contains aspecial cause , where the
transitional probabilities from one time period to another
remains stationary , in which case the process is referred to
as the markov –chain.

Principles of markov model


Examples please note in markov model we use the probability
(cohort rates or percent to probability
well sick

dead

well 20 sick

5 30

dead

100

Transition probability

Matrix

well Sick dead


well 0.75 0.2 0.05
Sick 0 0 0.3
dead 0 0 1

dead 1
sick
well

0.30
dead
0.70 0.05

0.75 0.2
sick
well
0.70+0.30=1

0.75+0.20+0.05=1

dead 1
sick
well

0.30
dead
0.70 0.05

0.75 0.2
sick
well

Cycle well sick dead


0 1 0 0
1 0.75 0.2 0.05
2 0.56 0.29 0.15
Calculation of cycle 2 0.75x0.75 ( 0.75x0.2)+(0.2x0.05) (0.75x0.05)+(0.2x0.3)+0.05
(Well*well) (well*sick)+(sick*sick ) (well*dead)+(sick*dead)+dead
3 0.42 0.32 0.26
Calculation of cycle 3 ( 0.56*0.75 ) ( 0.56*0.20)+0.29*0.7) (0.56*0.05)+0.29*0.3)+0.15
4 0.32 0.31 0.38
Calculation of cycle 4 0.42*0.75 (0.4*0.2)+(0.32*0.7) (0.42*0.05)+(0.26*0.30)+0.26
Cycle well sick dead
0 1 0 0
1 0.75 0.2 0.05
2 0.56 0.29 0.15
3 0.42 0.32 0.26
4 0.32 0.31 0.38

Please note that all the previous numbers are in probability


terms if the problem state a number of population , or a
specific cost , do all the pervious steps +multiplying by the
stated population or cost.
Construct a markov model for a cohort of 5000 using the
following.
well 20 sick

5 30

dead

100

Cycle well Well Well in sick Sick in dead Dead in


probability number number numbers
0 1 5000 0 0 0 0
1 0.75 3750 0.2 1000 0.05 250
2 0.56 2800 0.29 1450 0.15 750
3 0.42 2100 0.32 1600 0.26 1300
4 0.32 1600 0.31 1550 0.38 1900
Lets try to understand markov chain from very simple example

Weather :

Raining today ________60%

40%

Not raining ________20%


80%

Stochastic finite state machine

Rain 0.4 No rain 0.8

0.6 0.2

Conclusion

Markov chain is a simple concept which can explain most complicated really time
mainly of the artificial intelligence tools use this simple principle called markov
chain in some form.
This presentation illustrate how easy it is to understand this concept and some of it’s
applications