Professional Documents
Culture Documents
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
50
The classical approach to market behavioral analysis rarely uses data provided by the transitional, or
switching, habits of the consumer. In this article, the authors have taken types of laundry powders pur-
chased by a housewife to define the state space of a Markov chain. Using this model future purchase
behavior is predicted, and statistical inferences on the switching habits are made.
In recent years there has been a great deal of interest of four mutually exclusive and wholly exhaustiv
in the applications of stochastic processes in industry.gories of buying habits. These were
In particular, Markov chain models have been tried in (1) family buying detergent only,
quite a few areas. The number of papers appearing in (2) family buying soap powder only,
both advertising and market research journals on this (3) family buying both detergent and soap po
subject has been on the increase in the past few years together,
[3]. This paper is an illustration of certain concepts ofand (4) family buying no laundry powder at all.
discrete time parameter finite Markov chains, a Markov
chain being a stochastic process in which the future These four categories will be referred to as "states" in
state of the system is dependent only on the present the Markov analysis. A summary of purchase informa-
tion is shown in Table 1. One can see that over the 26-
state and is independent of past history.
Naturally, one of the main concerns of any producer week period, the purchases were divided up on the
of consumer goods is to get people to use his product.average as follows:
Table I
Also important, however, is to get people repurchasing
DISTRIBUTION OF HOUSEWIVES' PURCHASESa
his product once they have used it. That is, the producer
wants his customers to be loyal customers. However, State
complete loyalty is seldom found, and thus it is useful Week Period
for the producer to have information on the switching Ending (k) 1 2 3 4
habits of buyers in the commodity market. 1957
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
MARKOV CHAINS APPLIED TO MARKETING 51
being applied here is said to be an rth order X2 TEST RESULTS FOR ZERO VERSUS
Markov
chain. FIRST ORDER MARKOV CHAINS
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
52 JOURNAL OF MARKETING RESEARCH, FEBRUARY 1964
As is indicated by the matrices in Table 3, the loyalty impossible. Such a matrix defining a Mar
therefore
probability for each state is much higher than chain
the switch-
has the property that
ing probability between two different states. To con-
Lim P" = E
sider the possibility of regularities throughout the 26-
n-->o
week period, one may graph the loyalty probabilities2
for each state, as in Figure 1. One can see relatively
where E is an idempotent matrix with all i
stable loyalties, with the exception of Figure 3 where
same, and each row adding to unity. A matr
the wild fluctuations are clearly due to the small fre-
be idempotent when its square equals itself,
quencies observed for this state. E = E2. The row defining E will be a vecto
This leads to the possibility that the transition prob- and also the left-hand characterist
abilities,
abilities are "stationary," that is, independent of time
P corresponding to its characteristic root of
or purchase period. A system with stationarydenotes
transitiona column vector of unities, then on
probabilities is called a homogeneous MarkovEchain.
= e 1', and 1' P = -', where 1' is the row
Denote the transition probability2 from stateThus
i to state
if the Markov chain starts with probabil
j at periods k and k + 1 respectively as pij(k),
by the i,j elements
= in 1', it will always have
1,2,3,4; k = 1,2,...,25. Then to test the hypothesis of One says that 1' defines the limit
abilities.
stationary transition probabilities, considertionary
the null distribution of the Markov chain. T
hypothesis: case the elements of 1' will contain the shares of the
market attained by the various states if the switching
Ho: pij(k) = pi for all k = 1,2,...,25
pattern defined by P were to persist for a long period of
against the composite alternative hypothesis: time. It thus gives an indication of where the market is
heading, which is a useful piece of information for the
Hi: p1j(k) dependent on the period k.
market strategist. Comparison of 1' with the current mar-
A likelihood ratio test can be used to test these hypoth- ket shares will indicate how far from stationarity the
eses and is formulated in the appendix. Asymptotically current market distribution is.
one finds an equivalent standard normal variate as given In this case, one finds that
by (3) in the appendix, which for the data under analy- i' = (21.14, 40.85, 6.14, 31.87) percent
sis proves to be 0.69. This is clearly not significant,
since the corresponding significance level is over 24 per- while observed share vectors (from Table 1) are found
cent. to be
Hence one has insufficient evidence to reject the null Week 1: (22.00, 44.00, 9.00, 25.00) percent
hypothesis and hence may consider a homogeneous Week 26: (22.00, 37.00, 5.00, 36.00) percent
Markov chain model. Thus one represents the system by Average: (21.04, 41.23, 6.42, 31.31) percent
a single "stationary" transition probability matrix P, the
maximum likelihood estimate of which is shown in The closeness of 1' to the average market share indicates
Table 5. The switching pattern of the system is there- the market distribution is approximately stationary
that
fore taken to be independent of time. throughout the 26 weeks considered.
It has thus been found that the data from this survey
LIMITING DISTRIBUTION fit fairly well to the model of a first order Markov chain
with stationary transition probabilities and with, on the
Inspection of P shows that it is possible to move from
average, a stationary market share distribution.
every state to every other state. No particular switch is
PREDICTION
' Maximum likelihood estimates of these are found by divid-
One of the possible further applications of M
ing the transition frequency (as shown in Table 3) by the in-
chains
itial period frequency (as shown in Table 1), and is prediction of future market positions.
it is these
that are used. assumes the next position primarily dependent o
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
MARKOV CHAINS APPLIED TO MARKETING 53
FIGURE 1
LOYALTY PROBABILITIES BY STATE
PERCENTAGE PERCENTAGE
(state loyalty) (state loyalty)
100 100
90 90
80 80
70 70
60 60
50 50
40 40
30 30
20 20
10 10
0 0
5 10 15 20 25 5 10 15 20 25
TRANSITIONS TRANSITIONS
PERCENTAGE PERCENTAGE
(state loyalty) (state loyalty)
100 100
90 90
80 80
70 70
60 60
50 50
40 40
30 30
20 20
10 10
5 10 15 20 25 5 10 15 20 25
TRANSITIONS TRANSITIONS
C. BOTH POWDER LOYALTY D. NO POWDER LOYALTY
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
54 JOURNAL OF MARKETING RESEARCH, FEBRUARY 1964
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms
MARKOV CHAINS APPLIED TO MARKETING 55
This content downloaded from 14.139.157.21 on Fri, 12 Oct 2018 09:55:48 UTC
All use subject to https://about.jstor.org/terms