You are on page 1of 83

Dr.

Tejinder Sharma
Department of Commerce
Kurukshetra University,
Kurukshetra 136119

sharmatejinder@gmail.com
Measurement &
Questionnaire Design
• Agenda
– Measurement
– Scale development
– Questionnaire design
Defining Measurement

An integrative process of determining the


intensity (or amount) of information about
constructs, concepts or objects
Construct : Conceptual
Framework
• Construct
– Something that is being measured
• Construct development
– An integrative process wherein the
researchers identify the subjective properties
for which the data should be collected for
solving the defined research problem.
Construct Measurement
• Concrete (measurable) properties

• Abstract (subjective) properties

• Number of dimensions (concrete and abstract)

• Operationalization (measurability issues of


constructs)

• Reliability & validity issues


Types of Data
• State-of-being (verifiable facts)
– physical and/or demographic items

• State-of-mind (mental thoughts or emotional


feelings)
– Attitudes, perceptions, beliefs, etc.

• State-of-behavior (past or current behaviors)


– In a typical week, how often do you..

• State-of-intention (planned future behaviors)


– Likelihood to engage in behavior in future
Scaling

• Assigning number to various degrees


of opinion, attitude and other
concepts
• Determining quantitative measures
of subjective/ abstract concepts
• Assigning numbers to the properties
of objects
Basic Levels of Scales

• Nominal

• Ordinal

• Interval

• Ratio
Nominal Scale
• One level measurement
• Numbers are used as labels to classify the products
• Used for demographic variables, types of products,
stores, etc.
• Eg. PIN numbers 136118 for Kurukshetra
• Analysis – counting, frequency, percentage, mode,
binomial test, chi square test, etc.
Ordinal Scale
• Provides information about ordered
relationship among the objects
• Contains all information of nominal scale and
measure of whether an object is more or less
characteristics of the other object
• For example – Ranking of preferences
• Statistical Analysis – Mathematical operations,
percentage, median, rank correlation, non-
parametric tests
Example of Ordinal Scale
Respondent Adidas Nike Reebok
A 2 1 3
B 1 2 3
C 2 3 1
D 1 3 2
E 1 2 3
F 1 3 2
G 2 3 1
H 2 1 3
I 1 2 3
J 2 1 3
Interval Scale

• Intervals are adjusted in terms of some


rule that has been established as a
basis for making unites equal
• Units are equal only on assumption
• Has an absolute zero, or unique origin
• Statistics – mean, s.d., correlation, t-
test, F-test
Ratio Scale

• Expressed as ration of few dimensions


• Measured from base zero
• Measures actual amount of variables
• Amenable to all statistical techniques,
including geometric mean, harmonic mean,
COV, etc.
• Examples – price of a product (Rs./unit)
Attitude Measurement

• Trilogy Approach
– Cognitive component (beliefs,
perceptions or knowledge)
– Affective component (emotions or
feelings)
– Conative component (intended or actual
behaviors)
Scale Development
• Process of developing reliable instrument,
which measures the desired variables
correctly and accurately
• Systematic and scientific process of
identifying the item variables, statements
and the scales for measuring them
• Ascertaining whether the instrument
correctly measures the desired variables
sufficiently and accurately
Classification of Scales
• Subject Orientation
– Designed to measure characteristics of the respondents
– Stimuli – response to develop categories (smokers/non-
smokers, etc)
• Response Form
– Categorical and comparative scales
• Degree of Subjectivity
– Personal preferences
• Scale properties
– Nominal, interval, ordinal, ratio
• Number of Dimensions
– Uni-dimensional, multi-dimensional
Scale Construction
Techniques
• Arbitrary Approach
– Scales developed on adhoc basis
– Developed on presumption
• Consensus Approach (Thurstone Differential Scale)
– Panel of judges evaluate the items chosen for inclusion in the
instrument
• Item Analysis Approach (Likert Scale)
– Individual items are tested by gropu of respondents
– Analysed on the basis of degree of discrimination
• Cumulative Scale (Guttman’s Scalogram)
– Conformity to some ranking of items in ascending or
descending order
• Factor Scales (Osgood’s Semantic Differential Scale)
– On the bass of intercorrelations to identify the commin factors
Important Scaling
Techniques
• Rating Scale
• Ranking Scale
Rating Scale

• Qualitative description of a limited


number of aspects
• Judge in terms of specific criteria
– Like --- Dislike
– Above average, average, below average
• 3 to 7 point scales are used
• More the rating, more the sensitivity
Rating Scale Types

• Graphical Rating
– Points are put in a continuum
– Indicate rating by tick mark
Like Very Like Neutral Dislike Dislike Very
Much Somewhat Some What Much
Rating Scale Types
• Itemised Rating
– Presents a series of statements
– Respondent selects the test
• He is always involved in some friction with his fellow worker
• He is often at odds with one or more of his fellow workers
• He sometimes gets involved in friction
• He infrequently becomes involved in friction with others
• He almost ever gets involved in friction with his fellow workers
Ranking Scale

• Make comparative/relative
judgments
• Approaches
– Method of paired comparison
– Method of rank order
Method of Paired
Comparison
• Respondent expresses the attitude by
making choice between two objects
• Number of comparisons (N) to be made
depend upon number of objects (n)
N = n/2 (n-1) If n= 10, N = 45
• Reduce possible comparisons by sample
survey
• Paired Comparison can be converted to
interval data by the Thurstone’s Law of
Comparative Judgment & Guilford’s
composite standard method
Scale Construction

• Thurstone Scale
• Likert Scale
Differential Scale (Thurstone
Scale)
• Uses consensus approach
• Method used in measuring attitude
on single dimension
• Used to measure the issues like war,
religion, etc.
Differential Scale (Thurstone
Scale)
• Researcher gathers a large number of statements
to express a point of view
• Submitted to a panel of judges to arrange them in
11 groups ranging from one extreme to another
• Sorting by each judge yields composite position
of each item
• Items of disagreement are discarded
• Median position of items selected items is
decided
• Attitude comparison made on the basis of these
median scores
Likert Scale (Summated
Scale)
• Evaluates each item on its ability to
discriminate between those with high score and
those with low score
• Respondent indicates degree of agreement or
disagreement with the statements in the
instrument
• Each response is given a numerical score,
indicating favourableness or unfavourableness
and total score represents the attitude
Likert Scale (Summated Scale)
Procedure
• Collect large number of statements relevant
to the attitude
• Collect diverse statements which express
favourableness or unfavourableness
• Administer it to a group of respondents
• Do the coding 1 for lowest and 5/7 for the
highest
Likert Scale (Summated Scale)
Procedure
• Compute total score of each respondent
• Arrange the total scires to find out
discriminating power of each statement
• Identify top 25% and bottom 25%
statements, which express the attitudes
• Statements correlating with total score are
retained in the final instrument and rest
are discarded
Likert Scale (Summated
Scale)
• Advantages
– Easier than Thurstone Scale
– Without panel of judges
– More reliable as it considers each item statement
and respondent
• Limitations
– Just gives the difference in attitudes and does
not quantify the same
Criteria for good
measurement
• Reliability
• Validity
• Sensitivity
• Relevance
• Versatility
• Ease of response
Reliability

• Ability to obtain similar results by


measuring an object, trait or
construct with independent but
comparable measures
• Example: Do both CAT and MAT
scores measure the candidates
performance?
Assessing Reliability

• Stability: Measure the same objects or


individuals at two different points in time
and then correlate their scores. Also
known as test-retest reliability
• Example: Correlation of your score on the
ACT in your Junior year and your score on
the ACT in your Senior year
Assessing Reliability

• Equivalence: Determined by
calculating the internal consistency
or homogeneity of the set of items
forming the scale. One way to
calculate equivalence reliability is to
use coefficient alpha.
Example
For each of the following items, circle the number that best represents
how you feel about making a sales call/presentation.

Definitely Definitely
do not Feel Feel

Jittery 1 2 3 4 5 6 7
Active 1 2 3 4 5 6 7
Intense 1 2 3 4 5 6 7
Energetic 1 2 3 4 5 6 7
Fearful 1 2 3 4 5 6 7
Vigorous 1 2 3 4 5 6 7
Lively 1 2 3 4 5 6 7
Tense 1 2 3 4 5 6 7
Steps for reliability in SPSS
• Analyse
– Scale
Reliability
– Alpha
– Statistics
Descriptives for Scale
Descriptives for Scale if item deleted
Inter-item Correlation
OK
Reliability Statistics
Cronbach Alpha Cronbach Alpha of N of Items
Standardised Items
0.706 0.708 6

Variable Scale Mean if Scale Corrected Squared Cronbach


Item Deleted Variance if Item Total Multiple Alpha if
Item Correlation Correlation Item
Deleted Deleted
V2 25.89 22.655 0.431 0.231 0.669
V3 25.76 22.641 0.523 0.380 0.641
V4 25.69 22.561 0.480 0.315 0.653
V5 25.90 24.268 0.425 0.185 0.671
V6 26.14 23.715 0.364 0.171 0.690
V7 26.28 23.273 0.411 0.182 0.675
Validity

• Degree to which our measures reflect


true differences among individuals --
and that we’re measuring what we
think we’re measuring
Assessing Validity

• Pragmatic Validity
• How well the measure actually predicts
some other characteristic or behavior
• Predictive Validity: A measure is used
to predict something in the future
• Concurrent Validity: A measure is used
to predict something assessed at the
same point in time
Assessing Validity
• Content Validity: The adequacy with
which the domain of the characteristic is
captured by the measure. Also called face
validity.
• Construct Validity: Assessment of how
well the instrument captures the
construct, concept, or trait it is supposed
to be measuring
• Item to Total Correlation
Relationship between
Reliability and Validity
• Reliability is a necessary but not
sufficient condition for validity
• A measure may be reliable and not
valid
Sensitivity
• It is the ability of a measurement to
indicate changes or differences
• Eg : Three ad campaigns showed similar
sales. Possible reasons could be:
– The ads were similar
– The period of sales was brief and insensitive to
changes
– Sales might not be the right test to measure
the effectiveness of the ads
Relevance

• To the decision made


• Construct must be identical to the
description of items
Versatility

• Robustness of measurement for


various statistical interpretations,
especially validity
Ease of response

• How easily a person will supply the


data
Comparison of three modes of data collection
Parameter Interview Telephone Mail/Self
Literacy Not require Not require Require

Respondent Language & Language & Not needed


skills skills needed skills needed
Response Highest Medium Lowest
Rate
Privacy Difficult Some Good; no
anonymity for embarrassment
giving replies

Consent Easy to Convincing is Convincing is


convince & possible difficult so
get consent consent
Questionnaire

. . . . a prepared set
of questions (or
measures) to which
respondents or
interviewers record
answers
Steps in Questionnaire
Design:
Step 1: Determine Specific Data to be sought
Step 2: Determine Interview Process
Step 3: Evaluate Question Content
Step 4: Determine Response Format
Step 5: Determine Wording
Step 6: Determine Questionnaire Structure
 Step 7: Determine Physical Characteristics
 Step 8: Pretest – Revise – Finalize Draft
Questionnaire Design –
Identify Information Needs
• Clarify the nature of the research problem and
objectives.
• Develop research questions to meet research
objectives.
• Identify Variables from Literature
• Develop Statements to measure each item
• Select the right scale
Questionnaire Design –
Clarification of Concepts:

• Ensure the concepts(s) can be clearly defined.


• Select the variables/indicators to represent the
concepts.
• Determine the level of measurement.
Information Needs

• Prepare following documents


– Research Purpose
– Information to be measured
– Draft analysis plan
Research Questions:
• What are the most important factors
influencing the purchase of a laptop
computer?
• Do employees in this organization support
diversity in the workplace?
• What is the customer’s consideration while
purchasing a mutual fund?
Determine Interview
Process
• Interview administered survey
• Self administered personal survey
• Informal interviewing
• Telephone interview
• Mail survey
Self-Completion or
Interviewer Assisted
Questionnaire?
Respondent capabilities:
• Educational background.
• Vocabulary level.
• Prior experience in completing
questionnaires.
• Age.
• Cultural issues.
Questionnaire Design –
Typology of a
Questionnaire:
• Determine the types of questions to include
and their order.
• Check the wording and coding of questions.
• Decide on the grouping of the questions
and the overall length of the questionnaire.
• Determine the structure and layout of the
questionnaire.
QUESTIONNAIRE DESIGN

1. Two Types of Questions:


2. Open-ended.
3. Closed-ended.

• Open-ended Questions = place no constraints on


respondents who are free to answer in their own words.

Closed-ended Questions = respondent is given the option of


choosing from a number of predetermined answers.
Open-Ended Questions
• Free response
• Probing
• Projective technique
• Associative technique
• Construction technique
Open-ended Questions

• Typically used in exploratory/qualitative studies.


• Typically used in personal interview surveys
involving small samples.
• Allows respondent freedom of response.
• Respondent must be articulate and willing to
spend
time giving a full answer.
• Data is in narrative form which can be time
consuming and difficult to code and analyze.
• Possible researcher bias in interpretation.
• Narrative can be analyzed using content analysis.
Software is available (e.g., NUD*IST).
Open-Ended Questions:
examples
1. What do you think about your health
insurance plan?
2. Which mutual funds have you been
investing in for the past year?
3. How are the funds you are investing in
performing?
4. What do you think of airport security?
Closed-end Questions:
• Single Answer.
• Multiple Answer.
• Rank Order.
• Numeric.
• Likert-Type.
• Semantic Differential
Closed-end Questions
• Typically used in quantitative studies.
• Assumption is researcher has knowledge to pre-specify
response categories.
• Data can be pre-coded and therefore in a form amenable
for use with statistical packages (e.g., SPSS, SAS) –
data capture therefore easier.
• More difficult to design but simplifies analysis.
• Used in studies involving large samples.
• Limited range of response options.
Broad Considerations
• Sequencing of questions.
• Identification of concepts.
• How many questions are required to capture
each concept?
• Question wording.
• Overall length of questionnaire.
• Placing of sensitive questions.
• Ability of respondents.
• Level of measurement.
• Open-ended versus closed-end questions.
Questionnaire Sections

• Opening Questions
• Research Topic Questions
• Classification Questions
Screening or Filter
Questions:
• . . . are used to ensure respondents included in
the study are those that meet the pre-
determined criteria of the target population.

• “Tonight we are talking with individuals who


are 18 years of age or older and have 50
percent or more of the responsibility for
banking decisions in your household. Are you
that person?” __ Yes __ No
Rapport Questions:

• . . . are used to establish rapport with the


respondent by gaining their attention and
stimulating their interest in the topic.

• “Have you seen any good movies in the last


month?”
__ Yes __ No

• “What is your favorite seafood restaurant?”


Concept = a generic idea
formed in the mind.

• Example Concept: “Customer


Interaction”
• This customer was easy to talk to.
• This customer genuinely appreciated my
helping him/her.
• This customer was friendly.
• This customer seemed interested in me,
not only as a salesperson, but also as a
person.
Concepts

Concept Identification.
• Conceptual definition – e.g., Service Quality.
As perceived by customers, it is the extent of
discrepancy between customers’ expectations
or desires and their perceptions.
Working Definition for Concept.
• Decompose definition into components.
• Search for items that are measurable.
Preparing and Presenting
Good Questions:
• Use simple words.
• Be brief.
• Avoid ambiguity.
• Avoid leading questions.
• Avoid double-barreled questions.
• Be careful about question order and
context effects.
• Check questionnaire layout.
• Prepare clear instructions.
Avoid Position Bias:

• Position Bias:
• “How important are flexible hours in evaluating job
• alternatives?”
• “What factors are important in evaluating job
• alternatives?”
• No Position Bias:
• “What factors are important in evaluating job
• alternatives?”
• “How important are flexible hours in evaluating job
• alternatives?”
Double-Barreled Questions:

To what extent do you agree or


disagree with the following
statements?
“Airtel’s employees are friendly
and helpful.”
“Airtel’s employees are courteous
and knowledgeable.”
Branching Questions:
• . . . are used to direct respondents to answer the right
• questions as well as questions in the proper sequence.

– “Have you seen or heard any advertisements for wireless


– telephone service in the past 30 days?”
– “If ‘No’, go to question #10.
– “If ‘Yes’ , were the advertisements on radio or TV or both?”
– “If the advertisements were on TV or on both radio and
– TV, then go to question #6?
– “If the advertisements were on radio, then go to
– question #8.”
• Following questions #6 and #8 the next question would be:

– “Were any of the advertisements for ‘Mahindra Xylo’?”


Self-Completion
Instructions
The following issues typically are considered:

• Introducing and explaining how to answer


a series of
• questions on a particular topic.
• Transition statements from one section
(topic) of the
• questionnaire to another.
• Which question to go to next (branching
or skipping).
Self-Completion
Instructions
• How many answers are acceptable, e.g., “Check only
one response.” Or “Check as many as apply.”
• Whether respondents are supposed to answer the
question by themselves, or can consult another
person or reference materials.
• What to do when the questionnaire is completed,
e.g., “When finished, place this in the postage paid
envelope and mail it.”
Interviewer-Assisted
Instructions:

The following issues typically are considered:


• How to increase respondent participation?
• How to screen out respondents that are not
• wanted and still keep them happy?
• What to say when respondents ask how to
answer
• a particular question?
• When concepts may not be easily understood,
• how to define them?
Interviewer-Assisted
Instructions:

• When answer alternatives are to be


read to
respondents (aided response) or not
to be read (unaided response)?
• How to follow branching or skip
patterns?
• When and how to probe?
• How to end the interview?
Questionnaire Design – Pre-
testing of a Questionnaire:
• Determine the nature of the pretest for the
preliminary questionnaire.
• Analyze initial data to identify limitations of
the preliminary questionnaire.
• Refine the questionnaire as needed.
• Revisit some or all of the above steps, if
necessary.
Questionnaire Design –
Administering a
Questionnaire:
• Identify the best practice for administering
the type of questionnaire utilized.
• Train and audit field workers, if required.
• Ensure a process is in place to handle
completed questionnaires.
• Determine the deadline and follow-up
methods.
Tips for question writing
• Format for questions
– Look at the following layouts and decide which you
would prefer to use:
– Do you agree, disagree or have no opinion that this
company has:
– A good vacation policy - agree/not sure/disagree.
Good management feedback - agree/not sure/disagree.
Good medical insurance - agree/not sure/disagree.
High wages - agree/not sure/disagree.
Tips for question writing
• Use simple and clear language
– Poor: How often do you punish your toddler?
– Better: How often do you put your toddler into
timeout? Check only one.
– Once a day 1
– Several times a day 2
– Once a week 3
– Several times a week 4
– Once a week 5
– Several times a week 6
Tips for question writing
• 1 Do not use biased words
e.g. you should not say you are in favour of capital
punishment
• 2 Do not use slang
– Example: How many kids do you have? Example:
Should parents know the whereabouts of their teens
24/7?
• 3 Do not use double barreled questions
(one thought per question)
– Example: Curtailing development and protecting the
environment should be a top priority for “Our” town.
• 4 Do not use vague words or phrases
Tips for question writing
• 5 Do not use abbreviations
• Example: Should KU allow admission to MBA
without CAT ?
• Example: Which political party is responsible for
expanding the size of the GDP?
• 6 Do not use jargon or technical terms Example:
India should formulate a stricter fertility policy.
• 7 Do not use double negatives
• Example: Should the Fiji not oppose the UN court?
• 8. Use caution when asking personal questions
Tips for question writing
Poor: How much do you earn each year? $______________
Better: In which category does your annual income last year best fit?
Below Rs10,000
___ Rs10,001-Rs20,000
___ Rs20,001-Rs30,000
___ Rs30,001-Rs40,000
___ Rs40,001-Rs50,000
___ Rs50,001-Rs60,000
___ Rs60,001-Rs70,000
___ Over Rs70,001
• 10 Non-exhaustive listings
Do you have all of the options covered? If you are unsure, conduct a pretest using the
"Other (please specify) __________" option
– Examples Marital status
– What are the listings?
• 11 Avoid loaded or leading questions
Leading questions such as “Do you agree with the majority of people that the health
service is failing?” should be avoided for obvious reasons that any right-minded
individual can see.
Discussion

You might also like