You are on page 1of 55

Evaluating HRD Programs

Chapter 9

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Assessment Assess Needs Prioritize Needs

Design

Implementation

Evaluation

Define Objectives Develop Lesson Plans Develop/ acquire materials select trainer/leader
Select Methods & Techniques Schedule the programme/intervention

Select evaluation Criteria

Determine evaluation design


Deliver the HRD Programme or intervention Conduct evaluation of programme or intervention Interpret results

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Effectiveness

The degree to which a training (or other HRD program) achieves its intended purpose. Measures are relative to some starting point. Measures how well the desired goal is achieved.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 3

3/15/2013

HRD Evaluation
Textbook definition: The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.
3/15/2013 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 4

In Other Words
Are we training: the right people the right stuff the right way with the right materials at the right time?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

3/15/2013

Evaluation Needs

Descriptive and judgmental information needed.

Objective and subjective data

Information gathered according to a plan and in a desired format.


Gathered to provide decision making information.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 6

3/15/2013

Purposes of Evaluation

Determine whether the program is meeting the intended objectives. Identify strengths and weaknesses. Determine cost-benefit ratio. Identify who benefited most or least. Determine future participants. Provide information for improving HRD programs.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 7

3/15/2013

Purposes of Evaluation-2

Reinforce major points to be made. Gather marketing information. Determine if training program is appropriate. Establish management database.

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Evaluation Bottom Line

Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

How Often are HRD Evaluations Conducted?


Not often enough!!! Frequently, only end-of-course participant reactions are collected. Transfer to the workplace is evaluated less frequently.

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

10

Why HRD Evaluations are Rare

Reluctance to having HRD programs evaluated. Evaluation needs expertise and resources. Factors other than HRD cause performance improvements, e.g.,

Economy Equipment Policies, etc.


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 11

3/15/2013

Need for HRD Evaluation


Shows the value of HRD. Provides metrics for HRD efficiency. Demonstrates value-added approach for HRD. Demonstrates accountability for HRD activities. Everyone else has it why not HRD?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 12

3/15/2013

Make or Buy Evaluation


I bought it, therefore it is good. Since its good, I dont need to posttest. Who says its:

Appropriate? Effective? Timely? Transferable to the workplace?


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 13

3/15/2013

Evolution of Evaluation Efforts


1.

Anecdotal
samples.

(short interesting story)

other users.

approach: Talk to

2.

Try before buy: Borrow and use Analytical approach: Match research
data to training needs. Holistic approach: Look at overall HRD process, as well as individual training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 14

3.

4.

3/15/2013

Models and Frameworks of Evaluation

Table 7-1 lists nine frameworks for evaluation. The most popular is that of D. Kirkpatrick:

Reaction Learning Job Behavior Results


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 15

3/15/2013

Kirkpatricks Four Levels

Reaction

Focus on trainees reactions Did they learn what they were supposed to? Was it used on job? Did it improve the organizations effectiveness?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 16

Learning

Job Behavior

Results

3/15/2013

Issues Concerning Kirkpatricks Framework

Most organizations dont evaluate at all four levels. Focuses only on post-training. Doesnt treat inter-stage improvements. WHAT ARE YOUR THOUGHTS?

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

17

Other Frameworks/Models 1

CIPP: Context, Input, Process, Product CIRO: Context, Input, Reaction, Outcome Brinkerhoff:

Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 18

3/15/2013

Other Frameworks/Models 2

Kraiger, Ford, & Salas:


Cognitive outcomes Skill-based outcomes Affective outcomes Reaction Learning Applied learning on the job Business results ROI
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 19

Phillips:

3/15/2013

A Suggested Framework 1

Reaction

Did trainees like the training? Did the training seem useful? How much did they learn?

Learning

Behavior

What behavior change occurred?


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 20

3/15/2013

Suggested Framework 2

Results

What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

21

Data Collection for HRD Evaluation


Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information
3/15/2013 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 22

Interviews
Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed
23

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Questionnaires
Advantages: Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options
3/15/2013

Limitations: Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate
24

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Direct Observation
Advantages: Non-threatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

25

Written Tests
Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias
26

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Simulation/Performance Tests
Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains
3/15/2013

Limitations: Time consuming Simulations often difficult to create High costs to development and use

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

27

Archival Performance Data


Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes
28

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

Choosing Data Collection Methods

Reliability

Consistency of results, and freedom from collection method bias and error.

Validity

Does the device measure what we want to measure? Does it make sense in terms of the resources used to get the data?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 29

Practicality

3/15/2013

Type of Data Used/Needed


Individual performance System-wide performance Economic

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

30

Individual Performance Data


Individual knowledge Individual behaviors Examples:


Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 31

3/15/2013

System-Wide Performance Data


Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

32

Economic Data

Profits Product liability claims Avoidance of penalties Market share Competitive position Return on Investment (ROI) Financial utility calculations
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

3/15/2013

33

Use of Self-Report Data


Most common method Pre-training and post-training data Problems:

Mono-method bias

Desire to be consistent between tests

Socially desirable responses Response Shift Bias:

Trainees adjust expectations to training

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

34

Research Design
Specifies in advance: the expected results of the study.

the methods of data collection to be used. how the data will be analyzed.

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

35

Research Design Issues

Pretest and Posttest

Shows trainee what training has accomplished. Helps eliminate pretest knowledge bias. Compares performance of group with training against the performance of a similar group without training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 36

Control Group

3/15/2013

Recommended Research Design


Pretest and posttest with control group. Whenever possible:

randomly assign individuals to the test group and the control group to minimize bias. Use time-series approach to data collection to verify performance improvement is due to training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 37

3/15/2013

Ethical Issues Concerning Evaluation Research


Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

3/15/2013

38

Assessing the Impact of HRD


Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 39

3/15/2013

HRD Program Assessment


HRD programs and training are investments. Line manager often see HR and HRD as costs, i.e., revenue users, not revenue producers. You must prove your worth to the organization Or youll have to find another organization.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 40

3/15/2013

Two Basic Methods for Assessing Financial Impact


Evaluation of training costs Utility analysis

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

41

Evaluation of Training Costs

Cost-benefit analysis

Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sickdays, etc. Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 42

Cost-effectiveness analysis

3/15/2013

Return on Investment

Return on investment = Results/Costs

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

43

Types of Training Costs


Direct costs Indirect costs Development costs Overhead costs Compensation for participants

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

44

Direct Costs

Instructor

Base pay Fringe benefits Travel and per diem

Materials Classroom and audiovisual equipment Travel Food and refreshments


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 45

3/15/2013

Indirect Costs

Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

3/15/2013

46

Development Costs

Fee to purchase program Costs to tailor program to organization Instructor training costs

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

47

Overhead Costs

General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

48

Compensation for Participants

Participants salary and benefits for time away from job Travel, lodging and per-diem costs

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

49

Measuring Benefits

Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs

3/15/2013

HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.

50

Utility Analysis

Uses a statistical approach to support claims of training effectiveness:


N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from training SDy = Dollar value of untrained job performance (in standard deviation units) C = Cost of training

U = (N)(T)(dt)(Sdy) C
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 51

3/15/2013

Critical Information for Utility Analysis

dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained. SDy = Standard deviation in dollars, or overall productivity of organization.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 52

3/15/2013

Ways to Improve HRD Assessment


Walk the walk, talk the talk: MONEY. Involve HRD in strategic planning. Involve management in HRD planning and estimation efforts.

Gain mutual ownership

Use credible and conservative estimates.

Share credit for successes and blame for failures.


HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 53

3/15/2013

HRD Evaluation Steps


Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable training objectives. Obtain participant reactions. Develop criterion measures/instruments to measure results. Plan and execute evaluation strategy.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 54

3/15/2013

Summary

Training results must be measured against costs. Training must contribute to the bottom line. HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 55

3/15/2013

You might also like