Professional Documents
Culture Documents
Chapter 9
3/15/2013
Design
Implementation
Evaluation
Define Objectives Develop Lesson Plans Develop/ acquire materials select trainer/leader
Select Methods & Techniques Schedule the programme/intervention
3/15/2013
Effectiveness
The degree to which a training (or other HRD program) achieves its intended purpose. Measures are relative to some starting point. Measures how well the desired goal is achieved.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 3
3/15/2013
HRD Evaluation
Textbook definition: The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.
3/15/2013 HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 4
In Other Words
Are we training: the right people the right stuff the right way with the right materials at the right time?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.
3/15/2013
Evaluation Needs
3/15/2013
Purposes of Evaluation
Determine whether the program is meeting the intended objectives. Identify strengths and weaknesses. Determine cost-benefit ratio. Identify who benefited most or least. Determine future participants. Provide information for improving HRD programs.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 7
3/15/2013
Purposes of Evaluation-2
Reinforce major points to be made. Gather marketing information. Determine if training program is appropriate. Establish management database.
3/15/2013
Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?
3/15/2013
Not often enough!!! Frequently, only end-of-course participant reactions are collected. Transfer to the workplace is evaluated less frequently.
3/15/2013
10
Reluctance to having HRD programs evaluated. Evaluation needs expertise and resources. Factors other than HRD cause performance improvements, e.g.,
3/15/2013
Shows the value of HRD. Provides metrics for HRD efficiency. Demonstrates value-added approach for HRD. Demonstrates accountability for HRD activities. Everyone else has it why not HRD?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 12
3/15/2013
I bought it, therefore it is good. Since its good, I dont need to posttest. Who says its:
3/15/2013
Anecdotal
samples.
other users.
approach: Talk to
2.
Try before buy: Borrow and use Analytical approach: Match research
data to training needs. Holistic approach: Look at overall HRD process, as well as individual training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 14
3.
4.
3/15/2013
Table 7-1 lists nine frameworks for evaluation. The most popular is that of D. Kirkpatrick:
3/15/2013
Reaction
Focus on trainees reactions Did they learn what they were supposed to? Was it used on job? Did it improve the organizations effectiveness?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 16
Learning
Job Behavior
Results
3/15/2013
Most organizations dont evaluate at all four levels. Focuses only on post-training. Doesnt treat inter-stage improvements. WHAT ARE YOUR THOUGHTS?
3/15/2013
17
Other Frameworks/Models 1
CIPP: Context, Input, Process, Product CIRO: Context, Input, Reaction, Outcome Brinkerhoff:
Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 18
3/15/2013
Other Frameworks/Models 2
Cognitive outcomes Skill-based outcomes Affective outcomes Reaction Learning Applied learning on the job Business results ROI
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 19
Phillips:
3/15/2013
A Suggested Framework 1
Reaction
Did trainees like the training? Did the training seem useful? How much did they learn?
Learning
Behavior
3/15/2013
Suggested Framework 2
Results
What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?
3/15/2013
21
Interviews
Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed
23
3/15/2013
Questionnaires
Advantages: Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options
3/15/2013
Limitations: Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate
24
Direct Observation
Advantages: Non-threatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers
3/15/2013
25
Written Tests
Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias
26
3/15/2013
Simulation/Performance Tests
Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains
3/15/2013
Limitations: Time consuming Simulations often difficult to create High costs to development and use
27
3/15/2013
Reliability
Consistency of results, and freedom from collection method bias and error.
Validity
Does the device measure what we want to measure? Does it make sense in terms of the resources used to get the data?
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 29
Practicality
3/15/2013
3/15/2013
30
Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 31
3/15/2013
Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates
3/15/2013
32
Economic Data
Profits Product liability claims Avoidance of penalties Market share Competitive position Return on Investment (ROI) Financial utility calculations
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.
3/15/2013
33
Mono-method bias
3/15/2013
34
Research Design
Specifies in advance: the expected results of the study.
the methods of data collection to be used. how the data will be analyzed.
3/15/2013
35
Shows trainee what training has accomplished. Helps eliminate pretest knowledge bias. Compares performance of group with training against the performance of a similar group without training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 36
Control Group
3/15/2013
randomly assign individuals to the test group and the control group to minimize bias. Use time-series approach to data collection to verify performance improvement is due to training.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 37
3/15/2013
Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.
3/15/2013
38
Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 39
3/15/2013
HRD programs and training are investments. Line manager often see HR and HRD as costs, i.e., revenue users, not revenue producers. You must prove your worth to the organization Or youll have to find another organization.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 40
3/15/2013
3/15/2013
41
Cost-benefit analysis
Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sickdays, etc. Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 42
Cost-effectiveness analysis
3/15/2013
Return on Investment
3/15/2013
43
Direct costs Indirect costs Development costs Overhead costs Compensation for participants
3/15/2013
44
Direct Costs
Instructor
3/15/2013
Indirect Costs
Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ.
3/15/2013
46
Development Costs
Fee to purchase program Costs to tailor program to organization Instructor training costs
3/15/2013
47
Overhead Costs
General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM
3/15/2013
48
Participants salary and benefits for time away from job Travel, lodging and per-diem costs
3/15/2013
49
Measuring Benefits
Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs
3/15/2013
50
Utility Analysis
N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from training SDy = Dollar value of untrained job performance (in standard deviation units) C = Cost of training
U = (N)(T)(dt)(Sdy) C
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 51
3/15/2013
dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained. SDy = Standard deviation in dollars, or overall productivity of organization.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 52
3/15/2013
Walk the walk, talk the talk: MONEY. Involve HRD in strategic planning. Involve management in HRD planning and estimation efforts.
3/15/2013
Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable training objectives. Obtain participant reactions. Develop criterion measures/instruments to measure results. Plan and execute evaluation strategy.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 54
3/15/2013
Summary
Training results must be measured against costs. Training must contribute to the bottom line. HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster.
HRD3e Contributed by Wells Doty, Ed.D. Clemson Univ. 55
3/15/2013