You are on page 1of 79

Evaluation of

Training &
Development

Presented By,

Janhavi Rege – MMS 131


Ruchi Gupta – eMBA 12112
Megha Thakkar –
eMBA 12058
Supriya Semwal –
PGDM 12119
Training Cycle
What is a training needs analysis?

 A training need is a shortage of skills or abilities, which


could be reduced or eliminated by means of training
and development.

 Training needs analysis identifies training needs at


employee, departmental or organizational level in order
to help the organization to perform effectively.
How can a need to identified?
• Complaints from staff, customers/clients

• Poor quality work

• Frequent errors

• Large staff turnover

• Deadlines not being met Indicators


• Conflict amongst staff of a
need
• New equipment systems
Telling only Teaching
Unidirectional Bidirectional
Can somebody Let’s use
Blah blah blah…and tell me what some
more blah blah… addiction is? examples

Zzz ? Addiction is a brain


disease

Why?
Conduct Training

•Training does not need to be limited to a four walled


room;

•It can be taken outside for demonstrative training,


such as Fire Extinguisher training where you would
use a live fire,

OR

•Waste Water Operator training where you could be


in an actual wastewater treatment plant.
What Is Evaluation

A program evaluation is the systematic collection of


information about the activities, characteristics, and
outcomes of programs to make judgments about the program,
improve program effectiveness, and/or inform decisions about
future programming.
Types of Evaluation
Formative Evaluation Summative Evaluation
What is it? •Formative evaluation occurs while Summative evaluation takes place
a training program is forming or after the training program has
occurring. occurred.

When? •At the time of incident •End of the Course

•End of the day/weekly

•Mid-point in the course

Why? •Analyze strengths & weakness •Document achievement


towards improving
•Evidence of regular formative
•Feedback evaluation
How to Conduct Formative
Evaluation
• Review the training materials with one or two trainees.

• Hold group discussions with the trainees to gain feedback.

• Use the materials in a situation similar to that of an actual


training program to see how the materials work.

• Assess the materials with managers and supervisors who oversee


trainees participating in training program.

• Observe trainee behavior.

• Give short tests to trainees.


How To Conduct Summative
Evaluation
• Ask trainees for their opinions about the training program after
it has been delivered.

• Test trainees to learn how well they grasped the information.

• Ask participants to demonstrate how they would use the


information learned in training.

• Conduct surveys or interviews with each participant to gain


better understanding of what they learned.

• Measure changes in production and quality of work that has


been accomplished after the training program.
Five Steps of Training Evaluation
• Learn from Experience for Future Improvement

• To decide who should participate in future programs

• To establish a database that can assist management in making


decisions
"TheL.
Donald Four Levels ofModel
Kirkpatrick Learning Evaluation.“

• The model was defined in 1959 by Donald L. Kirkpatrick.

• Kirkpatrick redefined the evaluation model with his 1998 book


"Evaluating Training Programs.

• The idea behind the model is for an organization to have


meaningful evaluation of learning in the organization.

• The degree of difficulty increases as you move through the


levels.
Level 1: Reaction
• Kirkpatrick refers to Level 1 as a measure of
customer satisfaction.

• Most of the forms that people fill out at the end of


a class or workshop are instruments for measuring
Level 1.
Guidelines that Kirkpatrick recommends to get
maximum benefit from reaction

1. Determine what you want to find out

2. Design a form that will quantify reactions

3. Encourage written comments and suggestions

4. Get a 100 percent immediate response

5. Get honest responses

6. Develop acceptable standards

7. Measure reactions against standards and take the appropriate action

8. Communicate reactions as appropriate.


Level 2: Learning
• Kirkpatrick defines learning as the extent to which
participants change attitudes, increase knowledge,
and/or increase skill as a result of attending a
program.

• So to measure learning we need to determine the


following:

 What knowledge was learned


 What skills were developed or improved
 What attitudes were changed
Guidelines for evaluating learning
1.Use a control group if it is practical

2. Evaluate knowledge, skills, and/or attitudes both before


and after the program. Use a paper and pencil test to
measure knowledge and attitudes and use a performance
test to measure skills.

3. Get a 100 percent response

4. Use the results of the evaluation to take appropriate


action.
Level 3: Behaviour
• Level three can be defined as the extent to which a change
in behaviour has occurred because someone attended a
training program.

• In order for change in behaviour to occur, four conditions


are necessary:

 The person must have a desire to change


 The person must know what to do and how to do it
 The person must work in the right climate
 The person must be rewarded for changing
Guidelines for evaluating behaviour
1.Use a control group if that is practical

2. Allow time for a change in behaviour to take place

3. Evaluate both before and after the program if that is practical

4. Survey and/or interview one or more of the following:


trainees, their immediate supervisors, their subordinates and
others who often observe their behaviour.

5. Get a 100 percent response

6. Repeat the evaluation at appropriate times

7. Consider cost versus benefits


Level 4: Results
• This involves measuring the final results that
occurred because a person attended a training
session.

• This can include increased production, improved


work quality, reduced turnover, etc.

• It is important to determine whether the conditions


set forth above in Level 3 have been met.
Guidelines for evaluating results
1.Use a control group if it is practical

2. Allow time for results to be achieved

3. Measure both before and after the program if it is


practical

4. Repeat the measurement at appropriate times

5. Consider cost versus benefit

6. Be satisfied with evidence if proof is not possible


Various evaluation tools can be selected depending on the purposes and
methods of evaluation.

•Questionnaires
•Surveys
•Tests
•Interviews
•Focus group discussions
•Observations
•Performance records
• Keep responses anonymous

• Distribute questionnaire forms in advance

• Pre-Test & Post-Test


Questions can fall into the following categories:

Questionnaire
• Keep responses anonymous

• Distribute questionnaire forms in advance

• Pre/post(Tests)
Questions & Scales

• 5-point and 7-point scales are the most used


options in questionnaires as they give
respondents the ability to discriminate
between choices.

• Note that coding the response options with


numbers can assist in data entry and
analysis.
How to deliver a questionnaire
• Written Questionnaires

• Web-based Questionnaires

• Phone Questionnaire
Advantages of questionnaires
• Economy - Expense and time involved in training interviewers
and sending them to interview are reduced by using
questionnaires.

• Uniformity of questions - Each respondent receives the same set


of questions phrased in exactly the same way. Questionnaires
may, therefore, yield data more comparable than information
obtained through an interview.

• Standardization - If the questions are highly structured and the


conditions under which they are answered are controlled, then
the questionnaire could become standardized.
Disadvantages
• Respondent’s motivation is difficult to
assess, affecting the validity of response.
• Unless a random sampling of returns is
obtained, those returned completed may
represent biased samples.
• An individual may hide his real attitude
and express socially acceptable opinions.
• Limitations of Surveys
• While there are many benefits of surveys, they also
have some limitations. For instance, a survey only
gathers information about the questions asked. In
contrast, during an interview, the interviewer can
explore important subjects in depth, as they are
uncovered. Generally speaking, surveys are
effective only when those surveyed have at least a
moderate degree of literacy. Although oral surveys
are possible, they are usually impractical.
Interview Advantages
• Allows the interviewer to clarify questions.
• Allows the informants to respond in any
manner they see fit.
• Allows the interviewers to observe verbal
and non-verbal behavior of the
respondents.
• Means of obtaining personal information,
attitudes, perceptions, and beliefs.
Interview Disadvantages
• Unstructured interviews often yield data
too difficult to summarize or evaluate.
• Training interviewers, sending them to
meet and interview their informants, and
evaluating their effectiveness all add to
the cost of the study.
Observation
Data Input
Recommendations

Data Analysis
Program
Appendices
Description

Using figures to
present data
Evaluation
report outline
Reporting
MEASURING TRAINING
EFFECTIVENESS & IMPACT
1.Prior to training

• The number of people that say they need it during the needs
assessment process
• The number of people that sign up for it

2.At the end of the training

• The number of people that attend the session


• The number of people that paid to attend the session
• A measurable change in the knowledge or skill at the end of training
• Ability to solve a “mock” problem at the end of training
• Willingness to try to use the skill/knowledge at the end of the
training
3.Delayed Impact

• Customer satisfaction at X weeks after the end of training

• Customer satisfaction at X weeks after the training when the customers know
the actual costs of the training.

• Retention of knowledge at X weeks after the end of training

• Ability to solve a mock problem at X weeks after end of the training

• Willingness to try the skill/knowledge at X weeks after the end of the


training
4.On the Job Behaviour Change

• Trained individuals that self –report that they changed their behaviour / used the
skill or knowledge on the job after the training ( Within X months )

• Trained individuals who’s managers report that they changed their behaviour / used
the skill or knowledge on the job after the training ( Within X months )

• Trained individual’s whose manager’s report that their job performance changed
( as a result of their changed behaviour / skill) either through improved
performance appraisal scores or specific notations about the training on the
performance appraisal form ( within X months )

• Trained individuals that have observable /measurable ( improved sales , quality ,


speed etc.) improvement in the actual job performance as a result of their changed
behaviour/skill ( within X months )

• The performance of employees that are managed by ( or are part of the same team
with) individuals that went through the training
RESISTANCE TO TRAINING
EVALUATION

• There is nothing to evaluate.

• No one really cares about Evaluating Training.

• Evaluation is a threat to my job.


DIFFERENT WAYS TO OVERCOME
THE RESISTANCE TO TRAINING &
DEVELOPMENT EVALUATION

• The employees /trainees should be made aware that every training


programme is designed with some goals or objectives in mind.

• And some type of knowledge, skill and attitude change is expected


from the participants.

• While long term results are important in making business decisions, the
day to day purpose of evaluation should be used as a feedback
mechanism to guide efforts to reward success
• HDFC Life, one of India's leading private life
insurance companies, offers a range of
individual and group insurance solutions.

• It has 25 retail and 9 group products in its


portfolio.

• There are 10 optional rider benefits catering


to the savings, investment, protection and
retirement needs of customers.

• HDFC life have 600 branches in India


touching customers in over 900 cities and
towns.
Our Vision
HDFC Vision & Values
'The most obvious choice for all'.

Our Values
Values that we observe while we work:
• Integrity
• Innovation
• Customer centric
• People Care "One for all and all for one"
• Team work
• Joy and Simplicity
Analyzing Training Needs
• Organizational
• Entire organization, single division or department
• At this level you prepare for future need
• Job/Task
• Single job category
• Indicated by low productivity
• Individual
• Individual employee
• Indicated by poor review or employee assistance
request
Think About it
• Would you internally train, use prepackaged
programs, or outsource for the following topics:

• Sexual Harassment (WHY?)


• Technical training on a manufacturing line (WHY?)
Step 2. Design
• Identify Target Audience
• Learning Styles: Visual, Auditory, Kinesthetic

• Identify and talk with Stakeholders

• Develop Training Objectives

• Develop course Content


• Consider Learning Curves

• Develop Evaluation Criteria


Think About It:
• Think of this class, Training and
Organizational Development as a prolonged
training program. Can you think of a
training objective for this class. Remember
be SPECIFIC!!!
Step 3. Development
• Decide on the Training Materials
• Leader guides, Manuals, Handouts

• Decide on Media Use


• Computer, Television, DVD player, etc.

• Instructional Methods
• Active (Facilitation, Case Studies, Simulations, Vestibule,
Socratic Seminar)
• Passive (Lecture, Presentation, Conference)
• Experiential (Demonstration, One-on-One, Performance)
THINK ABOUT IT
What type of training instructional method would you use in
these circumstances: Passive, Active, Experiential?

• You want to have a training program that helps employees


build stronger analytical and problem solving skills.

• You want to have a training program that will help


employees in the human resource department with the
newly installed HRIS software.

• You want to have a training program for hundreds of


employee on new organizational policies and procedures
Program Delivery Mechanisms

THINK ABOUT IT:


• Classroom

• Self-Study

• Programmed instruction

• E-learning
Step 4. Implementation
• Facility
• Depends on type of training, number of participants, and budget
• Onsite or offsite
• Theater-style, classroom style, banquet-style, chevron style, conference style, u-
shaped style seating

• Trainers
• Skills, knowledge, and/or abilities with training material

• Schedule
• Feasibility of shutting down operations, employee availability, trainer availability,
multiple sessions

• Arrange for delivery


• Training materials, needed media products

• If offsite
• Ensure everyone has a ride
• All expenses have been paid
THINK ABOUT IT

What would you do (onsite vs. offsite training)?

• The organization for which you are trainer is currently not to doing to well
financially.

• The only onsite training facility is directly across from the work plant, and it
is extremely noisy and distracting. Recently you accidentally overheard a
conversation between the vice president of HR and Budget management
about how they feel that you are a drain on HRs’ budget.

• The training material that you are to present is extremely important and
needs to be clearly understood.

• What would you do, would just deal with the noise of the plant or would
you argue for more money? If so, what points would you argue?
Final Step!!!!! Evaluation
• Goals/Outcomes of Training

Transfer of Training

How do you evaluate these training outcomes????

Productivity/Efficiency Employee Knowledge Safety/Quality


Common Models of Training Evaluation
• Donald Kirkpatrick 4 levels of training evaluations in US Training and
Development Journal:

• Reaction Evaluation Method


– Survey (provides trainer feedback not organizational impact)

• Learning Evaluation Method


– Test (provides effectiveness of training info. but no feedback)
– Experimental design: Pre-test/post-test, Post-test/control group

• Behavior Evaluation Method


– Observations, interviews, or surveys 6 wks-6months later
– Provides info. if new skills were transferred to job

• Results Evaluation Method


– Impact on business
– Compares objective statements (turnover, sales, costs)
Training Evaluation Metrics
• Cost-Benefit Analysis
Typical Costs Typical Benefits

-Trainer’s salary and time -Increase in production


-Trainer’s salary and time
-Trainee’s salary and time -Reduction in errors and
-Trainee’s salary and time
accidents
-Material for training -Material for training
-Reduction in turnover
-Expenses for trainer and -Expenses for trainer and
trainees -Less supervision necessary
trainees
-Cost of facilities and -Ability to use new capabilities
-Cost of facilities and
equipment -Attitude
equipment change
-Lost productivity -Lost productivity
Benchmarking
• Gather training data from your
organization (performance data, sales,
efficiency, etc)

• Gather training data on similar


organization (ASTD, or American
Productivity & Quality Center and the
Saratoga Institute)
THINK ABOUT IT
What type of Kirkpatrick’s evaluation methods
would you use?
(Reaction Eval. Method, Learning Eval. Method, Behavior Eval. Method, or
Results Eval. Method)

• You work for organization whose upper management does not see the
benefit in having training programs other than the initial orientation. You
finally convinced them to give you resources to create an interview training
program for managers.

• The HR department has become somewhat fearful that departmental hiring


managers will violate some employment laws. Therefore, you are instructed
to put together a training program for managers on basic human resources
employment law. Be sure that they walk away knowing the laws, otherwise
there might be some lawsuits!!

• You have just completed a brand new training program aimed at decreasing
unnecessary work steps for a production line. You want to pilot test the
program, so you recruit a few employees from the line and a couple of their
managers to go through the training.
Evaluation in HDFC

• Smiley Sheets

• Feedback forms

• Pre-training assessment

• Post-training assessment
The Three DOMAINS
AFFECTIVE DOMIAN

EMOTIONAL LEARNING: FEELING –

Concerned with attitudes, appreciations, interests,


values and adjustments.
PSYCHOMOTOR DOMAIN

PHYSICAL LEARNING: DOING -

Emphasizes speed, accuracy, dexterity, and physical


skills.
COGNITIVE DOMAIN
RATIONAL LEARNING: THINKING-

Emphasis upon knowledge, using the mind, and intellectual


abilities.
Level 1. Knowledge

When was this picture taken?


Where was this picture taken?

Question cues: List, define, tell,


label
Level 2. Comprehension

What is happening in this picture?


Why are these boys dressed like this?

Question cues: Describe, name, identify,


discuss
Level 3. Application

How would you describe the photograph to others?

What caption would you write for this photograph (say,


in a newspaper)?

Question cues: Modify, solve, change, explain


Level 4. Analysis

What do you know about their lives based on this photo?

Question cues: Analyze, separate, compare, contrast


Level 5. Synthesis

What might these boys say about their work in an interview


setting?

What might they say about their future?

Question cues: Create, construct, plan, role-play


Level 6. Evaluation
What is the significance of this photo for the time period
depicted?

Compare this photo with one of three boys from today of the
same age. How are their lives similar? How are they different?

Question cues: Give opinion, criticize, discriminate,


summarize
Challenges in HDFC
• Getting an honest feedback

• Problems arising in pre-test and post-test training


methods

• Test should be conducted in the right manner

• Trainees are resistant to evaluation

You might also like