Professional Documents
Culture Documents
Preface
Table of contents
Introduction
12
12
2.
14
3.
18
4.
19
4.1.
4.2.
4.3.
4.4.
4.5.
4.6.
4.7.
4.8.
19
1.
5.
6.
VET reform
Evaluation
Context of reform
Goals
Planning of reforms
Management of reforms
Drivers
Impact
13
21
24
24
25
25
26
26
27
27
Levels of evaluation
29
6.1.
6.2.
6.3.
6.4.
6.5.
29
Establishing a baseline
Impact and timing of measurement
The individual
The organisation
The aggregate level of the economy and society
28
30
30
34
37
7.
40
8.
42
List of abbreviations
44
45
References
49
List of tables
Tables
Table
Table
Table
Table
Table
1:
2:
3:
4:
5:
Table
Table
Table
Table
Table
Table
Table
Table
Table
Table
Table
6:
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
Table 17:
Table 18:
15
21
23
28
31
31
31
32
32
33
33
35
35
36
37
38
38
38
1. Introduction
1.1.
Evaluating the impact of reforms of vocational education and training: examples of practice
1.2.
13
(1) EPPI Evidence based policy and practice University of London, Institute of Education
Evaluating the impact of reforms of vocational education and training: examples of practice
Reform
Report reference
Austria
Wimmer, 2000
Denmark
Reform 2000
Finland
15
16
France
Klein, 2002
Les bnficiaires du
contrat de qualification
entrs lautomne
1994: les enseignements de 4 vagues
de panel
Germany
Personalqualifizierung
in den nuen Bundeslandern
Greece
VFA, 2002
Evaluation of the
labour-market policies
and assessment of the
influence of European
employment strategy
in Greece during the
period 1997 to 2001.
Ireland
Leaving certificate
(applied)
Irish Ministry of
Education, 2002
Ireland
Vocational training
opportunities scheme
(VTOS)
Italy
Netherlands
Portugal
Montavavao e Silva
et al., 1996
Neubert and
Steinborn, 1999
Evaluating the impact of reforms of vocational education and training: examples of practice
Portugal
Despacho Conjunto
do Ministrio da
Educao e para a
Qualificao e o
Emprego No 123/97
de 16.6.1997
Ministrio da
Educao e para a
Qualificao e o
Emprego, 1997
Spain
Memoria Banance I
Acuerdo Nacional de
Formacion Continua
1993-1996
Forcem, 1999
Sweden
Reformeringen av
gymnasieskolan
Sweden
Larsson, 2000
UK
UK
UK
Technical and
vocational education
initiative
Department of
Employment, 1994
UK
Modern
apprenticeships
Economic Research
Services Ltd, 1998;
Coleman and
Williams, 1998
UK
17
4.1.
VET reform
20
cohort. There is also evidence of some permeation of teaching styles and resources from IVT
provision to general provision when these
programmes are taught in the same institution.
Access to higher education also features in the
selected reforms. Returning to the changes in
upper secondary IVT programmes, a common
objective is that students following these new
programmes should have access to higher
education, so more time is needed to develop the
vocational provision while retaining a core of
general education. Several reforms include additional time for study in upper secondary education typically another year. This additional time
is needed for learning from workplace experience
and for developing and assessing projects or
assignments. In some reforms studied legal
changes provide a right for individuals graduating
from such programmes to have a place in a
higher education institution. One reform that aims
to make higher education provision more desirable and accessible to students who retain this
interest in vocational programmes.
Some reforms aimed to create new learning
centres, others aimed to change participation and
pedagogy in existing centres. Sometimes the
reforms to content, pedagogy and VET organisation are sufficiently extensive to create a new
learning environment based in an existing institution. Thus it is sometimes difficult to isolate
reforms for creating new environments from
those simply changing content or pedagogy. In
this study new learning environments were
created by:
(a) the Austrian reforms where students are
educated in new Fachhochshule where
expertise could be concentrated on vocational courses;
(b) introducing apprenticeship programmes in
Italy and the UK where employers and
training providers established new ways of
blending general learning with work-based
learning;
(c) the reforms in Portugal where new semi-independent regionally controlled schools for IVT
were established alongside existing provision;
(d) trade unions in the UK where they negotiated
Evaluating the impact of reforms of vocational education and training: examples of practice
Aims of change
Examples
Concentrate expertise,
diversify VET provision,
attract students, meet
skills need.
Innovate in programme
design (focus on learner),
coordinate range of
existing provision.
To make curriculum
relevant, assessment
authentic, recognise
informal learning, motivate
learners, bridge initial
education and work
situations.
Improvements in the
content and teaching of
specific programmes
in an existing type
of institution
Develop teaching
approaches to motivate
and retain learners
4.2.
Evaluation
21
22
Evaluating the impact of reforms of vocational education and training: examples of practice
Table 3: Broad review of VET reforms and evaluation methods in this study
Type of VET
reform
Intermediate impact
Development of
CVT
D, EL, E, NL
Increased participation
of adults
Questionnaire survey
Continuous
of institutions and
monitoring and
individuals, satisfaction reporting programme
survey, case study
exemplification
Reform of post
compulsory IVT
DK, NL, FIN,
S, UK
Content development,
pedagogic development,
institutional development,
regionalisation
Increased
participation,
good progression
for graduates
New qualification
development
IRL, I, UK
Increased participation
Questionnaire survey
of institutions and
individuals.
Satisfaction survey
Monitoring and
formative evaluation
Cohort review
F
Identify successes
and problems
Modify provision
Periodic survey
Independent
monitoring
VET funding
review
UK
Changes to funding
distribution system
Literature review,
interview,
focus groups
Single event
evaluation
VET institutional
change
A, P, FIN, S, UK
Capacity building,
improve skills
supply
Stakeholder
consultation, case
study exemplification,
satisfaction survey
Formative evaluation
Lower long-term
unemployment
Stakeholder
consultation
including use of
controls.
Post programme
review
Mix of evaluation
styles
Evaluation style
23
24
4.3.
Context of reform
for the reform or whether in reporting on evaluation the authors attempted to locate, post hoc,
the reform in a wider setting is usually not clear.
However, most reforms are fundamental and it is
difficult to perceive them as developing independently of central government policy, for example
the Nordic reforms of upper secondary education.
Many reports describe the forces promoting
change in VET. These include:
(a) the political wish or mandate to reorganise
VET;
(b) economic necessity;
(c) relief of social issues;
(d) the need to find more equitable ways of
funding VET;
(e) the need to overcome bureaucratic barriers;
(f) feedback from enterprises about skills needs
and shortages.
Legal requirements for new structures to be
established and for new individual rights to
training undoubtedly carry forward VET reform
but it is difficult to establish whether legal requirements are a driving force. There are numerous
examples of reforms driven by law (prime examples come from Spain, Italy, the Netherlands,
Finland and Sweden). Research on the driving
forces for VET reform is worthy of further exploration on a European basis. There are insufficient
data in literature on how different agencies drive
VET reform in the reports used in this study and it
may be more fruitful to gather information on
drivers of reform independently and outside
impact studies.
4.4.
Goals
Evaluating the impact of reforms of vocational education and training: examples of practice
improve reform management or resource allocation. About one third of studies were clearly
defined for measuring outcomes with stated aims
of impact on specific stakeholders.
One strong differentiator of reports is the
extent to which the goals of reforms and evaluations referenced theory-based reasoning for
taking the proposed course of action. One third
of reforms argued for a new form of VET by using
evidence from research, another third made no
such case and for the remaining third it is difficult
to determine where the arguments for new forms
of VET originated. A theory-based reform typically
has theoretical proposals such as knowledge is
situated in specific contexts and include mechanisms for translating theoretical knowledge into
outcomes such as use a range of contexts for
learning ideas and the outcome could be greater
likelihood of employment or better performance
at work. Griffith (2001) provides useful information on theoretical perspectives in evaluation
studies.
It was not possible to make judgements on
how aims changed during evaluation. One study
made it clear the direction of the reform had
changed significantly during its implementation
and the purpose of evaluation had also changed.
As stated earlier only a few reports described
effects of interventions some time after the intervention had taken place.
4.5.
Planning of reforms
4.6.
Management of reforms
25
26
4.7.
Drivers
4.8.
Impact
5.1.
Financing reforms
28
Year 1
Year 2
Year 3
16
13
30
36 %
20 %
32 %
28
43
54
5.2.
Evaluation of financing
6. Levels of evaluation
6.1.
Establishing a baseline
(2) See, for example, the final assessment report for assistance funded under Phare VET reform programme in the Czech
Republic, OMAS Consortium Middle Region, 1999
30
6.2.
6.3.
The individual
The level of satisfaction from vocational training programmes was found to be quite high. Out of all trained
respondents, 90.6 % were very/quite a lot satisfied by the infrastructure of training centres, 89.3 % by the
quality of training, 89.8 % by the trainers standards and 83.8 % by the usefulness of the training provided.
Overall, satisfied respondents reached more than 90 % (91.8 %) and remained high level among all demographic groups.
Source: Greek report of VET effectiveness.
Evaluating the impact of reforms of vocational education and training: examples of practice
perspective from participants and seek information about how individuals feel relative to
others who had not participated in the intervention. For example:
74.4
Prestige
58.4
Career chances
38.2
24.4
Comparison sample
77
72
63
10
59
57
decide on a career
52
38
48
20
27
10
Both the differences in response rate and the similarities can be useful. Moving on from views about
the future, and possibly increasing the usefulness of
Table 7: Increases in salary from the first employment in 1992 to employment in 1997
(FRF)
Qualification
Salary 1992
Salary 1997
Difference
5 500
7 500
2 000
5 500
7 300
1 800
Bac gnral
5 000
6 500
1 500
5 200
6 000
800
5 500
6 000
500
31
32
Table 8: The measures of success for two active youth labour market programmes
Non participants
Programme 1
Programme 2
SEK 37 750
SEK 52 110
SEK 44 120
SEK 89 300
SEK 74 770
SEK 66 700
37 %
29 %
24 %
42 %
41 %
39 %
11 %
10 %
5%
12 %
13 %
9%
78
76
career prospects
72
70
being employed
69
paid to train
63
earning a wage
32
32
Evaluating the impact of reforms of vocational education and training: examples of practice
The information above shows the effect of characteristics of the programme on individuals.
Table 10: Motives for abandoning apprenticeship
(%)
I didnt like study
49.8
12.9
9.1
5.4
Family reasons
5.0
Financial reasons
3.1
1.0
0.9
6.7
The effects on individuals of factors external to the provision can be considered when evaluating effectiveness.
Table 11: Effect of training on long-term unemployed
Self reported effect
55.6
43.3
Sense of accomplishment
43.9
Clarified my objectives
20.0
16.1
42.2
Obtained qualification
30.0
20.0
11.7
29.4
21.7
10.0
To the question how relevant is your employment subject to that of the vocational training programme you
attended, almost 3 out of 10 trained employed respondents replied very/quite a lot relevant. Consequently,
almost 1 out of 10 trained respondents work at this period of time (almost 10 months after the completion of
the vocational training programme) in a job relevant to the subject of the vocational training programme he/she
had attended.
Source: Greek evaluation of VET
33
34
6.4.
The organisation
Evaluating the impact of reforms of vocational education and training: examples of practice
33
27
10
(%)
employers views
in regions of the reform
employers views
in other regions
76.1
42.6
77.5
39.1
75.5
34.7
69.1
34.9
80.1
60.3
84.9
62.1
Cost effectiveness
84.0
63.7
(4) More details on this issue can be found in a synopsis of research into how training impacts on enterprises is given in Tessaring,
1998, pp. 72-77.
35
36
Same
Worse
37.3
49.3
13.4
29.9
66.2
3.9
Relevance of qualifications
27.0
59.0
14.0
21.3
72.5
6.2
19.3
66.7
14.0
Creativity in problem-solving
15.6
67.5
16.9
13.0
70.1
16.9
Career chances
7.8
77.9
14.3
Evaluating the impact of reforms of vocational education and training: examples of practice
Table 15: Are you interested in employing more graduates from Fachhochschule in the future?
(%)
1998
2000
Yes
60.6
65.5
Maybe yes
27.2
21.4
Probably no
12.1
7.1
6.0
No
Source: Austrian vocational higher education initiative
6.5.
37
38
Table 16: Five years after gaining a qualification how many young people in France are
registered as unemployed or inactive?
(%)
Qualification
Registered unemployed
Inactive
13
11
0.5
Bac gnral
12
18
16
1998
1999
42
29
39
41
Other
19
30
unemployed
in education
in job
80
20
70.6
17.6
8.8
Portfolio of achievements
76.9
23.1
Vocational qualification
68.8
3.1
28.1
No certification
64.6
6.2
16.9
Evaluating the impact of reforms of vocational education and training: examples of practice
39
Evaluating the impact of reforms of vocational education and training: examples of practice
41
Evaluating the impact of reforms of vocational education and training: examples of practice
43
List of abbreviations
CVT
IVT
PQO
Personalqualifizierung in Ostdeutschland
QCA
QVE
VTOS
WRFE
Clarification of purpose
of these criteria
We are looking for criteria we can apply as we
read evaluations of programmes that come under
our remit. These will be fairly large-scale evaluations completed by reputable individuals and
organisations.
While we are concerned about impact evaluation, we will take account of process evaluation
if we are to make useful conclusions about evaluating impact and the design of reform programmes. Hence the breadth of territory covered
by the criteria below.
The criteria are grouped by chronology of
project implementation. They are further grouped
by the following characteristics of evaluation
programmes:
(a) context: what external influences are likely to
influence?
(b) goals: how are the aims, objectives, milestones described?
(c) planning: what is discussed in anticipation of
what might happen?
(d) management: what is done to make the
programme function effectively?
(e) drivers: who is shaping the evaluation activity?
(f) impact: what is the effect of the evaluation?
skills needs, workforce development and partnership) made clear? Was consistency of reforms
with other policies evaluated?
Policy description
Was a description of national policy used to
underpin the reform and its evaluation? Was it
intended to measure impact? If there was no
policy description what was the purpose of the
evaluation?
Goals
Reform aims
Was the aim of the reform included? What was to
change (e.g. learning, institutions, frameworks,
and certification)? What was to be maintained?
Was this the principal focus of the evaluation
programme? Were there clear targets for the
reform?
Purposes of evaluation
Was the purpose(s) of evaluation made clear? For
example to inform decision-making (government,
employers, providers, and trainees), improving
reform management, improving resource allocation, improving the quality of programmes, a
mechanism for stimulating debates, a process of
bringing reform designers and managers to
account.
Theoretical basis
Did the reform have an explicit link between
hoped-for outcomes and a theoretical basis for
expecting the interventions to lead to these
outcomes?
Planning
Baseline for analysis
Was the evaluation set up in such a way that a
baseline for measuring the effects of interventions
was established? Was this baseline established
and agreed amongst stakeholders before the
start of the programme?
46
Management
During implementation
Management structure
Was the management infrastructure of the reform
discussed? Who was consulted? Did the evaluation look at this?
Context
Donor commitment
Is there a declaration of intent and continued
commitment to the reform from any donor
authority? Did the evaluation investigate it?
Risk management
Were the risks associated with the reform made
explicit? What was done to reduce risk? How
was risk management organised? Did the evaluation include programme risks?
Legal embedding
Is there evidence in the evaluation that legal
arrangements were considered as a preparatory
step towards implementing the reform?
Programme handbook
Was there a publication describing the reform?
Who was it aimed at? How did any donor institution contribute to this publication? Was the manual
evaluated for content and usefulness to users?
Preparatory studies
Were feasibility studies carried out or ex ante
evaluations planned? Did the evaluation look at
the quality of these studies?
Drivers
Reform actors
Was there evidence of pressure for change? Who
was opposed to it? How did they act? Was there
evidence of involvement from policy level actors?
Similarly was there evidence of bottom-up
demand for reform? Was a stakeholder analysis
part of the evaluation?
Familiarisation
Did the proposed reform become widely known?
Who discussed it? Did evaluation consider
dissemination throughout the duration of
programme? If so what issues were discussed
and why?
Goals
Stability
Did the goals of the reform change during the
implementation? If so did the evaluation identify
the influences that were driving the change?
Planning
Formative influence
Is there evidence of change in intervention
programmes as a result of information arising
from the evaluation process or intermediate
outcomes of the reform? Was evaluation ever
present during discussions of the reform?
Complementary activities
Is there evidence of organisations at national,
regional and local levels working jointly towards
common targets in a complementary top-down
and bottom-up approach, and were they involved
in the evaluation?
Management
Transitional arrangements
Were there transitional arrangements in place
between existing VET systems and those introduced as a result of reform during the period of
reform? How were these introduced, managed
and dismantled? Did evaluation consider this?
Impact
User impact
Was it planned to make interventions that could
be measured by changes in opportunity and
practice amongst end users (learners, communities, organisations, and countries? How was the
evaluation organised to measure impact?
Piloting arrangements
Is there evidence that any pilot phase was useful
because it linked strongly to systemic change,
and was effectively disseminated? Were issues of
mainstreaming and sustainability addressed
during the expansion phase, and did evaluation
consider these?
Evaluating the impact of reforms of vocational education and training: examples of practice
Drivers
Stakeholder motivation
Were people and organisations incentivised to
participate? Did it work? How was this incentivisation differentiated and assessed?
Stakeholder groups
Were key groups participating in VET involved?
For example, those in pre employment activities,
those upgrading their skills used in their job,
those retraining in another area of work, those
requiring VET to improve employability.
Stakeholder activities
Was the involvement of individuals and key
groups monitored? How were they used as indicators of success or failure in programme and the
evaluation?
Impact
Programme timing
Was a timescale specified and did the reform
process meet its time-related goals?
Timely information
Was there a means of summarising evaluation
information and regularly feeding in to the reform
management activity? How is this done?
Programme budget
Were cost considerations (of reforms) included in
the evaluation specification? What were the
intended costs?
Monitoring tools
Were monitoring and evaluation tools in place to
identify, take account of and measure the impact
of policy changes during the course of the reform
and did evaluation analyse them?
Context
Programme context
Was the political, social, economic and cultural
influence considered important for the success of
the reform, and how did evaluation take context
into account?
EU priorities
Were the EU-wide common themes for VET (such
as improving access to VET and increasing
participation) addressed in the programme, and
the evaluation?
Goals
Duration of outcomes
Was there a plan to look at long-term effects of
Programme flexibility
Was the programme adjusted to take account of
changing financial, social and political contexts?
Did the evaluation programme seek evidence for
this?
Drivers
Social partners
Was there a plan for full engagement of social
partners? Were they involved in the evaluation?
Programme spillover
Is there evidence that the programme and evaluation practices (introduced from the EU) resulted
in activity in a Member State and a programme of
national action?
47
48
Impact
Personal impact
Is there evidence that the intervention has had an
impact on individuals for job change, learning
progression, and reward?
Institutional impact
Is there evidence that the intervention has had an
impact on institutions in terms of behavioural
transfer of learned skills into results?
Societal impact
Was the impact of interventions on communities
measured? For example, families and social
organisations.
Systemic impact
Is there evidence of a systemic link between the
outcomes of the interventions included in the
reform and policy development, and how was it
measured?
References
50
Evaluating the impact of reforms of vocational education and training: examples of practice
51