You are on page 1of 2

Practice: Evaluate implementation to document what

you are doing


Key Action: Measure quality of implementation activities

TOOL: Measuring Dosage and Quality of Program Implementation

Purpose: Both the dosage and the quality of your program activities affect your ability
to produce your intended outcomes. You may find this table useful for
ensuring that your evaluation of implementation activities includes
appropriate indicators of quality and reach, and not just dosage for magnet
program outputs. It can also help you determine which quality and reach
indicators will provide the most useful information.

Instructions: 1. Identify what data you will collect to answer your implementation
questions.

2. Compare your indicators for dosage and quality of program outputs to


those on the checklist. Check to make sure you have included indicators from
both columns.

3. Reflect on which indicators are most useful to you and why.

1
Practice: Evaluate implementation to document what
you are doing
Key Action: Measure quality of implementation activities

Measuring Dosage and Quality of Program Implementation

Magnet program Examples of indicators for dosage Example of indicators for quality and reach Which indicators are most
output (How much, how many, how often?) (How well? How effective? How uniform?) useful, and why?
Magnet  Number of new lessons, units, or  Extent of rigor, depth, and breadth of
instruction courses created integrate theme-based curriculum
received by magnet-theme content
 Mastery of concepts and skills as
students
 Number of hours of magnet- evidenced by student work
related instruction students
 Common understanding of magnet
receive each week
theme and quality criteria for
integration into coursework
 All students access and experience
program services and magnet content
Professional  Number of workshop sessions  Satisfaction of participants in training
development for made available to teachers and session
teachers administrators
 Level of teacher understanding of
 Frequency of professional workshop concepts
development
 Clarity of learning goals for training
 Number of staff participating
 Level of ongoing support
 Number of partnerships and
outside resources brought in to
support training
Recruitment of  Number of recruitment activities  Level of interest and engagement of
diverse body of executed parents at recruitment events
students
 Number of parents involved in  Consistency in understanding magnet
recruitment events (and other program and mission among parents
activities) and students
 Clarity of application and enrollment
process for participants
Source: Selected PowerPoint® slides from “Partners in data-driven decision making: Evaluators and districts working together to improve program practice,” by David Kikoler of American
Education Solutions, at Magnet Schools of America national conference, April 2008, Chattanooga, TN.

You might also like