You are on page 1of 19

Making MEL Powerful December 2013 Laura Roper, Ph.D. Roper LYV Consulting (l.roper@rcn.

com)

Obstacles to MEL
Poor program planning aspirations far exceed capacities; sloppy thinking (lack of theory of change; theory of action); MEL not resourced Overwhelmed and under-resourced lack of time biggest foe to good MEL Reluctance to be held accountable or to hold others accountable MEL imposed from above or outside; MEL gets caught up in politics; MEL design inappropriate to program too heavy; measures the wrong things; culturally inappropriate; data not used [Somerville housing example] No incentives to do MEL (not in work plan, not part of performance review)

The Power of Good MEL Closing the Loop


(an excellent publication by InterNews)

An underlying theory of action: two-way dialogic communication; information needs in the aid life-cycle + a series of assumptions about differential information needs. Research itself as a capacity building effort; participatory process that fosters agency and empowerment, in this case development of professional group of researchers familiar with quantitative and qualitative methods A diagnostic that established a baseline and shaped programming; designed to test hypotheses about different populations (both stratified and purposive)

The Power of Good MEL Closing the Loop


Building off tested instruments (e.g. survey questions into culturally adapted focus group questions); then focus group questions/responses informed survey design. Surveys (frequently) and focus groups repeated longitudinal data to test their hypothesis about changing needs over time. Mixed methods FGD, survey, content analysis of text messages Generated actionable information (e.g. around need for and effectiveness of cholera prevention messaging; most trusted media outlets; information needs [invalidating hypothesis about men and women having different needs], etc. etc.)

The Power of Good MEL Closing the Loop


Well presented in a visually compelling way
Most Important Issue

Closing the Loop Learn A Little, Do a Little


Diagnostic/Ne eds Assessment At inception, 3 month, 6 month, 1 year

User Satisfaction, Knowledge & Actions

Media Program Content

Dialogic

Adjusting Radio Programming AND

User Satisfaction, Knowledge

Aid Delivery

A Review of the Basics


The Basic Evaluation Questions
How do we do what we do better?

What difference does it make?


Does it make more difference than other investments?

Self-Interested MEL
Proactive on what are fair terms for judging your work, dont let the donor dictate
Use MEL reporting to tell a compelling narrative about your work (why it is important; how you confront challenges; that you are strategically nimble; that you are always building up your foundation of alliances/political capital)

Use MEL to educate your colleagues, make the more effective stakeholders

Laying Out Program Cycle


Problem Identification and Diagnosis Stakeholder Analysis G O A L Theory of Change Actors Motivations How Power is Deployed and other assumptions Causal If-Then to Get to Solution (Theory of Action) Strategy Goals Objectives Strategic Elements - Activities

MEL (continuous and iterative)

Implementation (Operational Plans/Log Frame) -Activities completed (outputs)

-- First Order Outcomes -- 2nd, 3rd, etc. Order Outcomes -- Impacts

Getting from Broad Theory of Change to Organizational Action A Three Step Process Addressing Gun Violence Theory of Change: All the major factors and actors that affect gunrelated morbidity and mortality
Theory of Action Media and Video Violence
Theory of Action Gun Legislation (i.e. what your organization can address) Theory of Action Gun Violence as Public Health problem

Theory of Action Drugs and Gang Violence

LOG FRAME: passing gun legislation


Obj 1 Obj 2 Obj 3

activities

outputs

outcomes

impacts

Assessment Levels of MEL and what they can tell you


Learning: What are we going to do differently in this instance and in similar or future cases?

Theory of Change Actors Motivations How Power is Deployed and other assumptions

Causal If-Then to Get to Solution Strategy Goals Objectives Strategic Elements - Activities

Monitoring: Did you Implementation do what you said you -Activities completed (outputs) would and did you do -- First Order Outcomes it well? -- 2nd Order Outcomes -- 3rd Order Outcomes -- Impacts

Whats the problem? Theory, Strategy, Execution? Evaluation: Did you get the results you expected; why or why not?

Need clarity on program goals/objectives (end state) AND point of departure (baseline)
Problem Identification and Diagnosis G O A L Theory of Change Actors Motivations How Power is Deployed and other assumptions

Causal If-Then to Get to Solution

Power Analysis Baseline

Intermediate markers Benchmarks

Implementation -Activities completed (outputs) -- First Order Outcomes -- 2nd Order Outcomes - 3rd Order Outcomes -- Impacts

Strategy Goals Objectives Strategic Elements - Activities

End StateAccountable for delivering


(aspirational vs realistic; process vs specific)

Finding the Breadth/Depth Balance


Broad, but Light Routine Monitoring, Self-Assessment and reporting across projects/programs
Formative Mid-term or critical juncture Summative Evaluation Accountability Broader Learning; More strategic, less expensive if monitoring and documentation has been strong

Targeted Evaluation Problem Solving Best Practice Costeffectiveness Capacities built

Learning Create space (time and


incentives) Create Framework/Strategy for learning Create Platforms Knowledge Management - Tools

Challenge: demonstrating the whole is more than the sum of its parts. Are there key
questions you should ask across evaluations? - e.g. local to global links; alliance strength.

Overall progress toward achieving Oxfams Mission

Other Extl Change Goals

Economic Justice/Livelihoods [meta-evaluations] GROW Advocacy Campaign Climate Change Adaptation

Other Extl Change Goals Fair Trade/ Value Chains

Program/ Regional or Global Campaign

Project/ National Campaign

What type of evaluator are you?

Regardless of what type of evaluator you are, there are some basic principles to follow (AEA Principles for evaluators: A. Systematic Inquiry B. Competence C. Integrity/Honesty D. Respect for People E. Responsibilities for General Public/Welfare AEA Principles for Evaluators

Ethical Principles in Data Collection in Humanitarian Response - InterAction


Broad Principles Respect, Do No Harm, NonDiscrimination
Broad Principles Operationalized:

Risk Benefit Analysis Informed Consent Confidentiality Security Fairness Dignity (subjects, not objects of evaluation)

Mindful Practice
Know you strengths

Know your biases Be clear what values youre bringing to the process Seek information from multiple sources; ideally use mixed methods Make an extra effort to identify and reach stakeholders who may not typically have a voice Listen to what is said and what is going unsaid Dont just document; seek to understand Be familiar with resources in your field; seek out expert advice when you need it.

You might also like