Randall Pearce THINK Insight amp Advice The 7 th Australasian Better Boards Conference 5 July 2013 Evaluation What is it A systematic way of answering questions about projects policies and programs ID: 263481
Download Presentation The PPT/PDF document "It’s the learning, not the result whic..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
It’s the learning, not the result which counts most in evaluation
Randall Pearce
THINK: Insight & Advice The 7th Australasian Better Boards Conference5 July 2013Slide2
Evaluation – What is it?
‘A systematic way of answering questions about projects, policies and programs’
Is it needed or worthwhile?Is it having an effect?At what cost?How could it be improved?Are there better alternatives?Are there any unintended consequences?Slide3
NFP Evaluation – What it isn’tSlide4
Who evaluates?Slide5
Why do (or don’t) they evaluate?
Source: New Philanthropy CapitalSlide6
What do they gain?
Source: New Philanthropy CapitalSlide7
Who should evaluate?
Advantages
DisadvantagesInternalBetter overall & informal knowledgeLess threatening/known to staffLess costly
May
be less objective
Evaluation could be a part-time responsibility
May not be trained in evaluation
External
More objective
Able to dedicate
time and attention
Greater evaluation expertise
Needs
to learn about organisation and culture
Unfamiliar to staff
More costlySlide8
When to conduct evaluation?
Form:
Stage of Program
Purpose:
Proactive Evaluation
Program
Start
Program Completion
Before a program starts
To synthesis information to inform program design
Clarificative Evaluation
During program development
To clarify the program design and how it operates
Interactive Evaluation
During program delivery
To improve program delivery. Involves stakeholders in the evaluation.
Monitoring Evaluation
Once the program is in a settled stage
To monitor program progress for accountability and improvement
Impact Evaluation
During or after program implementation
To assess what has been achieved, to learn and be accountable
Source: K Roberts (adapted from Owen and Rogers, 2006)Slide9
Dispelling myths
Theory of change?
Not needed because the evaluator will reconstruct the logic of the actual program, not the theoretical model:Foundational activitiesActivitiesOutputsImmediate outcomes
Intermediate outcomes
Long-term outcomes
Organisational goalsSlide10
Dispelling myths
Mountain of data?Most data is just information…we are looking for insight into what it means
Historical data is more valuable than a mountain of current dataYour evaluator should identify the few ‘dashboard’ measures that you will need to evaluateOnce an evaluation has been conducted you can use the dashboard foreverSlide11
Dispelling myths
A wad of cash?Think of what is at stake versus the internal budget allocation – any activity with a value in excess of $200K should be evaluated
Governments and foundations often allow for 10% to be spent on evaluationThere are many ways that NFPs can reduce the cost of evaluationsSlide12
Using the results of evaluation
Share them…as widely as you can
Some evaluators will agree to write a summary which protects the egos of those involvedAction learning/research is a participative approach based on a four part cycle of: taking action, measuring the results of the action, interpreting the results and planning what change needs to take place for the next cycle of action
The best projects conclude with a Summit workshopSlide13
Beyond program impact evaluationSlide14
Learning along the way
Documentation
Documents success and failuresSummary of key documents in one placeTimeline/sequence of eventsIsolates key measures for the future
Supports performance appraisal for staff and board
Helps orient staff, volunteers and contractorsSlide15
Learning along the way
Full cost accountingFull costs and expenses need to be calculated to arrive at the true financial picture
Need to include:Budget allocationCash donationsIn-kind servicesPro-bono servicesSlide16
Learning along the way
Full value assessmentCaptures all non-financial outputs in addition to financial information
For example, while social media produces a host of measures, there are no financial equivalents as there are in traditional media (i.e. TARPs)Need to identify data sources for year-on-year comparison in futureSlide17
Learning along the way
Organisational behaviour and GovernanceQualitative research reveals issues around organisational behaviour and governance which can impact outcomes
Project governance can be examined independent of personalities to pinpoint areas for change/improvementSlide18
Learning along the way
Relationship buildingThe evaluation process has been described as ‘cathartic’ by key players
Helps diffuse tensions that build up during a campaignProvides stakeholders a voice/builds goodwill for the futureAids communication ‘across the political/media divide’Slide19
Over to you…
QuestionsSlide20
For more information, contact:Randall Pearce
+61 2 9358 6664randall.pearce@thinkinsightadvice.com.au
NOTE: For a copy of this presentation, please provide your business card at the end of the session or email