Heather Huntington PhD Cloudburst Group Background Democracy Human Rights and Governance Democracy Fellow 2012 USAIDs DCHADRG Center Land Tenure and Natural Resource Management ID: 589499
Download Presentation The PPT/PDF document "Role of impact evaluations in the growth..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Role of impact evaluations in the growth of STIP programs at USAID
Heather Huntington, PhD
Cloudburst GroupSlide2
Background
Democracy, Human Rights and Governance
Democracy Fellow (2012) – USAID’s DCHA/DRG Center
Land Tenure and Natural Resource Management
Task lead for evaluation portfolio (2014- present) – USAID’s E3/Land Office Slide3
Evaluations – USAID
Ethiopia: Pastoral
rights
certification programs
(2)
Ethiopia: Farmland rights certification program
Ghana: Local governance and service delivery
Guinea
: Community land and artisanal diamond rights
Liberia: Community land rights protection
program
Zambia: Agroforestry and land certification
pilot
Zambia:
REDD+
pilot, global climate change Slide4
Role of IEs in growth of STIP
Learning from IEs is still at a nascent stage at AID.
Variation across sectors, Offices and Missions
Current evaluation practices and results still do not provide compelling evidence of the impacts of many programs.
There is still much work to be done on improving IEs and the learning from research and evaluation.Slide5
Early stage of building a learning agenda
Justification for programming funds
Policy requirement (USAID Evaluation Policy)
Afterthought, separate box to check
Example – Mobile application technologies
Offices/Missions vary in how much they are willing to invest in IEs
Pressure/support from leadership
USAID internal technical capacity and interest
Dearth of evidence on potential impact of many interventions
Too early for results to drive programming
IEs required for pilots (Evaluation policy) Slide6
Need to improve R&E and data driven approaches
Many programs (as designed) are not amenable to rigorous evaluations
Not integrated into program design
‘Christmas tree’ programs
Low internal technical capacity
Recruitment is difficult, talent
turnover
Coordination is critical - over reliance on contractors
Not a norm
Lack of Institutions, processes for effective R&E and building a sustainable learning agenda
IEs are difficult, time-consuming and expensive
Internal resistance and external (implementing partners)
IEs can make programming more difficult and expensive Slide7
What has been accomplished? (8 years)
IEs have become a more important part of USAID’s portfolio of M&E
USAID Policies
– Evaluation (2011), Research, Development Data Library
Independent, 3
rd
party evaluations
Significant improvement in rigor, methods
Growing R&E and data driven culture
Greater leadership support
Missions and Offices applying best practices, innovative and standardized methods
Mobile data collection
Geospatial integration
Improved survey instruments Slide8
Continued…
Attempts to build internal capacity
Improved knowledge, awareness and training
Focused recruitment (AAAS, Democracy Fellows)
and staff with research/evaluation background
Learning from Baseline data collection
Adapt interventions based on pre treatment data
Sponsor research on baseline data
Improve developing country capacity for R&E
Emphasis on engaging local data collection partnersSlide9
What further steps could be taken?
(1) USAID does not have internal capacity to assess the impact and effectiveness of programs.
Building USAID internal capacity is critical to developing (1) high quality evaluations and (2) a sustainable learning agenda regarding the impact of STIP programs.
Managing rigorous impact evaluations and promoting an associated learning agenda is a significant time commitment – requires:
(a) very close coordination
with technical and program staff, plus evaluation and implementation teams
(b) deep understanding of programming and evaluation components
-Need to take steps to recruit and retain staff with necessary background and skills and prioritize this role Slide10
Continued
(2) Evaluation designs must be embedded in program designs at early stage of program development
(3)Rebuild institutional learning capacity
Lack of organizational mechanisms to integrate findings and learn from research
Low capacity to absorb and disseminate the results of evaluations
(4) Need continued support and incentives from senior leadership
Continued skepticism of IE methods; they are seen as cumbersome, expensive and inflexible to programmingSlide11
Resources
https://www.usaid.gov/
evaluation
USAID Evaluation Policy (2011)
Examples of learning integration (education, food security, policy updates, program design)
USAID Evaluation 5 year report (2016)
National Academies Report (2008) -
Improving Democracy Assistance: Building Knowledge Through Evaluations and Research
(
http://pdf.usaid.gov/pdf_docs/Pnadl231.pdf
)
www.usaidlandtenure.net
E3/Land Office evaluation and impact site