/
Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation–

Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– - PowerPoint Presentation

luanne-stotts
luanne-stotts . @luanne-stotts
Follow
346 views
Uploaded On 2019-11-05

Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– - PPT Presentation

Program Evaluation Webinar Series Part 1 Top Roadblocks on the Path to Good Evaluation And How to Avoid Them Presented by Tom Chapel Top Roadblocks on the Path to Good Evaluation And How to Avoid Them ID: 763537

evaluation program outcome outcomes program evaluation outcomes outcome agency intermediate roadblock public term diabetes desired health action contributionsby findings

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Program Evaluation Webinar Series Part 1..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them” Presented by: Tom Chapel

Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them Thomas J. Chapel, MA, MBAChief Performance Officer (Acting) CDC/Office of the Director/OCOO Presented November 20, 2008 Tchapel@cdc.gov 404-498-6073

ObjectivesProgram evaluation and typical “roadblocks” in doing good evaluation. CDC’s Evaluation Framework as way to surmount roadblocks.

Key PointsIn today’s session we will discuss: What is important about CDC’s framework? Why does it lead to better use of findings? Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEPS S tandards Utility Feasibility Propriety Accuracy

Why We Evaluate…“... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason…

Why We Evaluate……there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus

The ProblemThe stuff I do doesn't make a difference! Why don't things get better?!

Implementing Program EvaluationHow do Imotivate? What gets in the way? Not this… This…

Today’s FocusTop Roadblocks on the Road to Good Evaluation

Defining EvaluationEvaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven

Use the Findings!If the findings don’t get used…the program will not improve.

What is “Evaluation?”Evaluation is not… Evaluation is… A specific set of tools or techniques. An orientation to your program. The idea of continuous reflection.

Defining Evaluation Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven

What is a “Program”?Not only: Big training programsCommunity interventions But also: Recommendations and guidelinesSurveillance systems In other words, a program is anything with an intended outcome.

Roadblock #6Not understanding where evaluation “fits in” …

The “Silo” Model

The Integrated or “CQI” Model To achieve “continuous quality improvement” planners, performance measurers, and evaluators must communicate with each other.

The Customer is the KeyProgram evaluation must:See planning, performance measurement, and evaluation as being integrated. Start with the idea of having a customer or an intended user of findings. Direct the evaluation with the customer in mind.

Roadblock #5Making the “perfect” the enemy of the “good”.

Roadblock #5 What if you said, “To be cardiovascularly fit, you must run a marathon.”?

Thanks, but…That's not me. I don't have that expertise. I don't have those skills. I don't have the money to do that.

Do What You Can!There’s always an evaluation worth doing. The biggest mistake is doing nothing because you can only do a little.Even a little bit is going to yield some benefit.

Roadblock #4 Evaluating only what you can “measure”… … because those are the things we can measure with validity, reliability and accuracy.

Upstream Questions How many brochures?How many trainees? How many people showed up? Did we get a lot of product out there?

Downstream Questions What have you done for me lately? How has it mattered? What have you done for public health?

Measuring the Right Thing…“…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein

Evaluation Starts By Saying… What are the important things that need to be measured? Can I measure them with enough rigor to meet the needs of this situation this time? Sometimes the answer is “NO!”

You Get What You Measure…“…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)

Roadblock #3Neglecting Intermediate Outcomes…. Nothing has advanced the evaluation cause in public health more than preaching this idea of intermediate outcomes.

Intermediate OutcomesContribute to Downstream SuccessHow is it that my program will make a contribution to that downstream outcome? We call these “intermediate outcomes”.

What is the Program Logic? What needs to happen to achieve the desired outcome? What is the “program logic”? My action Desired outcome

What are the markers that tell me I’m on the right road? Don’t just ask: Did it work? How many tomatoes did I get?

What are the markers that tell me I’m on the right road? Ask: Is it working? Are planting, watering, and weeding taking place? Have the blossoms “set”? Are there nematodes on the plants?

Research ModelDevelop Theory Measure Outcome Program Activities

Research ModelDevelop Theory Measure Outcome Program Activities If I achieved the outcome– great! If I didn’t achieve the outcome– why?

Evaluation Unpacks the “Black Box” My action Desired outcome

The World’s BestChildren’s Soccer Program

But We Never Won a Game

Focus on Intermediate Outcomes Can we: pass the ball? spread out?spend more time on the opponent’s side of the field?

Forgetting Intermediate Outcomes ScienceCartoonsPlus.com

What’s In the Box? My program:training technical assistance fundingpartnerships Desired outcome: less morbidityfewer mortalities

What’s In the Box? My program:training technical assistance fundingpartnerships Desired outcome: less morbidityfewer mortalities Intermediate outcomes

The Power of EvaluationEstablishing intermediate outcomes allows you to determine if you are making progress in the right direction.

Diabetes Intermediate Outcomes

Why Intermediate Outcomes?I’m making progress in the right direction. I am contributing to the downstream outcome.

Identifying Intermediate OutcomesWhat is the ultimate outcome I’m seeking? Who (besides me) needs to take action to achieve it?What action do they need to take? These are the intermediate outcomes that populate the “black box” or the “program logic”.

Roadblock #2Confusing attribution and contribution… “I can’t make the case that my program was responsible for that change.”

The Role of Public Healtha mobilizer and convener Public health is not … Public health is… a direct deliverer of services Based on: The Future of Public Health, Institute of Medicine, 1988.

“Networked” Interventions Agency A Program A-n Program A-1 Agency B Program B-1 Agency C Program C-n Program C-1 Agency D Program D-n Program D-1 OUTPUTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES SYSTEM OUTCOME

Attribution Program A-n Program A-1 Agency B Program B-1 Agency C Program C-n Program C-1 Agency D Program D-n Program D-1 OUTPUTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES SYSTEM OUTCOME Agency A

Contribution Agency A Program A-n Program A-1 Agency B Program B-1 Agency C Program C-n Program C-1 Agency D Program D-n Program D-1 OUTPUTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES SYSTEM OUTCOME

Identify Your Contributionsby Asking “Why?”Why fewer diabetes amputations ?

Identify Your Contributionsby Asking “Why?”Because physicians are doing more timely foot exams. Why? Why fewer diabetes amputations ?

Identify Your Contributionsby Asking “Why?”Because physicians doing more timely foot exams. Because the insurance reimbursement climate has changed. Why? Why fewer diabetes amputations ?”

Identify Your Contributionsby Asking “Why?”Because physicians doing more timely foot exams. Because insurance reimbursement climate has changed. Because the standards of practice have changed. Why? Why fewer diabetes amputations ?”

A “Chain of Causation”Because: We formed a coalition. We helped incent those standards. We helped change the reimbursement climate.

Contributions Count!Attribution We formed a coalition.Contribution We helped change the standards. We helped incent reimbursement.

Establish the “Chain of Causation”Ask providers. Ask insurance reimbursers. Ask physicians. Was the coalition influential in making these changes?

Roadblock #1Not asking:“Who (else) cares…..”

Ask the Right QuestionsWho matters for this intervention besides me? Who else needs the information from this evaluation? The stakeholders! Engage stakeholders Step 1