/
Program Evaluation Webinar Series Part 1: Program Evaluation Webinar Series Part 1:

Program Evaluation Webinar Series Part 1: - PowerPoint Presentation

tatiana-dople
tatiana-dople . @tatiana-dople
Follow
403 views
Uploaded On 2016-07-05

Program Evaluation Webinar Series Part 1: - PPT Presentation

Top Roadblocks on the Path to Good Evaluation And How to Avoid Them Presented by Tom Chapel Top Roadblocks on the Path to Good Evaluation And How to Avoid Them Thomas J Chapel MA MBA ID: 391854

evaluation program outcome outcomes program evaluation outcomes outcome agency intermediate roadblock public health term good diabetes desired findings downstream reimbursement helped physicians

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Program Evaluation Webinar Series Part 1..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Program Evaluation Webinar Series Part 1:

“Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them”

Presented by: Tom ChapelSlide2

Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them

Thomas J. Chapel, MA, MBAChief Performance Officer (Acting)

CDC/Office of the Director/OCOO

Presented November 20, 2008

Tchapel@cdc.gov

404-498-6073Slide3

ObjectivesProgram evaluation and typical “roadblocks” in doing good evaluation.

CDC’s Evaluation Framework as way to surmount roadblocks.Slide4

Key PointsIn today’s session we will discuss:

What is important about CDC’s framework? Why does it lead to better use of findings?

Ensure use and share lessons learned

Gather credible evidence

Engage stakeholders

Describe the program

Focus the evaluation design

Justify conclusions

STEPS

S

tandards

Utility

Feasibility

Propriety

AccuracySlide5

Why We Evaluate…“... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place.

They thought, with some reason…Slide6

Why We Evaluate……there was no punishment more severe than eternally futile labor....”

The Myth of SisyphusSlide7

The ProblemThe stuff I do doesn't make a difference!

Why don't things get better?!Slide8

Implementing Program EvaluationHow

do Imotivate?

What gets

in the way?

Not this…

This…Slide9

Today’s FocusTop Roadblocks

on the Road to

Good Evaluation

Slide10

Defining EvaluationEvaluation

is the systematic investigation of the merit, worth, or significance of any “object”. Michael ScrivenSlide11

Use the Findings!If the

findings don’t get used…the program will not improve.Slide12

What is “Evaluation?”Evaluation is not…

Evaluation is…

A specific set

of tools or techniques.

An orientation to your program.

The idea of continuous reflection.Slide13

Defining Evaluation

Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael

ScrivenSlide14

What is a “Program”?Not only:

Big training programsCommunity interventions

But also:

Recommendations and guidelinesSurveillance systems In other words, a program

is anything with an intended outcome.Slide15

Roadblock #6Not understanding

where evaluation “fits in” …Slide16

The “Silo” ModelSlide17

The Integrated or “CQI” Model

To achieve “continuous quality improvement” planners, performance measurers, and evaluators must communicate with each other.Slide18

The Customer is the KeyProgram evaluation must:See planning, performance measurement, and evaluation as being integrated.

Start with the idea of having a customer or an intended user of findings. Direct the evaluation with the customer in mind.Slide19

Roadblock #5Making the “perfect” the enemy of the “good”.Slide20

Roadblock #5

What if you said,

“To be cardiovascularly

fit, you must run a marathon.”?Slide21

Thanks, but…That's not me.

I don't have

that expertise.

I don't have those skills.

I don't have the money to do that.Slide22

Do What You Can!There’s always an evaluation worth doing.

The biggest mistake is doing nothing because you can only do a little.Even a little bit is going to yield some benefit.Slide23

Roadblock #4 Evaluating only what you can “measure”…

… because those are the things we can measure with validity, reliability and accuracy. Slide24

Upstream Questions

How many brochures?How many trainees? How many people showed up?

Did we get a lot

of product out there?Slide25

Downstream Questions

What have you done for me lately? How has it mattered?

What have you done

for public health?Slide26

Measuring the Right Thing…“…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….”

Albert EinsteinSlide27

Evaluation Starts By Saying… What are the important things that need to be measured?

Can I measure them with enough rigor to meet the needs of this situation this time? Sometimes the answer is “NO!”Slide28

You Get What You Measure…“…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…”

(New York Times, 3/4/99)Slide29

Roadblock #3Neglecting Intermediate Outcomes….

Nothing has advanced the evaluation cause in public health more than preaching

this idea of intermediate outcomes.Slide30

Intermediate OutcomesContribute to Downstream SuccessHow is it that my program will make

a contribution to that downstream outcome?

We call these

“intermediate outcomes”.Slide31

What is the Program Logic?

What needs to happen to achieve the desired outcome? What is the “program logic”?

My action Desired

outcomeSlide32

What are the markers that tell me I’m on the right road?

Don’t just ask:

Did it work?

How many

tomatoes did I get?Slide33

What are the markers that tell me I’m on the right road?

Ask:

Is

it working?

Are planting, watering, and weeding taking place?

Have the blossoms “set”?

Are there nematodes

on the plants?Slide34

Research ModelDevelop Theory

Measure

Outcome

Program ActivitiesSlide35

Research ModelDevelop Theory

Measure

Outcome

Program Activities

If I achieved the outcome– great!

If I didn’t achieve the outcome– why?Slide36

Evaluation Unpacks the “Black Box”

My action

Desired

outcomeSlide37

The World’s BestChildren’s Soccer ProgramSlide38

But We Never Won a GameSlide39

Focus on Intermediate Outcomes

Can we:

pass the ball?

spread out?spend more time on the opponent’s side of the field? Slide40

Forgetting Intermediate Outcomes

ScienceCartoonsPlus.comSlide41

What’s In the Box?

My program:training

technical assistance

fundingpartnerships

Desired outcome:less morbidityfewer mortalitiesSlide42

What’s In the Box?

My program:training

technical assistance

fundingpartnerships

Desired outcome:less morbidityfewer mortalities

Intermediate outcomesSlide43

The Power of EvaluationEstablishing intermediate outcomes allows you to determine if you are making progress in the right direction.Slide44

Diabetes Intermediate OutcomesSlide45

Why Intermediate Outcomes?I’m making progress in the right direction.

I am contributing to the downstream outcome.Slide46

Identifying Intermediate OutcomesWhat is the ultimate outcome I’m seeking?

Who (besides me) needs to take action to achieve it?What action do they need to take?

These are the intermediate outcomes that populate the “black box” or the “program logic”.Slide47

Roadblock #2Confusing attribution and contribution…

“I can’t make the case that my program

was responsible for that change.”Slide48

The Role of Public Healtha mobilizer and convener

Public health is not …

Public health is…

a direct deliverer of services

Based on:

The Future of Public Health,

Institute of Medicine, 1988.Slide49

“Networked” Interventions

Agency A

Program A-n

Program A-1

Agency B

Program B-1

Agency C

Program C-n

Program C-1

Agency D

Program D-n

Program D-1

OUTPUTS

SHORT-TERM

OUTCOMES

LONG-TERM

OUTCOMES

SYSTEM

OUTCOMESlide50

Attribution

Program A-n

Program A-1

Agency B

Program B-1

Agency C

Program C-n

Program C-1

Agency D

Program D-n

Program D-1

OUTPUTS

SHORT-TERM

OUTCOMES

LONG-TERM

OUTCOMES

SYSTEM

OUTCOME

Agency ASlide51

Contribution

Agency A

Program A-n

Program A-1

Agency B

Program B-1

Agency C

Program C-n

Program C-1

Agency D

Program D-n

Program D-1

OUTPUTS

SHORT-TERM

OUTCOMES

LONG-TERM

OUTCOMES

SYSTEM

OUTCOMESlide52

Identify Your Contributionsby Asking “Why?”Why fewer diabetes amputations ?Slide53

Identify Your Contributionsby Asking “Why?”Because physicians are doing more timely foot exams.

Why?

Why fewer diabetes amputations ?Slide54

Identify Your Contributionsby Asking “Why?”Because physicians doing more timely foot exams.

Because the insurance reimbursement climate has changed.

Why?

Why fewer diabetes amputations ?”Slide55

Identify Your Contributionsby Asking “Why?”Because physicians doing more timely foot exams.

Because insurance reimbursement climate has changed.

Because the standards of practice

have changed. Why?

Why fewer diabetes amputations ?”Slide56

A “Chain of Causation”Because:

We formed a coalition.

We helped incent those standards.

We helped change the reimbursement climate.Slide57

Contributions Count!Attribution

We formed a coalition.Contribution

We helped change the standards.

We helped incent reimbursement.Slide58

Establish the “Chain of Causation”Ask providers.

Ask insurance reimbursers.

Ask physicians.

Was the coalition influential in making these changes?Slide59

Roadblock #1Not asking:“Who (else) cares…..”Slide60

Ask the Right QuestionsWho matters for this intervention besides me? Who else needs the information from this evaluation?

The stakeholders!

Engage stakeholders

Step 1