/
Focus On… Focus On…

Focus On… - PowerPoint Presentation

myesha-ticknor
myesha-ticknor . @myesha-ticknor
Follow
369 views
Uploaded On 2015-10-05

Focus On… - PPT Presentation

Data Collection Choices Presented by Tom Chapel This Module Why and how of Developing indicators Making good data collection choices Using mixed methods effectively Ensure use and share lessons learned ID: 150510

data methods focus indicators methods data indicators focus providers evaluation participants trainings surveys groups provider indicator interviews method observation

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Focus On…" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Focus On…

“Data Collection Choices”

Presented by: Tom ChapelSlide2

This Module…

Why and how of:Developing indicators

Making good data collection choices

Using mixed methods effectivelySlide3

Ensure use and share lessons learned

Gather credible evidence

Engage stakeholders

Describe the program

Focus the evaluation design

Justify conclusions

STEPS

Standards

Utility

Feasibility

Propriety

Accuracy

CDC’s Evaluation Framework

Indicator development

bridges evaluation focus

and data collectionSlide4

What is an indicator?

Specific, observable, and measurable characteristics that show progress towards a specified activity or outcome.Slide5

Why Indicators?

“Gray” area between abstract concepts framed in evaluation questions and methods/sources of data collectionIndicators “operationalize” – restate abstract concepts in a tangible way

Tangible indicators help find/match appropriate data sources/methods

May, but need not, be S-M-A-R-T objectivesSlide6

Selecting Good Indicators

1. Construct ValidityThe indicator measures an important dimension

of the activity or outcome.

i.e., measure “quality” or “timeliness”.Slide7

Selecting Good Indicators

2. Measure the activity or outcome itself, NOT the “fruits” or “so what” of the

activity or outcome.For example: What constitutes a measure of good training?

“Successful training implementation” is an indicator for good training.

“Did participants learn something?” is a fruit of good training.Slide8

Selecting Good Indicators

3. There must be at least one indicator for each activity or outcome of interest-- BUT,

you may need multiple indicators.

The use of multiple indicators is called “triangulation”.Slide9

Good Indicators Can Vary in Level of Specificity

Concept: Timely jail screening

Indicator

Inmates are screened prior to

release, OR

Specificity: low

Indicator

% inmates screened prior to release, OR

Specificity: medium

Indicator

80% of felony inmates screened within 24 hours of booking

Specificity: highSlide10

Provider Education: Our Evaluation Focus

Activities: Outcomes:

Conduct trainings Provider KAB increase

MD peer education and rounds Provided policies

Nurse Educator presentation to LH Providers know registry and their role in it

Activities:

Outcomes:

Providers attend trainings and rounds Providers motivation to do

Providers receive and use Tool Kits immunization increases

LHD nurses do private provider consultsSlide11

Provider Education: Possible Indicators

Activities:

Indicators:

Providers attend trainings Number of participants in trainings

and rounds Number of participants completing series of trainings

Per cent participants by discipline

Per cent participants by regionSlide12

Provider Education: Possible Indicators

Activities:

Indicators:

Providers receive and Per cent providers who report use Tool Kits use of toolkit

Number of “call-to-action” cards received from toolkitSlide13

Data Collection Choices

The Framework approach emphasizes use of findings:

Not “Collect Data”, BUT “

Gather Credible Evidence”

Not “Analyze Data”, BUT “

Justify Conclusions

”Slide14

Characterizing Data Collection Methods and Sources

Primary vs. secondaryprimary: collecting data for first time for the purpose of this projectsecondary: making use of pre-existing data

Obtrusive vs. unobtrusive:

to what extent does the respondent know that data are being collected

Quantitative vs.

qualitative

quantitative: deals with numbers

qualitative: deals with descriptionsSlide15

Quantitative and Qualitative

Quantitative → QuantityNumbers - data which can be measured. Length, height, area, volume, weight, speed, time, temperature, humidity, sound levels, cost.Qualitative → QualityDescriptions - data can be observed but not measured.

Colors, textures, smells, tastes, appearance, beauty, etc.Slide16

Six (Most) Common Ways to Collect Data

PeopleSurveysInterviews

Focus

groups

Observation

Document review

Secondary

dataSlide17

Ensure use and share lessons learned

Gather credible evidence

Engage stakeholders

Describe the program

Focus the evaluation design

Justify conclusions

STEPS

Standards

Utility

Feasibility

Propriety

Accuracy

CDC’s Evaluation Framework

Standards inform

good choices

at Step 4Slide18

Choosing Methods—Cross-Walk to Evaluation Standards

StandardsUtilityFeasibility

Propriety

AccuracySlide19

Choosing Methods—Cross-Walk to Evaluation Standards

StandardsUtility - What is the purpose of the data collection?

FeasibilityPropriety

AccuracySlide20

Choosing Methods—Cross-Walk to Evaluation Standards

StandardsUtilityFeasibility - How much time? How much cost/budget?

Propriety

AccuracySlide21

Choosing Methods—Cross-Walk to Evaluation Standards

StandardsUtilityFeasibility

Propriety -

Any ethical considerations?AccuracySlide22

Choosing Methods—Cross-Walk to Evaluation Standards

StandardsUtilityFeasibility

Propriety

Accuracy - How valid and reliable do data need to be? What does “valid” and “reliable” mean in context of study?Slide23

Trade-offs of Different Data

Collection MethodsMethod/Factor Personal Interview Focus Groups Document Review Survey: Phone Secondary Data  ObservationTime

CostSensitive InsuresHawthorne EffectEthics

Survey: Mail    

   Slide24

Example 1:Sexual Behavior of High School Males

Point-in-time estimate— sexual behavior of high school malesIndicator: What % of high school males have had a sexual encounter by the end of their junior year?

Criterion:

Sensitive issue (consider accuracy)

Slide25

Example 1:Sexual Behavior of High School Males

Possible methods:SurveysInterviewsFocus groups Observation

Which method is

WORST?

WHY?Slide26

Example 1:Sexual Behavior of High School Males

Possible methods:SurveysInterviewsFocus groups Observation

Which method is WORST

? Focus groups

WHY?

Sensitive issue

- peer group is likely to distort responses.Slide27

Example 1:Sexual Behavior of High School Males

Sexual behavior of high school males.Possible methods:SurveysInterviews

Focus groups Observation

Which method is BEST

? Surveys

WHY?

Anonymous

(more accurate)Slide28

Example 2: Intimate Partner ViolenceUnderstanding context—intimate partner violence

Indicator: Understand context and identify patterns of intimate partner violence.Criterion:Sensitive issue (consider accuracy)Slide29

Example 2: Intimate Partner ViolencePossible methods:

SurveysInterviewsFocus groups Observation

Which method is WORST

?

WHY?Slide30

Example 2: Intimate Partner Violence

Possible methods:SurveysInterviewsFocus groups Observation

Which method is WORST

? Surveys

WHY?

Unethical

and

will not elicit the

data we need

(consider utility

).Slide31

Example 2: Intimate Partner ViolencePossible methods:

SurveysInterviewsFocus groups Observation

Which method is BEST?

Interviews or focus groups

WHY?

Build rapport through shared experiencesSlide32

Example 3: Reduce Lead Burden in Household

Aggressive housekeeping and nutrition behaviors to reduce lead burden. Indicator:Assess adoption of housekeeping and nutrition behaviors.Criterion:Sensitive issue

Hawthorne effectSlide33

Example 3: Reduce Lead Burden in Household

Possible methods:SurveysInterviewsFocus groups Observation

Which method is WORST

? Surveys, interviews

WHY?

Inaccurate

(desire to give socially acceptable responses)Slide34

Example 3: Reduce Lead Burden in HouseholdPossible methods:

SurveysInterviewsFocus groups Observation

Which method is BEST?

Observation (garbage, coupons)

WHY?

Passive and unobtrusiveSlide35

The Best Method Depends on the Specific Situation

All three examples involve a sensitive issue:sexual behaviorintimate partner violencegood nutrition and housekeeping

Even though the criterion (sensitive issue) was the same, the best data collection method was different for each situation.Slide36

Provider Education: Our Evaluation Focus

Activities: Outcomes:

Conduct trainings Provider KAB increase

MD peer education and rounds Provided policies

Nurse Educator presentation to LH Providers know registry and their role in it

Activities:

Outcomes:

Providers attend trainings and rounds Providers motivation to do

Providers receive and use Tool Kits immunization increases

LHD nurses do private provider consultsSlide37

Provider Education: Possible Indicators

Activities:

Indicators:

Providers attend trainings Number of participants in trainings

and rounds Number of participants completing series of trainings

Per cent participants by discipline

Per cent participants by regionSlide38

Provider Education: Possible Methods

Providers attend trainings and roundsIndicators

Methods/Sources

Number of participants in trainings Training logs

and rounds

Number of participants completing Registration info series of trainings

Per cent participants by discipline

Per cent participants by regionSlide39

Provider Education: Possible Methods

Providers receive and use Tool KitsIndicators

Methods/Sources

Per cent providers who Survey of providers

report use of toolkit

Number of “call-to-action” cards Analysis/count of call-to-action received from toolkit cards

Slide40

Mixed Methods: Definition

A combination of methods that has complementary

strengths and

non-overlapping

weaknesses

.

The purpose is to

supplement

or

complement

the validity and reliability of the information. Slide41

Why Mixed Methods?

“The Cs and the Es”Corroboration and

Clarification

understanding more defensibly, validly, credibly

”triangulation”

E

xplanation and

E

xploration

understanding more clearly

understanding the “why” behind the “what”