/
How Do You Know When How Do You Know When

How Do You Know When - PowerPoint Presentation

calandra-battersby
calandra-battersby . @calandra-battersby
Follow
367 views
Uploaded On 2015-11-29

How Do You Know When - PPT Presentation

Your Programs Really Work Evaluation Essentials for Program Managers Session 3 DATA ANALYSIS Anita M Baker EdD Evaluation Services Hartford Foundation for Public Giving ID: 209008

evaluation data program findings data evaluation findings program analysis plan review quantitative foundation codes results report participants questions bruner

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "How Do You Know When" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

How Do You Know When

Your Programs Really Work?

Evaluation Essentials for Program ManagersSession 3: DATA ANALYSISAnita M. Baker, Ed.D.Evaluation Services

Hartford Foundation for Public Giving,

Nonprofit Support Program: BEC

Bruner FoundationSlide2

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation.

They may NOT be sold or redistributed in whole or part for a profit.Copyright © by the Bruner Foundation 2012* Please see supplementary materials for a sample agenda, activities and handouts

Bruner Foundation Rochester, New YorkSlide3

2

How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint

SlidesThe Evaluation Essentials for Program Managers

slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project 2010. The materials were revised and re-tested with three nonprofit organizations as part of the

Anchoring Evaluation

project in 2011-12. The slides, intended for use in organizations that have already participated in comprehensive

evaluation training,

include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs. (Please note that the final session includes general information about analysis planning

as well as analysis of both quantitative and qualitative data, and presentation of findings. Specific strategies related to data collection, i.e., analysis of survey data or interview data, and information about development of tables and graphs are included in the

supplementary

powerpoint presentation. Additional MaterialsTo supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts.When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. https://www.surveymonkey.com/s/EvalAnchoringSurvey

Bruner Foundation

Rochester, New YorkSlide4

What is Evaluation Anyway?

Program Evaluation Participatory EvaluationThoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions.Trained evaluation personnel and practice-based decision-makers coming together to learn about , design, conduct and use results of program evaluation.

i ReviewSlide5

How are evaluation data collected?

InterviewsSurveysObservationsRecord ReviewsAll have limitations and benefitsRequire preparation on the front end: Instrument Development and testing

Administration plan developmentAnalysis plan developmentii ReviewSlide6

Evaluation Data Collection Options

Qualitative DataSurveysAdministering a structured series of questions with discrete choices

External Record ReviewUtilizing quantitative data that can be obtained from existing sourcesInterviewsConducting guided conversations with key people knowledgeable about a subject

Focus GroupsFacilitating a discussion about a particular issue/question among people who share common characteristics

Observations

Documenting visible manifestations of behavior or characteristics of settings

Quantitative Data

Record Review

Collecting and organizing data about a program or event and its participants from outside sources

iii ReviewSlide7

Surveys:

Series of items with pre-determined response choicesCan be completed by administrator or respondentsCan be conducted “paper/pencil” phone, internet (e-survey) using alternative strategiesInstruments are called – surveys, “evaluations,” questionnaires

USE SURVEYS TO:Study attitudes and perceptionsCollect self-reported assessment of changes in response to programCollect program assessmentsCollect some behavioral reportsTest knowledgeDetermine changes over time.

PRE

POST

GRAND

CLAIMS

iv Review Slide8

Interviews:

One-sided conversation with questions mostly pre-determined, but open-ended. Respondent answers in own terms. Can be conducted in person on phone one-on-one, or groupsInstruments are called – protocols, schedules or guides

USE INTERVIEWS TO:Study attitudes and perceptionsCollect self-reported assessment of changes in response to programCollect program assessmentsDocument program implementationDetermine changes over time. v ReviewSlide9

Observations:

Observations are conducted to view and hear actual program activities. Users of reports will know what and how events occur. Can be focused on programs overall participants pre-selected featuresInstruments are called – protocols, guides, checklists

USE OBSERVATIONS TO:Document program implementationWitness levels of skill/ability, program practices, behaviors Determine changes over time. vi ReviewSlide10

Record Reviews:

Accessing existing internal information, or information collected for other purposes. Can be focused on own records records of other orgs adding questions to existing docsInstruments are called – protocols USE REC REVIEW TO:Collect some behavioral reportsConduct tests, collect test results

Verify self-reported data Determine changes over time vii ReviewSlide11

What happens after data are collected?

Data are analyzed, results are summarized.Findings must be converted into a format that can be shared with others.Action steps should be developed from findings. “Now that we know _____ we will do _____.”

viii ReviewSlide12

Important Data-Related Terms

Data can exist in a variety of formsRecords: Numbers or text on pieces of paperDigital/computer: Bits and bytes stored electronicallyMemory: Perceptions, observations or facts stored in a person’s mindQualitative, QuantitativePrimary v. Secondary Data

Variables (Items)Unit of AnalysisDuplicated v. UnduplicatedUnit Record (Client-level) v. Aggregated1Slide13

Plan your Analysis in Advance!

What procedures will be conducted with each set of data and who will do them?

How will data be coded and recoded? How will data be disaggregated

(i.e. “broken out for example by participant characteristics, or time)?

How

will missing

data

be handled?

What analytical strategies or calculations will be performed (e.g., frequencies, cross-tabs

)?

How will comparisons will be made?Whether/which statistical testing is needed?2Slide14

3

Analysis Plan Specifics, You Must Decide . . . What procedures will be conducted with each set of data and who will do them.

How data will be grouped or partitioned.What types of codes will be applied to the data.How comparisons will be made.

Data to other project data (within group)

Data to expectations

Data to data from other sources (across groups)

There is no single process!Slide15

Analyzing (Quantitative) Data: A Few Important Terms*

Case: individual record (e.g., 1 participant, 1 day, 1 activity)Demographics: descriptive characteristics (e.g., gender)Disaggregate: to separate or group information (e.g., to look at data for males separately from females) – conducting crosstabs is a strategy for disaggregating data.Duplicated/Unduplicated (

e.g., counting # of individuals at events – dup; or counting number of events for each individual )Partition(v): another term that means disaggregate.Unit of Analysis: the major entity of the analysis – i.e., the what or the whom is being studied (e.g., participants, groups, activities) Unit Record (i.e., client level) v. Aggregate (i.e., group level)Variable: something that changes (e.g., number of hours of attendance)

*common usage

4Slide16

5

Quantitative Data Analysis: Basic StepsOrganize and arrange data (number cases as needed).Scan data visually.

Code data per analysis plan.Enter and verify data.Determine basic descriptive statistics.Recode data as needed (including missing data).

Develop created variables. Re-calculate basic

descriptive statistics

.

Conduct other analyses per planSlide17

6

Quantitative Data Analysis StrategiesImportant Things to Look at or Summarize Frequencies: How often a response or status occurs. Total and Valid Percentages:

Frequency/total *100 Measures of Central Tendency: Mean, Median, (Modes) Distribution: Minimum, Maximum, Groups (*iles) Cross-Tabulations: Relationship between two or more variables (also called contingency analyses, can include significance tests such as chi-square analyses)

Useful, 2nd

Level Procedures

Means testing (ANOVA, t-Tests)

Correlations

Regression AnalysesSlide18

Important Things to Look at or

SummarizeAnalyzing Quantitative Data

What to DoCalculate FrequenciesCalculate Total and/or Valid Percentages

What That Means

Count how many there are of something.

Count how often something (e.g., a response) occurs.

Frequency/total *100

Example

Questions You Could Answer

How many participants

were in each group?What were the demographics of participants?How many answered “Yes” to Question 2?What proportion of participants met intensity targets?What proportion of all those who answered question 2, said “Yes.”7Slide19

Important Things to Look at or

SummarizeWhat to DoDetermine Central Tendencies

What That Means

Calculate the average (mean), or identify the

median

(middle) or

mode

(most common value).

Avg. =

Sum of Values

Total Number of ValuesTotal # of hoursTotal # of people with hoursExample Questions You Could AnswerWhat is the average number of hours participants attend? What is the most common numbers of days attended in a week? (mode) Analyzing Quantitative Data8Slide20

Important Things to Look at or

SummarizeWhat to doDetermine Distributions

Cross-Tabulations(pivot tables are crosstabs)

What That

Means

Determine the minimum value,

the maximum, and/or how the data are grouped

(

e.g

, high, medium, or low values, quartiles, percentiles, etc.).

Relationship between 2 or more variables (also called contingency analyses, can include significance tests such as chi-square analyses)Example Questions You Could AnswerWhat was the least amount of attendance for the group? What was the most? How many participants fall into low, medium, and high intensity groups?Are there relationships between participant characteristics and outcome changes?Analyzing Quantitative Data9Slide21

Coding and Data Entry

Create codebook(s) as needed (identify codes and affix them to instrument copies).

Create electronic database when possible (use Excel, SPSS, SAS).ID/create unique identifiers for cases and affix or enter as needed. Enter or extract data as needed (do not recode as data are entered).

Make (electronic or paper) copies of your data.

10Slide22

Analysis of Qualitative Data

Analytical Strategies SimilarFor Qualitative and Quantitative Data

Consider how you plan to use findings, -- who is the audience? what format works best?Plan your analysis in advance.How does the data fit within overall evaluation plan, other data?How will findings fit in the overall report plan?How will you code, display and draw conclusions about data?

How will you validate/verify and adjust your findings?

Be careful interpreting data!

11Slide23

12

Steps to Take When Analyzing Qualitative DataSegment or partition data (i.e., divide it into meaningful analytical units)Reduce dataCode dataCompare data

Organize, summarize and display dataDraw conclusions, verify/validate resultsRevise summaries and displays accordingly

Process is IterativeSlide24

13

Coding Qualitative DataA priori or deductive codes: predetermined categories based on accepted theory or program knowledge Inductive: based on raw data (not predetermined)Hierarchical: larger categories with subcategories in each

You can combine inductive and deductive within a hierarchical coding scheme Slide25

14

Coding Strategies and RemindersKeep a master list of codes Distinguish a priori and inductive codesRe-apply codes to all segmentsUse multiple codes, but keep coding schemes as simple as possible

Test out sample entries to identify potential problems before finalizing code selectionsCheck for inter/intra coder reliability (consistency)Coding is not exact (expect differences)Co-occurring codes (more than one applies)Face-sheet codes (descriptors)Slide26

15

EnumerationA strategy for organizing, summarizing, and displaying qualitative data Quantify frequency of codes,* or types

Use counts to define results (e.g., most responses were positive; all responses fell into 4 categories – the category most exemplified was __________).* e.g., none, some, a lot, as a percentageSlide27
Slide28

Negative Findings

Explain the results and what they mean, and why they occurred if possibleClarify how negativeDon’t blame it on bad evaluation Clarify next course of actionClarify what did work and for whomAvoid milquetoast approach

Don’t be reluctant to report if possible16Slide29

Inaccurate Findings

Determine cause Disseminate errata if necessary or recall reportCommunicate with stakeholders why results will not be usable 17Slide30

Inconclusive Findings

Present in an unbiased fashion Indicate conclusions can not be drawnDevelop a plan to correct evaluation or program problems if necessary18Slide31

Positive Findings

Explain the results and what they mean, and why they occurred if possibleClarify how positive, who it worked for and howDon’t distrust positive results (but be careful to avoid biased designs) Report positive results and celebrate accomplishmentsClarify next course of actionResist making assumptions about the next iteration

Design careful follow-up 19Slide32
Slide33

Evaluation Reporting: Initial Steps

2. Determine what Presentation Strategies work best. PowerPoint Newsletter Fact sheet Oral presentation Visual displays Video Storytelling Press releases

Report full report, executive summary, stakeholder-specific report?1. Clearly identify your audience. Staff? Funders? Board? Participants? Multiple

20Slide34

Components of A Strong Program Evaluation Report

Description of the subject program.Clear statement about the evaluation questions and the purpose of the evaluation.

Description of actual data collection methodsSummary of key findings (including tables, graphs, vignettes, quotes, etc.)Discussion or explanation of the meaning and importance of key findings

Suggested Action StepsNext Steps (for the program and the evaluation)

Issues for Further Consideration (loose ends)

Introduction

Methods

Findings

Conclusions

21Slide35

Think About Communication Strategies

Are there natural opportunities for sharing (preliminary) findings with stakeholders? At a special convening At regular or pre-planned meetingsDuring regular work interactions (e.g., clinical supervision, staff meetings, board meetings)

Via informal discussions22Slide36

Additional Reporting Tips

Convert findings to shareable form(s). Think about internal and external reporting. Plan for multiple reports.

Before you start writing, be sure to develop an outline and pass it by some stakeholders. If you’re commissioning an evaluation report, ask to see a report outline in advance. Review the evaluation reports of others carefully for the important components and meaningfulness.

23Slide37

Before You Present Your Findings,

Answer These QuestionsDo your findings accurately reflect the data you collected? How might your interpretation be inaccurate?

Are there any unintended consequences that might result from sharing these findings?Are there any missing voices you overlooked?

24Slide38

Sharing

Findings: ‘ate’ Steps

Deliberate – Spend time with people close to the evaluation work and confirm the findings. You must convince yourself (ves) first.

Anticipate – Determine how you want to use the findings and what

value might be derived from the

findings for

the program/process.

Investigate

– Once you have

findings,

test them with key stakeholders. They will shed light on perceived value of the findings.Calibrate – Develop a result sharing mechanism that can convey the message you want to convey to your chosen audience.Illuminate – Remove any unnecessary details and highlight the ‘key findings’. Substantiate – Take a step away from the work and come back to it later with fresh eyes. Ask yourself, “Do the findings still resonate?”Annotate – Proofread the final draft. Misteaks can distract from results.Communicate – Share the results!25Slide39

Related Contents


Next Show more