/
Evaluation Report Template Evaluation Report Template

Evaluation Report Template - PowerPoint Presentation

luanne-stotts
luanne-stotts . @luanne-stotts
Follow
471 views
Uploaded On 2016-04-10

Evaluation Report Template - PPT Presentation

The steps involved Dp4 Step1 understanding the basic about evaluation ch1 Step2 defining the main purposes of the evaluation and the big picture questions that need answers ch2 Step3 Identifying the evaluative criteria ch3 ID: 278341

evaluand evaluation criteria process evaluation evaluand process criteria report performance cost importance output synthesis good merit consumers http program qualitative evaluative evidence

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Evaluation Report Template" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Evaluation Report TemplateSlide2

The steps involved (D-p4)

Step1: understanding the basic about evaluation (ch1)

Step2: defining the main purposes of the evaluation and the „big picture“ questions that need answers (ch2)

Step3: Identifying the evaluative criteria (ch3)

Step4: Organizing the list of criteria and choosing sources of evidence (collecting data) (ch4)Slide3

The steps involved (D-p4)

Step5: analyzing data

dealing with the causation issue (which cause what, why), to avoid „subjectivity“ (ch5+6)

importance weighting: weight the results (ch7)

Meric determination: how well your evaluand has done on the criteria (good? Unacceptable?) (ch8)

Synthesis methodology: systematic methods for condensing evaluation findings (ch9)

Staticistical analysis: Salkind (2007)Slide4

The steps involved (D-p4)

Step6: result

Putting it all together: fitting the pieces into the KEC framework (ch10)

Step7: feedback

Meta-evaluation: how to figure out whether your evlauation is any good (ch11)Slide5

The Key Evaluation Checklist (Davidson, 2005, p. 6-7)

I. Executive

Summary

II. Preface

III. Methodology

1. Background

& Context

2. Descriptions

& Definitions

3.

Consumers

4.

Resources

5. Values

6. Process

Evaluation

7. Outcome

Evaluation

8 & 9. Comparative

Cost-Effectiveness

10.

Exportability

11. Overall Significance

12.

Recommendations

& Explanations

13. Responsibilities

14.

Reporting

&

Follow-up

15. Meta-evaluationSlide6

Step 1: Understand the basic of evaluation

Identify the evaluand

Background and context of evaluand

Why did this program or product come into existence in the first place?

Descriptions and definitions

Describe the evaluand in enough detail so that virtually anyone can understand what it is and what it does

How: collect background information, pay a firsthand visit or literature reviewSlide7

Step1: Output report

Output: one or two page overview of the evaluand and findings

What is your evaluand

Background and context of your evaluand

Description of your evaluand

Try to be as detail as possibleSlide8

Step2: Defining the Purpose of the Evaluation (D-Ch2)

Who asked for this evaluation and why?

What are the main evaluation questions?

Who are the main audience?

Aboslute merit or relative meritSlide9

Step2: Output report

Your step2 output report should answer the following questions:

Define the evaluation purpose

Do you need to demonstrate to someone (yourself) the overall quality of something?

Or Do you need to find a file for improvement?

Or do you do both?

Once you answer above questions, figure out what are your big picture questions:

Is your evaluation related to the absolute merit of your evaluand?

Or the relative merit of your evaluandSlide10

Step3: Defining evaluative criteria

To build a criterion list, consider the following procedures:

A needs assessment

Logic model of linking the evaluand to the needs

An assesment of other relevant values

, such as process, outcomes, and cost

A strategy to organize your criterion checklist

Make sure that you go into the evaluation with a well-thought-out plan so that you know what you need to know, where to get that information, and how you are going to put it together when you write up your report.Slide11

Needs assessment

Understand the true needs of your evaluation end users (consumers or impactees)

Who are your end users?

They are the person or entity who buys or users a product or service, enroll in a training program,etc.

Upstream stakeholder (i.e. People on upper level of the structure – manager, designer)

Immediate recipients (i.e. People who directly consume your product or service – consumer, trainee)

Downstream consumers (i.e. People who indirectly involved in your evaluation)Slide12

Understanding needs

Needs vs. Wants

Difference and why

A need is something without which unsatisfactory functioning occurs.

Different kind of needs

Context dependence

Conscious needs vs. Unconsious needs

Needs we know and needs we do not know

Met needs vs. Unmet needs

Building a factory (increase job, but create pollution)Performance needs vs. Instrumental needs„need to do“ something for satisfactory functioning (actual problems) vs. Proposed solutions Access email vs. Lightweight laptopMost of the case, performance needs is considered, but not the instrumental needsSlide13

Needs assessment method

Two phases:

Identifying and documenting performance needs

Investigating the underlying causes of performance needs

Training program

Improved skills

Improved performanceSlide14

Step3: output report

Needs assessment

Identify consumers or impactees (e.g. Table3.2)

Identify different needs (e.g. Table3.3)

Logic model (e.g. Exhibit3.6 and Exhibit3.7)

An assessment of other relavent values with the consideration of process, outcome and cost (e.g. Table3.4)

Organizing your criteria

see step4 output reportSlide15

Step4: Organizing criteria and indentifying sources of evidence

When organizing your criteria, always keep the followings in mind:

Process

How good are the evaluand‘s content and implementation

Outcomes

How good are the impacts on immediate recipients and other impactees

Comparative Cost-Effectives

How costly is it? Excessive, quite high, acceptable or reasonable

Exportability

How can we extend this to other settings?Slide16

The process evaluation checkpoint

Process evaluation

Content

What the evaluand consists of, i.e., basic components or design)

Implementation

How well or efficiently the evaluand was implemented or delivered to the consumers who needed it

Other features

Any other features that make the program good or bad which are not covered by the first two and are not outcomes or cost-related criteria

category

Subcategories and criteriaSource of evdienceContent evaluationImplementation evlauationOthers (if necessary)Slide17

The outcome evaluation checkpoint

What is outcome

Things that happen as a result of the program

Outcomes can affect anyone listed as consumers

How to do

Based on logic model in step3 (e.g. Exhibit3.6 and Exhibit3.7)

Organize them into subcategories

See Table4.3 (D-p60)

category

Subcategories and criteriaSource of evdienceKnowledge, skill and attitude gainApplication of knowledge, skill and attitudesSlide18

The comparative cost-effectiveness checkpoint

Any evaluation has to take cost into account

What are costs?

Money

Time

Effort

Space

Opportunity costsSlide19

The exportability checkpoint

What elements of the evaluand (i.e., innovative design or approach) might make it potentially valuable or a significant contribution or advance in another settingSlide20

Step4: Output report

Checkpoints for

Process (e.g. Table4.1, 4.2)

Outcomes (e.g., Table4.3)

Comparative Cost-Effectives (e.g., cost cube table)

Exportability

Short summary of potential areas for exportabilitySlide21

Step5: Analysing data

5.1 Inferencing causation

5.2 Determining importance

5.3 Merit determination

5.4 SynthesisSlide22

5.1 Certainty about causation (D-ch5)

Each decision-making context requires a different level of certainty

Quantitative or qualitative analysis

All-quantitative or all-qualitative

Sample choosing

Sample size

Mix of them

More in statistical analysisSlide23

Inferrencing causation: 8 strategies

1. Ask observers

2. Check whether the content of the evaluand matches the outcome

3. Look for other telltale patterns that suggest one cause or another

4. Check whether the timing of outcomes makes sense

5. Check whether the „dose“ is related logically to the „response“.

6. Make comparisons with a „control“ or „comparison“ group

7. Control statistically for extraneous variables

8. Identify and check the underlying causal mechanism(s)Slide24

5.2 Determining importance (D-ch7)

5.2 Importance determiniation is the process of assigning labels to dimensions or components to indicate their importance.

Different evaluations

Dimensional evaluation

Component evaluation

Holistic evaluationSlide25

Determining importance: 6 strategies

1. having stakeholders or consumers „vote“ on importance

2. Drawing on the knowledge of selected stakeholders

3. Using evidence from the literature

4. Using specialist judgment

5. Using evidence from the needs and values assessments

6. Using program theory and evidence of causal linkagesSlide26

5.3 Merit determination

It is the process of setting „standards“ (definitions of what performance should constitute „satisfactory“, „good“, etc.) and applying those standards to descriptive data to draw explicitly evaluative conclusions about performance on a particular dimension or component.

Decscriptive facts about performance

Quality or value determinatoin guide

Evaluative conclusionsSlide27

Rubric

Rubric is a tool that provides an evaluative description of what performance or quality „looks like“.

It has two levels:

Grading rubric is used to determin absolute quality or value (e.g., Table8.2)

Ranking rubric is used to determin relative quality or valueSlide28

5.4 Synthesis methodology

Synthesis is the process of combining a set of ratings or performances on several components or dimensions into an overall rating.

Quantitative synthesis

Using numerical weights

Qualitative synthesis

Using qualitative labelsSlide29

Qualitative (nonnumerical) weighting example 2

Dimension by dimension

Layer by layer

Sub-dimnention1

Sub-dimnention2

Sub-dimnention3

Sub-dimnention4

Dimnention1

Dimnention2

Overall ratingSlide30

Step 6: Result

Putting it all together: fitting the pieces into the KEC framework (ch10)

Now we are ready to write our evaluation report.Slide31

Step 7: Feedback (optional)

Meta-evaluation: how to figure out whether your evlauation is any good (ch11)Slide32

Related links

KEC

http://www.wmich.edu/evalctr/checklists/kec.htm

http://www.wmich.edu/evalctr/checklists/kec_feb07.pdf

Questionnaire examples

http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html

http://enhancinged.wgbh.org/formats/person/evaluate.html

http://www.dioceseofspokane.org/policies/HR/Appendix%20II/SampleForms.htm

http://njaes.rutgers.edu/evaluation/