/
Finding the Value in E valu Finding the Value in E valu

Finding the Value in E valu - PowerPoint Presentation

kittie-lecroy
kittie-lecroy . @kittie-lecroy
Follow
369 views
Uploaded On 2019-06-22

Finding the Value in E valu - PPT Presentation

ation What should a Noyce director expect from program evaluation July 2011 Susan Tucker EampD Associates LLC Davida Fischman CSUSan Bernardino Who are we Davida Fischman ID: 759814

program evaluation amp data evaluation program data amp questions noyce project results qualitative goals resources teacher teaching stakeholders working outcomes model improve

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Finding the Value in E valu" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Finding the Value in Evaluation

What should a

Noyce

director expect from program evaluation?

July 2011

Susan Tucker,

E&D Associates LLC

Davida

Fischman

,

CSU-San Bernardino

Slide2

Who are we?

Davida Fischman: Research mathematician turned mathematics educator17 years teaching pre-service (elementary and secondary) teachers; 10 years working with in-service teachers in small and large grantsCo-designer and Coordinator of CSUSB MA in Teaching Math programSusan Tucker25 years as educational program evaluator20 years teaching educational program evaluation and working in teacher education programs experience as K12 teacher, principal, associate superintendent, university professor, PI and grant director, grant writer

2

Slide3

Who is in the room? Complete Mobile Survey #1

How "old" is your project?How many grant-funded projects have you managed?What experience do you have in project evaluation?What are your goals/expectations of this session?TEXT to: 96625Message: E&D1

3

Slide4

Agenda

From the PI perspectiveWhat is program evaluation?Negotiating a good evaluation planTools for evaluationData collectionUsing evaluation resultsTips for PIsResources

4

Slide5

From a PI... A Changed Perspective

First thoughts: 10-12%?? What for??Then...Using new survey of Noyce Scholars and Mentors to modify next year's workUsing formative evaluation from NSF MSP project to inform program decisions on an ongoing basisAdd-on of evaluation of Noyce Scholars and Mentors tool will continue to use to learn about participants needs and made adaptation decisionsNow use surveys for formative assessment also in university classesToday… a much better understanding of the value of evaluation, and ways it can improve the project.

5

Slide6

Define: “Program Evaluation”

Think and jot down notes: What do you get from your Noyce evaluation today? What more do you want?Changing views of evaluation Prove vs. Improve

6

Slide7

Warm up/Introductions/Review

7

Who are you? What do you do?

What disciplines, connections, experiences do you bring into evaluation?

How

do they help you think about evaluation?

What previous backgrounds or experiences do you bring that might assist you in

maximixing

the value of evaluation

?

How do you currently think about the role of

evaluation

in your

NOYCE project?

Strengths

Challenges/Frustrations

Slide8

Agree or Disagree about the characteristics of a good evaluator:

…is part facilitator, part researcher, part manager and part program specialist … is external to the program being evaluated… designs an evaluation to determine if a program is well managed…negotiates questions of relevance to multiple audiences—need to know vs nice to know…Is collaborative in terms of designing and implementing the evaluation plan…helps a project reflect on the quality of its objectives…helps a project look at more than just whether its goals have been met… helps staff develop a logic model that describes how a program’s components relate to each other and to the overall goals and objectives…selects/aligns evaluation model to complement the project’s logic model…develops a plan to determine if a program is meeting its goals & objectives…is concerned about how useful the evaluation is to project stakeholders

8

Slide9

Program Evaluation Defined?

“Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming” (Patton, M. Q. 2002).

9

Slide10

Does your

Evaluation

Do this?

7

Collect

the data

8

Analyze

the data

3

Identify purpose

of the evaluation

4

Negotiate the

right questions

1

Prepare for

the evaluation

5

Co-Design

the evaluation

6

Select and adapt instrumentation

2

Engage stakeholders

9

Disseminate and use the results

10

10

Slide11

What Can an Evaluation Tell Us?

11

What is working

How to improve

Support for evidence-based decision-making

Are we achieving the results that we were hoping for?

Are results being produced as planned at all levels of our logic model?

Relevance

Do stakeholders care about what we are doing? Are we solving the problem?

Cost-Effectiveness

Are we getting value for money?

Slide12

Many models of evaluation…a few examples popular in education

Scientific-experimental modelsObjectives based research orientationManagement models Stufflebeam’s CIPP modelAnthropological/Qualitative models Stake’s responsive modelLooking at intended & unintended outcomesParticipant oriented modelsFetterman’s empowerment model

12

Slide13

Qualities of an evaluator

Formal education (evaluation preferably)Experience (with program improvement)Evaluation philosophy complements management team and grant’s principlesCommunication skillsRecommendations and resumeUnderstand culture of target population(s)

13

Slide14

Start with the Right Questions…

14

Include questions of relevance to

stakeholders.

Explore what makes questions “relevant”

Determine what will be accepted as evidence in seeking answers to the

questions

Examine whose voices are heard in the choice of questions and

evidence

Slide15

Task 1: Are the following “good” evaluation questions?

15

What are the goals and activities of the teacher certification programs from which

Noyce

grant is

housed?

What is the “value added” of your

Noyce

program?

How

do stakeholders perceive the

Noyce

Program and

Noyce

recipients?

What

are the characteristics of the schools in which

Noyce

recipients teach?

What are the relationships between characteristics of the

Noyce

Program, types of

Noyce

recipients, and recipients’ plans to go into/stay in teaching and leadership roles?

What is the impact of

Noyce

on

teacher:

Recruitment?

Retention?

Teacher

effectiveness?

Slide16

Design the Evaluation Plan

16

Negotiate plan with multiple stakeholders

#1. Brainstorm what are important questions to ask

#2. Probe rationale/values behind each question

Build

design appropriate to both evaluation questions and cultural

context

Align evaluation questions to project logic model

#3. Worry last about how & what & when to measure

Seek

culturally appropriate methods that combine qualitative and quantitative approaches.

Collect

data at multiple points in time, extending the time frame of the evaluation as needed

.

Slide17

Major Questions Asked

Sources of Evidence

Quality

Criteria/Standards

Contexts

Inputs

Resources

Expectations

Qualitative

Quantitative

Mixed

Improvement oriented

Holistic

360 degrees

represented

Assumptions overt

Culturally responsive

Polemical:

Inductive & deductive

Long & short term

Local & Research based knowledge/wisdom

Processes:

Individual

group

Qualitative

Quantitative

Mixed

Products:

Anticipated

Unanticipated

Qualitative

Quantitative

Mixed

Slide18

Identify quality criteria…some examples

Persistence and success on a STEM trajectory from teacher prep to teaching jobsRetentionChanges in teacher pedagogy and content knowledgeWillingness to teach STEM classesObtain advanced training in teaching in STEM-related areas.

18

Slide19

Summative vs. Formative Evaluation

Formative EvaluationSummative EvaluationPurpose:Program improvementProgram accountabilityJudgment of overall worth, valueGeneral Question:Is this educational program being implemented as planned to achieve set goals?Did this educational program contribute to the planned impact and compensate the resources utilized?Specific Questions:What are the strengths and weaknesses? What is working and not working? Why is it working or not working? How should it be improved? What are the program results?Did intended audience benefit from the program? Was the program cost effective?Is it worth to continue this program?

19

Slide20

Question Types & Data Techniques

20/09/2011

20

DescriptiveAvailable dataCase studiesStatistical surveyHow did the program unfold? To what extent was the project coherent from the start...NormativeAvailable dataCase studiesStatistical surveyCost-benefit analysisCost-effectiveness analysisHow satisfactory was the placement rate after training? Impact-focusedSearch for causal relationsStatistical analysisForecast analysisDid the “Noyce Package” support successful hiring and retention of Noyce participants?

20

Slide21

Context QWhy was program planned?How were participants selected?How was program/PD planned?What is settingWhat resources are available?Process Q:Formativehow do major ongoing events match original expectations ?How effective are resources?How are dissemination/institutionalization efforts emerging?What policy issues are emerging?Product Q:Summative1. to what degree have project goals been met?2. what is transferrable or replicable as a result of project?3. what are recommendations re:- prerequisites for success, selection criteria, training content, processes/sequencing, resources, follow-ups, R&E

21

Slide22

Match with Evaluation Criteria

20/09/2011

Coherence

Releva

nce

Efficienc

y

Effectiveness

Impact

Sustainability

R

e

plicabilit

y

Goal

s,

Purpose

s

Main Objective

Direct Results

Activit

ie

s

M

ea

ns

Logical structure

22

Slide23

Questions @ Questions

23

What/whose perspectives are represented in the evaluation

questions?

What

other questions might have been posed?

Whose perspectives are accepted as credible evidence?

Credible

to whom?

How well does the time frame in this study match the needs and rhythms of this context?

Slide24

Mobile survey: #2

TEXT to: 96625Message: E&D1

24

Slide25

Agree/Disagree @ evaluators:

…collects both qualitative and quantitative data.…has a firm grasp on educational research strategies…collects data that is actionable—answers provide info needed for decision-making…is seen but not heard except at the end of each year to write annual reports…designs data-collection forms, procedures and databases to capture and record data collected.…is culturally competent and responsive to unique needs of a project…analyzes data in timely ways to help a project improve as it develops …clearly distinguishes between descriptions and judgments when presenting findings.…makes recommendations to the program regarding ways to improve…works with the project staff to disseminate the findings…asks questions about sustainability and institutionalization early and often

25

Slide26

Collect the Data

26

Be holistic:

collect qualitative &

quantitative data

Be

responsive to cultural

contexts

Tap into internal & external evaluation

Triangulate vs. one-shot data

Usually takes 3-6 months of

eval

planning & prep before data collection can begin

Slide27

Analyze the Data

27

Consider context/inputs and resources

a

s

a necessary component of

interpretation

.

Disaggregate data to examine diversity within

groups

Examine outliers, especially successful

ones

A

“cultural”

interpreter may be needed to capture nuances of meaning.

Stakeholder review panels can assist in accurate

interpretation

Confirm accuracy of analysis & interpretation before making judgments

Slide28

Disseminate & Use the Results

28

Inform a wide range of stakeholders

Cultural sensitivity and responsiveness

increases both the truthfulness and utility of the

results

Involve/Engage a variety of stakeholders

Find and train information users

Personal Factor greatly accelerates

eval

use

Make

use consistent with the purpose of the

evaluation

Situate

interpretation of results

Use results to make decisions about program improvement

Slide29

Task:

29

What

data collection procedures

are you considering when

in designing

next year’s

Noyce

evaluation?

Existing data

New data collection plans

Slide30

Tools that help:

EX 1: Logic modelingPlanning toolFlow diagram of your program with defined goals, inputs, outputs, and outcomes connected through causal linksVisual representation of what and how a program produces its outcomes

30

Slide31

Program Logic Model

Resources

Activities or inputs

Products or outputs

Short-Term (immediate) Outcomes (knowledge, skills, and abilities, changes in the environment)

Mid-Term Outcomes (behavior change, application of new skills/tools, impacts on environment)

Long-Term Outcomes (results or change/improvement in issue or effectiveness)

Planning

Implementation and Management

31

Slide32

Ex 2: Data-wise: 3 stages, 8 steps(Boudett et al 2005)

Stage I: PREPARE

1 Organize for Collaborative Work2 Build Assessment LiteracyStage II: INQUIRE3 Create a Data Overview 4 Dig into Data5 Examine InstructionStage III: ACT6 Develop an Action Plan7 Plan to Assess Progress8 Acting and Assessing

32

Slide33

EX 3: Some Evaluation Methods

Kirkpatrick’s Evaluation LevelsMethods1 Reaction2 Learning3 Behavior4ResultsSurvey●●●●Questionnaire/Interview●●●●Focus Group●●●●Knowledge Test/Check●Work Review●●Skills Observation●●Presentations / Teach Bk●●Action Planning●●●Action Learning●Key Business HR Metrics●

33

Slide34

Impacts of Noyce Projects

Number of teachers trained of K-12 science and mathNumber of students directly impactedNumber of partner schools involvedGains in teacher content knowledgeGains in student achievement Improved teaching strategies Increased student achievement Increased funding for science supplies & equipment for the region’s schoolsIHE faculty visiting schools on a regular basisCreating innovative course processes/materials

34

Slide35

Design Challenges

Based on faulty logicSelected strategy or activities cannot make intended changes Failure to connect with the target population (s) Do not reach themDo not resonate with themNot understood by themFailure to be well implementedSettings inappropriateIncompatibility between program and delivery setting Unrealistic (untested) expectations

35

Slide36

Challenges: Assessment issues

Measuring problem-solving skillStatistical Significance Adequate sample sizeCost of some assessment methods – how much should projects spend? Details - effective controls in matched comparisons Longitudinal effects may be important but realized “down the road”Tests are not always the best measure of student achievementHard to “analyze” qualitative impact dataStandard test culture is focused on “factual knowledge”

36

Slide37

Tips for Noyce Evaluators

In order to get the most out of program evaluation, you need to figure out what questions do you want answered.The goals of each component of the evaluation process need to be clear in your mind; often these can be negotiated with your evaluator, but certainly they should be laid out clearly and discussed.PI-Evaluator team – review evaluation plan annually Use the information you get! Even if it seems that you have wonderful rapport with participants, they might look at things differently when a third party asks the questions, and you'll learn more. 

37

Slide38

References

Boudett, K., City, E, and Murnane, R. J., Eds. (2005). Data Wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.Gadja, R Community of Practice Collaboration Rubric: www.asdk12.org/depts/cei/about/communities/CollaborationRubric.pdfKirkpatrick, D.L. and J.D. Evaluating Training Programs, 3rd Ed., Berrett-Koehler Publ., Inc. San Francisco, CA, 2006Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd Ed.) p. 10. Thousand Oaks, CA: Sage Publications.Patton’s (2003) Qualitative Evaluation Checklist: www.wmich.edu/evalctr/checklistsScriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.), Curriculum evaluation. American Educational Research Association Monograph Series on Evaluation, No. 1, pp. 39-83. Chicago: Rand McNally.Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation (3rd Ed.) pp5-60. San Francisco: Jossey-Bass.

38

Slide39

References

W.K. Kellogg Foundation Evaluation Toolkit: http://ww2.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&LanguageID=0 W. K. Kellogg Foundation Evaluation Hand Book. (1998). http://www.publichealth.arizona.edu/chwtoolkit/PDFs/Logicmod/chapter1.pdfThe Centers for Disease Control and Prevention provides a set of evaluation resources in a variety of topical areas, available at: http://www.cdc.gov/eval/resources.htm. Program Development and Evaluation (University of Wisconsin-Extension) http://www.uwex.edu/ces/pdande/evaluation/ Worthen, B. R., Sanders, J. R., and Fitzpatrick, J. L. (1997). Program evaluation: Alternative approaches and practical guidelines. (2nd Ed.) p.7. New York, Longman Publishers.

39