/
Fairness,  Accuracy, Fairness,  Accuracy,

Fairness, Accuracy, - PowerPoint Presentation

yoshiko-marsland
yoshiko-marsland . @yoshiko-marsland
Follow
342 views
Uploaded On 2019-11-06

Fairness, Accuracy, - PPT Presentation

Fairness Accuracy amp Consistency in Assessment NCATEs recommendations to reduce bias and ensure fairness to students Fairness Assess whats been taught Candidates should be aware of the knowledge skills and dispositions which are measured in the assessments ID: 764046

raters assessments assessment students assessments raters students assessment bias rubric program work scoring scores performance beliefs values rater assumptions

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Fairness, Accuracy," is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Fairness, Accuracy, & Consistency in Assessment NCATE’s recommendations to reduce bias and ensure fairness to students

Fairness“Assess what’s been taught”Candidates should be aware of the knowledge, skills, and dispositions which are measured in the assessments. Was it taught? Curriculum map – shows where students learn and practice what is assessed Do students understand the expectations? Syllabi containing instructions and timing of assessments Rubrics/scoring guides shared with students

FairnessAccording to these guidelines, are your program’s assessments FAIR? Is your curriculum map up to date? Does the curriculum map link to all professional standards and GSE standards and dispositions? Does the curriculum map indicate any gaps? Do syllabi indicate timing of assessments? Are rubrics/scoring guides shared with students when assigning the work?

Accuracy“Assessments measure what they say they measure”Assessments should be aligned with standards and proficiencies they are designed to measure Are assessments aligned with standards? Content match Complexity match Appropriate degree of difficulty Is there corroborating evidence? Is there field input on the assessment?

AccuracyAccording to these guidelines, are your program’s assessments ACCURATE? Do your program’s assessment types (test, observation) match what is being assessed? Dispositions = observations Skills = performance assessment Which of your assessments “validate” (relate to) other assessments? Ex. - Work sample to lesson plan assignment Have you had input from working professionals?

Consistency“Assessments produce dependable, trustworthy results”The assessment results should be reliably consistent regardless of time and rater Are scoring tools sufficiently descriptive? Language to differentiate between components and between performance levels Clear descriptors that promote accurate scoring Students can tell from scoring tool why they were rated at a certain level Are raters trained? Agreement between “what a 3 in lesson planning” looks like Raters understand of the consequences of final scores Programs have a plan to support/address students with insufficient performance that alleviate rater-pressure

ConsistencyAccording to these guidelines, are your program’s assessments CONSISTENT? Does your program consistently use descriptive rubrics for major assignments and performance observations? Does your program have regular rater training? Has your program engaged in rubric calibration and moderation activities? Are raters aware of how their assessment’s scores affect the student? Are raters aware of how their assessment’s scores contribute to program review? What is the plan to support struggling students?

Avoiding Bias“removing contextual and cultural bias from assessments”The assessment itself and assessment context should be analyzed for factors that would affect performance. Are clear assessments administered in the proper environment? Location/equipment Clear instructions/questions Have assessments been reviewed for bias? Racial/ethnic/cultural stereotypes Disability resource center review Assignments that favor one group over another

Avoiding BiasAccording to these guidelines, are your program’s assessments BIAS-FREE? Have all key assessments been reviewed for clarity of expectations? Have all your assignments been reviewed for accessibility? Have all your assignments been scrutinized for cultural bias and stereotypes? Has your program analyzed student outcomes according to sub-groups to determine if consistent scoring bias exists?

GSE Rubric GuidelinesDeveloped by the Assessment Committee - 2010 4 levels of competency: Unsatisfactory, Emerging, Proficient, ExemplaryIn ascending order from left to right If numbers are used: 1-4 from left to right   NEEDS IMPROVEMENT (1) EMERGING (2) PROFICIENT (3) EXEMPLARY (4) States values, beliefs and assumptions about priorities for resource allocation. Statement of values, beliefs and assumptions is absent Statement of values, beliefs, and assumptions is vague, too general or contrived. Statement of values, beliefs, and assumptions is clearly defined and specific. Statement of values, beliefs, and assumptions is clear, specific, convincing, and includes personal experience that promotes clarity.

Rubric ModerationProcess of strengthening consistency Develops inter-rater reliability through shared examination and discussion of student work.Involves all/many raters Process Recruit raters for a two-hour session Provide 4 samples of work Provide rubric and have raters “score” each sample Discuss as a group why the raters chose the scores Debrief about what was learned, what remains unanswered.

Rubric CalibrationProcess of strengthening consistency Develops inter-rater reliability by setting expectations of what the scores mean regarding student work. Involves all/many raters Process Recruit raters for a two-hour session Provide 4 samples of work that have been pre-scored (anchor papers), at least one at a low, medium, and high level of performance Discuss rubric areas and expectations for each level and component before scoring begins Provide rubric and have raters “score” each sample with your discussion in mind. Have raters compare the scores they assigned to the “anchor” papers Debrief about what was learned, what remains unanswered.

Next StepsHow is your program doing in providing fair, accurate, consistent, bias-free assessments to students?What work needs to be done in your program to ensure quality assessments are used? What do you need to be able to accomplish this?