Panelists Wayne Camara College Board John Fremer Caveon Test Security Wes Bruce Indiana Department of Education Tony Alpert SMARTER Balanced Assessment Consortium Testing Integrity Practices and Procedures for Online and Computerbased Assessments ID: 648151
Download Presentation The PPT/PDF document "Panel 4 Testing Integrity Practices and ..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Panel 4Testing Integrity Practices and Procedures for Online and Computer-based Assessments
Panelists
Wayne
Camara
: College Board
John
Fremer
:
Caveon
Test Security
Wes Bruce
: Indiana Department of Education
Tony Alpert
: SMARTER Balanced Assessment ConsortiumSlide2
Testing Integrity Practices and Procedures for Online and Computer-based AssessmentsWayne J. Camara College BoardSlide3
CBT vs. Paper Online testing offers numerous advantages over P&P testing, including features which can improve test security. As with all assessments, the intended purpose and potential consequences is suggestive of the types of threats to test integrity we need to focus upon.
Threats to all assessments:
item exposure,
candidate authenticity,
data transmission & storage,
proctor and personnel integrity,
system integrity (prevent interruptions and irregularities) Slide4
Assessment Purposes and Threats to Testing IntegrityCheating increases with age of student, bandwidth & distance (Rowe, 2004).Summative assessments – different threats emerge for different intended uses of scores:School and district accountability
Student rewards (endorsed diploma, entry into college credit bearing course)
Teacher and educator accountability (financial incentives or penalties, disciplinary-based actions)
Student barriers (graduation, retention, mandatory developmental programs, college remediation courses)Slide5
Testing Integrity: Unique risks with CBTStudent High StakesTeacher/Educ. Account.
Extended testing window
Students disclose items/tasks to other students or post on social
networks to those testing later in window
Educators provide instruction on specific tasks to aid students testing later in the window
Performance tasks
Easier to recall, more difficult
to create comparable tasks
Provide procedural
solutions Instruction targeted to tasks
Reuse of items – exposure
rate
Greater chance of intentional or nonintentional
disclosure
If reused over years (pre-testing, equating) greater
risks of teaching to task / item
Testing environ
- CBT lab
Easier to hide prohibited materials behind screen. Use
handheld devices to cheat.
Teacher monitors and changes student responses. Small groups settings present greater challenges
Assisting students during testing
Same risk as P&P unless items spiraled. Need privacy
carrels
Teacher
views student progress and responses – may offer hints to individual student or group
Technology
Machine allows access to external web resources?
NASlide6
Processes/policies that could mitigate risks to integrity of CBT test results
Processes and policies must be tailored to the types of risks or threats to test integrity that are anticipated based on the intended use, stakes and consequences for school, students and educators.
Reduce risk of item exposure via – extended testing windows with same form present the biggest security threat when tests used for high stakes:
More robust item banks and spiraling
Use of multistage adaptive models
Linear forms require more forms for the same testing window or single use
Reuse of items operationally, for equating or pretesting:
Reuse of scenarios, simulations, or extended performance tasks can more easily be captured and hence have less validity when exposed for any length of time.
Limit disclosure and reuse over several years.
Limit reuse of performance tasks (extended multi-year window w/out release or develop hundreds of tasks to pool from).
Limit retesting – different forms/item pools.
-
Slide7
Recommendations: Processes/policies that could mitigate risks to integrity of CBT test resultsAdministration and ScoringReduce opportunity for cheating – send message cheating is not tolerated.Classroom teachers should not be administering tests to students in their classes – there is simply too much temptation.
Proctors should have ‘no stake’ in outcome or risk collusion.
Environment should preclude copying responses from students seated adjacent (spiraling, different forms, or some physical obstruction); Document seating and proctors.
Mandatory training of proctors and administrators handling test materials; verify understanding of appropriate test procedures and consequences of unauthorized procedures.
Student reads and signs statement like an honor code or integrity policy.
Prohibit all handheld electronic devices (smartphones, calculators).
Employ variety of item formats & constructed response tasks to reduce ease of cheating.
Impose conditions on retest opportunities – beware of students unplugging equipment to restart or retest.Slide8
Recommendations: Processes/policies that could mitigate risks to integrity of CBT test results
Technology
Prepare for unexpected – it will
occur.
Ensure students can not access web resources (outside the system
).
Items and data are encrypted and stored on secure server (not desktops).
Paper forms use different item banks and chain of custody established.
Audit social networks, school preparation, blogs.
Ensure high system reliability – outages, interruptions and irregularities which
require candidates to stop and start, retest, or complete paper forms.
Guard against ‘sniffers that decipher and read items/responses and attempts
to have test administrators disclose passwords (McClure et al, 2001).
Disable network capabilities, printers. Conduct formal web crawling before/after.
Use Intrusion
Detection
S
oftware
to catch attacks prior to their occurrence.
Backup grade book or roster in case of attack and chances.
Statistical
Checks on aberrance rates, retest or score volatility statistics (individual, site) – does data conform to test response models?
Check on irregular latencies, response patterns.
High/Low Aberrance score, Cheating index, Thresholds (Impara et al, 2005)
Distance assessments - When online performance exceeds traditional tests – Have some traditional assessments (Rowe, 2004). Slide9
National Council for Measurement In Education (NCME) Draft Guidelines on Testing &Data IntegrityData integrity is shared ethical and professional requirement. Need to develop and implement a comprehensive data integrity policy and why its important. Tailored to use of test.
Training for all levels with examples of unacceptable behaviors (nondisclosure, confidentiality, participation forms)
Proactive prevention – eliminate opportunities.
Comprehensive data collection and maintenance.
Comprehensive policies for reporting cheating, security breaches, suspicious activities (dB & investigations).
Biometrics, data forensics, statistical patterns, etc. Slide10
Thank youWayne Camara, wcamara@collegeboard.orgSlide11
NCES Sponsored Symposium on Testing IntegrityDr. John FremerFebruary 28, 2012
President
Caveon Consulting ServicesSlide12
State Assessments in Transition
Test
Security
Storm
The
PerfectSlide13
State assessments face an impending Perfect Test Security Storm mandated assessments tied to federal funding
teacher evaluations tied to test scores
more students/teachers admit to cheating on tests
cheating techniques becoming more sophisticated
CBT test windows increasing test item exposure
use of State tests as a graduation requirementSlide14
Other test security risks will remain
Some risks will actually increase
CBT will reduce some test security risksSlide15
lost or stolen test books unauthorized access to tests
tampering with answer sheets
copying during testing
CBT will reduce some test security risksSlide16
assisting during an examstealing/memorizing test questions
pre-knowledge of exam content
collusion among test takers
technology-assisted cheating
Other test security risks will remain Slide17
exposure of items for extended periodsaccessing secure data during transmission
pre-knowledge later in testing windows
reduced funds allocated to test security due
to increased development costs
Some risks will actually increase
stealing items for an underground marketSlide18
21st Century Solutions
Advances in the detection of security anomalies and
investigative data forensics, enabled through CBT,
provide sophisticated means to heighten security
Available detection technologies and techniques
should be incorporated as routine, standard practiceSlide19
21st Century Solutions (cont)
Economies of
scale and experience will make these security safeguards
affordable
cost effective
easy to understandSlide20
Unusual Gains AnalysisSimilarity of Responses
Response Pattern Aberrance Analysis
Response Time Analyses
Web Monitoring
Advanced Security Analysis and Detection Techniques for CBT
Occurrence of Perfect Scores
Answer Changing AnalysesSlide21
Ten Recommendations Moving Forward 1. Acknowledge the seriousness of security issues
2. Expect cheating and plan to be proactive
3. Use multiple detection methods and forensic statistics
4. Minimize testing windows
5. Strengthen the chain of custodySlide22
6. Increase the emphasis on security training
7. Allocate adequate resources for test security
8. Pilot techniques for detection of cheating
9. Continue to learn from others
Monitor new advances in anomaly detection and
prevention
(e.g. “Epidemiological
Model”)
Ten Recommendations Moving Forward Slide23
State Assessments in Transition
Test
Security
Storm
The
PerfectSlide24
NCES Sponsored Symposium on Testing IntegrityDr. John FremerFebruary 28, 2012
President
Caveon Consulting ServicesSlide25
Transitioning Testing Integrity from Paper to ComputerWes BruceIndiana Department of EducationSlide26
Be thoughtful about the transitionUsually the move from paper to CBT is phasedBy grade, content or schoolSo be thoughtful in how you will transition the measures of test integrityYou want specific strategies for onlineSome are the same, some complementary, some uniqueBut the field must feel that there is a singular system in place - combined reporting (KISS)Slide27
CBT Security is DifferentLeverage the differences of CBTInfinitely more data is available on every student Your challenge is to determine how much of that you can turn into useful “information”What will you systematically use and what will be in your “back pocket”Time spent per itemTime spent per “session”The “system” time of each responseThe actual order in which students answer test items
The “real” pattern of item response changesSlide28
CBT Security is not UniqueMany of the metrics that we use with paper are equally valid for CBT Score change metricsSchool and studentPart to whole Analysis of items correct vs. item difficulty School, class and studentPerfect Score ReportsSlide29
Illustration – Erasure AnalysisPaper (Generic)Scanners detect when (if) multiple responses have been selected for a single itemIf one is “darker” it is seen as the final “answer”Lighter response flagged as an “erasure”In “Erasure Analysis” logic and statistics are applied to these multiple “marks”If lighter one is “wrong” and darker is “right” item is flagged as W to RIf a student, class or school exceeds threshold value (4 sd) they are flagged/flogged
Anybody take statistics in college?Slide30
Illustration – Erasure Analysis 2Concerns We do not “know” what the actual pattern of student responses was. W-R or R-W-RWe do not know when the “change” was made or how long the student took to make that changeCBT can provide more information for analysisPotential for fewer false positives, you can identify “true” W-RCan factor in other dimensions (i.e. filter on “when”)But it is still statistical & subject to the same limitationsSlide31
Illustration – Erasure Analysis 3We provide a single combined “Erasure Analysis” for schools (Even though there are no “erasures” on CBT)“Identical” fields for paper and CBTSame “flagging” criteria for bothSame expectations for investigation and reportingTrying to make these exceptions easy to understand and communicate Think about the context and the cognitive loadSlide32
CBT Security is not OmnipotentBe carefulStill inferentialIt may provide “stronger” or additional evidenceBut it “proves” nothingYou may know “what” but you still do not know who or howInvestigations still matterPress loves a scandal and CBT can help you create an even bigger oneSlide33
Wes Brucewbruce@doe.in.govSlide34
Secure Testing on ComputersTesting Integrity Symposium Tony Alpert – Smarter Balanced Assessment Consortium (SBAC)Slide35
State Supports as PrerequisiteModel rigorous implementation by making sure the system works as described Establish a culture of security within the Department and across the stateEstablish policies that address The larger network of adults that are involved in CBT vs. paper
The additional complexities of logistics
The additional complexities of new item typesSlide36
State Supports as Prerequisite (cont)Delineate minimum training requirements based on roles and responsibilitiesProvide practice versions of the applications early enoughEstablish help-desk supports consistent with longer testing windows
Conduct user acceptance testing in the schoolsSlide37
Local Supports as PartnerBe aware of which adults can be in the secure testing environmentUse the opportunities for sample tests/applicationsProvide clear expectations for which individuals must attend trainingsProvide clear path for identification and resolution of problemsSlide38
Local Supports as PartnerBe aware that CBT can be overwhelming for new teachers and substitutesDon’t expose Secure Student IdentifiersProvide clear path for identification and resolution of problems