/
Item Analysis:  Improving Multiple Choice Tests Item Analysis:  Improving Multiple Choice Tests

Item Analysis: Improving Multiple Choice Tests - PowerPoint Presentation

marina-yarberry
marina-yarberry . @marina-yarberry
Follow
342 views
Uploaded On 2020-01-26

Item Analysis: Improving Multiple Choice Tests - PPT Presentation

Item Analysis Improving Multiple Choice Tests httpwwwschreyerinstitutepsuedu Crystal Ramsay September 27 2011 Schreyer Institute for T eaching Excellence This workshop is designed to help you do three ID: 773882

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Item Analysis: Improving Multiple Choic..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Item Analysis: Improving Multiple Choice Tests http://www.schreyerinstitute.psu.edu/ Crystal Ramsay September 27, 2011Schreyer Institute for Teaching Excellence

This workshop is designed to help you do three things: To interpret statistical indices provided by the university’s Scanning Operations To differentiate between well-performing items and poor-performing itemsTo make decisions about poor performing items

We give tests for 4 primary reasons. To find out if students learned what we intended To separate those who learned from those who didn’t To increase learning and motivationTo gather information for adapting or improving instruction

The rounded filling of an internal angle between two surfaces of a plastic molding is known as the rib. f illet. chamfer.Gusset plate.StemDistractersKey Options Multiple choice items are comprised of 4 basic components.

An item analysis focuses on 4 major pieces of information provided in the test score report . Test Score Reliability Item DifficultyItem DiscriminationDistracter information

Test score reliability is an index of the likelihood that scores would remain consistent over time if the same test was administered repeatedly to the same learners. Reliability coefficients range from .00 to 1.00. Now look at the test score reliability from your exam.Ideal score reliabilities are >.80. Higher reliabilities = less measurement error.

Item Difficulty is the percentage of students who answered an item correctly. RESPONSE TABLE - FORM A ITEM ITEM NO. OMIT A B C D E KEY- % EFFECT % % % % % % 1 0 0 18 82 0 0 C 82 0.22 2 0 79 0 0 21 0 A 79 0.23 3 0 4 7 89 0 0 C 89 -0.12 Represented in the Response Table as KEY-% Ranges from 0% to 100%

Easier items have higher item difficulty values. More difficult items have lower item difficulty values. ITEM NO.RESPONSE TABLE –FORM A ITEM EFFECT OMIT A B C D E KEY - % % % % % % % 4 0 0 4 96 0 0 C 96 0.18 5 0 100 0 0 0 0 A 100 0.00 6 0 0 0 5+ 0 95 E 95 -0.11 ITEM NO. RESPONSE TABLE –FORM A ITEM EFFECT OMIT A B C D E KEY - % % % % % % % 8 0 0 43 0 57 0 D 57 0.46 9 0 7 4 0 75 14 D 75 -0.19 10 0 5 12 27 31 25 D 31 0.10

Number of alternatives for each item What is an ‘ideal’ item difficulty statistic depends on 2 factors. Your reason for asking the question

Sometimes we include very easy or very difficult items on purpose. Did I deliberately pose difficult items to challenge my students’ thinking? Did I deliberately pose easy items to test basic information or to boost students’ confidence?

Now look at the item difficulties from your exam.Which items were easier for your students? Which items were more difficult?

Item Discrimination is the degree to which students with high overall exam scores also got a particular item correct. RESPONSE TABLE - FORM A ITEM ITEM NO. OMIT A B C D E KEY- % EFFECT % % % % % % 1 0 0 18 82 0 0 C 82 0.22 2 0 79 0 0 21 0 A 79 0.23 3 0 4 7 89 0 0 C 89 -0.12 Represented as Item Effect because it tells how well an item ‘performed’ Ranges from -1.00 to 1.00 and should be >.2

A well- performing itemA poor -performing item ITEM NO.RESPONSE TABLE –FORM AITEM EFFECTOMIT A B C D E KEY - % % % % % % % 6 0 0 0 5+ 0 95 E 95 -0.11 ITEM NO. RESPONSE TABLE –FORM A ITEM EFFECT OMIT A B C D E KEY - % % % % % % % 8 0 0 43 0 57 0 D 57 0.46

Item Difficulty Test heterogeneity Item characteristics What is an ‘ideal’ item discrimination statistic depends on 3 factors.

Very easy or very difficult items will have poor ability to discriminate among students. Very easy or very difficult items may still be necessary to sample content taught. Yet… Item difficulty

A test that assesses many different topics will have a lower correlation with any one content-focused item. A heterogeneous item pool may still be necessary to sample content taught. Yet… Test heterogeneity

A poorly written item will have little ability to discriminate among students. There is no substitute for a well-written item or for testing what you teach! and… Item quality

Now look at the item effects from your exam. Which items on your exam performed ‘well’? Did any items perform ‘poorly’?

Distracter information can be analyzed to determine which distracters were effective and which ones were not. RESPONSE TABLE - FORM A ITEM ITEM NO. OMIT A B C D E KEY- % EFFECT % % % % % % 1 0 0 18 82 0 0 C 82 0.22 2 0 79 0 0 21 0 A 79 0.23 3 0 4 7 89 0 0 C 89 -0.12 Now look at the distracter information for items from your exam. What can you conclude about them?

Whether to retain, revise, or eliminate items depends on item difficulty, item discrimination, distracter information, and your instruction.DistractersUltimately, it’s a judgment call that you have to make. Instruction

What if I have a relatively short test or I give a test in a small class? I might not use the testing service for scoring. Is there a way I can understand how my items worked? Yes.

Item 1 A B* C D Top 1/3 10 Bottom 1/3 1 4 3 2 Item 2 A* B C D Top 1/3 8 2 Bottom 1/3 7 3 Item 3 A B C* D Top 1/3 5 1 4 Bottom 1/3 2 4 4 Item 4 A* B C D Top 1/3 10 Bottom 1/3 9 1 From : Suskie , L. (2009). Assessing student learning: A common sense guide (2 nd ed.). San Francisco: Jossey -Bass. 1. Which item is the easiest? 2. Which item shows negative ( very bad) discrimination? 3. Which item discriminates best between high and low scores? 4. In Item 2, which distracter is most effective? 5. In Item 3, which distracter must be changed?

Multiple course sections Student feedback Other item types Even after you consider reliability, difficulty, discrimination, and distracters, there are still a few other things to think about…

Resources For an excellent resource on item analysis:http://www.utexas.edu/academic/ctl/assessment/iar/students/report/itemanalysis.php For a more extensive list of item-writing tips:http://testing.byu.edu/info/handbooks/Multiple-Choice%20Item%20Writing%20Guidelines%20-%20Haladyna%20and%20Downing.pdf http://homes.chass.utoronto.ca/~murdockj/teaching/MCQ_basic_tips.pdfFor a discussion about writing higher-level multiple choice items:http://www.ascilite.org.au/conferences/perth04/procs/pdf/woodford.pdf