Survey Questionnaire Innovations for NAEP PowerPoint Presentation, PPT - DocSlides

Survey Questionnaire Innovations for NAEP PowerPoint Presentation, PPT - DocSlides

2018-10-24 3K 3 0 0

Description

Gina Broxterman, NCES. NCSA . 2017. June 30, 2017. Outline. As part of the 2017 operational administration, several innovations have been implemented. The three presentations in this symposium will focus on the innovations introduced at the small-scale pretesting, piloting, and reporting stages. ID: 695209

Direct Link: Embed code:

Download this presentation



DownloadNote - The PPT/PDF document "Survey Questionnaire Innovations for NAE..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Presentations text content in Survey Questionnaire Innovations for NAEP

Slide1

Survey Questionnaire Innovations for NAEP

Gina Broxterman, NCES

NCSA

2017

June 30, 2017

Slide2

Outline

As part of the 2017 operational administration, several innovations have been implementedThe three presentations in this symposium will focus on the innovations introduced at the small-scale pretesting, piloting, and reporting stages

Survey Questionnaire Pre-testing Methods for the National Assessment of Educational Progress

(Debby

Almonte)Development of Noncognitive Modules for the National Assessment of Educational Progress (Jonas Bertling)Using NAEP Survey Data to Inform Planning, Research, and Policymaking (Renée A. Savoie)

2

Slide3

What is the development process for NAEP Survey Questionnaires?

3

Item development for

coglabs

Pre-testing of new and revised items in

coglabs

Analysis of pre-testing data and decisions for pilot questionnaires

Preparation of pilot assessment

Pilot administration

Analysis of pilot data and decisions for operational

Preparation of operational assessment

Operational administration

Reporting

Pilot

Operational

Item Development & Pre-testing

Existing item pool review

Slide4

Questions?

Slide5

Survey Questionnaire

Pre-testing Methods for the National Assessment of Educational Progress

6/27/2017

Debby E. Almonte, Ph. D.

NAEP Survey Questionnaires

Educational

Testing Service

NCSA 2017 - Austin, TX

June 30, 2017

5

Slide6

Key Talking Points

The National Assessment of Educational Progress (NAEP) administers short survey questionnaires to students after students complete the NAEP

assessment.

All items have

gone through small-scale pre-testing in cognitive interviews prior to large-scale piloting and operational administrations.Over the years, several pre-testing methods have been used for NAEP survey questionnaires. Appropriateness and the purpose of the pre-testing study determines which methods we use. The pre-testing method may differ but the end goal is always the same…

make data-driven decisions for which items are ultimately administered in pilot and operational administrations.

6

Slide7

How are NAEP survey questions developed?

7

M

ultiple

rounds of

reviews including: edit

and fairness, translatability,

human subjects internal review board, expert panels, Office of Management and Budget (OMB),

and

Governing Board (GB).

Slide8

Small-scale Pre-testing Methods for NAEP Survey Questionnaires

Focus Groups

Advantageous when the goal is to (1) “delve deep” on a specific topic or issue and the interaction among interviewees is key to yielding the best information; or (2) you need additional information to develop appropriate questions for a study or survey questionnaire.

Cognitive Interviews Advantageous when the goal is to collect additional information about the individual experience or understanding of a topic or question.

8

Slide9

Focus Groups

9

Slide10

Focus Group Approach

Group discussion with trained interviewers/facilitators drawing on methods from cognitive scienceAllows us to collect rich information on a specific topic in order to guide development prior to large-scale administration

Diverse samples that mirror, as much as possible, samples of the actual NAEP assessments

Specific guiding questions are used to facilitate discussion

Qualitative data analysis and creation of “themes” that help summarize the data collected and identify issues related to specific themes10

Slide11

General Guiding Questions (examples)

11

When someone uses the word “mathematics” what do you think they are talking

about?

Do you think solving mathematics problems “in your head” is the same as solving mathematics problems with a calculator?

How are they the same

? How are they different?

Do you think reading printed text (e.g., a hard copy book) is the same as reading on a computer or other digital

technologies?

How are they the same? How are they different?

Follow-up questions

Follow-up questions

Mathematics:

Reading:

When someone says the word “reading” what do you think they are talking about?

Slide12

Item-specific Guiding Questions (examples)

Probing specific terms:

Probing specific behaviors/actions:

12

When you need to use a calculator, do you only use a traditional calculator or do you also use your cell phone, tablet, or

laptop?

When someone uses the word “computer” what do you think they are talking about?

When someone uses the phrase “digital technology” what do you think they are talking about?

How often do you use digital devices such as e-readers or tablets for reading?

Slide13

Aggregation of Data

Resembles story-telling where specific themes are identified, described, and further exemplified by specific quotes

Example

Theme D: Ways technology changes children’s learning experience.

For these students, technology is not just a “thing” that is to be learned. Technology has become a way to learn in and of itself. As one 8th grade student said, “I like technology itself. It is fascinating. You can find wider explanations than the teacher can tell you something... I use technology to get better explanations.”

13

Slide14

Examples of developed items

14

Slide15

Example for post focus group revisions

15

Focus group version

Operational version

Slide16

Cognitive Interviews

16

Slide17

Cognitive Interview Approach

Retrospective coglabs with trained interviewers =

1:1

interview drawing on methods from cognitive

scienceAllows us to identify any problems with items and address/resolve prior to large-scale administrationDiverse student samples, approx. 15 students per grade and subjectGeneric and item-specific probesQualitative data analysis and creation of flags that help quantify item performance

17

Slide18

Generic Probes

18

Flag 1: Student reported confusion

Flag 2: Student reported difficulty

Can you tell me in your own words what the question is asking?

Were there any words or parts of this question that were confusing?

What did you find confusing?

What could we do to make the questions less confusing?

Would you say it was very easy, easy, difficult, or very difficult to answer this question?

How could we make it easier to answer this question?

Follow-up questions

Follow-up question

C

onfusion:

Difficulty:

Slide19

Item-specific

probes (examples)

Probing specific terms:

Probing student to compare items:

19

Which of the two versions you have seen do you find easier to answer? Why?

What do you think the term “tablet computer” means?

What do you think the term “outside of school” means?

Slide20

Aggregation of Data: Flags (Examples)

Student-reported

confusion

Student-reported

difficultyOverall comprehension

(based on qualitative analysis of verbatim responses to generic and item-specific probes, plus general observations)

20

This is not real data, just for illustration purpose

Slide21

Example for post coglab

revisions

21

Coglab

version

Operational version

Slide22

Summary

All items have gone through small-scale pre-testing in cognitive interviews prior to large-scale piloting and operational administrations.

Over the years, NAEP has used focus groups and cognitive interviews to help us make data-driven decisions about items that are administered in the field.

There is no “right way” to conduct pre-testing studies. Appropriateness

and the purpose of the pre-testing study determines which methods we use. We continue to refine our pre-testing methods…2018 Social Studies- re-evaluated generic probes based on 2017 coglab data and further improved generic probes2019 Science- introduced guidelines to examine the quality of cognitive interviewsFuture studies- plan to introduce probes related to sensitivity/intrusiveness of item content

22

Slide23

Questions? Comments?

Thank you!

23

Debby E. Almonte,

Ph. D.

NAEP Survey Questionnaires

Educational

Testing Service

dalmonte@ets.org

Slide24

Development of Noncognitive Modules for the National Assessment of Educational Progress

Jonas Bertling, Ph. D.

NAEP Deputy Director for Survey Questionnaires, Cross-Subject Data Analysis & Research Innovations

Educational

Testing Service

NCSA 2017 – Austin, TX

June 30, 2017

24

Slide25

Key Talking Points

Large-scale assessments have started to broaden their

focus

to

also measure skills, strategies, attitudes, and behaviors that are distinct from content knowledge and academic skills.For the 2017 digitally based NAEP administration, measurement of “noncognitive factors” was expanded to include factors such as perseverance and enjoyment of difficult problems.One key challenge: How to implement robust measurement approaches while keeping student burden

low

?

Results from small-scale pre-testing, large-scale piloting, and operational data analysis indicate that

relevant constructs can be captured

in the questionnaires at both grades 4 and 8.Index reporting is based on IRT-based indices and interactive charts the user can explore.25

Slide26

NAEP measures what students in the US, the states, and central urban districts know and can do.

26

Slide27

NAEP Survey Questionnaires place achievement results into context.

27

Data

reported

at group

level only.

Individual

respondents

can

never be identified based on NAEP’s unique design.

Digital administration on tablets. 2017 administration included

approximately 1 million students

Slide28

Perseverance

Academic

Self-Discipline

Enjoyment of Difficult Problems

2017 Digitally Based Contextual Questionnaires

 

28

Domain-General (“Core”)

Questions

Subject-Specific Questions

Survey

Q

uestions

D

esigned to Address “

N

oncognitive

Student Factors

Student

Student

TeacherSchool

Confidence in Subject-Areas Knowledge and Skills

Achievement GoalsInterest & Enjoyment

Components of SES

Technology Use

School Climate

Out-of-school Activities

Resources for Learning & Instruction

Organization of Instruction

Teacher Preparation

Survey

Q

uestions Designed to Address “Opportunity-to-Learn” Factors

Note.

“Noncognitive”

= broad term for factors outside of those commonly measured by cognitive aptitude tests, e.g., skills, strategies, attitudes, and behaviors that are distinct from content knowledge

and academic

skills

(e.g., Farrington

et al., 2012).

Th

e

term is not intended to be interpreted literally. All aspects of human thought and behavior

are

related to

cognition

.

Slide29

What are survey questionnaire indices?

NAEP survey questionnaire indices are variables

created for reporting

that are based on student responses to

multiple survey questions around a single topic.

6

Slide30

Why are indices useful?

Let’s assume we want to know how often students use laptop or desktop computer during their classes at school.

Do we need an index for that?

No. This is an example for a topic that can be measured well with a single question.

7

Slide31

Why are indices useful? (Cont’d)

Let’s assume we want to know how perseverant students are.

Do we need an index for that?

Yes. This is an example for a topic that cannot be measured well with a single question.

8

Slide32

NAEP Survey Questionnaire Indices Roadmap

Index reporting for Core Contextual Modules

Index reporting for U.S. History, Geography, Civics, and TEL

Index reporting for Reading, Mathematics, and Science

Index reporting for Reading, Writing, and Mathematics

(Trend) Index reporting U.S. History, Geography, Civics, and TEL

(Trend) Index reporting for Reading, Mathematics, and Science

(Trend) Index reporting for Core Contextual Modules

(Trend) Index reporting for Core Contextual Modules

Subject-specific indices

Core indices

2014

2015

2016

2017

2018

2019

Admin.

year

9

Slide33

Development Process for Questionnaire Indices

33

Slide34

How are NAEP survey questions (and indices) developed?

34

M

ultiple

rounds of

reviews including: edit

and fairness, translatability,

human subjects internal review board, expert panels, Office of Management and Budget (OMB),

and

Governing Board (GB).

2017

20162014-15

2013-14

Slide35

Survey method principles applied

All questions went through

multiple pre-testing rounds

(

coglabs and pilot)Avoid agree-disagree formatUse questions with 5 or 6 response optionsMatrix format at all grades

Avoid negatively framed

questions

Advanced multivariate

analyses

: EFA, MG-CFA, IRT-scaling“Spiraling” designs at pilot stage to test alternative questions and run methodological comparisonsTiming data used to select final questions35

Slide36

Choice of indices for reporting is based on statistical and content considerations

How many items are needed to adequately represent the construct?

A minimum of 4 items per index is needed based on statistical standards.

Findings across several NAEP pilots and operational data collections show that indices with as few as

4 items work well empirically.36

Analyses at Pilot Stage:

Item Frequency Analysis,

Factor Analysis,

Reliability Analysis,

Timing

D

ata Analysis

Correlations with other constructs

Additional analyses based on operational data before reporting:Item Response Theory ScalingMeasurement Invariance Analysis

Slide37

Index reporting is used to report on broader topics that cannot be adequately represented by a single question.

37

http://www.nationsreportcard.gov/tel_2014/#

results/context

Example: 2014 TEL

Slide38

Summary

Large-scale assessments have started to broaden their focus

to

also measure skills, strategies, attitudes, and behaviors that are distinct from content knowledge and academic

skills.For the 2017 digitally based NAEP administration, measurement of “noncognitive factors” was expanded to include factors such as perseverance and enjoyment of difficult problems.All indices have gone through small-scale pre-testing in cognitive interviews and large-scale piloting prior to operational administration.Results indicate that

relevant constructs can be captured

in the questionnaires at both grades 4 and 8.

Index reporting is based on

IRT-based indices and interactive charts

the user can explore.38

Slide39

Questions? Comments?

Thank you!

39

Jonas Bertling,

Ph. D.

NAEP Deputy Director

Survey Questionnaires, Cross-subject Data Analysis & Research Innovations

Educational

Testing Service

jbertling@ets.org

Slide40

Using NAEP Survey Data to Inform Planning, Research, and Policymaking

Renée A. Savoie

Connecticut NAEP

State

CoordinatorNCSA 2017 – Austin, TX40

Slide41

Survey Data Use in Connecticut

Mathematics Course-taking Information

NAEP 2009 Grade 12 Student Questionnaires

Measuring Students’ Computer Access

NAEP 2015 Teacher and Student QuestionnairesScience-Related Career AspirationsNAEP 2015 Science Student Questionnaires41

Slide42

Course-taking Data Provided Context for Achievement Gaps

42

Slide43

Six Years Later, A Refined Focus

Connecticut’s Next Generation Accountability System includes an indicator based on enrollment in challenging coursework.

43

Subgroup data highlighted in every school’s Profile and Performance Report (PPR)

Slide44

Computer Access: Pervasive or Limited in Classrooms?

To what extent are computers available in school for Grade 4 students and their teachers?

44

Available to both

Available only to teacher

Not available

National

Public

90%

9

%1%Northeast

89%10%

#Connecticut93%6%

1%

# Rounds to zero.From NAEP 2015 Teacher Questionnaires

Slide45

Access Varies by Location

To what extent is there differential access to computers in

school for

Connecticut Grade

4 students and their teachers by school location?45

Percentage of Population

Available to both

Available only to teacher

Not available

City

28%89%9%

2%Suburb58%94%

6%#

Town4%‡‡

‡Rural10%97%

3%#

# Rounds to zero.‡ Reporting standards not met.

From NAEP 2015 Teacher Questionnaires

Slide46

Students May Have Home Access

What percentage of Grade 4 students report having a computer at home?

46

Yes

No

National

Public

83%

17

%

Northeast87%13%

Connecticut90%10%

To what extent does computer access at home vary for CT Grade 4 students based on school location?

Yes

No

City

83%

17

%

Suburb92%

8%

Rural96%4%

From NAEP 2015 Student Questionnaires

Slide47

Eighth Graders Report Greater Access at Home

To what extent are computers available in

English or language arts classes for

Grade

8 students and their teachers?47

To what extent does computer access at home vary for CT Grade 8 students based on school location?

Yes

No

City

89%

11

%

Suburb

95%

5%Rural

94%6%

Available to bothAvailable only to teacherNot available

National Public

89%10%

#Northeast

88%

12%1%

Connecticut93%

7

%#

From NAEP 2015 Teacher and Student Questionnaires

Slide48

Computer Access Conclusions and Additional Questions

While the majority of Connecticut students have access to computers in school and/or at home, access is not yet universal.

Connecticut students have access to computers at home and in school at a rate at least as high as their peers nationally.

How are districts using technology systematically to enhance teaching and learning?

To what extent do educators feel prepared to incorporate technology into their instruction?

48

Slide49

CT

Condition of Education Featured NAEP Science

Ultimately

, the new standards coupled with appropriate professional learning supports will improve teaching and learning and prepare students for the jobs of the future.

NAEP 2015 data show that 41% of Connecticut’s Grade 8 students would like a job that involves science. The NGSS were carefully designed to emphasize scientific inquiry and ensure that students are thinking like a scientist. This active approach to learning has the potential to be more engaging and inspire more students to consider advanced study and eventual careers in STEM fields. 49

From NAEP 2015 Student Questionnaires

Slide50

Recent Developments Place Spotlight on Survey Data

Educators granted early access to school and teacher questionnaire responsesNAEPQ Feedback to Teachers and Schools

Greater focus on questionnaire data in

NAEP releases

Arts 2016Technology and Engineering Literacy 2014Potential increased interest and relevance based on new approaches

50

Slide51


About DocSlides
DocSlides allows users to easily upload and share presentations, PDF documents, and images.Share your documents with the world , watch,share and upload any time you want. How can you benefit from using DocSlides? DocSlides consists documents from individuals and organizations on topics ranging from technology and business to travel, health, and education. Find and search for what interests you, and learn from people and more. You can also download DocSlides to read or reference later.