Silipigni Connaway PhD Senior Research Scientist OCLC Research Vicechair ACRL Value of Academic Libraries Committee Thursday April 24 th 2014 Academic Library Association of Ohios Assessment Special Interest Group Spring Workshop ID: 140079
Download Presentation The PPT/PDF document "Lynn" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Lynn Silipigni Connaway, Ph.D.
Senior Research ScientistOCLC Research Vice-chair , ACRL Value of Academic Libraries Committee
Thursday, April 24th 2014
Academic Library Association of Ohio’s Assessment Special Interest Group Spring Workshop
"The library has a website?”
User-Centered Library Assessment Slide2
The Road Travelled2Slide3
Value of Academic Libraries Report3
Freely available
http://acrl.org/value Slide4
Themes from Summits4Slide5
Value of Academic Libraries Initiative
5Keep Up-to-Date
Value of Academic Libraries Blog
ValueographyOutreach &Collaboration
Presentations (e.g. CNI, LAC, &
Northumbria
)
ACRL Liaisons Assembly
Assessment Management Systems
Under Discussion
Librarian Competencies
Research agenda
Library Poster Slide6
ACRL Plan for Excellence6
Value of Academic LibrariesGoal: Academic libraries demonstrate alignment with and impact on institutional outcomes.
Objectives:Leverage existing research to articulate and promote the value of academic and research libraries.Undertake and support new research that builds on the research agenda in
The Value of Academic Libraries: A Comprehensive Review and Report.Influence national conversations and activities focused on the value of higher education.
Develop and deliver responsive professional development programs that build the skills and capacity for leadership and local data-informed and evidence-based advocacy.Slide7
7
Cycle of Assessment
[focused on]
Library Value
Acting
August-
December 2013
Sharing
March-May 2014
January -February 2014
Reflecting
Planning
June- July 2013Slide8
Recommendations8
Define outcomesCreate
or adopt systems for assessment managementDetermine what libraries enables students, faculty,
student affairs professionals, administrators and staff to do.
Develop systems to collect data on individual library
user behavior
, while maintaining privacy.
Record
and increase library impact on student enrollment.
Link
libraries to improved student retention and
graduation rates
.
Review
course content, readings, reserves, and assignments.
Document
and augment library advancement of
student experiences
, attitudes, and perceptions of quality.
Track
and increase library contributions to faculty
research productivity
.
Contribute
to investigate library impact on faculty
grant proposals
and funding, a means of generating
institutional income
.
Demonstrate
and improve library support of faculty teaching.
Create
library assessment plans.
Promote
and participate in professional development.Mobilize library administrators.
Leverage library professional associations.Slide9
Recommendations9
Increase the profession’s understanding of library value in relation to various dimensions of student learning and
successArticulate and promote the importance of assessment competencies necessary for documenting and communicating library impact on student learning and success
.Create professional development opportunities for librarians to learn how to initiate and design assessment that demonstrates the library’s contributions to advancing institutional mission and strategic goals.Slide10
Recommendations cont. Expand
partnerships for assessment activities with higher education constituent groups and related stakeholders.Integrate the use of existing ACRL resources with library value initiatives.
10Slide11
Assessment in Action Goals11Slide12
12
Institutional Researcher/ Assessment Officer
Faculty Member
Librarian Leader
Team ApproachSlide13
AiA 2013 Institutional Teams13Slide14
Library Factors Examined
instruction: games, single/multiple session, course embedded, tutorialsreferencephysical space
discovery: institutional web, resource guidescollectionspersonnel
14Slide15
Variety of Tools/Methods
15
surveyinterviewsfocus group(s)
observationpre/post testrubric
student portfolio
research paper/project
other class assignment
test scores
GPA
degree completion rate
retention rateSlide16
What is your definition of
assessment? What comes to mind when you hear the term
“assessment”?What benefits do you see for
assessment?What are your concerns?
Some Initial Questions
16Slide17
Process of…
DefiningSelectingDesigningCollecting
AnalyzingInterpreting
Usinginformation to increase service/program effectiveness
Assessment Defined
17Slide18
Answers questions:
What do users/stakeholders want & need?How can services/programs better meet needs?
Is what we do working?Could we do better?
What are problem areas?Traditional stats
don’t tell whole story
Why
Assessment?
18Slide19
Importance of Assessment
“Librarians
are increasingly called upon to document and articulate the value of academic
and research libraries and their contribution to institutional mission and goals
.”
(ACRL Value of Academic Libraries, 2010, p. 6
)
19Slide20
Formal vs. Informal Assessment
Formal Assessment Data drivenEvidence-based
Accepted methodsRecognized as rigorous
Informal Assessment
Anecdotes & casual observation
Used to be norm
No longer acceptable
20Slide21
Outcomes Assessment Basics
Outcomes: “The ways in which library users are changed as a result of their contact with the library’s resources and programs” (ALA, 1998).
“Libraries
cannot demonstrate institutional value to maximum effect until they define outcomes of institutional relevance and then measure the
degree to which they attain
them”
(Kaufman &
Watstein,
2008,
p. 227
)
.
21Slide22
Steps in Assessment Process
Why? Identify purpose
Who? Identify team How? Choose model/approach/method
Commit
Training/planning
22Slide23
Outputs & Inputs
OutputsQuantify the work done
Don’t relate factors to overall effectivenessInputs
Raw materialsMeasured against standards
Insufficient for overall assessment
23Slide24
Principles for Applying Outcomes Assessment
Center on usersAssess changes in service/resources use
Relate to inputs - identify “best practices”Use variety of methods to corroborate conclusions
Choose small number of outcomesNeed not address every aspect of service
Adopt continuous process
24Slide25
Examples of Outcomes
User matches information need to information resourcesUser can organize an effective search strategy
User effectively searches online catalog & retrieves relevant resources
User can find appropriate resources
25Slide26
What We Know About Assessment
Ongoing process to understand & improve service
Librarians are busy with day-to-day work & assessment can become another burden
Can build on what has already been done or is known
26Slide27
“One size fits none!”
(Lynn’s Mom)27Slide28
Survey Research “…to look at or to see over or beyond…allows one to generalize from a smaller group to a larger group”
28
(Connaway & Powell, 2010, p. 107)Slide29
Explores many aspects of service
Demographic information
Online surveys (e.g., Survey Monkey) provide statistical analysisControlled sampling
High response rates possibleData reflect characteristics & opinions of respondents
Cost effective
Can be self-administered
Survey large numbers
Survey Research: Advantages
(
Hernon
& Altman, 1998)
29Slide30
Survey Research: Disadvantages
Produces a snapshot of situationMay be time consuming to analyze & interpret resultsProduces self-reported dataData lack depth of interviewing
High return rate can be difficult
(
Hernon
& Altman, 1998)
30Slide31
Design Issues
Paper or Online (e.g., Survey Monkey)Consider order of questionsDemographic
q’s firstInstructions
Be specificIntroduce sections
Keep it simple
Pre-test!
31Slide32
32
Survey Research
Interpreting Results
Objectively analyze all data
Interpret results with appropriate level of precision
Express proper degree of caution about conclusions
Use data as input in outcome measures
Consider longitudinal study, compare results over time
Qualitative data requires special attentionSlide33
33
Example: Seeking Synchronicity CIT: VRS Potential User Online Survey Questions
Think about one experience in which you felt you achieved (or did not achieve) a positive result after seeking library reference services in any format.
(
Connaway
& Radford, 2011)
a. Think about one experience in which you felt you did (or did not) achieve a positive result after seeking library reference services in any format.
b. Describe each interaction.
c. Identify the factors that made these interactions
positive or negative.Slide34
Interviews34
Conversation involving two or more people guided by a predetermined purpose
(Lederman, 1996)Slide35
Structured
Semi-structuredFormats:Individual
Face-to-faceTelephone
SkypeFocus Group Interviews
Types of Interviews
35Slide36
Types of Questions
OPEN
“What is it like when you visit the library?”DIRECTIVE
“What happened when you asked for help at the reference desk?”REFLECTIVE“It sounds like you had trouble with the mobile app?”
CLOSED
“Have I covered everything you wanted to say?”
36Slide37
Interviews: Advantages
Face-2-face interaction
In-depth informationUnderstand experiences & meanings
Highlight individual’s voicePreliminary information to “triangulate”
Control samplingInclude underrepresented groups
Greater range of topics
37Slide38
Interviews: Disadvantages
Time Factors
Varies by # & depthStaff intensive
Cost FactorsHigher the #, higher the costAdditional Factors
Self-reported data
Errors in note taking possible
38Slide39
Example: Digital Visitors & Residents Participant Questions
391. Describe the things you enjoy doing with technology and the web each week.
2. Think of the ways you have used technology and the web for your studies. Describe a typical week.
3. Think about the next stage of your education. Tell me what you think this will be like.
(White &
Connaway
, 2011-2012
)Slide40
Focus Group Interviews40
“…interview of a group of 8 to 12 people representing some target group and centered on a single topic.”
(
Zweizig
, Johnson, Robbins, & Besant, 1996)Slide41
Conducting Focus Group Interviews
Obtain permission to use information & if taping Report and/or publicationEnlist note-taker or, if recording, check equipment, bring back-up
Begin by creating safe climate
41Slide42
42WorldCat.org Study Recruitment
DifficultLittle data of user-base
Participants across 3 continentsHard-to-reach populationsHistorians
Antiquarian booksellersNon-probabilistic methodsConvenience sampling
Snowball sampling
(
Connaway
& Wakeling, 2012)Slide43
43
Example: WorldCat.orgFocus Group Interview Questions
Tell us about your experiences with WorldCat.org.
Broad introductory question to reveal the extent to which users have engaged with WorldCat.org, and the information-seeking contexts within which they use the system.
(Connaway & Wakeling, 2012, p. 7)Slide44
Structured Observations44
Systematic description focusing on designated aspects of behavior to test causal hypotheses
(Connaway & Powell, 2010)Slide45
45Structured Observations: A Guide
Develop observational categoriesDefine appropriate, measurable acts
Establish time length of observationAnticipate patterns of phenomena
Decide on frame of reference
(Connaway & Powell, 2010)Slide46
Ethnographic Research46
Rich description
(Connaway & Powell, 2010)Slide47
Ethnographic research47
Incredibly detailed dataTime consumingEstablishing rapport
Selecting research participantsTranscribing observations & conversations
Keeping diaries
(Connaway & Powell, 2010, p.175)
(
Khoo
, Rozaklis, & Hall, 2012)Slide48
Analysis48
“
summary of observations or data in such a manner that they provide answers to the hypothesis or research questions”
(Connaway & Powell, 2010)Slide49
Analysis49
Collection of data affects analysis of dataOngoing process
Feeds back into research designTheory, model, or hypothesis must grow from data analysisSlide50
Data Analysis: Digital Visitors & Residents
CodebookNvivo 10
50
I. Place
A. Internet
1. Search engine
a. Google
b. Yahoo
2. Social Media
a.
FaceBook
b. Twitter
c. You Tube
d. Flickr/image sharing
e. Blogging
B. Library
1. Academic
2. Public
3. School (K-12)
C. Home
D. School, classroom, computer lab
E. Other
(White &
Connaway
, 2011-2012)Slide51
Getting the Right Fit!
What do we know?Where do we go from here?Use tools & research design to customize project to fit your assessment needs
51Slide52
References
ALA/ACRL. 1998. Task force on academic library outcomes assessment report. Available: http://www.ala.org/Content/NavigationMenu/ACRL/Publications/White_Papers_and_Reports/Task_Force_on_Academic_Library_Outcomes_Assessment_Report.htmBrown, Karen, and Kara J.
Malenfant. 2012. Connect, collaborate, and communicate: a report from the Value of Academic Libraries Summits
. [Chicago, Ill.]: Association of College & Research Libraries. http://www.acrl.ala.org/value.Connaway
, Lynn S., Johnson, Debra W., & Searing, Susan. 1997. Online catalogs from the users’ perspective: The use of focus group interviews.
College and Research Libraries
,
58
(5), 403-420.
Connaway
, Lynn S. & Radford, Marie L. 2011.
Seeking Synchronicity: Revelations and recommendations for virtual reference.
Dublin, OH: OCLC Research. Retrieved from http://www.oclc.org/reports/synchronicity/full.pdf
Connaway
, Lynn S. & Powell, Ronald R. 2010.
Basic research methods for librarians
(5th ed.). Westport, CN: Libraries Unlimited.
Connaway
, Lynn S., &
Wakeling
, Simon. 2012. To use or not to use Worldcat.org: An international perspective from different user groups. OCLC Internal Report.
Dervin
, Brenda,
Connaway
, Lynn S., &
Prabha
, Chandra. 2003-2006
Sense-making the information confluence: The whys and
hows
of college and university user
satisficing
of information needs.
Funded by the Institute of Museum and Library
Flanagan, John C. 1954. The critical incident technique. Washington: American Psychological Association.
Geertz, Clifford. 1973. The interpretation of cultures: Selected essays. New York: Basic Books.
52Slide53
ReferencesHernon
Peter & Altman, Ellen. 1998. Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago, IL: American Library Association.Kaufman, Paula, and Sarah Barbara
Watstein. 2008. "Library value (return on investment, ROI) and the challenge of placing a value on public services". Reference Services Review.
36 (3): 226-231.Khoo, Michael, Rozaklis, Lily, & Hall, Catherine (2012). A survey of the use of ethnographic methods in the study of libraries and library users.
Library and Information Science Research, 34
(2), 82-91.
Lederman, Linda C. 1996.
Asking questions and listening to answers: A guide to using individual, focus group, and debriefing interviews
. Dubuque, Iowa: Kendall/Hunt.
Oakleaf
, Megan J. 2010.
The value of academic libraries: a comprehensive research review and report
. Chicago, IL: Association of College and Research Libraries, American Library Association.
QSR International. 2011.
NVivo
9: Getting started.
Retrieved from http://download.qsrinternational.com/Document/NVivo9/NVivo9-Getting-Started-Guide.pdf
Services (IMLS). http://www.oclc.org/research/activities/past/orprojects/imls/default.htm
White, David S., &
Connaway
, Lynn S. 2011-2012.
Visitors and residents: What motivates engagement with the digital information environment
. Funded by JISC, OCLC, and Oxford University. Retrieved from http://www.oclc.org/research/activities/vandr/
Zweizig
, Douglas, Johnson, Debra W., Robbins, Jane, & Besant, Michele. 1996.
The tell it! Manual.
Chicago: ALA.
53Slide54
Lynn Silipigni Connaway, Ph.D.
Senior Research ScientistOCLC Research Vice-chair , ACRL Value of Academic Libraries Committee
@LynnConnawayconnawal@oclc.org
54