Measuring and Visualizing Scholarly Impact John Barnett Scholarly Communications Librarian Jennifer Chan Assistant Scholarly Communications Librarian Office of Scholarly Communication and Publishing ID: 415928
Download Presentation The PPT/PDF document "Here Comes the Sunburst:" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Here Comes the Sunburst:
Measuring and Visualizing Scholarly Impact
John Barnett
Scholarly Communications Librarian
Jennifer
Chan
Assistant Scholarly Communications Librarian
Office of Scholarly Communication and Publishing
University Library System
University of PittsburghSlide2
Here Comes the Sunburst:Measuring and Visualizing Scholarly ImpactSlide3
University of Pittsburgh
Pittsburgh
campus + regional campuses in Bradford, Greensburg, Johnstown, and Titusville
16
undergraduate, graduate, and professional schools
456+ degree programs
2012: conferred 8,949 degreesSlide4
University of Pittsburgh
Top 10 American higher ed. in federal funding (NSF)
Top 5 in annual research support (NIH)
5,369 faculty; 4,470 full-time faculty
Research conducted: more than 300 centers, institutes, laboratories, clinicsSlide5
University Library System
ARL22nd largest academic library system in North America25 libraries; 6.6 million volumes279,000 current serialsSlide6
Sum of the PartsSlide7
Why Pitt?Strategic goal: Innovation in scholarly communication
Providing services that scholars understand, need, and valuePutting ourselves in faculty “spaces”Re-envisioning our librarian liaison programDeepening our understanding of scholarly communications issuesSlide8
Why
PlumX?Making research “more assessable and accessible”
Gathering
information in one place
Making it intelligible and
useful
Measuring
and
visualizing
research
impactCorrelating metrics from traditional and new forms of scholarly communicationAllowing researchers, labs, departments, institutions to track real-time scholarly impactPromoting research, comparing with peers, connecting with new research Slide9
Altmetrics Project TimelineSlide10
Pilot project aimsDevelop a tool for measuring and visualizing research impactGathering information in one placeIntelligible and usefulImpact in social media and other scholarly communication methodsTraditional measures counted as well
See where to disseminate works to increase impactSlide11
Traditional vs.
newTraditional measures are also countedFindings
are complementary
to conventional
methods of
measuring research
impact
(
e.g., H-Index)
Not
intended to replace themSlide12
New measuresMore comprehensive: Altmetrics = ALL METRICSCitationsUsage
CapturesMentionsSocial MediaCovers impact of online behaviorBecause scholars increasingly work onlineMeasures impact immediatelyBecause citation counts take years to appear in literatureSlide13
Pilot ProcessSlide14
Pilot Project Participants32 researchers, various disciplines
9 schools18 departments1 complete research groupOthers joined as they learned about the projectSlide15
Pilot Project ParticipantsSlide16
TechnologiesInternalIR built on Eprints PlatformSharepoint
Microsoft Office SuitePMID/DOI data import toolExternalPlumXDOIsPMIDSlide17
Data collection for pilot project
Created records in D-Scholarship@Pitt,
our
institutional repository
Focused on articles
,
books, book
chapters,
proceedings
Scholarly output with standard identifiers
DOI, ISBN, PubMed ID, official URL, etc.Scholarship produced since 2000Slide18
Other Library workDeveloped guidelines to standardize record creationData entry from faculty CVs into IR (2 to 3 student
workers with QA by librarians)Librarian liaisons and other staff trained in record creationSharePoint site used to track work completedCoordination with pilot facultyGathered feedback and administered online surveySlide19
Sharepoint
Altmetrics Meetings MinutesFaculty CVsExcel spreadsheetsWord docsSlide20
External Data SourcesSlide21
Metadata sourcesFaculty CVs . . . But verify metadata!Books: PittCat, WorldCat, Books in Print, publisher sites, online retailers
Journals: Serials Solutions list, journal websites, JournalSeek, UlrichsWeb, DOAJ, PubMedConference presentations: Websites, PittCat, indexes, WorldCatSlide22
PMID Import ToolCustom build by SysAdmin for Eprints PlatformUtilizing PMIDs from PubMed, able to import records that prepopulate metadata fields
Item Type, Title, Abstract, Creators, Publication Title, ISSN, Volume/Issue, Page ranges, Date and Date type, DOI, MeSH Headings, Grant Information, Keywords, etc.Slide23
Data IngestionSlide24
Full-text sourcesDOAJERICPLOSSSRN*Other repositories*Federal government websites*
Conference websites** Use with cautionSlide25
Plum Analytics processing activitiesSlide26
Key featuresFaculty profilesOnline ‘artifacts’ArticleBook
Book chapterVideoEtc.Impact graphSunburstSlide27
Faculty profileSlide28
Online ‘artifact’ displaySlide29
Impact graphSlide30
SunburstSlide31
FeedbackSolicited via email and online survey
Generally positive in most casesData correctionsErrors in profilesLinks to wrong dataQuickly corrected by Plum staffRequests for results from
additional online sources (Google
Scholar,
SlideShare
,
Reddit
, etc
.)
PlumX
collects data from these but did not gather information in advance for profilesSlide32
The survey saysSurveyed pilot project faculty in spring 2013@ 1/3rd responded to the surveyMeaning 13 out of 32 participants respondedSlide33
Accurate and useful dataSlide34
The bar graphSlide35
The sunburst
0Slide36
Traditional & new measures
0Slide37
Usefulness of altmetricsSlide38
Learning something newSlide39
CommentsAffiliations/bio inaccurate or has missing information“Mentions” by whom & when?Publications misclassifiedBooks vs. conference proceedingsData not collected
Google ScholarSlideshareSlide40
CommentsFilter out unwanted informationData are wrong—and not usefulOverabundance of information in sunburst“I only care what a select group of scholars thinks of my work”
“I did not find this useful for my discipline”Slide41
ObservationsLacked information about faculty practices Are the results useful to all faculty, all disciplines?May appeal more to faculty who are early in their careers or whose work is more contemporary
Will the data be used against faculty or programs?Labor-intensive strategyWhen it comes down to it . . . Does anyone care?Slide42
Embeddable widgets(in development)For researchers, to add to:t
heir own Web pagesdepartment directoriesIR researcher profile pageFor individual artifacts,to build article level metrics for imbedding in:IR document abstract pageArticle abstract page for
journals we publishSlide43
Roll-out challengesWho creates profiles? Who edits?What information should be included in profiles? Who can view them?Separate data gathering from D-Scholarship deposits?Who promotes the service? Who trains?Timing . . .Slide44
Future plansData checkingAdditional data gatheringRecord merging/dedupingAbility to edit user profiles and artifact records locally
Open APITo allow integration with other online systemsMore exhaustive scholarly practices survey for all facultyRollout to all Pitt ResearchersWill use automatic feed from Pitt IR to PlumXSlide45
DiscussionHow would you “sell” PlumX to additional faculty?