/
Accounting  for  Impact Evaluation Accounting  for  Impact Evaluation

Accounting for Impact Evaluation - PowerPoint Presentation

yoshiko-marsland
yoshiko-marsland . @yoshiko-marsland
Follow
383 views
Uploaded On 2018-02-14

Accounting for Impact Evaluation - PPT Presentation

practices and the Effects of Indicator Use Dr Sarah de Rijcke Centre for Science and Technology Studies CWTS Copenhagen 10 March 2016 Formative effects of evaluation 1 Research ID: 631446

journal research work evaluation research journal evaluation work cwts rijcke project phd wouters progress impact planning publish collaboration practices

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Accounting for Impact Evaluation" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Accounting for ImpactEvaluation practices and the Effects of Indicator Use

Dr. Sarah de Rijcke

Centre for Science and Technology Studies (CWTS

)

Copenhagen

,

10

March 2016Slide2

Formative effects of evaluation

1

Research

projects

Teaching

Interventions

&

debateSlide3

2

KNOWSCIENCE project

Open Data projectSlide4

3Slide5

Increasing importance of metrics

Strong

tensions

with central goals

of

European and

national

research

policies

to foster

excellent

and

collaborative, socially responsible, and

societally relevant science

4Slide6
Slide7

Four main problems

The

funding

system

The

career

structure

The

publication

system

The

evaluation

system

6

Slide credit:

Paul Wouters (CWTS)Slide8

D

iscrepancy

between evaluation criteria and the

social, cultural

and economic functions of science

The ‘Evaluation Gap’

Wouters, P.F.

A

key

challenge

: the

evaluation

gap

.

Blogpost

,

August 28, 2014.

citationculture.wordpress.comSlide9

Literature review on effects of indicators(De Rijcke et al. 2015,

Research Evaluation

)

Three possible consequences:

Goal displacement

Task reduction

Changes in relations government

and

institutions

PBF – national systems - does trickle down to institutional and individual level

8Slide10

9Slide11

Thinking with Indicators in the Life Sciences

Sarah

de Rijcke

Ruth Müller (TU Munich)

Alex Rushforth (CWTS)Paul

Wouters

(CWTS)Slide12

One indicator:

the Journal Impact Factor

Three steps in

knowledge production:

Planning research

Collaboration and authorship practices

A

ssessing

work-in-progress

manuscripts

11

Rushforth

& De Rijcke (2015)Slide13

Theme:Planning research

12Slide14

Planning Research

Selecting research questions

I

:

What

would

you

say

is

an ‚ideal‘

postdoc

project

?

PHD_2M:

One

that

gives

me

a

good

paper

in a

year

! [

laughter

]

Well

,

you

can

never

entirely

cancel

out all

risk

. But

what

I

mean

with

risk

is

how

predictable

it

is

what

will

come

out

of

the

research

in a

certain

frame

of time

“Slide15

Planning Research

Structuring

research

on the

experimental

level

You

already

need

to

plan in

the

very

beginning

what

papers

you

will

be

able

to

publish

,

which

experiments

do I

need

for

that

.

It

sounds

way

more

calculating

than

you

think

when

you

naively

start

your

research

career

.

But

you

just

have

to

focus

on

what’s

good

for

the

papers

.‘

(PDoc_6f, 1319). Slide16

Theme:Collaboration

and

a

uthorship

practices

15

Andy

Lamb

. Co-

authorship

network

map of

physicians

publishing on hepatitis C (detail)

https

://

www.flickr.com

/

photos

/

speedoflife

/8274993170/Slide17

Collaboration and

authorship

practices

Determining

the

safest

bet

Respondent:

I

just had a discussion with [

PhD student X]

on a project that's never going to be high impact. But then we have the choice; either publish it in a lower journal, or forget about it. And then, of course, we're also practical and say, "Okay, we have to publish it."

Interviewer: Okay, yes. So you can decide whether to do more experiments on the basis of whether you think it stands a chance in a higher impact journal.

Respondent:

Of

course, but then if we stick to [same PhD] as an example, she also has projects that are running really well. And so then, my problem, or something that I have to decide, is

are we actually going to invest in that project that we don't think is very high impac

t, or are we going to try to publish it as it is, in a lower journal, so that she has all the time to work on the projects that are going well, and that do have an interesting set of results?

(PI Interview, Surgical Oncology, Institute B)

16Slide18

Theme: assessing

work

-in-

progress

manuscripts

17Slide19

Assessing work-in-

progress

manuscripts

Grading

for

novelty

and

quality

PI

goes to computer.

Any

alternatives? Any journals

?”

PhD

:

Hmm maybe

[Journal C].

They are similar in impact right?

Post-doc:

Yeah seven-

ish

. It’s difficult because some papers are descriptive and some have mechanism. So for this paper it could actually go one step higher than Journal C because you’re going a bit beyond description. They also have priority reports in

[Journal B]

.

PI:

[Journal D]

also

has

very fast publishing periods from date of submission- if they like it of course.

(

Fieldnote

22

July 2014)

18Slide20

ConclusionsRespondents

’ ‘folk

theories

’ (Rip 2006) of indicators have important

epistemic implications

Affecting the types of work researchers consider viable and interesting

Indicator-considerations

eclipsed other

judgments

about work-in-progress

Dominant way of attributing

worth

What

kinds of ‘excellent’

science

does this

result

in?

Not

incentivized

to

think

about

responsible

research

’ or

relevance

to

society

19Slide21

We need new assessment models

to bridge the

evaluation gap

20Slide22

21

A

collaboration

between

Diana Hicks (Georgia Tech)

, Paul

Wouters

(CWTS),

Ismael

Rafols

(SPRU/

Ingenio

), Sarah de Rijcke and

Ludo Waltman (CWTS) Slide23

22Slide24

23Slide25

24

https

://

vimeo.com/133683418Slide26

25

www.leidenmanifesto.orgSlide27

##

26