Katherine Button Department of Psychology Grassroots Training for Reproducible Science The wider reproducibility problem Low statistical power Poor control for bias Questionable research practices ID: 760143
Download Presentation The PPT/PDF document "Collaborative student projects" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Collaborative student projects
Katherine Button
Department of Psychology
Grassroots Training for Reproducible Science
Slide2The wider reproducibility problem
Low statistical power
Poor control for bias
Questionable research practices
Slide349 meta-analyses, 730 studies,
Median power = 21% Low power increase the risk of: Type II error (false negatives)Type I error (false positives)Biased effect size estimates
Low statistical power
Slide4Undisclosed flexibility
Slide5A competitive system…
Van
Dijk
et al (2014) Current Biology, 24, R516–R517
Slide6…rewarding positive results
Munafò
et al (2009). Molecular Psychiatry, 14, 119-120.
p
< 0.001
Slide7The problem magnified…
Multiple projects, poor resourcesTimeMoneyAccess to participantsAssessment criteria often focus onIndividual contributions Creativity and noveltyPotential for multiple studies which are Small, underpowered, poorly designedTesting novel hypotheses rather than replicationAnalysed with undisclosed flexibility
Student Projects
Slide8Student projects, power and false positives
Suppose:
In 90% of student projects the null hypothesis is true
The significance level is set at 5%
The
average power
of studies is
20%
If, in 100 UG studies, 10 true associations will exist, we will detect 2
(20%)
Of the remaining 90 non-associations, we will falsely declare 4-5
(5%) as significant
Now suppose “significant” findings get submitted for publication – 2 / 3 will be false positives
Slide9Sending the wrong message…
Current system rewards (false) positive results and novelty over
rigourous
methods and replication
Lack of scientific
rigour
increases (false) positive results
Low statistical power
Lack of control for bias
Questionable research practices + Lack of transparency
Need to align training and incentives for career progression with scientific
rigour
not results
Slide10solutions
Slide11solutions
Clinical trials
Protocol and analysis plan pre-registration Specifying primary outcome variable CONSORT reporting guidelines
GWAS Consortia
Pooling resources to maximise sample sizeLarge scale collaborationVariable harmonisation
Systematic reviews
Transparent and clear reporting of data and methods
Slide12Solutions: A manifesto for reproducibility
Button et al (2013). Nature Reviews Neuroscience, 14, 365
Munafò
et al (2017). Nature Human
Behaviour, 1, 0021
Slide13Registers the study (with a commitment to make the results public) and closely pre-specifies study design, primary outcome and analysis plan Reduces publication bias, makes research discoverableReduces outcome switching, P-hacking, HARKing as decisions remain data-independent. Distinguishes confirmatory from exploratory Front loads thinking into design – excellent training!
Pre-registration
Slide14But how to square the circle…
Larger samples, pre-registration, and preparing data for publication all take extra time and resources… …but student projects are time-limited, poorly resourced and the assessment criteria focus on individual contributions and novelty…
Slide15A potential solution
Instilling scientific rigor at the grassroots: Feasibility of a novel multi-centre methodology for undergraduate psychology projectsWorking collaboratively across institutions to Improve the quality of research training in open science Improve the quality of the research outputs (current and future)
Collaboration to increase power, generalisability, transparency Pre-registration, open data, open resourcesDesign to allow individual assessment
Button et al. (2016). The Psychologist 29, 158 - 167.
Slide16GW4 Consortium 2016/17: Project structure
Slide17GW4 Consortium 2017/18: Project structure
Slide18GW4 Consortium 2018/19: Project structure
Slide19Squaring the circle to align training and assessment with rigour
Slide20BPS Accreditation
Slide212016/17 Project: Replication and extension
Slide22Solutions
2016/7 Sample size based on treatment effect from Lawrence et al 2015, n = 106
Each student contributed their own moderator hypothesis (secondary), n = 412
…We reached n = 238, underpowered for moderators but sufficient for replication, data collection is ongoing.
Button et al (2013). Nature Reviews Neuroscience, 14, 365 – 376
Slide23Solutions
2017/18 design simplified to single session, sample size based on published effect sizes adjusted for publication bias, 149 participants, 90% power, 0.005 alpha
Each student contributed their own moderator hypothesis or secondary outcome
…We finished recruitment in February
Button et al (2013). Nature Reviews Neuroscience, 14, 365 – 376
Slide24Solutions
Collectively written protocol & a-priori statistical analysis plan published on the Open Science Framework
before
data collection started:https://osf.io/7h6w3/Each student contributed their own moderator hypothesis and analysis plan, qualifying for authorship Students can add doi to CV
Button et al (2013). Nature Reviews Neuroscience, 14, 365 – 376
Slide25Solutions
Students follow the statistical plan as set out in the protocol, or explain why they have used different methods (i.e., frequentist).
This provides an audit trail and discourages QRPs (both conscious and unconscious!)
Button et al (2013). Nature Reviews Neuroscience, 14, 365 – 376
Slide26What did the students think?
working with students from different universities
I liked the idea of there being pre-registration of the method to foster transparency, and thought it would be interesting to meet other students who had different knowledge
The large amount of support available, working with other universities
“I thought it might be difficult to incorporate everyone’s opinions into the project and make sure everything was communicated clearly between all the researchers.”
“I was slightly concerned I would not have much choice in what I focused on in terms of hypotheses, but this was not the case as I was given the chance to test my own moderator hypothesis.”
MOTIVATIONS
RESERVATIONS
Slide27Advantages?
I liked the idea of there being pre-registration of the method to foster transparency, and thought it would be interesting to meet other students who had different knowledge
…additional sources of participants and data, which helped prevent the study from being underpowered. Additionally a group project shared the responsibilities of research administrative tasks, so the research started more promptly and easily than it would have if the project was a solo one.
Working with others to get a bigger data set, meaning the study had more power. Meeting others and sharing ideas.
Recruited larger amount of people than if the dissertation was done alone. Interesting to have insights from a variety of people, and so if the supervisor was not specialised in the field then the other universities could provide support.
Lots of
support from peers
. Topics seem to be more interesting (I personally think) than the average dissertation topic, probably due to
more people to help collect data so projects can be more ambitious
Slide28Disadvantages?
Having slightly less say in the design
Communication between universities can be a little difficult sometimes, as you have fewer face to face meetings
People relied on me, bit of pressure. Lots to remember/attend.
It was sometimes confusing when researchers had different ideas, eg data analysis. Also, I found it difficult to cover all parts of the consortium project due to the limited word count of my dissertation.
Some aspects of joining a complex study after its been agreed instead of building it up from the ground has made it difficult to comprehend at times. Additionally differing deadlines for work has put different levels of data into all projects.
Differences in deadlines,
and so sometimes there wasn't the urgency that would be more helpful for some universities.
Waiting for things to go through different universities like the ethics and pre-registration
.
Slide29What would you say to students considering a consortium project?
“Do it, it'll be an invaluable experience”
“It is really exciting to know you are part of a bigger research project where you get to work with current researchers in the field”
“To be aware of both the advantages and disadvantages of working with large, multi-site and multi-student projects.”
“The opportunity to work in a research team across multiple universities is invaluable, especially for experimental studies in which time and access to samples can be limited.”
“It's definitely worthwhile, you feel like a supported member of a team”
Slide30Concluding remarks: Research culture
There are solutions to problems of reproducibility but rigorous research takes more time and resourcesSystemic change is happening – funders, publishers, researchersWe need to prepare our students for open science!
Slide31Concluding remarks: Grassroots change
Consortium studies offer a way to tackle these issuesStudents participate and train in rigorous methods from the start Instilling best practice in the grassroots to change scientific culture from the ground up!
Slide32Acknowledgements
Marcus
Munafò University of BristolChris Chambers Cardiff UniversityNatalia Lawrence University of ExeterRachel Adams Cardiff University Loukia Tzavella Cardiff UniversityKatie Hobbs University of Bath GW4 UG Psychology Consortium Matt Garner University of SouthamptonGabby Haeems University of SouthamptonBath-Soton UG Psychology ConsortiumDorothy Bishop, University of Oxford
K.S.Button@bath.ac.uk
@
ButtonKate
Slide33Solutions
Publish final datasets on university data repositories or OSF
Make study materials available on OSF
Button et al (2013). Nature Reviews Neuroscience, 14, 365 – 376