The document discusses the reproducibility crisis in psychological science. It notes several cases from 2011 that called into question research practices. An open science collaboration attempted to replicate 100 psychology studies and found that only 36% replicated. The document recommends promoting open science practices like replication, transparent statistics and data sharing, and teaching/rewarding rigor over quantity or novelty. It argues the field needs to change incentives to prioritize accuracy over publication. Overall, the document analyzes issues compromising reproducibility and proposes open science solutions to improve research integrity in psychology.
9. “…the field of psychology
currently uses methodological
and statistical strategies that are
too weak, too malleable, and
offer too many opportunities for
researchers to befuddle
themselves and their peers”
10. Each case raised unique
questions about how science is
conducted in psychology
From how research is planned
right through to how data are
analysed and published
11. All cases pertain to growing
concern over the number of
false positives in the literature
22. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
23. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
28. We have a professional
responsibility to ensure the
findings we are reporting are
robust and replicable
29. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
30. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
34. a) the probability that the results are due
to chance
b) the probability that the results are not
due to chance
c) the probability of observing results as
extreme (or more) as obtained if there
is no effect in reality
d) the probability that the results would
be replicated if the experiment was
conducted a second time
35. a) the probability that the results are due
to chance
b) the probability that the results are not
due to chance
c) the probability of observing results as
extreme (or more) as obtained if there
is no effect in reality
d) the probability that the results would
be replicated if the experiment was
conducted a second time
36. True or False?
The p-value tells us something
about the size of an effect
37. True or False?
The p-value tells us something
about the importance of an effect
38. True or False?
The p-value tells us something
about the probability of our
hypothesis
39. a) the probability that the results are due
to chance
b) the probability that the results are not
due to chance
c) the probability of observing results as
extreme (or more) as obtained if there
is no effect in reality
d) the probability that the results would
be replicated if the experiment was
conducted a second time
48. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
49. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
65. “I appreciate your results were
unexpected, but in order to tell
a nicer story, you should re-
write your introduction as if
you expected these results”
72. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
73. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
74.
75. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
76. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices
83. 5. Reward Open Science Practices
Good for
Science:
- Truth seeking
- Rigour
- Quality
- Reproducibility
Good for
Individuals/
Institutions:
- Publishable
- Quantity
- Novelty
- Impact
84. “…the solution requires
making the incentives
for getting it right
competitive with the
incentives for getting it
published”
(Nosek et al., 2012)
93. Universe A & B:
• Investigating embodiment of political
extremism
• Participants (N = 1,979!) from the
political left, right, and center
• Moderates perceived shades of grey
more accurately than left or right (p<.01).
94.
95. Universe A & B:
Moderates perceived
shades of grey more
accurately than left or right
(p<.05).
96. Universe A
• Moderates perceived shades of grey
more accurately than left or right (p<.01).
97. Universe A
• Moderates perceived shades of grey
more accurately than left or right (p<.01).
98. Universe A
• Moderates perceived shades of grey
more accurately than left or right (p<.01).
99. Universe B
• Moderates perceived shades of grey
more accurately than left or right (p<.01).
• Surprised by the effect, so tries to
replicate the result before publishing
–Uses even larger sample size than Study 1
• Replication fails to reproduce the
effect
–No publication
102. There is something
wrong with hiring
decisions if “getting it
published” is
rewarded more than
“getting it right”
103. (Utopian) Ideas for Hiring
Committees
Look for evidence of open
science practice
104. (Utopian) Ideas for Hiring
Committees
Have open science practice as
a “desired” (or “essential”!)
item on job specification
105. (Utopian) Ideas for Hiring
Committees
Judge publication quality
rather than quantity
106.
107. Recommendations
1. Replicate, replicate, replicate…
2. Know your statistics
3. Open your science
4. Incorporate open science
practices in teaching
5. Reward open science practices