2021-10-19 08:55:00
Wikipedia
“Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature.”
Response | Percent |
---|---|
No reply | 41% |
Refused/unable to share data | 18% |
No data despite promise | 4% |
Data shared after reminder | 16% |
Data shared after 1st request | 22% |
Bem, D.J. (2011). Experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100(3), 407-425.
“This article reports 9 experiments, involving more than 1,000 participants, that test for retroactive influence by ‘time-reversing’ well-established psychological effects so that the individual’s responses are obtained before the putatively causal stimulus events occur.”
“We argue that in order to convince a skeptical audience of a controversial claim, one needs to conduct strictly confirmatory studies and analyze the results with statistical tests that are conservative rather than liberal…”
“We conclude that Bem’s p values do not indicate evidence in favor of precognition; instead, they indicate that experimental psychologists need to change the way they conduct their experiments and analyze their data.”
“…psychologists tend to treat other peoples’ theories like toothbrushes; no self-respecting individual wants to use anyone else’s.”
“The toothbrush culture undermines the building of a genuinely cumulative science, encouraging more parallel play and solo game playing, rather than building on each other’s directly relevant best work.”
“Reviewers and editors want novel, interesting results. Why would I waste my time doing careful direct replications?”
“Reviewing papers is hard, unpaid work. If I have to check someone’s stats, too, I’ll quit.”
– Any number of researchers I’ve talked with
But much (lab-based) data collected are from Western, Educated Industrialized, Rich, Democratic (WEIRD) populations
“We conducted replications of 100…studies published in three psychology journals using high-powered designs and original materials when available….The mean effect size (r) of the replication effects …was half the magnitude of the mean effect size of the original effects…”
“Ninety-seven percent of original studies had significant results (P < .05). Thirty-six percent of replications had significant results.”
“39% of effects were subjectively rated to have replicated the original result…”
This talk was produced on 2021-10-19 in RStudio using R Markdown. The code and materials used to generate the slides may be found at https://github.com/gilmore-lab/2021-10-19-bs-in-science/. Information about the R Session that produced the code is as follows:
## R version 4.1.0 (2021-05-18) ## Platform: x86_64-apple-darwin17.0 (64-bit) ## Running under: macOS Big Sur 11.6 ## ## Matrix products: default ## LAPACK: /Library/Frameworks/R.framework/Versions/4.1/Resources/lib/libRlapack.dylib ## ## locale: ## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8 ## ## attached base packages: ## [1] stats graphics grDevices utils datasets ## [6] methods base ## ## other attached packages: ## [1] DiagrammeR_1.0.6.1 ## ## loaded via a namespace (and not attached): ## [1] Rcpp_1.0.7 knitr_1.33 ## [3] servr_0.23 magrittr_2.0.1 ## [5] R6_2.5.0 jpeg_0.1-9 ## [7] rlang_0.4.11 highr_0.9 ## [9] stringr_1.4.0 visNetwork_2.1.0 ## [11] tools_4.1.0 websocket_1.4.1 ## [13] xfun_0.24 png_0.1-7 ## [15] jquerylib_0.1.4 htmltools_0.5.1.1 ## [17] yaml_2.2.1 digest_0.6.27 ## [19] processx_3.5.2 RColorBrewer_1.1-2 ## [21] later_1.2.0 htmlwidgets_1.5.3 ## [23] sass_0.4.0 promises_1.2.0.1 ## [25] ps_1.6.0 mime_0.11 ## [27] glue_1.4.2 evaluate_0.14 ## [29] rmarkdown_2.9 stringi_1.7.3 ## [31] compiler_4.1.0 bslib_0.2.5.1 ## [33] jsonlite_1.7.2 pagedown_0.15 ## [35] httpuv_1.6.1
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature News, 533(7604), 452. https://doi.org/10.1038/533452a
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., … Wu, H. (2018). Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature Human Behaviour, 1. https://doi.org/10.1038/s41562-018-0399-z
Collaboration, O. S. (2015). Estimating the reproducibility of psychological. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716
Goodman, S. N., Fanelli, D., & Ioannidis, J. P. A. (2016). What does research reproducibility mean? Science Translational Medicine, 8(341), 341ps12–341ps12. https://doi.org/10.1126/scitranslmed.aaf5027
LaCour, M. J., & Green, D. P. (2014). When contact changes minds: An experiment on transmission of support for gay equality. Science, 346(6215), 1366–1369. https://doi.org/10.1126/science.1256151
Mischel, W. (2011). Becoming a cumulative science. APS Observer, 22(1). Retrieved from https://www.psychologicalscience.org/observer/becoming-a-cumulative-science
Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Sert, N. P. du, … Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. https://doi.org/10.1038/s41562-016-0021
Nuijten, M. B., Hartgerink, C. H. J., Assen, M. A. L. M. van, Epskamp, S., & Wicherts, J. M. (2015). The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods, 1–22. https://doi.org/10.3758/s13428-015-0664-2
Szucs, D., & Ioannidis, J. P. A. (2017). Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLoS Biology, 15(3), e2000797. https://doi.org/10.1371/journal.pbio.2000797
Vanpaemel, W., Vermorgen, M., Deriemaecker, L., & Storms, G. (2015). Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra, 1(1). https://doi.org/10.1525/collabra.13
Wagenmakers, E.-J., Wetzels, R., Borsboom, D., & Maas, H. L. J. van der. (2011). Why psychologists must change the way they analyze their data: The case of psi: Comment on bem (2011). J. Pers. Soc. Psychol., 100(3), 426–432. https://doi.org/10.1037/a0022790
Wicherts, J. M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61(7), 726–728. https://doi.org/10.1037/0003-066X.61.7.726