Supporting materials

Reproducible Social Research
Download slides
Download transcript

Recommended reading

  • Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533(7604), 452-454. doi: 10.1038/533452a
  • Burlig, F. (2018). Improving transparency in observational social science research: A preanalysis plan approach. Economics Letters, 168, 56-60.
  • Christensen, G., Freese, J., & Miguel, E. (2019). Transparent and reproducible social science research: How to do open science. Oakland, California: University of California Press.
  • Claerbout, J. (1994). Seventeen years of super computing and other problems in seismology. Paper presented at the National Research Council meeting on High Performance Computing in Seismology.
  • Connelly, R., & Gayle, V. (2019). An investigation of social class inequalities in general cognitive ability in two British birth cohorts. The British journal of sociology, 70(1), 90-108. Connelly, R., Gayle, V., & Lambert, P. S. (2016). Modelling key variables in social science research: Introduction to the special section. Methodological Innovations, 9. doi:10.1177/2059799116637782
  • Foster, E. D., & Deardorff, A. (2017). Open science framework (OSF). Journal of the Medical Library Association: JMLA, 105(2), 203.
  • Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), 1502-1505.
  • Gayle, V. J., & Lambert, P. S. (2017). The workflow: A practical guide to producing accurate, efficient, transparent and reproducible social survey data analysis. Retrieved from Southampton:
  • Janz, N. (2016). Bringing the gold standard into the classroom: replication in university teaching. International Studies Perspectives, 17(4), 392-407.
  • Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196-217.
  • King, G. (2003). The future of replication. International Studies Perspectives, 4(1), 100-105. Knuth, D. E. (1984). Literate programming. The Computer Journal, 27(2), 97-111.
  • Knuth, D. E. (1992). Literate programming. CSLI Lecture Notes, Stanford, CA: Center for the Study of Language and Information (CSLI), 1992.
  • Long, J. S. (2009). The workflow of data analysis using Stata. College Station, TX: Stata Press
  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S., Breckler, S., . . . Christensen, G. (2016). Transparency and openness promotion (TOP) guidelines. doi:https://osf.io/vj54c/
  • Nosek, B. A., & Lakens, D. (2014). Registered Reports: A method to increase the credibility of published reports. 45(3), 137-141.
  • Rubin, M. (2017). When does HARKing hurt? Identifying when different types of undisclosed post hoc hypothesizing harm scientific progress. Review of General Psychology, 21(4), 308-320.
  • Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E., . . . Bonnier, E. (2018). Many analysts, one data set: Making transparent how variations in analytic choices affect results. Advances in Methods and Practices in Psychological Science, 1(3), 337-356. Wilkinson, M. D.,
  • Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., Baak, A., . . . Bourne, P. E. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Nature, 3