Contenuto in:
Capitolo

Misinformation and disinformation in statistical methodology for social sciences: causes, consequences and remedies

  • Giulio Giacomo Cantone
  • Venera Tomaselli

The present is an introductory summary on the topic of misinformative and fraudolent statistical inferences, in the light of recent attempts to reform social sciences. The manuscript is focused is on the concept of replicability, that is the likelihood of a scientific result to be reached by two independent sources. Replication studies are often ignored and most of the scientific interest regards papers presenting theoretical novelties. As a result, replicability happens to be uncorrelated with bibliometric performances. These often reflect only the popularity of a theory, but not its validity. These topics are illustrated via two case studies of very popular theories. Statistical errors and bad practices are discussed. The consequences of the practice of omitting inconclusive results from a paper, or 'p-hacking', are discussed. Among the remedies, the practice of preregistration is presented, along with attempts to reform peer review through it. As a tool to measure the sensitivity of a scientific theory to misinformation and disinformation, multiversal theory and methods are discussed.

  • Keywords:
  • replication crisis,
  • research evaluation,
  • p-hacking,
  • preregistration,
  • multiverse analysis,
+ Mostra di più

Giulio Giacomo Cantone

University of Catania, Italy - ORCID: 0000-0001-7149-5213

Venera Tomaselli

University of Catania, Italy - ORCID: 0000-0002-2287-7343

  1. Brembs, B. (2018). Prestigious Science Journals Struggle to Reach Even Average Reliability. Frontiers in Human Neuroscience, 12.
  2. Breznau, N. et al. (2022). Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty. Proceedings of the National Academy of Sciences, 119(44):e2203150119.
  3. Camerer, C. F. et al. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9):637–644.
  4. Feigenbaum, S. and Levy, D. M. (1996). The technological obsolescence of scientific fraud. Rationality and Society, 8(3):261–276.
  5. Gelman, A. and Loken, E. (2014). The statistical crisis in science. American Scientist, 102(6):460–466.
  6. Gelman, A. and Tuerlinckx, F. (2000). Type S error rates for classical and Bayesian single and multiple comparison procedures. Computational Statistics, 15(3):373–390.
  7. Gignac, G. E. and Zajenkowski, M. (2020). The Dunning-Kruger effect is (mostly) a statistical artefact: Valid approaches to testing the hypothesis with individual differences data. Intelligence, 80:101449.
  8. Goodman, S. N., Fanelli, D., and Ioannidis, J. P. A. (2016). What does research reproducibility mean? Science Translational Medicine, 8(341):341ps12–341ps12.
  9. Gordon, M. et al. (2020). Are replication rates the same across academic fields? Community forecasts from the DARPA SCORE programme. Royal Society Open Science, 7(7):200566.
  10. Head, M. L., Holman, L., Lanfear, R., Kahn, A. T., and Jennions, M. D. (2015). The extent and consequences of p-hacking in science. PLoS biology, 13(3):e1002106.
  11. Imbens, G. W. (2021). Statistical Significance, p-Values, and the Reporting of Uncertainty. Journal of Economic Perspectives, 35(3):157–174.
  12. Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLOS Medicine,2(8):e124.
  13. Jansen, R. A., Rafferty, A. N., and Griffiths, T. L. (2021). A rational model of the Dunning-Kruger effect supports insensitivity to evidence in low performers. Nature Human Behaviour, 5(6):756–763.
  14. Kleinfeld, J. S. (2002). The small world problem. Society, 39(2):61–66.
  15. Krueger, J. and Mueller, R. A. (2002). Unskilled, unaware, or both? The better-than-average heuristic and statistical regression predict errors in estimates of own performance. Journal of personality and social psychology, 82(2):180.
  16. Kruger, J. and Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6):1121.
  17. Lakatos, I. (1976). Falsification and the Methodology of Scientific Research Programmes. In Harding, S. G., editor, Can Theories be Refuted? Essays on the Duhem-Quine Thesis, Synthese Library, pages 205–259. Springer Netherlands, Dordrecht.
  18. Milgram, S. (1967). The small world problem. Psychology today, 2(1):60–67.
  19. Nissen, S. B., Magidson, T., Gross, K., and Bergstrom, C. T. (2016). Publication bias and the canonization of false facts. eLife, 5:e21451. Publisher: eLife Sciences Publications, Ltd.
  20. Nosek, B. A., Ebersole, C. R., DeHaven, A. C., and Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11):2600–2606.
  21. OPEN SCIENCE COLLABORATION (2015). Estimating the reproducibility of psychological science. Science, 349(6251):aac4716.
  22. Patel, C. J., Burford, B., and Ioannidis, J. P. A. (2015). Assessment of vibration of effects due to model specification can demonstrate the instability of observational associations. Journal of Clinical Epidemiology, 68(9):1046–1058.
  23. Rubin, M. (2017). When Does HARKing Hurt? Identifying When Different Types of Undisclosed Post Hoc Hypothesizing Harm Scientific Progress. Review of General Psychology, 21(4):308–320.
  24. Schmidt, S. (2009). Shall we Really do it Again? The Powerful Concept of Replication is Neglected in the Social Sciences. Review of General Psychology, 13(2):90–100.
  25. Simmons, J. P., Nelson, D., and Simonsohn, U. (2021). Pre-registration: Why and How. Journal of Consumer Psychology, 31(1):151–162.
  26. Simmons, J. P., Nelson, L. D., and Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11):1359–1366.
  27. Simonsohn, U., Simmons, J. P., and Nelson, L. D. (2020). Specification curve analysis. Nature Human Behaviour, 4(11):1208–1214. Number: 11 Publisher: Nature Publishing Group.
  28. Steegen, S., Tuerlinckx, F., Gelman, A., and Vanpaemel, W. (2016). Increasing Transparency Through a Multiverse Analysis. Perspect Psychol Sci, 11(5):702–712.
  29. Szucs, D. and Ioannidis, J. P. A. (2017). Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLOS Biology, 15(3):e2000797.
  30. Travers, J. and Milgram, S. (1969). An Experimental Study of the Small World Problem. Sociometry, 32(4):425–443.
  31. van Zwet, E. W. and Cator, E. A. (2021). The significance filter, the winner’s curse and the need to shrink. Statistica Neerlandica, 75(4):437–452.
  32. Watts, D. J. and Strogatz, S. H. (1998). Collective dynamics of small-world networks. nature, 393(6684):440–442.
  33. West, J. D. and Bergstrom, C. T. (2021). Misinformation in and about science. Proceedings of the National Academy of Sciences, 118(15):e1912444117.
  34. Yamada, Y. (2018). How to Crack Pre-registration: Toward Transparent and Open Science. Frontiers in Psychology, 9.
  35. Young, C. and Holsteen, K. (2017). Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis. Sociological Methods & Research, 46(1):3–40.
PDF
  • Anno di pubblicazione: 2023
  • Pagine: 53-58

XML
  • Anno di pubblicazione: 2023

Informazioni sul capitolo

Titolo del capitolo

Misinformation and disinformation in statistical methodology for social sciences: causes, consequences and remedies

Autori

Giulio Giacomo Cantone, Venera Tomaselli

Lingua

English

DOI

10.36253/979-12-215-0106-3.10

Opera sottoposta a peer review

Anno di pubblicazione

2023

Copyright

© 2023 Author(s)

Licenza d'uso

CC BY 4.0

Licenza dei metadati

CC0 1.0

Informazioni bibliografiche

Titolo del libro

ASA 2022 Data-Driven Decision Making

Sottotitolo del libro

Book of short papers

Curatori

Enrico di Bella, Luigi Fabbris, Corrado Lagazio

Opera sottoposta a peer review

Anno di pubblicazione

2023

Copyright

© 2023 Author(s)

Licenza d'uso

CC BY 4.0

Licenza dei metadati

CC0 1.0

Editore

Firenze University Press, Genova University Press

DOI

10.36253/979-12-215-0106-3

eISBN (pdf)

979-12-215-0106-3

eISBN (xml)

979-12-215-0107-0

Collana

Proceedings e report

ISSN della collana

2704-601X

e-ISSN della collana

2704-5846

126

Download dei libri

186

Visualizzazioni

Salva la citazione

1.347

Libri in accesso aperto

in catalogo

2.262

Capitoli di Libri

3.790.127

Download dei libri

4.421

Autori

da 923 Istituzioni e centri di ricerca

di 65 Nazioni

65

scientific boards

da 348 Istituzioni e centri di ricerca

di 43 Nazioni

1.248

I referee

da 380 Istituzioni e centri di ricerca

di 38 Nazioni