Replication Session at the First Plenary Conference of the Institute for New Economic Thinking Young Scholars Initiative

From ReplicationWiki
Jump to: navigation, search

The Institute for New Economic Thinking Young Scholars Initiative held its First Plenary Conference in Budapest, Hungary, October 19-22, 2016. On October 22, a session on replication took place from 10:00-12:00. The set up was as follows:

  • Andrew C. Chang (Board of Governors of the Federal Reserve System), Philipp Li (Office of the Comptroller of the Currency)
"Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say 'Usually Not'"
Abstract
We attempt to replicate 67 macroeconomic articles published in 13 well-regarded economics journals using author-provided replication files that include both data and code. Some journals in our sample require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. Defining replication success as our ability to use the author-provided data and code files to produce the key qualitative conclusions of the original paper, we successfully replicate 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.
"Transparency and Reproducibility in Economics - A Literature Review"
Abstract
There is growing interest in research transparency and reproducibility in economics and other scientific fields. We survey existing work on these topics within economics, and discuss the evidence suggesting that publication bias, inability to replicate, and specification searching remain widespread problems in the discipline. We next discuss recent progress in this area, including through improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing of data and materials, and draw on experiences in both economics and other social sciences. We discuss areas where consensus is emerging on new practices, as well as approaches that remain controversial, and speculate about the most effective ways to make economics research more accurate, credible and reproducible in the future.


"Lessons learned from the 3ie replication experience"
Abstract
There is a growing acceptance that replication research is important for science and social science knowledge creation. Nonetheless, the culture around replication research is fraught with tensions. In 2012, the International Initiative for Impact Evaluation (3ie) launched a replication programme with the objective of improving the quality of impact evaluation evidence for development policy making and with the perhaps naïve belief that some of the tensions inherent with replication research could be reduced by having a third party play a role. In this paper, we begin by setting out the three main challenges for replication research: objectives and definitions, incentives for both original authors and replication researchers, and replication ethics. We explore how academic journals may be able to address these challenges and then discuss why they may not do so. We then describe the policies and processes that 3ie designed in order to address these challenges and share our lessons learned. We conclude that there has been progress, but that certain human limitations will continue to challenge the culture around replication research.


"Teaching Replication in Quantitative Empirical Economics"
Abstract
In empirical economics, a twofold lack of incentives leads to chronic problems with replicability: For authors of empirical studies providing replicable material is not awarded in the same way as publishing new irreplicable studies is. Neither is authoring replication studies.

We offer a strategy to set incentives for replicability and replication. By integrating replication studies in the education of young scholars, we raise the awareness for the importance of replicability among the next generation of researchers and ensure that a big number of scientists get incentives to write replication studies: credit points and the prospect of publications at least of working papers [1] already during their time as students. By raising the number of researchers involved in replication and by providing an infrastructure for sharing their information, on the one hand we help to lower the amount of work researchers need to put into making their studies replicable. On the other hand, we facilitate the dissemination of insights derived from replication studies. This as a side effect imposes a significant threat of detection of irreplicable research, following the cases of recently introduced wiki projects for the revelation of plagiarism.[2][3][4] In contrast to previous efforts like the report on the American Economic Review Data Availability Compliance Project,[5] with our project we build the basis for the first replicable review paper on reblicability as we give account of which studies were tested and which results were found in each case. After exploring several dozen studies published in highly ranked journals, we have not yet determined a single case where we see replicability is fully ensured.
We identified two main problems: First, not all published results can be obtained from the replication material provided. Second, information about how the used data were obtained from the raw data is hardly ever sufficient.
For our investigation, we gave seminars at several faculties. We set up a wiki project[6] for documenting the results of our replications as well as those found in the literature. In our database, we provide information about more than 2500 empirical studies, especially with regards to the availability of material for their replication. We invite for discussion to develop standards for how to make research replicable and how to write replication studies. For this we provide information about existing projects that facilitate the sharing of material for empirical econometric research.

References

  1. Institut für Statistik und Ökonometrie, Wirtschaftswissenschaftliche Fakultät, Georg-August-Universität Göttingen, Replication project Replication Working Papers
  2. GuttenPlag Wiki
  3. Vroniplag Wiki
  4. Freyplag Wiki
  5. "Report on the American Economic Review Data Availability Compliance Project". Appendix to American Economic Review Editors Report. 2011
  6. ReplicationWiki


Comments and questions are welcome on this page's discussion page before and until one week after the conference. You can also send us your related work and we will have a number of papers discussed here online.

This session will be a continuation of the initiative we started with our workshop on Transparency and Replication in San Francisco in January for which we already compiled some online materials.

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox