The Case For Making Economics Research Study Easier To Reproduce

October 28, 2015 Nemes Random

Some things don’t provide themselves to replication: the pattern of a snowflake, finger prints, Wilt Chamberlain’s 100-point efficiency against the New york city Knicks. To that list we can now include “research results from leading economics journals.”

In a new paper, Andrew Chang, an economic expert at the Federal Reserve and Phillip Li, an economist with the Office of the Comptroller of the Currency, explain their attempt to reproduce 67 documents from 13 well-regarded economics journals. They chose documents that utilized US information, which looked for to develop an empirical result that involved gross domestic itemgdp, or GDP, thanks to its status as a commonly made use of indication of macroeconomic conditions.

The pair was systematic and invariably polite: If no data or code were provided in journal archives on the authors’ personal sites, they got in touch with the matching author, waited a week for a reply, and remained to get in touch with the other noted authors up until they received a response. They offered authors a minimum of one month to send out data or code (“code” describes programming language that tells a computer system the best ways to run mathematical designs). 6 of the documents had actuallyneeded to be dropped since the relevant information sets were proprietary, and two due to the fact that the authors did not have the essential software.

Their results? Simply under half, 29 out of the staying 59, of the documents could be qualitatively replicated (that is to state, their basic findings held up, even if the authors did not show up at the exact very same quantitative result). For the other half whose results could not be replicated, the most common reason was “missing public information or code.”


Comments are currently closed.