Some causes for the problem of not being able to reproduce the work of others can be addressed when publishing scientific articles by the adherence to rigor and reproducibility criteria, implemented via checklists at various journals. Menke et al. developed an automated tool (SciScore) that evaluates research publications based on their adherence to key rigor criteria, including NIH criteria and RRIDs, at an unprecedented scale. The authors show that despite steady improvements, less than half of the scoring criteria, such as blinding or power analysis, are routinely reported.
In addition, the average score for a journal in a given year provided by the SciScore tool was used to define a new journal quality metric, the Rigor and Transparency Index (RTI). When comparing the RTI to the Journal Impact Factor, it was found that there was no correlation.
The authors conclude that the RTI can potentially serve as a proxy for methodological quality.