Digital Science has today launched a report focusing on falsifiability and reproducibility in scientific research. The report addresses three areas including appropriate documentation and sharing of research data, clear analysis and processes, and the sharing of code.
Making Science Better: Reproducibility, Falsifiability and the Scientific Method looks at the current state of reproducibility in 2019, as well as the importance of falsifiability in the research process.
The analysis comes from the Digital Science portfolio company, Ripeta, which aims to make better science easier by identifying and highlighting the important parts of research that should be transparently presented in a manuscript and other materials.
The tool detects and evaluates the key evidence for reproducibility in science through software and analytics development; improving evidence-based science and fiscal efficiency of research investments. These tools leverage sophisticated machine-learning and natural language processing algorithms to extract key reproducibility elements from research articles.
Key report findings include:
- All research stakeholders have a responsibility to make their work both reproducible and falsifiable. Reproducible: so that anyone can follow the stated method and reach the same conclusions; and falsifiable: so that the method used can appropriately test the hypothesis.
- While not all research materials need to be accessible due to confidentiality and/or anonymity, achieving adequate transparency is essential to reproducibility.
- The research paper should be a route to test and recreate the research that has been carried out. This is the basis of the scientific method.
- Falsifiability is an integral part of the research process. It adds credibility to research and allows further work to build on solid foundations.
- Establishing a well-structured framework against which assessments of reproducibility can be made, alongside appropriate reporting, allows the barriers in reusing scientific work, supporting scientific outcomes, and assessing scientific quality to be reduced.
- Good data documentation, which includes the research design, data collection, data cleaning, and analyses leads to ‘good’ science. Well-documented science and research enables further advancement through transparency and adequate data documentation.
- Clear data analysis reporting is not only related but critical to the practice of good science.
- By supplying code, documenting which version of software was used, and storing code for future reference, science can be made more accurate, more reproducible, and more useful to scientists within and across domains and geographies.
- The scientific community needs faster and more scalable means to assess and improve reproducibility. An important part of that is fundamentally changing how we think about reproducibility. The difficulty is that while we all have a sense of what reproducibility is in our own fields, reproducibility as a concept does not easily translate between fields.
- We need to build structure into our research processes that automate the checking of the process itself and alert us to problems when they arise. This new machinery of checks and counterbalances needs to take both falsifiability and reproducibility into account.
Leslie McIntosh, CEO of Ripeta, said: “The pursuit of knowledge is important, and should be undertaken thoroughly, accurately, and transparently. Accessible, reproducible research is an important and often challenging aspect of that pursuit.
“Technology has made conducting science faster and more sophisticated. We need ways to quickly and accurately capture and report all the methods without asking more from the scientists. Ripeta addresses one part of the problem.”
Ripeta has been nominated as a finalist for the ALPSP Awards for Innovation in Publishing, with the winner announced tomorrow evening at the annual conference dinner.
The report contributions include:
Main authors: Leslie D. McIntosh, Cynthia Hudson Vitale, Anthony Juehne, Leah Haynes, Sasha Mothershead and Josh Sumner
To read the full report, see here