This is a Preprint and has not been peer reviewed. This is version 1 of this Preprint.

Code review in practice: A checklist for computational reproducibility and collaborative research in ecology and evolution
Downloads
Authors
Abstract
Ensuring that research, along with its data and code, is credible and remains accessible is crucial for advancing scientific knowledge—especially in ecology and evolutionary biology, where the climate crisis and biodiversity loss accelerate and demand urgent, transparent science. Yet, code is rarely shared alongside scientific publications, and when it is, unclear implementation and insufficient documentation often make it difficult to use. Code review—whether as self-assessment or peer review—can improve key aspects of code quality: reusability, i.e., ensuring technical functionality and that the code is well-documented, and validity, i.e., ensuring the code implements the intended analyses faithfully. While assessing validity requires domain expertise for methodological assessment, code review for reusability can be conducted by anyone with basic understanding of programming practices. Here, we introduce a checklist-based, customisable approach to code review that focuses on reusability. Informed by best practices in software development and recommendations from commentary pieces and blog posts, the checklist organises specific review prompts around seven key attributes of high-quality reusable scientific code: Reporting, Running, Reliability, Reproducibility, Robustness, Readability, and Release. By defining and structuring these principles of code review and turning them into a practical tool, our template guides through a systematic evaluation that is also flexible to be tailored to specific needs. This includes providing researchers with a clear path to proactively improve their own code. Ultimately, this approach to code review aims to reinforce reproducible coding practices, and strengthens both the credibility and collaborative potential of research.
DOI
https://doi.org/10.32942/X26S6P
Subjects
Life Sciences, Social and Behavioral Sciences
Keywords
research software, Code Quality, computational reproducibility, open science, collaborative research
Dates
Published: 2025-04-24 23:38
Last Updated: 2025-04-24 23:38
License
CC-BY Attribution-NonCommercial 4.0 International
Additional Metadata
Conflict of interest statement:
The authors declare no competing interests, financial or otherwise.
Data and Code Availability Statement:
This manuscript did not generate or use any data or code.
Language:
English
There are no comments or no comments have been made public for this article.