Additional reads in May 2020

SuperPlots: Communicating reproducibility and variability in cell biology

Against pandemic research exceptionalism

Pseudoscience and COVID-19 — we’ve had enough already

Spin in Scientific Publications: A Frequent Detrimental Research Practice

Pseudoscience and COVID-19 — we’ve had enough already

Coronavirus in context: tracks positive and negative citations for COVID-19 literature

Pandemic researchers — recruit your own best critics

The importance of being second – PLOS-wide edition

Reproducibility in science: important or incremental?

Octopus: The primary research record

Created in partnership with the UK Reproducibility Network, the Octopus platform is free to use and publishes all kinds of scientific work, whether it is a hypothesis, a method, data, an analysis or a peer review.

The principle behind Octopus is to break the ‘unit of publication’ down from being a ‘paper’ to its constituent parts.

The Octopus beta version is now available and can be accessed at
On the site, there is a feedback link in the footer of each page which you can use to report bugs or thoughts and if time allows, there’s a full feedback form to provide more detailed information HERE.

Please be aware that anything uploaded to the site will be publicly visible (please be careful not to accidentally upload anything copyright or inappropriate) and all content will be deleted after this round of testing.

Using Microsoft OneNote as an ELN

Electronic research data documentation can be achieved with either comprehensive software applications or “do-it-yourself” solutions. Whereas the first is often quite cost intensive, the latter one is usually labor intensive to set it up properly. An interesting intermediary solution is provided by Microsoft OneNote:
Guerrero and colleagues compared different electronic lab notebook (ELN) applications and found that Microsoft OneNote is very competitive in its capabilities compared to dedicated solutions (LINK). A survey also revealed that its users preferred OneNote compared to other solutions. Last year, the same group described the adaptation of OneNote to the lab environment (LINK) and touched on the following aspects:

  • Structure and labeling: Here, the research unit needs to agree on the organization of generated research data and on having a convention for naming and classifying different experiments
  • Data acquisition: OneNote allows saving of raw data within the program and hyperlinking of larger files
  • Data presentation: Tools are described for data presentation and connecting with other Microsoft applications, e.g. integration of Microsoft Excel
  • Sharing: OneNote provides comprehensive sharing features which makes collaborations easy
  • Storing, securing and legalizing: With specific settings and usage of Microsoft SharePoint it is even possible to be FDA Code Title 21 Part11 compliant and files can be backed up in a OneNote file format – only the implementation of legally binding time stamps does not seem to be currently possible.

Overall, this article provides many practical tips on establishing OneNote as an alternative to a conventional lab notebook.
An interesting resource for additional reading material is the blog by Dr Martin Engel. In his posts he describes the transition from paper-based lab notebooks to OneNote (LINK) or the use of OneNote (LINK) in more detail.

PLOS launches research data survey

PLOS has launched a major survey targeting researchers in North America and Europe: The goal is to deepen the understanding of researchers’ priorities with regards to sharing and reusing research data, and to understand how well existing tools and services are meeting researchers’ needs.
Please consider taking part in the survey and / or forward to anyone you know who may be able to participate.
The survey is open till May 26, 2020. PLOS will be sharing results of the survey publicly in the near future.
The survey can be accessed HERE.
More information about the survey, including why PLOS is running this new survey now, is on the PLOS blog.

Die Reproduzierbarkeitskrise: Bedrohung oder Chance für die Wissenschaft? (In German)

In this Editorial, published in Biologie in unserer Zeit, Martin C. Michel and Ralf Dahm discuss threats and opportunities related to the current reproducibility crisis in biomedical sciences.
The authors highlight several top-down approaches currently in place to increase data quality and reproducibility: the BMBF, the EU or the NIH have launched research programs on the topic of reproducibility; various specialist journals (e.g. Nature or Molecular Pharmacology) have adapted their guidelines for authors; and the DFG has published newguidelines for Good Scientific Practice and declared them binding for all DFG-funded scientists.
In addition, there is also an increasing number of bottom-up initiatives, such as the European Quality in Preclinical Data (EQIPD) project ( or the Global Preclinical Data Forum ( Such initiatives as well as professional organizations like the PAASP Network (e.g. offer solutions, advice and training to promote preclinical data quality.


Systematic review of guidelines for internal validity in the design, conduct and analysis of preclinical biomedical experiments involving laboratory animals

Several initiatives have set out to increase transparency and internal validity of preclinical studies. While many of the points raised in these various guidelines are identical or similar, they differ in detail and rigour. Most of them focus on reporting, only few of them cover the planning and conduct of studies.
The aim of this systematic review was to identify existing experimental design, conduct, analysis and reporting guidelines relating to preclinical animal research. Based on a systematic search in PubMed, Embase and Web of Science unique 58 recommendations were extracted. Amongst the most recommended items were sample size calculations, adequate statistical methods, concealed and randomised allocation of animals to treatment, blinded outcome assessment and recording of animal flow through the experiment.
The authors highlight, that – although these recommendations are valuable – there is a striking lack of experimental evidence on their importance and relative effect on experiments and effect sizes.
This work is part of the European Quality In Preclinical Data (EQIPD) consortium.