The Materials and Methods section is a crucial component of any formal lab report, such as a scientific publication. This part of the report explains the experimental procedures used to complete the experiments and should provide sufficient detail for other scientists to reproduce the studies presented in the report. Considering the importance of “reproducible results” in science, it becomes obvious why the Materials and Methods became a vital part in each scientific article.
However, given the complexity of most current research methods and experiments, it can be very challenging to decide which parameters, settings and experimental factors are required for reproducing the published results and therefore need to be reported. As shown in the terrific real-life example below, sometimes only one’s own experience can demonstrate that even slight differences in certain protocols can have dramatic and potentially misleading consequences:
When two independent laboratories, located in Boston and Berkeley decided to collaborate on establishing FACS profiles of primary breast cells, it came as quite a surprise that a single protocol gave consistently different outcomes dependent on which lab performed the experiment. Despite using identical methods and careful validation of all antibodies used, one lab could not reproduce the CD10 and CD44 expression profiles obtained by the other. Tissues processed in Boston always produced the Boston profile and tissues treated in Berkeley produced the Berkeley profile.
Due to advances in the flow cytometry (FT) technology and the new FT instruments available today, the number of parameters measured is constantly increasing and large datasets are generated for every single experiment. A simple explanation would have been that this complexity creates challenges in both annotation and data sharing. However, although different instruments were used, the implementation of calibration beads and cell line standards helped to rule out instrumentation as the source of the problem.
Only when scientists travelled from one laboratory to the other and performed experiments side by side to observe every manual step in the protocol, the mystery could be solved:
It was finally found that the difference in the outcome stemmed from the two labs using different mechanical shaker speeds to disrupt tissue. In Boston, tissue was stirred in a flask with a stir bar more vigorously compared to the Berkeley method where tissues were incubated relatively gently on a rotating platform but for longer time. As a consequence, from this slow digestion strategy, five times more organoids were obtained compared to the fast digestion approach which has a dramatic effect on the CD44 antigen presentation.
As the stirring speed was considered an insignificant technical detail not worth reporting, it took over 2 years and a lot of additional resources to understand this “data reproducibility” issue. In the authors’ words: ‘It is educational that CD44 staining used by countless laboratories in FACS analysis can so easily be altered by an apparently minor difference in methodology’ (Hines et al., Cell Reports 2014).
Researchers who encounter discrepancies between their conclusions and published work, or those whose work cannot initially be replicated, are encouraged to resolve the differences by working together with other laboratories to try to determine the sources of limited or lack of reproducibility.