Timezone: »
The apparent contradiction of the title is a wordplay on the different meanings attributed to the word \emph{reproducible} across different scientific fields. What we imply is that unreproducible \emph{findings} can be built upon reproducible \emph{methods}. Without denying the importance of facilitating the reproduction of \emph{methods}, we deem important to reassert that reproduction of \emph{findings} is a fundamental step of the scientific inquiry. We argue that the commendable quest towards easy deterministic reproducibility of methods and numerical results, should not have us forget the even more important necessity of ensuring the reproducibility of empirical conclusions and findings by properly accounting for essential sources of variations. We provide experiments to exemplify the limitations of current evaluation methods of models in the field of deep learning, showing that even if the results could be reproduced, a slightly different experiment would not support the findings. This work is an attempt to promote the use of more rigorous and diversified methodologies. It is not an attempt to impose a new methodology and it is not a critique on the nature of exploratory research. We hope to help clarify the distinction between exploratory and empirical research in the field of deep learning and believe more energy should be devoted to proper empirical research in our community.
Author Information
Xavier Bouthillier (Mila - Université de Montréal)
César Laurent (MILA)
Pascal Vincent (U Montreal)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Unreproducible Research is Reproducible »
Fri. Jun 14th 01:30 -- 04:00 AM Room Pacific Ballroom #14
More from the Same Authors
-
2022 : Did We Forget about the Canonical Source of Variance in Machine Learning Pipelines? »
Xavier Bouthillier -
2020 Poster: Stochastic Hamiltonian Gradient Methods for Smooth Games »
Nicolas Loizou · Hugo Berard · Alexia Jolicoeur-Martineau · Pascal Vincent · Simon Lacoste-Julien · Ioannis Mitliagkas -
2019 : Poster discussion »
Roman Novak · Maxime Gabella · Frederic Dreyer · Siavash Golkar · Anh Tong · Irina Higgins · Mirco Milletari · Joe Antognini · Sebastian Goldt · Adín Ramírez Rivera · Roberto Bondesan · Ryo Karakida · Remi Tachet des Combes · Michael Mahoney · Nicholas Walker · Stanislav Fort · Samuel Smith · Rohan Ghosh · Aristide Baratin · Diego Granziol · Stephen Roberts · Dmitry Vetrov · Andrew Wilson · César Laurent · Valentin Thomas · Simon Lacoste-Julien · Dar Gilboa · Daniel Soudry · Anupam Gupta · Anirudh Goyal · Yoshua Bengio · Erich Elsen · Soham De · Stanislaw Jastrzebski · Charles H Martin · Samira Shabanian · Aaron Courville · Shorato Akaho · Lenka Zdeborova · Ethan Dyer · Maurice Weiler · Pim de Haan · Taco Cohen · Max Welling · Ping Luo · zhanglin peng · Nasim Rahaman · Loic Matthey · Danilo J. Rezende · Jaesik Choi · Kyle Cranmer · Lechao Xiao · Jaehoon Lee · Yasaman Bahri · Jeffrey Pennington · Greg Yang · Jiri Hron · Jascha Sohl-Dickstein · Guy Gur-Ari -
2018 Poster: Convergent Tree Backup and Retrace with Function Approximation »
Ahmed Touati · Pierre-Luc Bacon · Doina Precup · Pascal Vincent -
2018 Oral: Convergent Tree Backup and Retrace with Function Approximation »
Ahmed Touati · Pierre-Luc Bacon · Doina Precup · Pascal Vincent