ICLR 2018 Reproducibility Challenge
See the 2019 edition of the challenge!
Background:
One of the challenges in machine learning research is to ensure that published results are reliable and reproducible. In support of this, the goal of this challenge is to investigate reproducibility of empirical results submitted to the 2018 International Conference on Learning Representations.
We are choosing ICLR for this challenge because the timing is right for course-based participants (see below), and because papers submitted to the conference are automatically made available publicly on Open Review.
The Challenge is inspired by discussions at the ICML 2017 Workshop on Reproducibility in Machine Learning.
Task Description
You should select a paper from the 2018 ICLR submissions, and aim to replicate the experiments described in the paper. The goal is to assess if the experiments are reproducible, and to determine if the conclusions of the paper are supported by your findings. Your results can be either positive (i.e. confirm reproducibility), or negative (i.e. explain what you were unable to reproduce, and potentially explain why).
Essentially, think of your role as an inspector verifying the validity of the experimental results and conclusions of the paper. In some instances, your role will also extend to helping the authors improve the quality of their work and paper.
You do not need to reproduce all experiments in your selected paper, for example the authors may experiment with a new method that requires more GPUs than you have access to, but also present results for a baseline method (e.g. simple logistic regression), in which case you could elect to reproduce only the baseline results. It is sometimes the case that baseline methods are not properly implemented, or hyper-parameter search is not done with the same degree of attention.
If available, the authors' code can and should be used; authors of ICLR submissions are encouraged to release their code to facilitate this challenge. The methods described can also be implemented/re-implemented according to the description in the paper. This is a higher bar for reproducibility, but may be helpful in detecting anomalies in the code, or shedding light on aspects of the implementation that affect results.
Proposed outcomes
Participants should produce a Reproducibility report, describing the target questions, experimental methodology, implementation details, analysis and discussion of findings, conclusions on reproducibility of the paper. This report should be posted as a contributed review on OpenReview.
The result of the reproducibility study should NOT be a simple Pass / Fail outcome. The goal should be to identify which parts of the contribution can be reproduced, and at what cost in terms of resources (computation, time, people, development effort, communication with the authors).
Participants should expect to engage in dialogue with ICLR authors through the OpenReview site. In cases where participants have made significant contributions to the final paper, ICLR should allow adding these participants as co-authors (at the request of the original authors only.)
Important dates
- Announcement of the challenge: October 6 2017
- Registration of participants (see link below): October 28 - December 15 2017 (flexible)
- Final submission of reproducibility report (on OpenReview): December 15 2017 (suggested, to be considered for decisions)
Target participants
Instructors teaching a graduate-level machine learning course in Fall 2017 are encouraged to use this challenge as their final course project. The project can be completed individually or in small groups. Participation by other researchers or research trainees with adequate machine learning experience is also encouraged.
Participating institutions (contact Joelle Pineau to be added to the list):
Available resources
- Instructors can apply for Google Cloud credits for their students. Each student will be given a small number of credits to start (approx. $50).
- By default, Google Cloud accounts don't come with a GPU quota, but you can find instructions on how to request GPUs, including links on how to check and increase quotas, at
this link.
- If necessary, instructors can ask for much more computing credits (up to $1000 per student) by contacting: CloudEDUGrants@google.com.
- Students can also request a $300 credit.
- If you are another company that can offer cloud computing credits, please contact reproducibility.challenge@gmail.com.
Suggested Readings
How to register
Fill out the online registration form, containing:
- Team participants (names; emails)
- ICLR paper being reproduced
- Reproducibility plan (as detailed as possible)
For questions:
Email: reproducibility.challenge@gmail.com
Organizers
- Joelle Pineau, challenge coordinator
- Genevieve Fried, logistics and registration
- Rosemary Nan Ke, references and technical support
- Hugo Larochelle, corporate sponsorship