The goal of this event is to stimulate the exchange of problems in causal discovery. We created a Repository, which hosts datasets, models, software, and papers. Deposit your own task in the repository and have it solved by others, or solve one of the proposed tasks.
Conditions of participation: Anybody who complies with the rules of the challenge
is welcome to participate. There are two modes of participation:
As a data donor by making an entry in the Repository.
As a problem solver by submitting results on at least one of the proposed tasks.
The challenge results will be presented at a
NIPS 2008 conference workshop, December 12, 2008. To present at the workshop, abstracts must be submitted before October 24, 2008 to email@example.com. Participants are not required to attend the
workshop and the workshop is open to non-challenge participants.
The proceedings of the competition will be published by the
Journal of Machine Learning Research (JMLR).
Anonymity: The participants who do not submit a paper to the workshop can elect to remain anonymous. Their results will be published, but their name will remain confidential.
Tasks: A number of datasets on which tasks have been defined are available, see the Task page. More tasks will be added from time to time, if new data are donated. Data donated show up immediately in the Repository, but they become part of the challenge only after beeing reviewed by the organizers, who then add them to the Task page. To be informed of task updates, request to be added to our mailing list by sending email to firstname.lastname@example.org.
Milestone and final results: Results must be submitted between the start and the termination of the challenge. The challenge starts on September 15, 2008 and is scheduled
to terminate on November 12, November 19, 2008.
Each participating team is allowed to submit one set of results per task for the final evaluation. If more than one set of results is submitted, the last one will be taken into account. The results of the final evaluation will be publicly released at the workshop. In addition, optionally, each participating team may submit one set of results before October 15, 2008 to be part of the milestone evaluation, whose results will be publicly but anonymously released.
Submission method: The results on each task must be sent to the designated contact person, see the Task page. In case of problem, send email to email@example.com.
Evaluation and rewards: To compete towards the prizes, the participants must submit a 6-page paper describing their donated dataset (if they entered as a donor) or their task-solving method(s) and result(s) (if they entered as a solver), before November 21, 2008, to firstname.lastname@example.org (A sample paper and a Latex style file are provided). The challenge participants must append their fact sheet to their paper, see template provided in Latex (sample paper appendix), MS Word, or Acrobat formats. Each participant is allowed to submit several papers, if they address or propose distinct problems. The contributions of the participants will be evaluated by the organizers on the basis of their challenge performance results, the post-challenge tests (see reproducibility), AND the paper, using the following criteria: Performance in challenge and general Usefulness, Novelty and Originality, Sanity, Insight, Reproducibility, and Clarity of presentation. The data donors may provide solutions to their own problems, however such contribution will not count towards winning the prizes. Close collaborators of data donors having access to information, which may give them an unfair advantage, should disclose this fact to the organizers. The best papers will be selected for presentation at the workshop and several Prizes will be awarded.
Reproducibility: Participation is not conditioned on delivering your code nor publishing your methods. However, we will ask the top ranking participants to voluntarily cooperate to reproduce their results. This will include filling out a fact sheet about their methods (get template in Latex, MS Word, or Acrobat formats) and eventually participating to post-challenge tests and sending us their code, including the source code. The outcome of our attempt to reproduce your results will be published and add credibility to your results.
September 15, 2008: challenge start.
October 15, 2008: deadline for (optional) submission of milestone challenge results.
October 20, 2008: public anonymous release of milestone result analysis.
October 24, 2008: workshop abstracts due.
November 12, November 19, 2008: challenge ends (last day to submit challenge results).
November 17, November 20, 2008: challenge results released to participants.
November 21, 2008: JMLR proceedings paper submission deadline.
December 1, 2008: paper notification of acceptance.
December 12, 2008: challenge results publicly released; workshop.