Security research is often criticized for the poor reproducibility of its results. Unfortunately, authors seldom release the software they develop and the datasets they use to perform their experiments. This makes it difficult to compare different solutions and force other researchers to undergo the tedious and error-prone task of re-implementing previous approaches and to compare solutions on different datasets, which may not be a fair comparison.
To help improve this situation, ACSAC encourages authors of accepted papers to submit software and data artifacts and make them publicly available to the entire community. These artifacts are not part of the paper evaluation. Their submission is strictly optional and occurs only after a paper has been accepted - to prevent any influence on the decision process. Authors who decide to participate in this program will interact with a special committee dedicated to verifying the submitted artifacts (e.g., to test that source code compiles and runs correctly, or that datasets content match their description). Authors can decide what they want to submit (software, data, or both) and the verification procedure will take place in parallel with the preparation of the camera-ready version of the paper. The authors of the submitted artifacts need to commit to keeping them available online on a publicly accessible website for a minimum period of three months between October and December.
We believe that this is an important initiative that can help the entire community increase its reputation, and make research in the security field proceeds faster by taking advantage of systems previously built by other researchers. Therefore we plan to reward authors who participate in this program with a special mention during the conference and on the ACSAC webpage, a stamp of reproducibility on their papers, and (if enough authors participate to the program) by reserving a Distinguished Paper Award for this group.