Skip to main content

Paper Artifacts

Artifacts registration deadline: August 24 - 11:59pm EDT
Artifacts submission deadline: August 28 September 1 - 11:59pm EDT
Artifacts evaluation reviewer questions period: August 31 - September 20
Artifacts evaluation decision: September 20 September 23
Final papers with AE badge labels due: refer to camera-ready deadline

Artifact Evaluations Chair: Roberto Perdisci, University of Georgia

Security research is often criticized for the poor reproducibility of its results. Unfortunately, authors seldom release the software they develop and the datasets they use to perform their experiments. This makes it difficult to compare different solutions and force other researchers to undergo the tedious and error-prone task of re-implementing previous approaches and to compare solutions on different datasets, which may not be a fair comparison.

To help improve this situation, ACSAC encourages authors of accepted papers to submit software and data artifacts and make them publicly available to the entire community. Releasing software and data artifacts represents an important step towards facilitating the reproducibility of research results, and ultimately contributes to the real-world deployment of novel security solutions. These artifacts are not part of the paper evaluation. Their submission is strictly optional and occurs only after a paper has been accepted - to prevent any influence on the decision process. Authors who decide to participate in this program will interact with a special committee dedicated to verifying the submitted artifacts (e.g., to test that source code compiles and runs correctly, or that datasets content match their description). Authors can decide what they want to submit (software, data, or both) and the verification procedure will take place in parallel with the preparation of the camera-ready version of the paper. The authors of the submitted artifacts need to commit to keep them available online on a publicly accessible website for a minimum period of three months, between October and December 2020.

We believe that this is an important initiative that can help the entire community increase its reputation, and make research in the security field proceed faster by taking advantage of systems previously built by other researchers. Therefore, we plan to reward authors who participate in this program with a special mention during the conference and on the ACSAC webpage, an ACM Artifacts Evaluated badge on their papers, and (if enough authors participate to the program) by reserving a Distinguished Paper Award for this group.

Submission Guidelines

Artifacts registration

By the registration deadline, submit the title of your accepted paper along with a short, abstract-like description of the artifact that will be submitted. The artifact submission abstract (not to be confused with the paper abstract) should describe whether the submitted artifact will consist of software and/or data, and in what parts of the accepted paper the software/data to be submitted was used (e.g., what evaluation subsections or experiments). If software is submitted, the abstract should also mention if source code or binaries will be submitted, and what are the main dependencies and requirements for executing the code (e.g., OS version, special hardware, specific compilers, etc.). If data is submitted, high-level information about the size, format, and schema should be provided.

Artifacts submission process

By the submission deadline, submit a PDF document containing instructions on how to download the artifact and where to find the documentation. Submissions must be made using the artifacts submissions system. For instance, the PDF document to be submitted may simply contain a description sentence and a URL pointing to a GitHub repository where source code, data, and documentation for the artifact can be found (this is just an example). For your artifact submission to be considered, you will need to check the "ready for review" checkbox.

Submit Artifacts

To maximize the probability of your submission receiving a positive evaluation, we highly recommend to take the following steps:

  • Create a single repository (e.g. on GitHub) or a stable webpage that contains both the artifact download instructions as well as the documentation for compiling/running any source code or for interpreting any artifact data.
  • The reviewers will refer back to your paper. However, evaluating the artifacts will very likely require more detailed information than what is contained in the paper itself. The artifact documentation should contain references to appropriate paper sections where the artifact was used (e.g., specific subsection of the Evaluation section of the paper) and to the data that the reviewer is expected to obtain by running/interpreting the artifact.
  • If you are submitting code to be compiled or binaries to be executed, we highly recommend that you create a container (using Docker) or Virtual Machine image (using VirtualBox) in which the software can compile and/or execute correctly. This will make it much less likely for the reviewer to encounter numerous problems with OS version and software package dependencies that may be difficult to resolve in a timely fashion.
  • Similarly, if you are submitting data along with a set of data analysis scripts, the container or VM should include both the artifact data, the analysis scripts, and necessary software dependencies for the scripts to produce the expected analysis results.
  • Even though you can submit a container or VM, please make sure that the artifact documentation includes all necessary instructions to satisfy software/library dependencies. Ideally, the reviewer should be able to compile/execute your code even outside of your container or VM by following the instructions in your artifact documentation.
  • If your artifact consists of a live service instance on the Web, make sure to document how the Web service must be used, and that the service is reachable for the entire evaluation period.
  • If your artifact depends on software that must run on a custom system, such as a high-performance computing cluster, cloud infrastructure, special hardware (e.g., SGX, GPU, TPU), or if it requires significant computing resources of other types, you should make arrangements to provide the reviewers with remote access to such resources; this will allow the reviewers to execute your software in the required special environment.
  • If exercising the artifacts requires significant amounts of time (e.g., software compilation/execution requires several hours) to produce the expected results, this should be explicitly explained in the artifact documentation and in the artifact submission abstract. The documentation should provide an execution time estimate, along with a description of the computing environment in which the estimate was obtained.

Submit Artifacts

Artifacts evaluation

The evaluation process will follow the ACM review guidelines for assigning an Artifact Evaluated badge at the Functional or (when possible) Reusable level, as outlined in the following ACM document:
https://www.acm.org/publications/policies/artifact-review-badging

Specifically, to be granted a badge artifacts should be:

  • Documented: At minimum, an inventory of artifacts is included, and sufficient description provided to enable the artifacts to be exercised.
  • Consistent: The artifacts are relevant to the associated paper, and contribute in some inherent way to the generation of its main results.
  • Complete: To the extent possible, all components relevant to the paper in question are included. (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)
  • Exercisable: Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.

To facilitate the evaluation process, reviewers may contact the artifact corresponding authors for questions and clarifications. The corresponding authors must be available to assist in the evaluation during the entire reviewer questions period (see deadlines atop this page). Consequently, artifact submissions must not be anonymous, and should instead include the names and contact information of the corresponding authors for the related accepted paper.

Final papers preparation

In addition to the main ACSAC final paper preparation guidelines, the authors whose artifacts are successfully evaluated and assigned an ACM Artifact Evaluated badge will need to include the related ACM badge label into their papers before final submission. Additional instructions will be provided after the artifact evaluation decisions are notified to the authors.

Contacts

If you have any questions, please contact the Artifacts Evaluations Chair at perdisci@cs.uga.edu.