CALL FOR ARTIFACT

Repeatability Evaluation Guidelines

Repeatability in cyber-physical systems research is critical. Therefore, this year, we are strongly encouraging authors of accepted regular papers to submit a repeatability package. The program committee will evaluate this package. Authors of accepted repeatable packages will receive one or more badges that will be included on the first page of the published version. These papers will also be highlighted on the conference website. The submission date will be approximately 2 weeks after the notifications for the papers are sent out.

The Repeatability Evaluation Package (REP) consists of several components:

  • A copy (in pdf format) of the accepted paper with an appendix that explains the following:
    • What elements of the paper are included in the REP (e.g., figures, tables, etc.). 
    • Instructions for installing the software.
    • Instructions for running the software. Ideally, there is a short and easy-to-run script for each computational component in the paper.
    • The system requirements for running the REP (e.g.: OS, compilers, environments, etc.). The document should also include a description of the host platform used to prepare and test the docker image or virtual machine.
    • Expected resource requirements. What architecture did you use for your experiments and how long is your code expected to run?
  • The software. Please prepare either a:
    • Docker Image or
    • Virtual Machine. You may use VirtualBox to save a VM image as an OVA file.
    • If your experiments use open-source simulators, please enclose them within the virtual machine or docker image. 
    • If the previous options are not viable, please contact the RE PC chairs to make other arrangements. For example, if your software uses other licensed software (e.g., Matlab) which cannot be included in a VM or Docker image.
  • Any data used in your experiments
    • Provide data as a tar.gz file with instructions for mounting it into the docker image or the virtual machine and passing the data as input to the code. 
    • If the data is large, then please share it using an appropriate cloud storage solution. You mustnโ€™t share any proprietary data that cannot be made open source. 

The REP should be made available via a link. When possible, we encourage authors to make their artifact public by uploading it to an irrevocable repository such as Zenodo (this is required for the Available badge โ€“ see below). For non-public artifacts, a link to GitHub or Google Drive is acceptable; please donโ€™t use your personal website. In any case, the link should remain accessible throughout the review process. If changes to the artifact are required based on issues that arise early in the review process, a new link can be provided (e.g. to a second version of the Zenodo record).

To be accepted, each REP needs to achieve satisfactory performance along the following three dimensions:

  • Coverage. How many of the computational components in the paper can be reproduced?
  • Documentation quality. Is the REP clearly documented, including system requirements, resource requirements, installation instructions and execution instructions?
  • Ease of reuse. Is the REP generally easy to reuse, i.e., is the code well documented, are the scripts clear and easy to run?

Papers with accepted REPs will receive the Reviewed and Reproducible IEEE Xplore badges. If the REP was also posted to Zenodo (or a similar repository), the paper will additionally receive the Available badge.

Submission Page

Please go to Easychair to submit your Repeatability Evaluation Package (REP).

Important Dates

  • Artifact Due: Feb 2, 2024 
  • Author Response Period: Feb 10-14, 2024  
  • RE Decisions: Mar 6, 2024  
  • Camera-ready Version Due: Mar 9, 2024