Submission
Competition contributions consist of three parts:
- (i) a system description that describes the competition contribution, including an outline of the concepts, technology, libraries, and tool infrastructure used in the competition, installation instructions, a specification of the version and parameters to be used for the competition,
- (ii) a executable tool (= competition candidate, to be added to the Tester repository), and
- (iii) a tool-info module for BenchExec (to be added to the BenchExec repository), the name of which should be mentioned in the system description.
- (iv) a benchmark definition (to be added to the Test-Comp repository), the name of which should be mentioned in the system description.
In order to participate, two actions are required by the deadline:
- Submit your system description via the Test-Comp 2020 submission site of Easychair. System descriptions have a maximum of 4 pages in LNCS style.
- Register your verifier via the Test-Comp registration form.
- Required for pre-runs: Upload your tester archive via merge request to the Tester repository.
Requirements for (i) System Description
The competition contribution paper should be structured as follows (the structure is recommended, not mandatory; but the below-mentioned information must be provided):
-
Title, Authors, and Abstract
The format is defined in the usual LNCS style. It is a good idea to mention the name of the tool and/or technique (or a combination thereof) in the title. Please mark the jury member in the paper (asterisk after the name and footnote, perhaps). -
1. Test-Generation Approach
A short overview of the theory that the tool is based on. Description of the abstract domains and algorithms that are used. Reference to the concept papers that describe the technical details. -
2. Software Architecture
- Libraries and external tools that the testing tool uses (e.g., parser frontend, SAT solver)
- Software structure and architecture (e.g., components that are used in the competition)
- Implementation technology (e.g., programming language)
-
3. Discussion of Strengths and Weaknesses of the Approach
Evaluation of the results for the benchmark categories, where was the checker successful, where not, why? -
4. Tool Setup and Configuration
- Download instructions (a public web page from which the tool can be downloaded) including a reference to a precise version of the tool (do not refer to ``the latest version'' or such, because that is not stable and not replicable)
- Installation instructions
- Participation statement (a clear statement which categories the tester participates in; consult the rules about opt-out statements)
- Configuration definition (there is one global set of parameters for all categories of the benchmark set, a full definition of the parameter set must be provided); check the rules page under Parameters for more details
-
5. Software Project and Contributors
- Contact info (web page of the project, people involved in the project)
- Information about the software project, licensing, development model, institution that hosts the software project, acknowledgement of contributors
References
Bibliographic references to more in-depth papers about the approaches used in the tool.
Requirements for (ii) Executable Tool, (iii) Tool-Info Module, and (iv) Benchmark Definition
See rules under "Competition Environment and Requirements" and "Qualification".