Submission
Since Test-Comp 2026, tool submissions consist of the following parts only.
-
an archive with the tool
published at Zenodo
The archive has to meet the following requirements:- The archive is publicly available and contains a LICENSE that allows reproduction and evaluation of the tool by anybody and does not place any restriction on the tool output (log files, witnesses).
- The archive contains a README file that describes the contents.
- The archive contains a script smoketest.sh that runs the tool on some simple example(s) that are also in the archive.
- The tool is archived in a ZIP file (.zip), which contains exactly one directory (no tarbomb), with the files LICENSE, README, smoketest.sh, and all other files and folders of the tool.
- The tool does not exceed an stdout/stderr limit of at most 2 MB.
- The tool has an option to report its version (ideally --version).
- The archive does not contain large amounts of unnecessary data, such as repository data (.svn, .git), source files, aux folders like __MACOSX, and test files.
- The tool should not require any special software on the competition machines; all necessary libraries and external tools should be contained in the archive. Standard packages that are available as Ubuntu packages can be requested via an entry in the FM-Tools repository.
- The tool should be executable from any path, and should not expect the working directory to match the tool directory.
-
an entry at the FM-Tools repository
named <short_tool_name>.yml
The structure of the file is described in the repository. In particular, the file has to contain- a version entry containing the DOI of the archive mentioned above,
- a participation declaration,
- a jury member who is (a) a contributing designer/developer of the submitted tool (witnessed by occurrence of the person’s name on the tool's project web page, a tool paper, or in the revision logs) or (b) authorized by the competition organizer (after the designers/developers of the tool were contacted about the participation),
- label meta_tool if the tool is a meta-verifier,
- label ai if the tool uses an LLM or other sophisticated kind of artificial intelligence, and
- description that shortly explains the principles of the tool and the components it uses.
- a tool-info module for BenchExec (to be added to the BenchExec repository; the tool-info module enables BenchExec to abstract from the concrete command-line interface by assembling the command line, parsing the output to determine the result/status, and to get tool version),
- a merge request to category-structure.yml saying in which categories the tool wants to participate
- Additionally, new jury members have to announce their e-mail address to the organizer.
Note that in contrast to previous years, there is no paper submission/registration to EasyChair needed for the participation. There will be a separate and optional submission deadline for short papers (called system descriptions) after the competition results are publicly available.
Requirements for (i) System Description
The competition contribution paper should be structured as follows (the structure is recommended, not mandatory; but the below-mentioned information must be provided):
-
Title, Authors, and Abstract
The format is defined in the usual LNCS style. It is a good idea to mention the name of the tool and/or technique (or a combination thereof) in the title. Please mark the jury member in the paper (asterisk after the name and footnote, perhaps). -
1. Test-Generation Approach
A short overview of the theory that the tool is based on. Description of the abstract domains and algorithms that are used. Reference to the concept papers that describe the technical details. -
2. Software Architecture
- Libraries and external tools that the testing tool uses (e.g., parser frontend, SAT solver)
- Software structure and architecture (e.g., components that are used in the competition)
- Implementation technology (e.g., programming language)
-
3. Discussion of Strengths and Weaknesses of the Approach
Evaluation of the results for the benchmark categories, where was the checker successful, where not, why? -
4. Tool Setup and Configuration
- Download instructions (a public web page from which the tool can be downloaded) including a reference to a precise version of the tool (do not refer to ``the latest version'' or such, because that is not stable and not replicable)
- Installation instructions
- Participation statement (a clear statement which categories the tester participates in; consult the rules about opt-out statements)
- Configuration definition (there is one global set of parameters for all categories of the benchmark set, a full definition of the parameter set must be provided); check the rules page under Parameters for more details
-
5. Software Project and Contributors
- Contact info (web page of the project, people involved in the project)
- Information about the software project, licensing, development model, institution that hosts the software project, acknowledgement of contributors
-
6. Data-Availability Statement
Publish (e.g., on Zenodo) your tool archive and reference it here (via DOI) to ensure reproducibility, also provide the URL of the project web site and repository. References
Bibliographic references to more in-depth papers about the approaches used in the tool.
Requirements for (ii) Executable Tool, (iii) Tool-Info Module, and (iv) Benchmark Definition
See rules under "Competition Environment and Requirements" and "Qualification".