- Lionel Briand (University of Luxembourg) on Search-based Testing for Cyberphysical Systems
- Juan P. Galeotti, Program Co-Chair (Universidad de Buenos Aires, Argentina)
- Justyna Petke, Program Co-Chair (University College London, UK)
- Annibale Panicella, Competition Chair (Delft University of Technology, Netherlands)
- Paper Submission:
January 20, 2017 January 27, 2017 (extended)
- Author Notification:
February 17, 2017
February 27, 2017
- Dates of Workshop: May 22-23, 2017
All papers must conform to the IEEE Formatting Guidelines at http://icse2017.gatech.edu/?q=submission-guidelines.
All submissions must be anonymised and in PDF format. All submissions should be performed electronically through EasyChair.
Accepted papers will be published as an ICSE 2017 Workshop Proceedings in the ACM and IEEE Digital Libraries. The official publication date of the workshop proceedings is the date the proceedings are made available in the ACM Digital Library. This date may be up to two weeks prior to the first day of ICSE 2017. The official publication date affects the deadline for any patent filings related to published work.
Call for Papers
Researchers and practitioners are invited to submit:
- Full papers (maximum of 7 pages) to the workshop on original research, either empirical or theoretical, in SBST, practical experience of using SBST techniques and/or SBST tools.
- Short papers (maximum of 4 pages) that describe novel techniques, ideas and positions that have yet to be fully developed; or are a discussion of the importance of a recently published SBST result by another author in setting a direction for the SBST community, and/or the potential applicability (or not) of the result in an industrial context.
- Position papers (maximum of 2 pages) that analyze trends in SBST and raise issues of importance. Position papers are intended to seed discussion and debate at the workshop, and thus will be reviewed with respect to relevance and their ability to spark discussions.
- Tool Competition entries (maximum of 4 pages). We invite researchers, students, and tool developers to design innovative new approaches to software test generation. More information will be available soon on the workshop website.
In all cases, papers should address a problem in the software testing/verification/validation domain or combine elements of those domains with other concerns in the software engineering lifecycle. Examples of problems in the software testing/verification/validation domain include (but are not limited to) generating testing data, prioritizing test cases, constructing test oracles, minimizing test suites, verifying software models, testing service-orientated architectures, constructing test suites for interaction testing, and validating real-time properties.
The solution should apply a metaheuristic search strategy such as (but not limited to) random search, local search (e.g. hill climbing, simulated annealing, and tabu search), evolutionary algorithms (e.g. genetic algorithms, evolution strategies, and genetic programming), ant colony optimization, and particle swarm optimization.
Help spread the word! Download the Call for Submissions in PDF format!