IEEE International Conference on Software Analysis, Evolution and Reengineering

The 30th edition of the International Conference on Software Analysis, Evolution, and Reengineering (SANER’23) would like to encourage researchers to (1) reproduce results from previous papers and (2) publish studies with important and relevant negative or null results (results which fail to show an effect, yet demonstrate the research paths that did not pay off).

We would also like to encourage the publication of the negative results or reproducible aspects of previously published work (in the spirit of journal-first submissions). This previously published work includes accepted submissions for the 2023 SANER main track.

  1. Reproducibility studies. Inspired by ISSTA’23 Reproducibility studies, the papers in this category must go beyond simply re-implementing an algorithm and/or re-running the artifacts provided by the original paper. Such submissions should at least apply the approach on new data sets (open-source or proprietary). Particularly, reproducibility studies are encouraged to target techniques that previously were evaluated only on proprietary or open-source systems. A reproducibility study should clearly report on results that the authors were able to reproduce as well as on the aspects of the work that were irreproducible. We encourage reproducibility studies to follow the ACM guidelines on reproducibility (different team, different experimental setup): “The measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artifacts, which they develop completely independently.”
  2. Negative results papers. We seek papers that report on negative results. We seek negative results for all types of software engineering research in any empirical area (qualitative, quantitative, case study, experiment, etc.). For example, did your controlled experiment on the value of dual monitors in pair programming not show an improvement over single monitors? Even if negative, results obtained are still valuable when they are either not obvious or disprove widely accepted wisdom. As Walter Tichy writes, “Negative results, if trustworthy, are extremely important for narrowing down the search space. They eliminate useless hypotheses and thus reorient and speed up the search for better approaches.”

Evaluation Criteria

Both Reproducibility Studies and Negative Results submissions will be evaluated according to the following standards:

  • Depth and breadth of the empirical studies
  • Clarity of writing
  • Appropriateness of conclusions
  • Amount of useful, actionable insights
  • Availability of artifacts
  • Underlying methodological rigor. For example, a negative result due primarily to misaligned expectations or due to lack of statistical power (small samples) is not a good submission. The negative result should be a result of a lack of effect, not lack of methodological rigor.

Most importantly, we expect reproducibility studies to clearly point out the artifacts the study is built upon, and to provide the links to all the artifacts in the submission (the only exception will be given to those papers that reproduce the results on proprietary datasets that can not be publicly released).


Submission Instructions

Submissions must be original, in the sense that the findings and writing have not been previously published or under consideration elsewhere. However, as either reproducibility studies or negative results, some overlap with previous work is expected. Please make that clear in the paper.

Publication format should follow the SANER guidelines. Choose “RENE:Replication” or “RENE:NegativeResult” as the submission type.

Length: There are two formats. Appendices to conference submissions or previous work by the authors can be described in 4 pages. New reproducibility studies and new descriptions of negative results will have a length of 10 pages.


Important note: the RENE track of SANER 2023 DOES NOT FOLLOW a full double-blind review process.


Important Dates

  • Abstract submission deadline: November 11, 2022 AoE
  • Paper submission deadline: November 20, 2022 AoE (Extended)
  • Notifications: December 17 2022 AoE
  • Camera Ready: January 13, 2023 AoE
  • Submission Page: https://easychair.org/conferences/?conf=saner2023