Empirical software engineering, a field dedicated to rigorously studying software development, continues to evolve despite significant progress over the last several decades. Justus Bogner from Vrije Universiteit Amsterdam and Roberto Verdecchia from University of Florence, along with their colleagues, recognise that challenges remain in areas such as research reproducibility and translating findings into practical industrial applications. To address these issues and foster improvement within the field, they introduce a new regular column in ACM SIGSOFT SEN, dedicated to openly discussing the often-unspoken aspects of empirical software engineering research. This column aims to stimulate conversation and reflection on crucial topics, drawing on expert insights from interviews, surveys and focused discussions, ultimately seeking to enhance the quality, communication and impact of software engineering research.
Empirical Software Engineering, Rigor, Openness and Discussion
This article advocates for increased rigor, openness, and community discussion within Empirical Software Engineering (ESE). The authors highlight the need to move beyond simply conducting empirical research to conducting it well and effectively sharing knowledge, emphasizing careful experimental design, robust statistical analysis, and wider adoption of established standards. Open Science practices, including open data, open code, and pre-registration of studies, are championed as crucial for enhancing reproducibility, transparency, and building trust in ESE findings. The authors also stress the importance of open dialogue within the ESE community, particularly regarding methodological challenges and the interpretation of results.
They contend that impact extends beyond publications, emphasizing the value of sharing practical insights and fostering a culture of continuous improvement. They acknowledge the growth of ESE but suggest a need for greater maturity, referencing work that highlights potential pitfalls and emphasizing the importance of validity. Furthermore, the authors highlight the need for more open discussion, referencing work on how ESE is discussed on social media. They reference key resources in the field, including texts on experimentation and contemporary empirical methods in software engineering.
Advancing Empirical Software Engineering Through Critical Discussion
This work introduces a new regular column dedicated to advancing Empirical Software Engineering (ESE) research through critical discussion of its methodologies and practices. The initiative pioneers a multi-faceted approach to improvement, grounded in expert interviews, focus groups, surveys, and position pieces, aiming to foster reflection and enhance the quality of ESE. Over subsequent decades, contributions from Barbara Kitchenham, Carolin Seaman, Natalia Juristo, and Claes Wohlin expanded the methodological toolkit for ESE, moving research from small-scale lab experiments to larger industrial case studies.
Contemporary ESE leverages a diverse range of techniques, including quantitative and qualitative approaches, data mining, and systematic secondary studies. The field is supported by dedicated journals, conferences, and modern textbooks. Despite this progress, the study identifies areas for improvement, including suboptimal reproducibility, difficulties in accumulating evidence through replications, and inconsistent education of students in modern ESE practices. Researchers address these deficiencies through critical analysis and community engagement, aiming to establish a more rigorous and impactful field of empirical software engineering.
Advancing Empirical Software Engineering Research Practices
This work introduces a new regular column within ACM SIGSOFT SEN dedicated to meta-aspects of Empirical Software Engineering (ESE) research. Recognizing the maturity of the ESE field, the authors highlight the need for continued evolution and improvement in research practices. The column aims to foster discussion on often-implicit topics crucial to advancing the field, ranging from replication packages and statistical methods to interview techniques and interdisciplinary collaboration. The authors specifically intend to address facets of ESE research that are not always explicitly documented, making it difficult for newcomers to fully grasp best practices. Topics will include the creation of effective replication packages, the appropriate use of statistical methods, and guidelines for conducting and transcribing interviews. Researchers identify persistent challenges in ESE, including issues with reproducibility, the accumulation of evidence through replication, and the tendency for peer review to favour novelty over rigorous investigation. The authors highlight a lack of explicit documentation regarding best practices within the discipline, particularly for early-career researchers. To address this, the team initiated a project to document the common structure and effective writing strategies for ESE papers, aiming to share tacit knowledge and support newcomers.
This effort culminated in a published document and the establishment of the SEN-ESE column, intended as a forum for discussing often-implicit aspects of ESE research. The column will draw upon expert interviews, focus groups, surveys, and position pieces to encourage reflection and improvement in how ESE is conducted, communicated, and taught. The authors acknowledge that many challenges remain within ESE, and that the column represents an initial step towards addressing them, fostering open discussion and knowledge sharing within the community.
👉 More information
🗞 ACM SIGSOFT SEN Empirical Software Engineering: Introducing Our New Regular Column
🧠 ArXiv: https://arxiv.org/abs/2510.02007
