ABSTRACT
Context: Software Engineering (SE) research with a scientific foundation aims to influence SE practice to enable and sustain efficient delivery of high quality software. Goal: To improve the impact of SE research, one objective is to facilitate practitioners in choosing empirically vetted interventions. Method: Literature from evidence-based medicine, economic evaluations in SE and software economics is reviewed. Results: In empirical SE research, the emphasis has been on substantiating the claims about the benefits of proposed interventions. However, to support informed decision making by practitioners regarding technology adoption, we must present a business case for these interventions, which should comprise not just effectiveness, but also the evidence of cost-effectiveness. Conclusions: This paper highlights the need to investigate and report the resources required to adopt an intervention. It also provides some guidelines and examples to improve support for practitioners in decisions regarding technology adoption.
- N. B. Ali, K. Petersen, and C. Wohlin. A systematic literature review on the industrial use of software process simulation. J. Syst. Software, 97:65--85, 2014. Google ScholarDigital Library
- T. S. Bergmo. Can economic evaluation in telemedicine be trusted? a systematic review of the literature. Cost Effectiveness and Resource Allocation, 7(1):1, 2009.Google ScholarCross Ref
- B. Boehm and L. G. Huang. Value-based software engineering: a case study. Computer, 36(3):33--41, 2003. Google ScholarDigital Library
- B. W. Boehm and K. J. Sullivan. Software economics: status and prospects. Inf. Softw. Technol., 41(14):937--946, 1999.Google ScholarCross Ref
- P. Bourque and R. E. Fairley. Guide to the software engineering body of knowledge v3. IEEE Computer Society, 2014.Google Scholar
- M. Brunetti and et al. Grade guidelines: 10. considering resource use and rating the quality of economic evidence. J. Clin. Epidemiol., 66(2):140--150, 2013.Google ScholarCross Ref
- A. Cockburn and L. Williams. The costs and benefits of pair programming. In G. Succi and M. Marchesi, editors, Extreme Programming Examined, pages 223--243. Addison-Wesley, Boston, MA, USA, 2001. Google ScholarDigital Library
- F. Deissenboeck and M. Pizka. Probabilistic analysis of process economics. Softw. Process Improv. Pract., 13(1):5--17, 2008. Google ScholarDigital Library
- B. Dervaux. Stakes and methods in economic evaluation. Orthopaedics & traumatology, surgery & research: OTSR, 102(2):141, 2016.Google Scholar
- T. Dybå. Contextualizing empirical evidence. IEEE Softw., 30(1):81--83, 2013. Google ScholarDigital Library
- T. Dybå and T. Dingsøyr. Strength of evidence in systematic reviews in software engineering. In Proc. of the Int. Symp. on Empirical Software Engineering and Measurement, ESEM, pages 178--187, 2008. Google ScholarDigital Library
- T. Dybå, B. Kitchenham, and M. Jørgensen. Evidence-based software engineering for practitioners. IEEE Softw., pages 58--65, 2005. Google ScholarDigital Library
- E. Engström, K. Petersen, N. bin Ali, and E. Bjarnason. SERP-test: a taxonomy for supporting industry-academia communication, accepted to Softw. Qual. J., pages, 2016.Google Scholar
- H. Erdogmus. Cost-effectiveness indicator for software development. In Proc. of the 1st Int. Symp. on Empirical Software Engineering and Measurement, ESEM, 2007. Google ScholarDigital Library
- C. for Reviews and Dissemination. Systematic reviews: CRD's guidance for undertaking reviews in health care. Technical report, University of York, 2009.Google Scholar
- T. Gorschek, C. Wohlin, P. Carre, and S. Larsson. A model for technology transfer in practice. IEEE Softw., 23(6):88--95, 2006. Google ScholarDigital Library
- M. Ivarsson and T. Gorschek. Technology transfer decision support in requirements engineering research: a systematic review of REj. Requir. Eng., 14(3):155--175, July 2009. Google ScholarDigital Library
- M. Ivarsson and T. Gorschek. A method for evaluating rigor and industrial relevance of technology evaluations. Empirical Softw. Engg., 16(3):365--395, June 2011. Google ScholarDigital Library
- M. I. Kellner, R. J. Madachy, and D. M. Raffo. Software process simulation modeling: Why ? What ? How ? J. Syst. Software, 46, 1999.Google Scholar
- B. A. Kitchenham, T. Dybå, and M. Jørgensen. Evidence-based software engineering. In Proc. of the 26th Int. Conf. on Software Engineering (ICSE), pages 273--281, 2004. Google ScholarDigital Library
- R. J. Madachy. System dynamics modeling of an inspection-based process. In Proc. of the 18th Int. Conf. on Software Engineering (ICSE), pages 376--386, 1996. Google ScholarDigital Library
- F. McCaffery, D. Šmite, F. G. Wilkie, and D. McFall. A proposed way for european software industries to achieve growth within the global marketplace. Softw. Process Improv. Pract., 11(3):277--285, 2006.Google ScholarCross Ref
- S. M. Mitchell and C. B. Seaman. A comparison of software cost, duration, and quality for waterfall vs. iterative and incremental development: A systematic review. In Proc. of the Int. Symp. on Empirical Software Engineering and Measurement, pages 511--515, 2009. Google ScholarDigital Library
- M. M. Müller and F. Padberg. On the economic evaluation of XP projects. In Proc. of the 11th ACM SIGSOFT Symp. on Foundations of Software Engineering, pages 168--177, 2003. Google ScholarDigital Library
- NICE. Process and methods guides, the guidelines manual. Technical report, National Institute for Health and Clinical Excellence (NICE), 2012.Google Scholar
- A. Rainer, D. Jagielska, and T. Hall. Software engineering practice versus evidence-based software engineering research. ACM SIGSOFT Software Engineering Notes, 30(4):1--5, 2005. Google ScholarDigital Library
- I. Shemilt, M. Mugford, M. Drummond, E. Eisenstein, J. Mallender, D. McDaid, L. Vale, and D. Walker. Economics methods in cochrane systematic reviews of health promotion and public health related interventions. BMC Med. Res. Methodol., 6(1):1--11, 2006.Google ScholarCross Ref
- M. Shepperd. How do I know whether to trust a research result? IEEE Softw., 32(1):106--109, 2015.Google ScholarCross Ref
- J. E. Siegel, M. C. Weinstein, L. B. Russell, and M. R. Gold. Recommendations for reporting cost-effectiveness analyses. JAMA, 276(16):1339--1341, 1996.Google ScholarCross Ref
- D. I. K. Sjøberg, T. Dybå, and M. Jørgensen. The future of empirical methods in software engineering research. In Proc. of the Workshop on the Future of Software Engineering, FOSE, pages 358--378, 2007. Google ScholarDigital Library
- D. Šmite and R. van Solingen. What's the true hourly cost of offshoring? IEEE Softw., 2015.Google Scholar
- W. F. Tichy, P. Lukowicz, L. Prechelt, and E. A. Heinz. Experimental evaluation in computer science: A quantitative study. J. Syst. Software, 28(1):9--18, 1995. Google ScholarDigital Library
- C. Wohlin. Writing for synthesis of evidence in empirical software engineering. In Proc. of the 8th ACM/IEEE Int. Symp. on Empirical Software Engineering and Measurement, ESEM, pages 1--4, New York, NY, USA, 2014. ACM. Google ScholarDigital Library
Recommendations
Ten years with evidence-based software engineering. What is it? Has it had any impact? What's next?
FSE 2014: Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software EngineeringAn evidence-based software engineer is one who is able to: 1) Formulate a question, related to a decision or judgment, so that it can be answered by the use of evidence, 2) Collect, critically evaluate and summarise relevant evidence from research, ...
Engaging the net generation with evidence-based software engineering through a community-driven web database
Software engineering faculty face the challenge of educating future researchers and industry practitioners regarding the generation of empirical software engineering studies and their use in evidence-based software engineering. In order to engage the ...
How Important Is Evidence, Really?
The utility of evidence in the adoption of software engineering ideas depends on several factors. The type of evidence, the adoption context, the attitudes of decision makers, and the size of the idea and its bundle all play a role in the adoption ...
Comments