skip to main content
10.1145/2745802.2745823acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

Support mechanisms to conduct empirical studies in software engineering: a systematic mapping study

Authors Info & Claims
Published:27 April 2015Publication History

ABSTRACT

Context: Empirical studies are gaining recognition in the Software Engineering (SE) research community, allowing improved quality of research and accelerating the adoption of new technologies in the software market. However, empirical studies in this area are still limited. In order to foster empirical research in SE, it is essential to understand the resources available to aid these studies. Goal: Identify support mechanisms (methodology, tool, guideline, process, etc.) used to conduct empirical studies in the Empirical Software Engineering (ESE) community. Method: We performed a systematic mapping study that included all full papers published at EASE, ESEM and ESEJ since their first editions. Were selected 891 studies between 1996 and 2013. Results: A total of 375 support mechanisms were identified. We provide the full list of mechanisms and the strategies that uses them. Despite this, we identified a high number of studies that do not cite any mechanism to support their empirical strategies: 433 studies (48%). Experiment is the strategy that has more resources to support their activities. And guideline was the most used type of mechanism. Moreover we observed that the most mechanisms used as reference to empirical studies are not specific to SE area. And some mechanisms were used only in specific activities of empirical research, such as statistical and qualitative data analysis. Experiment and case studies are the strategies most applied. Conclusions: The use of empirical methods in SE has increased over the years. Despite this, many studies did not apply these methods and do not cite any resource to guide their research. Therefore, the list of support mechanisms, where and how they were applied is a major asset to the SE community. Such asset can encourage empirical studies aiding the choice regarding which strategies and mechanisms to use in a research, as well as pointing out examples where they were used, mainly to novice researchers. We also identified new perspectives and gaps that foster other research for the improvement of empirical research in this area.

References

  1. Sjoberg, D. I., Dyba, T., and Jorgensen, M. 2007. The Future of Empirical Methods in Software Engineering Research. In 2007 Future of Software Engineering (May 23--25, 2007). International Conference on Software Engineering. IEEE Computer Society, Washington, DC, 358--378. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. N. Juristo and A. M. Moreno. Basics of software engineering experimentation. Springer Publishing Company, Incorporated, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. C. Wohlin, P. Runeson, M. Hst, M. C. Ohlsson, B. Regnell, and A. Wessln. Experimentation in software engineering. Springer Publishing Company, Incorporated, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. V. R. Basili. The role of experimentation in software engineering: past, current, and future. In Proceedings of the 18th international conference on Software engineering, pages 442--449. IEEE Computer Society, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. V. R. Basili, R. W. Selby, and D. H. Hutchens. Experimentation in software engineering. Software Engineering, IEEE Transactions on, (7): 733--743, 1986. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. G. H. Travassos, P. S. M. dos Santos, P. Neto, and J. Biolchini. An environment to support large scale experimentation in software engineering. In Engineering of Complex Computer Systems, 2008. ICECCS 2008. 13th IEEE on, pages 193--202. IEEE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. B. Kitchenham, L. Pickard, and S. L. Pfleeger. Case studies for method and tool evaluation. Software, IEEE, 12(4): 52--62, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. W. F. Tichy. Should computer scientists experiment more? Computer, 31(5): 32--40, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Kitchenham, B., and S. Charters. Guidelines for performing Systematic Literature Reviews in Software Engineering. Technical Report. Keele University and University of Durham. 2007.Google ScholarGoogle Scholar
  10. S. Easterbrook, J. Singer, M.-A. Storey, and D. Damian. Selecting empirical methods for software engineering research. In Guide to advanced empirical software engineering, pages 285--311. Springer, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  11. Kitchenham, B., Brereton, O. P., Budgen, D., Turner, M., Bailey, J., Linkman, S. Systematic literature reviews in software engineering - A systematic literature review. J. of Information and Software Technology. 51, 1 (Jan. 2011), 7--15. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. T. Dyba, B. A. Kitchenham, and M. Jørgensen. Evidence-based software engineering for practitioners. Software, IEEE, 22(1): 58--65, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. R. K. Yin. Case study research: Design and methods, volume 5. Sage, 2009.Google ScholarGoogle Scholar
  14. M. Jørgensen. A strong focus on low price when selecting software providers increases the likelihood of failure in software outsourcing projects. In Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering, pages 220--227. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. V. R. B. G. Caldiera and H. D. Rombach. The goal question metric approach. Encyclopedia of software engineering, 2: 528--532, 1994.Google ScholarGoogle Scholar
  16. B. A. Kitchenham, S. L. Pfleeger, L. M. Pickard, P. W. Jones, D. C. Hoaglin, K. El Emam, and J. Rosenberg. Preliminary guidelines for empirical research in software engineering. IEEE Transactions on Software Engineering, 28(8): 721--734, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kitchenham, B. 2004. Procedures for performing systematic reviews. Technical Report. Keele University at Staffordshire.Google ScholarGoogle Scholar
  18. J. Corbin and A. Strauss. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage, 2008.Google ScholarGoogle Scholar
  19. B. G. Glaser and A. L. Strauss. The discovery of grounded theory: Strategies for qualitative research. Transaction Books, 2009.Google ScholarGoogle Scholar
  20. V. R. Basili, F. Shull, and F. Lanubile. Building knowledge through families of experiments. Software Engineering, IEEE Transactions on, 25(4): 456--473, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. C. Robson. Real world research: A resource for social scientists and practitioner-researchers, volume 2. Blackwell Oxford, 2002.Google ScholarGoogle Scholar
  22. A. Jedlitschka and D. Pfahl. Reporting guidelines for controlled experiments in software engineering. In Empirical Software Engineering, 2005. 2005 International Symposium on, pages 10-pp. IEEE, 2005.Google ScholarGoogle Scholar
  23. Seaman, C. B. 1999. Qualitative Methods in Empirical Studies of Software Engineering. IEEE Transactions on Software Engineering, 25(4), 557--572. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. P. Runeson, M. Host, A. Rainer, and B. Regnell. Case study research in software engineering: Guidelines and examples. Wiley, 1 edition, April 10, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. S. L. Plfeeger and B. A. Kitchenham. Principles of survey research: part 1: turning lemons into lemonade. ACM SIGSOFT Software Engineering Notes, 26(6): 16--18, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. A. Gordon. Surveymonkey.com A web-based survey and evaluation system: http://www.surveymonkey.com. The Internet and Higher Education, 5(1): 83--87, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  27. M. Hammersley and P. Atkinson. Ethnography: Principles in practice. Psychology Press, 1995.Google ScholarGoogle Scholar
  28. D. M. Fetterman. Ethnography: Step-by-step, volume17. Sage, 2010.Google ScholarGoogle Scholar
  29. D. E. Avison, F. Lau, M. D. Myers, and P. A. Nielsen. Action Research. Communications of the ACM, vol. 42, pp. 94--97, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. R. L. Baskerville. Investigating information systems with action research. Communications of the AIS, 2(3es): 4, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. C. Passos, D. S. Cruzes, T. Dyba, and M. Mendon ca. Challenges of applying ethnography to study software practices. In Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement, pages 9--18. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Support mechanisms to conduct empirical studies in software engineering: a systematic mapping study

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          EASE '15: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering
          April 2015
          305 pages
          ISBN:9781450333504
          DOI:10.1145/2745802

          Copyright © 2015 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 27 April 2015

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          EASE '15 Paper Acceptance Rate20of65submissions,31%Overall Acceptance Rate71of232submissions,31%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader