Skip to main content
Log in

A large-scale study of call graph-based impact prediction using mutation testing

  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

In software engineering, impact analysis involves predicting the software elements (e.g., modules, classes, methods) potentially impacted by a change in the source code. Impact analysis is required to optimize the testing effort. In this paper, we propose an evaluation technique to predict impact propagation. Based on 10 open-source Java projects and 5 classical mutation operators, we create 17,000 mutants and study how the error they introduce propagates. This evaluation technique enables us to analyze impact prediction based on four types of call graph. Our results show that graph sophistication increases the completeness of impact prediction. However, and surprisingly to us, the most basic call graph gives the best trade-off between precision and recall for impact prediction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. Also called the “estimated impact set” (EIS) in Arnold and Bohner (1993).

  2. Bohner named this set the “discovered impact set” (DIS), but this naming is not appropriate in our context and may be confusing.

  3. https://github.com/v-m/PropagationAnalysis. The version used for extracting graphs and running the experiments of this paper is version tag g1.

  4. http://javalanche.org.

  5. http://pitest.org.

  6. http://cloc.sourceforge.net/.

  7. https://github.com/v-m/PropagationAnalysis-dataset.

References

  • Acharya, M., & Robinson, B. (2012). Practical change impact analysis based on static program slicing for industrial software systems. In Proceedings of the 20th international symposium on the foundations of software engineering, FSE’12, ACM, (pp. 13:1–13:2) New York, NY, USA. doi:10.1145/2393596.2393610.

  • Antoniol, G., Canfora, G., Casazza, G., & de Lucia, A. (2000). Identifying the starting impact set of a maintenance request: A case study. In Proceedings of the conference on software maintenance and reengineering, CSMR’00, IEEE Computer Society, (p. 227)Washington, DC, USA.

  • Arnold, R. S., & Bohner, S. A. (1993). Impact analysis—Towards a framework for comparison. In Proceedings of the conference on software maintenance (pp. 292–301), ICSM’93 Washington, DC, USA: IEEE Computer Society.

  • Binkley, D., Gold, N., Harman, M., Islam, S., Krinke, J., & Yoo, S. (2014). ORBS: Language-independent program slicing. In Proceedings of the 22Nd ACM SIGSOFT international symposium on foundations of software engineering, FSE 2014, ACM, (pp. 109–120). New York, NY, USA. doi:10.1145/2635868.2635893.

  • Binkley, D., Gold, N., Harman, M., Islam, S., Krinke, J., & Yoo, S. (2015). ORBS and the limits of static slicing. In 2015 IEEE 15th international working conference on source code analysis and manipulation (SCAM), (pp. 1–10). doi:10.1109/SCAM.2015.7335396.

  • Bohner, S. (2002). Software change impacts—An evolving perspective. In Proceedings of the international conference on software maintenance, ICSM’02 (pp. 263–272). doi:10.1109/ICSM.2002.1167777.

  • Bohner, S. A., & Arnold, R. S. (1996). Software change impact analysis. Los Alamitos, CA: IEEE Computer Society Press.

    Google Scholar 

  • Cai, H., Jiang, S., Santelices, R., Zhang, Y. J., & Zhang, Y. (2014). SENSA: Sensitivity analysis for quantitative change-impact prediction. In Proceedings of the 14th international working conference on source code analysis and manipulation, SCAM’14, IEEE Computer Society, (pp. 165–174). Washington, DC, USA. doi:10.1109/SCAM.2014.25.

  • Challet, D., & Lombardoni, A. (2004). Bug propagation and debugging in asymmetric software structures. Physical Review E, 70(4), 046109. doi:10.1103/PhysRevE.70.046109.

    Article  Google Scholar 

  • Dean, J., Grove, D., & Chambers, C. (1995). Optimization of object-oriented programs using static class hierarchy analysis. In Proceedings of the 9th European conference on object-oriented programming (pp. 77–101), ECOOP’95 London, UK, UK: Springer-Verlag.

  • Do, H., & Rothermel, G. (2005) A controlled experiment assessing test case prioritization techniques via mutation faults. In Proceedings of the 21st international conference on software maintenance, ICSM’05, IEEE Computer Society, (pp. 411–420). Washington, DC, USA. doi:10.1109/ICSM.2005.9.

  • Gethers, M., Dit, B., Kagdi, H., & Poshyvanyk, D. (2012). Integrated impact analysis for managing software changes. In Proceedings of the 34th international conference on software engineering (pp. 430–440)., ICSE’12 Piscataway, NJ, USA: IEEE Press.

  • Grove, D., DeFouw, G., Dean, J., & Chambers, C. (1997). Call graph construction in object-oriented languages. In Proceedings of the conference on object-oriented programming, systems, languages, and applications, (pp. 108–124).

  • Hattori, L., Guerrero, D., Figueiredo, J., Brunet, J., & Damásio, J. (2008). On the precision and accuracy of impact analysis techniques. In Proceedings of the seventh IEEE/ACIS international conference on computer and information science (Icis 2008), ICIS’08. IEEE Computer Society, (pp. 513–518). Washington, DC, USA. doi:10.1109/ICIS.2008.104.

  • Jia, Y., & Harman, M. (2011). An analysis and survey of the development of mutation testing. IEEE Transactions on Software Engineering, 37(5), 649–678. doi:10.1109/TSE.2010.62.

    Article  Google Scholar 

  • King, K. N., & Offutt, A. J. (1991). A fortran language system for mutation-based software testing. Software: Practice and Experience, 21(7), 685–718. doi:10.1002/spe.4380210704.

    Google Scholar 

  • Law, J., & Rothermel, G. (2003). Whole program path-based dynamic impact analysis. In Proceedings of the 25th international conference on software engineering (pp. 308–318)., ICSE’03 Washington, DC, USA: IEEE Computer Society.

  • Lehnert, S. (2011) A taxonomy for software change impact analysis. In Proceedings of the 12th international workshop on principles of software evolution and the 7th annual ERCIM workshop on software evolution, IWPSE-EVOL’11, ACM, (pp. 41–50). New York, NY, USA. doi:10.1145/2024445.2024454.

  • Li, B., Sun, X., Leung, H., & Zhang, S. (2013). A survey of code-based change impact analysis techniques. Software Testing, Verification and Reliability, 23(8), 613–646. doi:10.1002/stvr.1475.

    Article  Google Scholar 

  • Loyall, J. P., & Mathisen, S. A. (1993). Using dependence analysis to support the software maintenance process. In Proceedings of the conference on software maintenance (pp. 282–291), ICSM’93 Washington, DC, USA: IEEE Computer Society.

  • Michael, C. C., & Jones, R. C. (1997). On the uniformity of error propagation in software. In Proceedings of the 12th annual conference on computer assurance, COMPASS’97, (pp. 68–76). doi:10.1109/CMPASS.1997.613237

  • Moriconi, M., & Winkler, T. C. (1990). Approximate reasoning about the semantic effects of program changes. IEEE Transactions on Software Engineering, 16(9), 980–992. doi:10.1109/32.58785.

    Article  Google Scholar 

  • Offutt, A. J., Lee, A., Rothermel, G., Untch, R. H., & Zapf, C. (1996). An experimental determination of sufficient mutant operators. ACM Transactions on Software Engineering and Methodology, 5(2), 99–118. doi:10.1145/227607.227610.

    Article  Google Scholar 

  • Pawlak, R., Monperrus, M., Petitprez, N., Noguera, C., & Seinturier, L. (2015). Spoon: A library for implementing analyses and transformations of java source code. Software: Practice and Experience. doi:10.1002/spe.2346.

    Google Scholar 

  • Ramanathan, M. K., Grama, A., & Jagannathan, S. (2006). Sieve: A tool for automatically detecting variations across program versions. In Proceedings of the 21st IEEE/ACM international conference on automated software engineering, ASE’06, IEEE Computer Society. (pp. 241–252) Washington, DC, USA. doi:10.1109/ASE.2006.61.

  • Ren, X., Shah, F., Tip, F., Ryder, B. G., & Chesley, O. (2004). Chianti: A tool for change impact analysis of java programs. In Proceedings of the 19th annual ACM SIGPLAN conference on object-oriented programming, systems, languages, and applications, OOPSLA’04. ACM, (pp. 432–448). New York, NY, USA. doi:10.1145/1028976.1029012.

  • Robillard, M. P., & Murphy, G. C. (2002). Concern graphs: Finding and describing concerns using structural program dependencies. In Proceedings of the 24th international conference on software engineering, ICSE’02, ACM, (pp. 406–416). New York, NY, USA. doi:10.1145/581339.581390.

  • Seo, H., Sadowski, C., Elbaum, S., Aftandilian, E., & Bowdidge, R. (2014). Programmers’ build errors: A case study (at Google). In Proceedings of the 36th international conference on software engineering, ICSE’14, ACM, (pp. 724–734). New York, NY, USA. doi:10.1145/2568225.2568255.

  • Shu, G., Sun, B., Henderson, T., & Podgurski, A. (2013). JavaPDG: A new platform for program dependence analysis. In Proceedings of the 6th international conference on software testing, verification and validation, ICST’13 (pp. 408–415). doi:10.1109/ICST.2013.57.

  • Shu, G., Sun, B., Podgurski, A., & Cao, F. (2013). MFL: Method-level fault localization with causal inference. In Proceeding of the sixth international conference on software testing, verification and validation, ICST’13 (pp. 124–133). doi:10.1109/ICST.2013.31.

  • Strug, J., & Strug, B. (2012). Machine learning approach in mutation testing. In B. Nielsen, C. Weise (Eds.), Testing software and systems. Lecture notes in computer science (Vol. 7641, pp. 200–214). Berlin and Heidelberg: Springer.

  • Walker, R. J., Holmes, R., Hedgeland, I., Kapur, P., & Smith, A. (2006). A lightweight approach to technical risk estimation via probabilistic impact analysis. In Proceedings of the international workshop on mining software repositories, MSR’06, ACM (pp. 98–104) New York, NY, USA. doi:10.1145/1137983.1138008.

  • Zimmermann, T., & Nagappan, N. (2008). Predicting defects using network analysis on dependency graphs. In Proceedings of the 30th international conference on software engineering, ICSE’08, ACM (pp. 531–540). New York, NY, USA. doi:10.1145/1368088.1368161.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vincenzo Musco.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Musco, V., Monperrus, M. & Preux, P. A large-scale study of call graph-based impact prediction using mutation testing. Software Qual J 25, 921–950 (2017). https://doi.org/10.1007/s11219-016-9332-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11219-016-9332-8

Keywords

Navigation