Skip to main content
Log in

Improving Fault Detection in Modified Code — A Study from the Telecommunication Industry

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Many software systems are developed in a number of consecutive releases. In each release not only new code is added but also existing code is often modified. In this study we show that the modified code can be an important source of faults. Faults are widely recognized as one of the major cost drivers in software projects. Therefore, we look for methods that improve the fault detection in the modified code. We propose and evaluate a number of prediction models that increase the efficiency of fault detection. To build and evaluate our models we use data collected from two large telecommunication systems produced by Ericsson. We evaluate the performance of our models by applying them both to a different release of the system than the one they are built on and to a different system. The performance of our models is compared to the performance of the theoretical best model, a simple model based on size, as well as to analyzing the code in a random order (not using any model). We find that the use of our models provides a significant improvement over not using any model at all and over using a simple model based on the class size. The gain offered by our models corresponds to 38–57% of the theoretical maximum gain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Sommerville I. Software Engineering. Boston: Addison-Wesley, 2004.

    Google Scholar 

  2. Cartwright M, Shepperd M. An empirical investigation of an object-oriented software system. IEEE Trans. Software Engineering, 2000, 26(8): 786–796.

    Article  Google Scholar 

  3. Boehm B, Basili V R. Software defect reduction top 10 list. Computer, 2001, 34: 135–137.

    Article  Google Scholar 

  4. Fenton N, Ohlsson N. Quantitative analysis of faults and failures in a complex software system. IEEE Trans. Software Engineering, 2000, 26(8): 797–814.

    Article  Google Scholar 

  5. Ohlsson N, Eriksson A C, Helander M. Early risk-management by identification of fault-prone modules. Empirical Software Engineering, 1997, 2(2): 166–173.

    Article  Google Scholar 

  6. Briand L C, Wust J, Daly J W, Porter D W. Exploring the relationship between design measures and software quality in object-oriented systems. The Journal of Systems and Software, 2000, 51(3): 245–273.

    Article  Google Scholar 

  7. Briand L C, Wust J, Ikonomovski S V et al. Investigating quality factors in object-oriented designs: An industrial case study. In Proc. the 1999 Int. Conf. Software Eng., Los Angeles, USA, 1999, pp.345–354.

  8. K El Emam, W L Melo, J C Machado. The prediction of faulty classes using object-oriented design metrics. The Journal of Systems and Software, 2001, 56(1): 63–75.

    Article  Google Scholar 

  9. Tomaszewski P, Lundberg L, Grahn H. Increasing the efficiency of fault detection in modified code. In Proc. Asia Pacific Software Engineering Conference, Taipei, 2005, pp.421–430.

  10. Zhao M, Wohlin C, Ohlsson N, Xie M. A comparison between software design and code metrics for the prediction of software fault content. Information and Software Technology, 1998, 40(14): 801–809.

    Article  Google Scholar 

  11. Pighin M, Marzona A. An empirical analysis of fault persistence through software releases. In Proc. the Int. Symp. Empirical Software Engineering, Rome, Italy, 2003, pp.206–212.

  12. Pighin M, Marzona A. Reducing corrective maintenance effort considering module’s history. In Proc. Ninth European Conference on Software Maintenance and Reengineering, Manchester, UK, 2005, pp.232–235.

  13. Selby R W. Empirically based analysis of failures in software systems. IEEE Trans. Reliability, 1990, 39(4): 444–454.

    Article  Google Scholar 

  14. Gascoyne S. Productivity improvements in software testing with test automation. Electronic Engineering, 2000, 72(885): 65–67.

    Google Scholar 

  15. A Gunes Koru, J Tian. An empirical comparison and characterization of high defect and high complexity modules. Journal of Systems and Software, 2003, 67(3): 153–163.

    Article  Google Scholar 

  16. Khoshgoftaar T M, Allen E B, Deng J. Controlling overfitting in software quality models: Experiments with regression trees and classification. In Proc. The 7th Int. Software Metrics Symposium, London, UK, 2001, pp.190–198.

  17. Khoshgoftaar T M, Allen E B, Jianyu D. Using regression trees to classify fault-prone software modules. IEEE Trans. Reliability, 2002, 51(4): 455–462.

    Article  Google Scholar 

  18. Khoshgoftaar T M, Allen E B, Jones W D, Hudepohl J P. Accuracy of software quality models over multiple releases. Annals of Software Engineering, 2000, 9(1-4): 103–116.

    Article  Google Scholar 

  19. Khoshgoftaar T M, Seliya N. Fault prediction modeling for software quality estimation: Comparing commonly used techniques. Empirical Software Engineering, 2003, 8(3): 255–283.

    Article  Google Scholar 

  20. Ohlsson N, Zhao M, Helander M. Application of multivariate analysis for software fault prediction. Software Quality Journal, 1998, 7(1): 51–66.

    Article  Google Scholar 

  21. Tomaszewski P, Håkansson J, Lundberg L, Grahn H. The accuracy of fault prediction in modified code — Statistical model vs. expert estimation. In Proc. 13th Annual IEEE Int. Conf. and Workshop on the Engineering of Computer Based Systems, Potsdam, Germany, 2006, pp.334–343.

  22. Chidamber S R, Darcy D P, Kemerer C F. Managerial use of metrics for object-oriented software: An exploratory analysis. IEEE Trans. Software Engineering, 1998, 24(8): 629–639.

    Article  Google Scholar 

  23. Nikora A P, Munson J C. Developing fault predictors for evolving software systems. In Proc. The Ninth Int. Software Metrics Symposium, Sydney, Australia, 2003, pp.338–349.

  24. Ostrand T J, Weyuker E J, Bell R M. Predicting the location and number of faults in large software systems. IEEE Trans. Software Engineering, 2005, 31(4): 340–355.

    Article  Google Scholar 

  25. Yu P, Systa T, Muller H. Predicting fault-proneness using OO metrics. An industrial case study. In Proc. The Sixth European Conference on Software Maintenance and Reengineering, Budapest, Hungary, 2002, pp.99–107.

  26. Chidamber S R, Kemerer C F. A metrics suite for object oriented design. IEEE Trans. Software Engineering, 1994, 20(6): 476–494.

    Article  Google Scholar 

  27. Fioravanti F, Nesi P. A study on fault-proneness detection of object-oriented systems. In Proc. Fifth European Conf. Software Maintenance and Reengineering, Lisbon, Portugal, 2001, pp.121–130.

  28. Fenton N, Neil M. A critique of software defect prediction models. IEEE Trans. Software Engineering, 1999, 25(5): 675–689.

    Article  Google Scholar 

  29. Keppel G. Design and Analysis: A Researcher’s Handbook. Upper Saddle River: Prentice Hall, NJ, 2004.

    Google Scholar 

  30. Milton J S, Arnold J C. Introduction to Probability and Statistics: Principles and Applications for Engineering and the Computing Sciences. Boston: McGraw-Hill, 2003.

    Google Scholar 

  31. Nagappan N, Ball T. Use of relative code churn measures to predict system defect density. In Proc. the 27th Int. Conf. Software Engineering, St. Louis, USA, 2005, pp.284–292.

  32. Munson J C, Elbaum S G. Code churn: A measure for estimating the impact of code change. In Proc. the Int. Conf. Software Maintenance, Bethesda, USA, 1998, pp.24–31.

  33. Graham I. Migrating to Object Technology. Wokingham, England; Reading, Mass.: Addison-Wesley Pub. Co., 1995.

    Google Scholar 

  34. Henderson-Sellers B, Constantine L L, Graham I M. Coupling and cohesion (towards a valid metrics suite for object-oriented analysis and design). Object Oriented Systems, 1996, 3(3): 143–158.

    Google Scholar 

  35. Scientific Toolworks Inc. Understand C++ Homepage. Last accessed on 2006-03-13.

  36. LOCC Project Homepage. The Collaborative Software Development Laboratory. University of Hawaii, USA, last accessed on 2005-11-10, http://csdl.ics.hawaii.edu/Tools/LOCC/.

  37. Kitchenham B, Pickard L, Pfleeger S L. Case studies for method and tool evaluation. IEEE Software, 1995, 12(4): 52–62.

    Article  Google Scholar 

  38. Kitchenham B A, Pfleeger S L, Pickard L M et al. Preliminary guidelines for empirical research in software engineering. IEEE Trans. Software Engineering, 2002, 28(8): 721.

    Article  Google Scholar 

  39. Wohlin C, Runeson P, Höst M et al. Experimentation in Software Engineering: An Introduction. Boston: Kluwer, 2000.

    MATH  Google Scholar 

  40. Rees D G. Essential Statistics, London; New York: Chapman & Hall, 1995.

    Google Scholar 

  41. SPSS Inc., SPSS Package Homepage, http://www.spss.com/, last accessed on 2006-03-13.

  42. Aczel A D, Sounderpandian J. Complete Business Statistics. Boston: McGraw-Hill, 2006.

    Google Scholar 

  43. Host M, Regnell B, Wohlin C. Using students as subjects—A comparative study of students and professionals in lead-time impact assessment. Empirical Software Engineering, 2000, 5(3): 201–214.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Piotr Tomaszewski.

Additional information

This work was partly funded by The Knowledge Foundation in Sweden under a research grant for the project “Blekinge — Engineering Software Qualities (BESQ)” (http://www.bth.se/besq).

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tomaszewski, P., Lundberg, L. & Grahn, H. Improving Fault Detection in Modified Code — A Study from the Telecommunication Industry. J Comput Sci Technol 22, 397–409 (2007). https://doi.org/10.1007/s11390-007-9053-3

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-007-9053-3

Keywords

Navigation