Skip to main content

Estimation Under Uncertainty

  • Chapter
  • First Online:
Software Project Effort Estimation
  • 3189 Accesses

Abstract

Uncertainty and inaccuracy are inherent properties of estimation, in particular predictive estimation. Measuring and managing uncertainty and inaccuracy lies at the heart of good estimation. Flyvbjerg (2006) distinguished three categories of reasons for inaccuracies in project forecasts:

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Project performed at Langmark Graphics, a vendor of commercial software for the oil and gas exploration and production market.

  2. 2.

    Beta-PERT is a version of the Beta distribution and got its name because it uses the same assumption about the mean as PERT networks. It requires the same three parameters as the Triangle distribution. It is considered to better reflect human predictions and as such should be preferred over the triangular distribution in the context of expert-based effort estimation.

  3. 3.

    We discuss the difference between bias and precision in Sect. 4.3.2.

  4. 4.

    Pred.m measures the percentage proportion of estimates that are within a given percentage m of the actual value. The parameter m reflects estimation error and is commonly instantiated with Magnitude of Relative Error (MRE) we present in (4.3); in this case, Pred.m measures the percentage portion of estimates that are within a given MRE estimation error.

  5. 5.

    We discuss top-down and bottom-up estimation strategies in Sects. 5.1 and 5.2, respectively.

References

  • Stamelos I, Angelis L (2001) Managing uncertainty in project portfolio cost estimation. Inform Softw Technol 43(13):759–768

    Article  Google Scholar 

  • Jørgensen M (2004c) Realism in assessment of effort estimation uncertainty: it matters how you ask. IEEE Trans Softw Eng 30(4):209–217

    Article  Google Scholar 

  • Boehm BW (1981) Software engineering economics. Prentice-Hall, Englewood Cliffs, NJ

    MATH  Google Scholar 

  • Conte SD, Dunsmore HE, Shen YE (1986) Software engineering metrics and models. Benjamin-Cummings Publishing Company, Menlo Park, CA

    Google Scholar 

  • DeMarco T (1982) Controlling software projects: management, measurement, and estimates. Prentice-Hall, Englewood Cliffs, NJ

    Google Scholar 

  • Efron B, Tibshirani RJ (1994) An introduction to the bootstrap. Chapman and Hall, New York

    Google Scholar 

  • Flyvbjerg B (2006) From Nobel Prize to project management: getting risks right. Proj Manage J 37(3):5–15

    Google Scholar 

  • ISO (2009) ISO/IEC 20926 - IFPUG Functional Size Measurement Method 2009, 2nd edn. International Standardization Organization, Geneva

    Google Scholar 

  • Jørgensen M (2005a) Practical guidelines for expert-judgment-based software effort estimation. IEEE Softw 22(3):57–63

    Article  Google Scholar 

  • Jørgensen M (2005b) Evidence-based guidelines for assessment of software development cost uncertainty. IEEE Trans Softw Eng 31(11):942–954

    Article  Google Scholar 

  • Jørgensen M, Sjøberg DIK (2003) An effort prediction interval approach based on the empirical distribution of previous estimation accuracy. Inform Softw Technol 45(3):123–136

    Article  Google Scholar 

  • Jørgensen M, Teigen KH, Moløkken-Østvold K (2004) Better sure than safe? Over-confidence in judgement based software development effort prediction intervals. J Syst Softw 70(1–2):79–93

    Article  Google Scholar 

  • Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and Giroux, New York, 499 p

    Google Scholar 

  • Kitchenham B, Linkman S (1997) Estimates, uncertainty, and risk. IEEE Softw 14(3):69–74

    Article  Google Scholar 

  • Little T (2006) Schedule estimation and uncertainty surrounding the cone of uncertainty. IEEE Softw 23(3):48–54

    Article  Google Scholar 

  • McConnell S (2006) Software estimation: demystifying the black art. Microsoft Press, Redmond, MA

    Google Scholar 

  • Mittas N, Angelis L (2009) Bootstrap prediction intervals for a semi-parametric software cost estimation model. In: Proceedings of the 35th Euromicro conference on software engineering and advanced applications, 27–29 August, Patras, Greece, pp 293–299

    Google Scholar 

  • Miyazaki Y, Takanou A, Nozaki H, Nakagawa N, Okada K (1991) Method to estimate parameter values in software prediction models. Inform Softw Technol 33(3):239–243

    Article  Google Scholar 

  • Pawlak Z (1982) Rough sets. Int J Inform Comput Sci 11(5):341–356

    Article  MATH  MathSciNet  Google Scholar 

  • Pawlak Z (1991) Rough sets: theoretical aspects of reasoning about data. Kluwer, Dordrecht

    Book  MATH  Google Scholar 

  • Pawlak Z, Skowron A (2007) Rudiments of rough sets. Inform Sci 177(1):3–27

    Article  MATH  MathSciNet  Google Scholar 

  • Port D, Korte M (2008) Comparative studies of the model evaluation criterions MMRE and PRED in software cost estimation research. In: Proceedings of the 2nd international symposium on empirical software engineering and measurement, Kaiserslautern, Germany, pp 51–60

    Google Scholar 

  • Putnam LH, Myers W (1992) Measures for excellence: reliable software on time, within budget. Prentice-Hall Professional Technical Reference, Englewood Cliffs, NJ

    Google Scholar 

  • Sauer C, Gemino A, Reich BH (2007) The impact of size and volatility on IT project performance. Commun ACM 50(11):79–84

    Article  Google Scholar 

  • Stutzke RD (2005) Estimating software-intensive systems: projects, products, and processes. Addison-Wesley Professional, Reading, MA

    Google Scholar 

  • Wang ETG, Ju P-H, Jiang JJ, Klein G (2008) The effects of change control and management review on software flexibility and project performance. Inform Manage 45(7):438–443

    Article  Google Scholar 

  • Zadeh LA (1965) Fuzzy sets. Inform Control 8:338–353

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Further Reading

Further Reading

  • S. French (1995), “Uncertainty and imprecision: Modeling and analysis,” Journal of the Operational Research Society, vol. 46, pp. 70–79.

    The author provides a general discussion of uncertainty and imprecision in the context of statistics and decision making. He identifies potential sources of uncertainty and their impact on analysis. Furthermore, the author discusses possible ways of modeling and methods for analyzing uncertainty.

  • B. Kitchenham and S. Linkman (1997), “Estimates, uncertainty, and risk,” IEEE Software, vol. 14, no. 3, pp. 69–74.

    In this article, the authors discuss in detail the potential sources of uncertainty in the context of software project estimation.

  • M. Jørgensen and S. Grimstad (2008), “Avoiding Irrelevant and Misleading Information When Estimating Development Effort,” IEEE Software, vol. 25, no. 3, pp. 78–83.

    In this article, the authors discuss the sources of uncertainty in the context of expert-based effort estimation. In particular, they discuss the impact on estimates if experts provide irrelevant and misleading information.

  • S. Grimstad and M. Jørgensen (2006), “A framework for the analysis of software cost estimation accuracy”, Proceedings of the International Symposium on Empirical Software Engineering, Rio de Janeiro, Brazil, 2006, pp. 58–65.

    Based upon an analysis of effort estimation studies, the authors propose a framework for analyzing effort estimation effort. Within the framework, they propose a list of typical sources of estimation error.

  • T. Little (2006), “Schedule Estimation and Uncertainty Surrounding the Cone of Uncertainty.” IEEE Software, vol. 23, no. 3, pp. 48–54.

    The author discusses aspects of estimation uncertainty (here schedule estimation) based on example industrial data.

  • Z. Xu and T.M. Khoshgoftaar (2004), “Identification of Fuzzy Models of Software Cost Estimation,” Fuzzy Sets and Systems, vol. 145, no. 1, pp. 141–163.

    Authors describe a fuzzy effort modeling technique that deals with linguistic data and uses historical project data for automatically generating fuzzy membership functions and fuzzy rules.

  • J. Li and G. Ruhe (2006), “A Comparative Study of Attribute Weighting Heuristics for Effort Estimation by Analogy,” Proceedings of the International Symposium on Empirical Software Engineering, Rio de Janeiro, Brazil, pp. 66–74.

    Authors compare in an empirical study several techniques for quantifying the impact of effort drivers on development effort. Investigated techniques are used in the context of an analogy-based estimation method called AQUA+. One of the techniques they employ for modeling the uncertain impact of effort drivers is based on rough sets theory.

  • S. McConnell (2006), Software Estimation: Demystifying the Black Art, Microsoft Press.

    In Chap. 4 of his book, McConnell considers several ways of addressing estimation uncertainty. He provides several means for narrowing the cone of uncertainty and for considering the remaining uncertainty in project estimates, targets, and commitments.

  • A. Stellman and J. Greene (2005), Applied Software Project Management, O’Reilly Media.

    In Chap. 9 of their book, the authors discuss project change. They investigate the most common reasons of failed project changes and provide guidelines on how to successfully run projects through change. Additionally, in Sect. 6.4, the authors discuss a change control process for dealing with changes in requirements. They highlight the need for implementing only those changes that are worth pursuing and for preventing unnecessary or overly costly changes. In practice, many changes that apparently look like good ideas ultimately get thrown out once the true cost-benefit ratio of the change is known.

  • E.T. Wang, P. Ju, J.J. Jiang, and G. Klein (2008), “The effects of change control and management review on software flexibility and project performance,” Information & Management, vol. 45, no. 7, pp. 438–443.

    This article discusses the link between project performance, software flexibility, and management interventions. In particular, they investigate software flexibility as a mediator between project performance and two commonly recommended management control mechanisms, management review and change control. In their survey among over 200 project managers from the Project Management Institute, the authors confirmed that the level of control activities during software development is a significant facilitator of software flexibility, which, in turn, improves project performance.

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Trendowicz, A., Jeffery, R. (2014). Estimation Under Uncertainty. In: Software Project Effort Estimation. Springer, Cham. https://doi.org/10.1007/978-3-319-03629-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-03629-8_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-03628-1

  • Online ISBN: 978-3-319-03629-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics