Skip to main content
Log in

Portability of executable service-oriented processes: metrics and validation

  • Original Research Paper
  • Published:
Service Oriented Computing and Applications Aims and scope Submit manuscript

Abstract

A key promise of process languages based on open standards, such as the Web Services Business Process Execution Language, is the avoidance of vendor lock-in through the portability of processes among runtime environments. Despite the fact that today various runtimes claim to support this language, every runtime implements a different subset, thus hampering portability and locking in their users. It is our intention to improve this situation by enabling the measurement of the portability of executable service-oriented processes. This helps developers to assess their implementations and to decide if it is feasible to invest in the effort of porting a process to another runtime. In this paper, we define several software quality metrics that quantify the degree of portability of an executable, service-oriented process from different viewpoints. When integrated into a development environment, such metrics can help to improve the portability of the outcome. We validate the metrics theoretically with respect to measurement theory and construct validity using two validation frameworks. The validation is complemented with an empirical evaluation of the metrics using a large set of processes coming from several process libraries.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. Betsy is a conformance testing tool for BPEL. For more information, see the project page: https://github.com/uniba-dsg/betsy.

  2. See the project page http://uniba-dsg.github.io/prope/ for more information and a description on how to use the static checker. A plugin implementation for the Sonarqube quality management platform is also available at http://uniba-dsg.github.io/sonar-prope-plugin/.

  3. For the remainder of the paper, we round all metric values to two decimal places.

  4. We deviate slightly from the specification, by also considering event processing activities like, e.g., onEvent and onAlarm, as normal activities. In the specification, these are not listed as separate activities, but as part of the pick activity. Since they are almost identical to other activities, receive and wait, we think a separate consideration is worthwhile.

  5. The network recently (July 2014) changed its name from Ohloh to Open Hub. For more information, see the network homepage located at https://www.openhub.net/.

  6. All statistical computations in this section were performed using the R software [56].

  7. Since we drop the assumption of normality in the following, we omit a presentation of the results of this test.

  8. The homepage of the Mono project is available at http://www.mono-project.com.

  9. ISO/IEC CD 25023 [29] is intended to revise the preceding ISO/IEC standards that define quality metrics, ISO/IEC TR 9126-2:2003 [26] and ISO/IEC TR 9126-3:2003 [27]. At the time of writing, [29] is still under development.

References

  1. Athanasopoulos G, Tsalgatidou A, Pantazoglou M (2006) Interoperability among heterogeneous services. In: International conference on services computing, Chicago, USA

  2. Basci D, Misra S (2009) Measuring and evaluating a design complexity metric for XML schema documents. J Inf Sci Eng 25(5):1405–1425

    Google Scholar 

  3. Bianculli D, Binder W, Drago ML (2010) Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: International conference on the World Wide Web, pp 141–150, Raleigh, NC, USA

  4. Boehm B, Brown J, Lipow M (1976) Quantitative evaluation of software quality. In: Proceedings of the 2nd international conference on software engineering, San Francisco, USA

  5. Boehm BW, Abts C, Brown AW, Chulani S, Clark BK, Horowitz E, Madachy R, Reifer DJ, Steece B (2000) Software cost estimation with Cocomo II. Prentice Hall, Englewood Cliffs ISBN-13: 978-0130266927

    Google Scholar 

  6. Briand L, Morasca S, Basily V (1996) Property-based software engineering measurement. IEEE Trans Software Eng 22(1):68–86

    Article  Google Scholar 

  7. Cardoso J (2007) Business process quality metrics: log-based complexity of workflow patterns. In: On the move to meaningful internet systems 2007: CoopIS, DOA, ODBASE, GADA, and IS. Springer, pp 427–434

  8. Cardoso J, Binz T, Breitenbücher U, Kopp O, Leymann F (2013) Cloud computing automation: integration USDL and TOSCA. In: 25th international conference on advanced information systems engineering, pp 1–16, Valencia, Spain

  9. Cesari L, Lapadula A, Pugliese R, Tiezzi F (2010) A tool for rapid development of WS-BPEL applications. In: Proceedings of the 2010 ACM symposium on applied computing (SAC), Sierre, Switzerland

  10. Emam KE, Benlarbi S, Goel N, Rai S (2001) The confounding effect of class size on the validity of object-oriented metrics. IEEE Trans Software Eng 27(7):630–650

    Article  Google Scholar 

  11. Geiger M, Harrer S, Lenhard J, Casar M, Vorndran A, Wirtz G (2015) BPMN conformance in open source engines. In: 9th IEEE Symposium on Service Oriented System Engineering, San Francisco Bay, CA, USA

  12. Geiger M, Harrer S, Lenhard J, Wirtz G (2016) On the Evolution of BPMN 2.0 Support and Implementation. In: 10th IEEE symposium on service oriented system engineering, Oxford, UK

  13. Geiger M, Wirtz G (2013) BPMN 2.0 serialization—standard compliance issues and evaluation of modeling tools. In: 5th international workshop on enterprise modelling and information systems architectures, St. Gallen, Switzerland

  14. Geiger M, Wirtz G (2013) Detecting interoperability and correctness issues in BPMN 2.0 process models. In: 5th Central European workshop on services and their composition, Rostock, Germany

  15. Gilb T (1988) Principles of software engineering management. Addison Wesley, Reading. ISBN-13: 978-0201192469

  16. Glinz M (2008) A risk-based, value-oriented approach to quality requirements. IEEE Comput 25(8):34–41

    Google Scholar 

  17. González LS, Rubio FG, González FR, Velthuis MP (2010) Measurement in business processes: a systematic review. Bus Process Manag J 16(91):114–134

    Article  Google Scholar 

  18. Hallwyl T, Henglein F, Hildebrandt T (2010) A standard-driven implementation of WS-BPEL 2.0. In: Proceedings of the 2010 ACM symposium on applied computing (SAC), Sierre, Switzerland

  19. Harrer S, Lenhard J (2012) Betsy—a BPEL engine test system. Bamberger Beiträge zur Wirtschaftsinformatik und Angewandten Informatik, no. 90, University of Bamberg. Technical report

  20. Harrer S, Lenhard J, Wirtz G (2012) BPEL conformance in open source engines. In: IEEE international conference on service-oriented computing and applications, IEEE, Taipei, Taiwan

  21. Harrer S, Lenhard J, Wirtz G (2013) Open source versus proprietary software in service-orientation: the case of BPEL engines. In: 11th international conference on service oriented computing (ICSOC), pp 99–113, Berlin, Germany

  22. Hinz S, Schmidt K, Stahl C (2005) Transforming BPEL to Petri nets. In: 3rd international conference on business process management, Nancy, France

  23. Hofmeister H, Wirtz G (2008) Supporting service-oriented design with metrics. In: Proceedings of the 12th international IEEE enterprise distributed object computing conference, Munich, Germany

  24. Højsgaard E, Hallwyl T (2012) Core BPEL: syntactic simplification of WS-BPEL 2.0. In: Proceedings of the 27th annual ACM symposium on applied computing, pp 1984–1991. ACM, Trento, Italy

  25. IEEE (1998) IEEE Std 1061-1998 (R2009), IEEE standard for a software quality metrics methodology. Revision of IEEE Std 1061-1992

  26. ISO/IEC (2003) Software engineering—product quality—Part 2: External metrics. 9126-2:2003

  27. ISO/IEC (2003) Software engineering—product quality—Part 3: Internal metrics. 9126-3:2003

  28. ISO/IEC (2011) Systems and software engineering—system and software quality requirements and evaluation (SQuaRE)—system and software quality models. 25010:2011

  29. ISO/IEC (2013) Systems and software engineering—systems and software quality requirements and evaluation (SQuaRE)—measurement of system and software product quality. 25023

  30. Kaner C, Bond W (2004) Software engineering metrics: what do they measure and how do we know? In: 10th international software metrics symposium, Chicago, USA

  31. Kang H, Yang X, Yuan S (2007) Modeling and verification of web services composition based on CPN. In: IFIP international conference on network and parallel computing workshops, Dalian, China

  32. Khalaf R, Keller A, Leymann F (2006) Business processes for web services: principles and applications. IBM Syst J 45(2):425–446

    Article  Google Scholar 

  33. Kolb S, Wirtz G (2014) Towards application portability in platform as a service. In: 8th international symposium on service-oriented system engineering, Oxford, UK

  34. Kopp O, Martin D, Wutke D, Leymann F (2009) The difference between graph-based and block-structured business process modelling languages. Enterp Model Inf Syst 4(1):3–13

    Google Scholar 

  35. Lapadula A, Pugliese R, Tiezzi F (2008) A formal account of WS-BPEL. In: Proceedings of the 10th international conference on coordination models and languages, Oslo, Norway

  36. Lübke D (2007) Unit testing BPEL compositions. In: Test and analysis of service-oriented systems. Springer, Berlin, pp 149–171. ISBN: 978-3540729112

  37. Lenhard J, Geiger M, Wirtz G (2015) On the measurement of design-time adaptability for process-based systems. In: 9th international IEEE symposium on service-oriented system engineering (SOSE), San Francisco Bay, USA

  38. Lenhard J, Harrer S, Wirtz G (2013) Measuring the installability of service orchestrations using the SQuaRE method. In: IEEE international conference on service-oriented computing and applications, IEEE, Kauai, Hawaii, USA

  39. Lenhard J, Schönberger A, Wirtz G (2011) Edit distance-based pattern support assessment of orchestration languages. In: 19th international conference on cooperative informtion systems, Hersonissos, Greece

  40. Lenhard J, Wirtz G (2013) Measuring the portability of service-oriented processes. In: 17th IEEE international enterprise distributed object computing conference (EDOC2013), Vancouver, Canada

  41. Letouzey JL, Ilkiewicz M (2012) Managing technical debt with the SQUALE method. IEEE Softw 29(6):44–51

    Article  Google Scholar 

  42. Leymann F (2010) BPEL vs. BPMN 2.0: should you care? In: 2nd international workshop on BPMN, Potsdam, Germany

  43. Lohmann N, Verbeek E, Dijkman RM (2009) Petri net transformations for business processes—a survey. In: Transactions on Petri nets and other models of concurrency, vol 2, pp 46–63

  44. Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18(1):50–60

    Article  MathSciNet  MATH  Google Scholar 

  45. Meneely A, Smith B, Williams L (2012) Validating software metrics: a spectrum of philosophies. ACM Trans Softw Eng Methodol 21(4):1–28

  46. Muketha G, Ghani A, Selamat M, Atan R (2010) Complexity metrics for executable business processes. Inf Technol J 9(7):1317–1326

    Article  Google Scholar 

  47. OASIS: web services business process execution language, V2.0 (2007)

  48. OASIS: topology and orchestration specification for cloud applications, version 1.0 (2013)

  49. OMG: business process model and notation (BPMN) version 2.0 (2011)

  50. Ortega M, Pérez M, Rojas T (2003) Construction of a systemic quality model for evaluating a software product. Softw Qual J 11(3):219–242

    Article  Google Scholar 

  51. Ouyang C, Dumas M, van der Aalst Wil MP, ter Hofstede Arthur HM, Mendling J (2009) From business process models to process-oriented software systems. ACM Trans Softw Eng Methodol 19(2)

  52. Overhage S, Birkmeier D, Schlauderer S (2012) Quality marks, metrics and measurement procedures for business process models: the 3QM-framework. Bus Inf Syst Eng 4(5):229–246

    Article  Google Scholar 

  53. Peltz C (2003) Web services orchestration and choreography. IEEE Comput 36(10):46–52

    Article  Google Scholar 

  54. Perepletchikov M, Ryan C, Frampton K, Tari Z (2007) Coupling metrics for predicting maintainability in service-oriented designs. In: IEEE Australian software engineering conference

  55. Petcu D, Macariu G, Panica S, Crăciun C (2013) Portable cloud applications—from theory to practice. Future Gener Comput Syst 29(6):1417–1430

    Article  Google Scholar 

  56. R Core Team (2013) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna. http://www.R-project.org

  57. Shapiro S, Wilk MB (1965) An analysis of variance test for normality (complete samples). Biometrika 52(3–4):591–611

    Article  MathSciNet  MATH  Google Scholar 

  58. Simon B, Goldschmidt B, Kondorosi K (2010) A human readable platform independent domain specific language for BPEL. In: 2nd international conference on networked digital technologies, Prague, Czech Republic

  59. SOA Manifesto Working Group (2009) SOA Manifesto. In: SOA symposium, Rotterdam, The Netherlands

  60. Sun K, Li Y (2013) Effort estimation in cloud migration process. In: 7th IEEE International symposium on service-oriented engineering, San Francisco Bay, USA

  61. Tan W, Fan Y, Zhou M (2009) A Petri net-based method for compatibility analysis and composition of web services in business process execution language. IEEE Trans Autom Sci Eng 6(1):94–106

    Article  Google Scholar 

  62. van der Aalst WMP, ter Hofstede AHM (2005) YAWL: yet another workflow language. Inf Syst 30(4):245–275

    Article  Google Scholar 

  63. van der Aalst WMP, ter Hofstede AHM, Kiepuszewski B, Barros AP (2003) Workflow patterns. Distrib Parallel Databases 14(1):5–51

    Article  Google Scholar 

  64. Vanderfeesten I, Cardoso J, Mendling J, Reijers H, van der Aalst W (2007) Quality metrics for business process models. Future Strategies, Lighthouse Point

    Google Scholar 

  65. Wang Y, Taher Y, van den Heuvel WJ (2012) Towards smart service networks: an interdisciplinary service assessment metrics. In: 4th international workshop on service oriented enterprise architecture for enterprise engineering, Beijing, China

  66. Weyuker E (1988) Evaluating software complexity measures. IEEE Trans Software Eng 14(9):1357–1365

    Article  MathSciNet  Google Scholar 

  67. WfMC: process definition interface—XML process definition language, V2.2 (2012)

  68. White B (2012) Pro WF 4.5. Apress. ISBN-13: 978-1-4302-4383-0

  69. Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83

    Article  Google Scholar 

  70. Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2012) Experimentation in software engineering. Springer, Berlin

    Book  MATH  Google Scholar 

  71. zur Muehlen M, Recker J (2013) How much language is enough? Theoretical and practical use of the business process modeling notation. Seminal contributions to information systems engineering. Springer, Berlin. ISBN: 978-3642369261

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jörg Lenhard.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lenhard, J., Wirtz, G. Portability of executable service-oriented processes: metrics and validation. SOCA 10, 391–411 (2016). https://doi.org/10.1007/s11761-016-0195-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11761-016-0195-4

Keywords

Navigation