Abstract
This paper presents a technique to measure and visualize execution-path coverage of test cases in the context of model-based software systems testing. Our technique provides visual feedback of the tests, their coverage, and their diversity. We provide two types of visualizations for path coverage based on so-called state-based graphs and path-based graphs. Our approach is implemented by extending the Modbat tool for model-based testing and experimentally evaluated on a collection of examples, including the ZooKeeper distributed coordination service. Our experimental results show that the state-based visualization is good at relating the tests to the model structure, while the path-based visualization shows distinct paths well, in particular linearly independent paths. Furthermore, our graph abstractions retain the characteristics of distinct execution paths, while removing some of the complexity of the graph.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Ammann, P., Offutt, J.: Introduction to Software Testing. Cambridge University Press, Cambridge (2016)
Artho, C.V., et al.: Modbat: a model-based API tester for event-driven systems. In: Bertacco, V., Legay, A. (eds.) HVC 2013. LNCS, vol. 8244, pp. 112–128. Springer, Cham (2013). https://doi.org/10.1007/978-3-319-03077-7_8
Artho, C., Rousset, G., Gros, Q.: Precondition coverage in software testing. In: Proceedings of 1st International Workshop on Validating Software Tests (VST 2016), Osaka, Japan. IEEE (2016)
AT&T Labs Research. Graphviz - Graph Visualization Software. https://www.graphviz.org
Brass, P.: Advanced Data Structures. Cambridge University Press, Cambridge (2008)
Cheng, K., Krishnakumar, A.: Automatic functional test generation using the extended finite state machine model. In: Proceedings of 30th International Design Automation Conference, DAC, pp. 86–91, Dallas, USA. ACM (1993)
Chilenski, J.J., Miller, S.P.: Applicability of modified condition/decision coverage to software testing. Softw. Eng. J. 9(5), 193–200 (1994)
CWI and INRIA. The VLTS benchmark suite (2019). https://cadp.inria.fr/resources/vlts/. Accessed 20 May 2019
Gansner, E., Koutsofios, E., North, S.: Drawing graphs with dot (2006). http://www.graphviz.org/pdf/dotguide.pdf
Groote, J.F., van Ham, F.: Interactive visualization of large state spaces. Int. J. Softw. Tools Technol. Transf. 8(1), 77–91 (2006)
Hunt, P., Konar, M., Junqueira, F.P., Reed, B.: Zookeeper: wait-free coordination for internet-scale systems. In: Barham, P., Roscoe, T. (eds.) 2010 USENIX Annual Technical Conference. USENIX Association (2010)
Jorgensen, P.C.: Software Testing: A Craftsman’s Approach. Auerbach Publications, Boca Raton (2013)
Koochakzadeh, N., Garousi, V.: TeCReVis: a tool for test coverage and test redundancy visualization. In: Bottaci, L., Fraser, G. (eds.) TAIC PART 2010. LNCS, vol. 6303, pp. 129–136. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15585-7_12
Ladenberger, L., Leuschel, M.: Mastering the visualization of larger state spaces with projection diagrams. In: Butler, M., Conchon, S., Zaïdi, F. (eds.) ICFEM 2015. LNCS, vol. 9407, pp. 153–169. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25423-4_10
Lawrence, J., Clarke, S., Burnett, M., Rothermel, G.: How well do professional developers test with code coverage visualizations? An empirical study. In: IEEE Symposium on Visual Languages and Human-Centric Computing, pp. 53–60. IEEE (2005)
Lu, S., Zhou, P., Liu, W., Zhou, Y., Torrellas, J.: Pathexpander: Architectural support for increasing the path coverage of dynamic bug detection. In: Proceedings of the 39th Annual IEEE/ACM International Symposium on Microarchitecture, pp. 38–52. IEEE Computer Society (2006)
Myers, G.J., Badgett, T., Thomas, T.M., Sandler, C.: The Art of Software Testing, vol. 2. Wiley Online Library, Hoboken (2004)
Programming Methods Laboratory of École Polytechnique Fédérale de Lausanne. The Scala Programming Language. https://www.scala-lang.org
Rountev, A., Kagan, S., Sawin, J.: Coverage criteria for testing of object interactions in sequence diagrams. In: Cerioli, M. (ed.) FASE 2005. LNCS, vol. 3442, pp. 289–304. Springer, Heidelberg (2005). https://doi.org/10.1007/978-3-540-31984-9_22
Utting, M., Legeard, B.: Practical Model-Based Testing: A Tools Approach. Morgan Kaufmann Publishers Inc., San Francisco (2007)
Utting, M., Pretschner, A., Legeard, B.: A taxonomy of model-based testing approaches. Softw. Test. Verif. Reliab. 22, 297–312 (2012)
Visser, W., Havelund, K., Brat, G., Park, S., Lerda, F.: Model checking programs. Autom. Softw. Eng. J. 10(2), 203–232 (2003)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, R., Artho, C., Kristensen, L.M., Stolz, V. (2019). Visualization and Abstractions for Execution Paths in Model-Based Software Testing. In: Ahrendt, W., Tapia Tarifa, S. (eds) Integrated Formal Methods. IFM 2019. Lecture Notes in Computer Science(), vol 11918. Springer, Cham. https://doi.org/10.1007/978-3-030-34968-4_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-34968-4_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34967-7
Online ISBN: 978-3-030-34968-4
eBook Packages: Computer ScienceComputer Science (R0)