Skip to main content
Log in

ConfigCrusher: towards white-box performance analysis for configurable systems

  • Published:
Automated Software Engineering Aims and scope Submit manuscript

Abstract

Stakeholders of configurable systems are often interested in knowing how configuration options influence the performance of a system to facilitate, for example, the debugging and optimization processes of these systems. Several black-box approaches can be used to obtain this information, but they either sample a large number of configurations to make accurate predictions or miss important performance-influencing interactions when sampling few configurations. Furthermore, black-box approaches cannot pinpoint the parts of a system that are responsible for performance differences among configurations. This article proposes ConfigCrusher, a white-box performance analysis that inspects the implementation of a system to guide the performance analysis, exploiting several insights of configurable systems in the process. ConfigCrusher employs a static data-flow analysis to identify how configuration options may influence control-flow statements and instruments code regions, corresponding to these statements, to dynamically analyze the influence of configuration options on the regions’ performance. Our evaluation on 10 configurable systems shows the feasibility of our white-box approach to more efficiently build performance-influence models that are similar to or more accurate than current state of the art approaches. Overall, we showcase the benefits of white-box performance analyses and their potential to outperform black-box approaches and provide additional information for analyzing configurable systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Though SPLat was designed for unit testing software product lines, the algorithm can be used to reduce the number of configurations to sample.

References

  • Al-Hajjaji, M., Krieter, S., Thüm, T., Lochau, M., Saake, G.: Incling: efficient product-line testing using incremental pairwise sampling. In: Proceedings of the International Conference Generative Programming and Component Engineering (GPCE), pp. 144–155. ACM, New York (2016)

  • Aldrich, J., Garlan, D., Kaestner, C., Le Goues, C., Mohseni-Kabir, A., Ruchkin, I., Samuel, S., Schmerl, B., Timperley, C.S., Veloso, M., Voysey, I., Biswas, J., Guha, A., Holtz, J., Camara, J., Jamshidi, P.: Model-based adaptation for robotics software. IEEE Softw. 36(2), 83–90 (2019)

    Article  Google Scholar 

  • Andreasen, E.S., Møller, A., Nielsen, B.B.: Systematic approaches for increasing soundness and precision of static analyzers. In: Proceedings of the International Workshop State of the Art in Program Analysis (SOAP), pp. 31–36. ACM, New York (2017). https://doi.org/10.1145/3088515.3088521

  • Angerer, F., Grimmer, A., Prähofer, H., Grünbacher, P.: Configuration-aware change impact analysis. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 385–395. IEEE Computer Society, Washington, DC (2015)

  • Apel, S., Batory, D., Kästner, C., Saake, G.: Feature-Oriented Software Product Lines: Concepts and Implementation. Springer, Berlin (2013)

    Book  Google Scholar 

  • Arzt, S., Rasthofer, S., Fritz, C., Bodden, E., Bartel, A., Klein, J., Le Traon, Y., Octeau, D., McDaniel, P.: Flowdroid: precise context, flow, field, object-sensitive and lifecycle-aware taint analysis for android apps. In: Proceedings of the Conference Programming Language Design and Implementation (PLDI), pp. 259–269. ACM, New York (2014)

  • Austin, T.H., Flanagan, C.: Efficient purely-dynamic information flow analysis. In: Proceedings of the Workshop Programming Languages and Analysis for Security (PLAS), pp. 113–124. ACM, New York (2009)

  • Austin, T.H., Flanagan, C.: Multiple facets for dynamic information flow. In: Proceedings of the Symposium Principles of Programming Languages (POPL), pp. 165–178. ACM, New York (2012)

  • Avdiienko, V., Kuznetsov, K., Gorla, A., Zeller, A., Arzt, S., Rasthofer, S., Bodden, E.: Mining apps for abnormal usage of sensitive data. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 426–436. IEEE Press, Piscataway (2015)

  • Barros, P., Just, R., Millstein, S., Vines, P., Dietl, W., dAmorim, M., Ernst, M.D.: Static analysis of implicit control flow: resolving Java reflection and android intents (t). In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 669–679. IEEE Computer Society, Washington, DC (2015)

  • Becker, S., Koziolek, H., Reussner, R.: The palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)

    Article  Google Scholar 

  • Bell, J., Kaiser, G.: Phosphor: illuminating dynamic data flow in commodity JVMS. SIGPLAN Not. 49(10), 83–101 (2014)

    Article  Google Scholar 

  • Bodden, E.: Self-adaptive static analysis. In: Proceedings of the International Conference Software Engineering (ICSE): New Ideas and Emerging Results, pp. 45–48. ACM, New York (2018)

  • Bruneton, E., Lenglet, R., Coupaye, T.: ASM: a code manipulation tool to implement adaptable systems. Adapt. Ext. Compon. Syst. 30(19), 1 (2002)

    Google Scholar 

  • Cashman, M., Cohen, M.B., Ranjan, P., Cottingham, R.W.: Navigating the maze: the impact of configurability in bioinformatics software. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 757–767. ACM, New York (2018)

  • Castro, P.D.O., Akel, C., Petit, E., Popov, M., Jalby, W.: Cere: LLVM-based codelet extractor and replayer for piecewise benchmarking and optimization. ACM Trans. Archit. Code Optim. (TACO) 12(1), 6:1–6:24 (2015)

  • Christakis, M., Bird, C.: What developers want and need from program analysis: an empirical study. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 332–343. ACM, New York (2016)

  • Cito, J., Leitner, P., Bosshard, C., Knecht, M., Mazlami, G., Gall, H.C.: Performancehat: augmenting source code with runtime performance traces in the IDE. In: Proceedings of the International Conference Software Engineering, Companion Proceedings, pp. 41–44. ACM, New York (2018)

  • Do, L.N.Q., Ali, K., Livshits, B., Bodden, E., Smith, J., Murphy-Hill, E.: Just-in-time static analysis. In: Proceedings of the International Symposium Software Testing and Analysis (ISSTA), pp. 307–317. ACM, New York (2017)

  • Dong, Z., Andrzejak, A., Lo, D., Costa, D.: Orplocator: identifying read points of configuration options via static analysis. In: Proceedings of the International Symposium Software Reliability Engineering (ISSRE), pp. 185–195 (2016)

  • Enck, W., Gilbert, P., Chun, B.G., Cox, L.P., Jung, J., McDaniel, P., Sheth, A.N.: Taintdroid: an information-flow tracking system for realtime privacy monitoring on smartphones. In: Proceedings of the Conference Operating Systems Design and Implementation (OSDI), pp. 393–407. USENIX Association, Berkeley (2010)

  • Esfahani, N., Elkhodary, A., Malek, S.: A learning-based framework for engineering feature-oriented self-adaptive software systems. IEEE Trans. Softw. Eng. 39(11), 1467–1493 (2013)

    Article  Google Scholar 

  • Garbervetsky, D., Zoppi, E., Livshits, B.: Toward full elasticity in distributed static analysis: the case of callgraph analysis. In: Proceedings of the European Software Engineering Conference Foundations of Software Engineering (ESEC/FSE), pp. 442–453. ACM, New York (2017). https://doi.org/10.1145/3106237.3106261

  • Georges, A., Buytaert, D., Eeckhout, L.: Statistically rigorous Java performance evaluation. SIGPLAN Not. 42(10), 57–76 (2007)

    Article  Google Scholar 

  • Gregg, B.: The flame graph. Commun. ACM 59(6), 48–57 (2016)

    Article  Google Scholar 

  • Gui, J., Li, D., Wan, M., Halfond, W.G.J.: Lightweight measurement and estimation of mobile ad energy consumption. In: Proceedings of the International Workshop Green and Sustainable Software (GREENS), pp. 1–7. ACM, New York (2016)

  • Guo, J., Czarnecki, K., Apel, S., Siegmund, N., Wąsowski, A.: Variability-aware performance prediction: a statistical learning approach. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 301–311. IEEE Computer Society/ACM, London/New York (2013)

  • Gupta, A., Zimmermann, T., Bird, C., Nagappan, N., Bhat, T., Emran, S.: Mining energy traces to aid in software development: an empirical case study. In: Proceedings of the International Symposium Empirical Software Engineering and Measurement (ESEM), pp. 40:1–40:8. ACM, New York (2014)

  • Halin, A., Nuttinck, A., Acher, M., Devroey, X., Perrouin, G., Baudry, B.: Test them all, is it worth it? Assessing configuration sampling on the Jhipster web development stack. Empir. Softw. Eng. 24, 674–717 (2018)

    Article  Google Scholar 

  • Han, X., Yu, T.: An empirical study on performance bugs for highly configurable software systems. In: Proceedings of the International Symposium Empirical Software Engineering and Measurement (ESEM), pp. 23:1–23:10. ACM, New York (2016)

  • Han, X., Yu, T., Lo, D.: Perflearner: learning from bug reports to understand and generate performance test frames. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 17–28. ACM, New York (2018)

  • Hao, S., Li, D., Halfond, W.G.J., Govindan, R.: Estimating mobile application energy consumption using program analysis. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 92–101. IEEE Press, Piscataway (2013)

  • Harchol-Balter, M.: Performance Modeling and Design of Computer Systems: Queueing Theory in Action, 1st edn. Cambridge University Press, New York (2013)

    MATH  Google Scholar 

  • Hervieu, A., Baudry, B., Gotlieb, A.: Pacogen: automatic generation of pairwise test configurations from feature models. In: International Symposium Software Reliability Engineering, pp. 120–129 (2011)

  • Hervieu, A., Marijan, D., Gotlieb, A., Baudry, B.: Optimal minimisation of pairwise-covering test configurations using constraint programming. Inf. Softw. Technol. 71, 129–146 (2016)

    Article  Google Scholar 

  • Hoffmann, H., Sidiroglou, S., Carbin, M., Misailovic, S., Agarwal, A., Rinard, M.: Dynamic knobs for responsive power-aware computing. In: Proceedings of the International Conference Architectural Support for Programming Languages and Operating Systems (ASPLOS), pp. 199–212. ACM, New York (2011)

  • Hubaux, A., Xiong, Y., Czarnecki, K.: A user survey of configuration challenges in Linux and ECOS. In: Proceedings of the Workshop Variability Modeling of Software-Intensive Systems (VAMOS), pp. 149–155. ACM, New York (2012)

  • Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Proceedings of the International Conference Learning and Intelligent Optimization, pp. 507–523. Springer, Berlin (2011)

  • Jabbarvand, R., Sadeghi, A., Bagheri, H., Malek, S.: Energy-aware test-suite minimization for android apps. In: Proceedings of the International Symposium Software Testing and Analysis (ISSTA), pp. 425–436. ACM, New York (2016)

  • Jamshidi, P., Casale, G.: An uncertainty-aware approach to optimal configuration of stream processing systems. In: International Symposium Modeling, Analysis and Simulation of Computer and Telecommunication Systems (MASCOTS), pp. 39–48 (2016)

  • Jamshidi, P., Siegmund, N., Velez, M., Kästner, C., Patel, A., Agarwal, Y.: Transfer learning for performance modeling of configurable systems: an exploratory analysis. In: Proceedings of the International Conference Automated Software Engineering (ASE). ACM, New York (2017a)

  • Jamshidi, P., Velez, M., Kästner, C., Siegmund, N., Kawthekar, P.: Transfer learning for improving model predictions in highly configurable software. In: Proceedings of the International Symposium Software Engineering for Adaptive and Self-managing Systems (SEAMS), pp. 31–41. IEEE Computer Society, Los Alamitos (2017b)

  • Jamshidi, P., Velez, M., Kästner, C., Siegmund, N.: Learning to sample: exploiting similarities across environments to learn performance models for configurable systems. In: Proceedings of the International Symposium Foundations of Software Engineering (FSE), pp. 71–82. ACM, New York (2018)

  • Jin, D., Qu, X., Cohen, M.B., Robinson, B.: Configurations everywhere: implications for testing and debugging in practice. In: Companion Proceedings of the International Conference Software Engineering, pp. 215–224. ACM, New York (2014)

  • Kim, C.H.P., Batory, D.S., Khurshid. S.: Reducing combinatorics in testing product lines. In: Proceedings of the International Conference Aspect-Oriented Software Development (AOSD), pp. 57–68. ACM, New York (2011)

  • Kim, C.H.P., Marinov, D., Khurshid, S., Batory, D., Souto, S., Barros, P., d’Amorim, M.: Splat: lightweight dynamic analysis for reducing combinatorics in testing configurable systems. In: Proceedings of the European Software Engineering Conference Foundations of Software Engineering (ESEC/FSE), pp. 257–267. ACM, New York (2013)

  • Kolesnikov, S., Siegmund, N., Kästner, C., Grebhahn, A., Apel, S.: Tradeoffs in modeling performance of highly configurable software systems. Software and System Modeling (SoSyM) (2018)

  • Konietschke, F., Hothorn, L.A., Brunner, E.: Rank-based multiple test procedures and simultaneous confidence intervals. Electron. J. Stat. 6, 738–759 (2012)

    Article  MathSciNet  Google Scholar 

  • Kuhn, D.R., Kacker, R.N., Lei, Y.: Introduction to Combinatorial Testing, vol. 1. Chapman & Hall, London (2013)

    MATH  Google Scholar 

  • Lerch, J., Späth, J., Bodden, E., Mezini, M.: Access-path abstraction: scaling field-sensitive data-flow analysis with unbounded access paths (t). In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 619–629. IEEE Computer Society, Washington, DC (2015)

  • Lillack, M., Kästner, C., Bodden, E.: Tracking load-time configuration options. IEEE Trans. Softw. Eng. 44(12), 1269–1291 (2018)

    Article  Google Scholar 

  • Medeiros, F., Kästner, C., Ribeiro, M., Gheyi, R., Apel, S.: A comparison of 10 sampling algorithms for configurable systems. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 643–654. ACM, New York (2016)

  • Meinicke, J., Wong, C.P., Kästner, C., Thüm, T., Saake, G.: On essential configuration complexity: measuring interactions in highly-configurable systems. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 483–494. ACM, New York (2016)

  • Montgomery, D.C.: Design and Analysis of Experiments. Wiley, London (2006)

    Google Scholar 

  • Mostafa, S., Wang, X., Xie, T.: Perfranker: prioritization of performance regression tests for collection-intensive software. In: Proceedings of the International Symposium Software Testing and Analysis (ISSTA), pp. 23–34. ACM, New York (2017)

  • Nguyen, T., Koc, T., Cheng, J., Foster, J.S., Porter, A.A.: iGen dynamic interaction inference for configurable software. In: Proceedings of the International Symposium Foundations of Software Engineering (FSE), IEEE Computer Society, Los Alamitos (2016)

  • Nie, C., Leung, H.: A survey of combinatorial testing. ACM Comput. Surv. (CSUR) 43(2), 11:1–11:29 (2011)

  • Oh, J., Batory, D., Myers, M., Siegmund, N.: Finding near-optimal configurations in product lines by random sampling. In: Proceedings of the European Software Engineering Conference Foundations of Software Engineering (ESEC/FSE), pp. 61–71. ACM, New York (2017)

  • Olaechea, R., Rayside, D., Guo, J., Czarnecki, K.: Comparison of exact and approximate multi-objective optimization for software product lines. In: Proceedings of the International Software Product Line Conference (SPLC), pp. 92–101. ACM, New York (2014)

  • Pauck, F., Bodden, E., Wehrheim, H.: Do android taint analysis tools keep their promises? In: Proceedings of the International Symposium Foundations of Software Engineering (FSE), pp. 331–341. ACM, New York (2018). https://doi.org/10.1145/3236024.3236029

  • Qiu, L., Wang, Y., Rubin, J.: Analyzing the analyzers: FlowDroid/IccTA, AmanDroid, and DroidSafe. In: Proceedings of the International Symposium Software Testing and Analysis (ISSTA), pp. 176–186. ACM, New York (2018)

  • Rabkin, A., Katz, R.: Static extraction of program configuration options. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 131–140. ACM, New York (2011)

  • Reisner, E., Song, C., Ma, K.K., Foster, J.S., Porter, A.: Using symbolic evaluation to understand behavior in configurable software systems. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 445–454. ACM, New York (2010)

  • Sarkar, A., Guo, J., Siegmund, N., Apel, S., Czarnecki, K.: Cost-efficient sampling for performance prediction of configurable systems. In: Proceedings of the International Conference Automated Software Engineering (ASE), pp. 342–352. IEEE Computer Society, Washington, DC (2015)

  • Saumont, P.-Y.: Lazy computations in Java with a lazy type (2017). https://www.sitepoint.com/lazy-computations-in-java-with-a-lazy-type/. Accessed 24 Jan 2018

  • Serazzri, G., Casale, G., Bertoli, M., Serazzri, G., Casale, G., Bertoli, M.: Java modelling tools: an open source suite for queueing network modelling andworkload analysis. In: 3rd International Conference on the Quantitative Evaluation of Systems—(QEST’06), pp. 119–120 (2006)

  • Siegmund, N., Kolesnikov, S.S., Kästner, C., Apel, S., Batory, D., Rosenmüller, M., Saake, G.: Predicting performance via automated feature-interaction detection. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 167–177. IEEE Press, Piscataway (2012a)

  • Siegmund, N., Rosenmüller, M., Kuhlemann, M., Kästner, C., Apel, S., Saake, G.: SPL conqueror: toward optimization of non-functional properties in software product lines. Softw. Qual. J. 20(3–4), 487–517 (2012b)

    Article  Google Scholar 

  • Siegmund, N., von Rhein, A., Apel, S.: Family-based performance measurement. In: Proceedings of the International Conference Generative Programming and Component Engineering (GPCE), pp. 95–104. ACM, New York (2013)

  • Siegmund, N., Grebhahn, A., Apel, S., Kästner, C.: Performance-influence models for highly configurable systems. In: Proceedings of the European Software Engineering Conference Foundations of Software Engineering (ESEC/FSE), pp. 284–294. ACM, New York (2015)

  • Souto, S., d’Amorim, M.: Time-space efficient regression testing for configurable systems. J. Syst. Softw. 137, 733–746 (2018)

    Article  Google Scholar 

  • Souto, S., d’Amorim, M., Gheyi, R.: Balancing soundness and efficiency for practical testing of configurable systems. In: Proceedings of the International Conference Software Engineering (ICSE), pp. 632–642. IEEE Press, Piscataway (2017)

  • Späth, J., Ali, K., Bodden, E.: Ideal: efficient and precise alias-aware dataflow analysis. Proc. ACM Program Lang. 1(OOPSLA):99:1–99:27 (2017). https://doi.org/10.1145/3133923

  • Thüm, T., Apel, S., Kästner, C., Schaefer, I., Saake, G.: A classification and survey of analysis strategies for software product lines. ACM Comput. Surv. (CSUR) 47(1), 6:1–6:45 (2014)

    Article  Google Scholar 

  • Vallée-Rai, R., Co, P., Gagnon, E., Hendren, L., Lam, P., Sundaresan, V.: Soot—a Java bytecode optimization framework. In: Proceedings of the Conference Centre for Advanced Studies on Collaborative Research (CASCON), p. 13. IBM Press, London (1999)

  • Valov, P., Petkovich, J.C., Guo, J., Fischmeister, S., Czarnecki, K.: Transferring performance prediction models across different hardware platforms. In: Proceedings of the International Conference on Performance Engineering (ICPE), pp. 39–50. ACM, New York (2017)

  • Velez, M., Jamshidi, P., Sattler, F., Siegmund, N., Apel, S., Kästner, C.: Configcrusher: towards white-box performance analysis for configurable systems—Supplementary Material—https://bit.ly/3diKZmK (2019)

  • Wang, B., Passos, L., Xiong, Y., Czarnecki, K., Zhao, H., Zhang, W.: Smartfixer: fixing software configurations based on dynamic priorities. In: Proceedings of the International Software Product Line Conference (SPLC), pp 82–90. ACM, New York (2013). https://doi.org/10.1145/2491627.2491640

  • Wang, Y., Zhang, H., Rountev, A.: On the unsoundness of static analysis for android GUIS. In: Proceedings of the International Workshop State of the Art in Program Analysis (SOAP), pp. 18–23. ACM, New York (2016)

  • Wang, S., Li, C., Hoffmann, H., Lu, S., Sentosa, W., Kistijantoro, A.I.: Understanding and auto-adjusting performance-sensitive configurations. In: Proceedings of the International Conference Architectural Support for Programming Languages and Operating Systems (ASPLOS), pp. 154–168. ACM, New York (2018)

  • Weisenburger, P., Luthra, M., Koldehofe, B., Salvaneschi, G.: Quality-aware runtime adaptation in complex event processing. In: Proceedings of the International Symposium Software Engineering for Adaptive and Self-managing Systems (SEAMS), pp. 140–151. IEEE Computer Society, Los Alamitos (2017)

  • Xu, T., Zhang, J., Huang, P., Zheng, J., Sheng, T., Yuan, D., Zhou, Y., Pasupathy, S.: Do not blame users for misconfigurations. In: Proceedings of the Symposium Operating Systems Principles, pp. 244–259. ACM, New York (2013)

  • Xu, T., Jin, L., Fan, X., Zhou, Y., Pasupathy, S., Talwadker, R.: Hey, you have given me too many knobs!: Understanding and dealing with over-designed configuration in system software. In: Proceedings of the European Software Engineering Conference Foundations of Software Engineering (ESEC/FSE), pp. 307–319. ACM, New York (2015)

  • Yang, J., Hance, T., Austin, T.H., Solar-Lezama, A., Flanagan, C., Chong, S.: Precise, dynamic information flow for database-backed applications. In: Proceedings of the Conference Programming Language Design and Implementation (PLDI), pp. 631–647. ACM, New York (2016)

  • Yu, T., Pradel, M.: Pinpointing and repairing performance bottlenecks in concurrent programs. Empir. Softw. Eng. 23(5), 3034–3071 (2018)

    Article  Google Scholar 

  • Zhang, Q., Su, Z.: Context-sensitive data-dependence analysis via linear conjunctive language reachability. In: Proceedings of the Symposium Principles of Programming Languages (POPL), pp. 344–358. ACM, New York (2017). https://doi.org/10.1145/3009837.3009848

  • Zhu, Y., Liu, J., Guo, M., Bao, Y., Ma, W., Liu, Z., Song, K., Yang, Y.: Bestconfig: tapping the performance potential of systems via automatic configuration tuning. In: Proceedings of the Symposium Cloud Computing (SoCC), pp. 338–350. ACM, New York (2017)

Download references

Acknowledgements

This work has been supported in part by the NSF (Awards 1318808, 1552944, 1717022), AFRL, DARPA (FA8750-16-2-0042), and the German Research Foundation (AP 206/7-2, AP 206/11-1, SI 2171/2, SI 2171/3-1). We thank Chu-Pan Wong and Jens Meinicke for their comments during the development of this work. We thank the FOSD 2017 and 2018 meeting participants for their feedback on the central idea of this work. We thank Steven Artz for his help with FlowDroid.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Miguel Velez.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Velez, M., Jamshidi, P., Sattler, F. et al. ConfigCrusher: towards white-box performance analysis for configurable systems. Autom Softw Eng 27, 265–300 (2020). https://doi.org/10.1007/s10515-020-00273-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10515-020-00273-8

Keywords

Navigation