Abstract
The Object Management Group's (OMG) Data Distribution Service (DDS) provides many configurable policies which determine end-to-end quality of service (QoS) of applications. It is challenging to predict the system's performance in terms of latencies, throughput, and resource usage because diverse combinations of QoS configurations influence QoS of applications in different ways. To overcome this problem, design-time formal methods have been applied with mixed success, but lack of sufficient accuracy in prediction, tool support, and understanding of formalism has prevented wider adoption of the formal techniques. A promising approach to address this challenge is to emulate system behavior and gather data on the QoS parameters of interest by experimentation. To realize this approach, which is preferred over formal methods due to their limitations in accurately predicting QoS, we have developed a model-based automatic performance testing framework with generative capabilities to reduce manual efforts in generating a large number of relevant QoS configurations that can be deployed and tested on a cloud platform. This paper describes our initial efforts in developing and using this technology.
- Patrick Th. Eugster, Pascal A. Felber, Rachid Guerraoui, and Anne-Marie Kermarrec. The many faces of publish/subscribe. ACM Computer Survey, 35:114--131, June 2003. Google ScholarDigital Library
- Joe Hoffert, Douglas Schmidt, and Aniruddha Gokhale. A QoS Policy Configuration Modeling Language for Publish/Subscribe Middleware Platforms. In Proceedings of International Conference on Distributed Event-Based Systems (DEBS), pages 140--145, Toronto, Canada, June 2007. Google ScholarDigital Library
- D. Jayasinghe, G. Swint, S. Malkowski, J. Li, Qingyang Wang, Junhee Park, and C. Pu. Expertus: A Generator Approach to Automate Performance Testing in IaaS Clouds. In Cloud Computing (CLOUD), 2012 IEEE 5th International Conference on, pages 115--122, 2012. Google ScholarDigital Library
- Object Management Group. Data Distribution Service for Real-time Systems Specification, 1.2 edition, January 2007.Google Scholar
Index Terms
- Model-driven generative framework for automated OMG DDS performance testing in the cloud
Recommendations
Model-driven generative framework for automated OMG DDS performance testing in the cloud
GPCE '13: Proceedings of the 12th international conference on Generative programming: concepts & experiencesThe Object Management Group's (OMG) Data Distribution Service (DDS) provides many configurable policies which determine end-to-end quality of service (QoS) of applications. It is challenging to predict the system's performance in terms of latencies, ...
Model-driven generative framework for automated OMG DDS performance testing in the cloud
SPLASH '13: Proceedings of the 2013 companion publication for conference on Systems, programming, & applications: software for humanityThe Object Management Group's (OMG) Data Distribution Service (DDS) provides many configurable policies which determine end-to-end quality of service (QoS) delivered to the applications. It is challenging, however, to predict the application's ...
An Integrated Model-driven Framework for Simulation, Analysis, and Testing Based on OMG Standards
VALUETOOLS'16: proceedings of the 10th EAI International Conference on Performance Evaluation Methodologies and Tools on 10th EAI International Conference on Performance Evaluation Methodologies and ToolsModel-driven engineering techniques are becoming increasingly popular for cost-effective and highly productive software development. Particularly, approaches and tools based on the standards provided by the Object Management Group (OMG), first of all on ...
Comments