Model Driven Benchmark Generation for Web Services Liming Zhu, Ian Gorton, Yan Liu Empirical Software Engineering, National ICT Australia School of Computer Science and Engineering, University of New South Wales {Liming.Zhu, Ian.Gorton, Jenny.Liu@nicta.com.au} Ngoc Bao Bui Faculty of Information Technology, University of Technology, Sydney, Australia NgocBao.Bui@student.uts.edu.au ABSTRACT Web services solutions are being increasingly adopted in enterprise systems. However, ensuring the quality of service of Web services applications remains a costly and complicated performance engineering task. Some of the new challenges include limited controls over consumers of a service, unforeseeable operational scenarios and vastly different XML payloads. These challenges make existing manual performance analysis and benchmarking methods difficult to use effectively. This paper describes an approach for generating customized benchmark suites for Web services applications from a software architecture description following a Model Driven Architecture (MDA) approach. We have provided a performance-tailored version of the UML 2.0 Testing Profile so architects can model a flexible and reusable load testing architecture, including test data, in a standards compatible way. We extended our MDABench [27] tool to provide a Web service performance testing “cartridge” associated with the tailored testing profile. A load testing suite and automatic performance measurement infrastructure are generated using the new cartridge. Best practices in Web service testing are embodied in the cartridge and inherited by the generated code. This greatly reduces the effort needed for Web service performance benchmarking while being fully MDA compatible. We illustrate the approach using a case study on the Apache Axis platform. Categories and Subject Descriptors D.2.10 [Software]: Software Engineering – Design; D.2.11 [Software]: Software Engineering – Software Architectures; D.2.2 [Software]: Software Engineering – Design Tools and Techniques General Terms Design, Theory Keywords MDA; model-driven development; Performance; Testing; Code Generation; Web Service; Service-Oriented Architecture 1. Introduction Web services technologies have proven useful in the construction of enterprise-scale systems. However, many challenges remain, especially ensuring that Web services solution can meet specified performance requirements [2]. Various performance analysis models with prediction capabilities exist to evaluate architecture designs during early phases of the application development cycle [4] [8]. Applying them to Web services has shown promising results [16]. Utilizing these models requires two distinct activities be carried out by the application architect. The first requires the development of specific analytical models based on the application design. The second must obtain parameter values for a performance model using measurements or simulation. Both these activities require significant additional effort and specific expertise in performance engineering methods. Hence, they are key inhibitors that have prevented performance engineering techniques from achieving wide-spread adoption in practice [4]. With the growing interest in Model Driven Architecture (MDA)[19] technologies, attempts to integrate performance analysis with MDA and UML have been made, aiming to reduce the performance modeling effort required. Recent work has attempted model transformation from UML design models to method-specific performance analysis models [21]. Parameter values in these models also depend greatly on the underlying Web service framework used to implement the application. One method to obtain and tune these parameters is to run a benchmark application on the framework. This approach has proven to be useful with component-based technologies [11, 12] [17] and Web services [16]. Running benchmark applications can also help in predicting and diagnosing performance problems, including identifying bottlenecks, preliminary profiling and exploring core application characteristics. An effective benchmark suite includes a core benchmark application, a load testing suite and performance monitoring utilities. There are existing industry benchmark standards and suites applicable to Web services (e.g. TPC-W v2 [9]), but these are not broadly suitable for performance modeling and prediction for a number of reasons. First, they are mainly designed for vendors to showcase and improve their products, rather than reflecting a specific application’s performance characteristics. The application logic in these benchmarks is Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. IW-SOSE’06, May 27–28, 2006, Shanghai, China. Copyright 2006 ACM 1-59593-085-X/06/0005...$5.00..