|Title||Declarative Performance Testing Automation: Automating Performance Testing for the DevOps Era|
|Year of Publication||2021|
|Academic Department||Software Institute, Faculty of Informatics|
Recent trends in industry show increasing adoption of Development and Operations (DevOps) practices. Reasons for increasing DevOps adoption are the focus on the creation of cross-functional teams, and the ability to release high-quality software at a fast pace. Alongside the adoption of DevOps, performance testing continues to evolve to meet the growing demands of the modern enterprise and its need for automation. As DevOps adoption continues and self-service environment provisioning becomes commonplace in Information Technology (IT) departments, more developers will be working on executing performance tests, to ensure the quality of released services satisfies users’ expectations while constraining the resources needed to do so.
Modeling and automated execution of performance tests are time-consuming and difficult activities, requiring expert knowledge, complex infrastructure, and a rigorous process to guarantee the quality of collected performance data and the obtained results. Currently available performance testing approaches are not well integrated with DevOps practices and tools and are often focusing only on specific needs of performance testing modeling and automation.
A recent survey by the Standard Performance Evaluation Corporation (SPEC) Research Group (RG) on DevOps reported the need for a new paradigm for performance activities to be successfully integrated with DevOps practices and tools, such as the one proposed by Declarative Performance Engineering (DPE). Previous studies reported successful applications of DPE to DevOps contexts, due to the opportunity to model the performance testing domain knowledge as a first-class citizen and its ability to offer different levels of abstraction to different people relying on it.
In this dissertation, we introduce a "Declarative Approach for Performance Tests Execution Automation" enabling the continuous and automated execution of performance tests alongside the Continuous Software Development Lifecycle (CSDL), an integral part of DevOps practices. We contribute an automation-oriented catalog of performance test types and goals and a description of how they fit in different moments of the CSDL, a declarative Domain Specific Language (DSL) enabling the declarative specification of performance tests and their automated orchestration processes alongside the CSDL, and a framework for end-to-end automated performance testing of RESTful (RESTful) Web services and Business Process Model and Notation 2.0 (BPMN 2.0) Workflow Management Systems (WfMSs) relying on the contributed DSL. We evaluate the proposed DSL by conducting an expert review targeting its overall expressiveness and suitability for the target users, perceived usability and effort, and reusability of specified tests. We also perform a summative evaluation of the DSL’s usability in terms of learnability, and reusability of test specifications. The surveys confirm the proposed approach is valid for the aims it has been built for, and it is considered on average good for all the evaluated usability dimensions. We evaluate the implemented framework by performing iterative reviews of the different versions of the framework, and a comparative evaluation of the proposed framework’s features compared to state-of-the-art available solutions. The iterative reviews led to many improvements due to the received constructive feedback, while the comparative evaluation showed no other solutions similar to the proposed one are available in the literature. We assess the overall contributed solution by executing a large number of case studies, by collaborating with other researchers in extending both the DSL and the framework.
Submitted by cp on