Tuesday, 2019-12-10

Tutorials

 

1. Model-based performance self-adaptation (MPS)

- Emilio Incerto and Mirco Tribastone (IMT School for Advanced Studies Lucca, Italy)

- Monday, April 8, Morning Session

Abstract: This tutorial presents techniques for self-adaption of software systems using performance models in order to achieve desired quality-of-service objectives. Main hindrances with state-of-the-art methods are the requirement of assuming a steady-state regime to be able to use analytical solutions and the explosion of the state space which occurs when modeling software systems with stochastic processes such as Markov chains. This makes their online use difficult because the system under consideration may be in a transient regime, and the typically large cost of the analysis does not permit fast tracking of performance dynamics. We will introduce fluid models based on nonlinear ordinary differential equations (ODEs) as a key enabling technique to effectively approximate large-scale stochastic processes. This representation makes it possible to employ online optimization techniques based on model-predictive control (MPC) in order to find an assignment of the values of tunable parameters of the model to steer the system toward the desired objective. We will also show how, dually, the same techniques can be used for the online estimation of software service demands. In this tutorial we will focus on software performance models based on queuing networks, with applications to runtime auto-scaling in cloud scenarios, presenting recent research by the authors in a unified context.

 

2. Performance Benchmarking of Infrastructure-as-a-Service (IaaS) Clouds with CloudWorkBench (PB-IaaS)

- Joel Scheuner and Philipp Leitner (Chalmers University of Gothenburg, Gothenburg, Sweden)

- Monday, April 8, Afternoon Session

Abstract: The continuing growth of the cloud computing market has led to an unprecedented diversity of cloud services with different performance characteristics. To support service selection, researchers and practitioners conduct cloud performance benchmarking by measuring and objectively comparing the performance of different providers and configurations (e.g., instance types in different data center regions). In this tutorial, we demonstrate how to write performance tests for IaaS clouds using the Web-based benchmarking tool Cloud WorkBench (CWB). We will motivate and introduce benchmarking of IaaS cloud in general, demonstrate the execution of a simple benchmark in a public cloud environment, summarize the CWB tool architecture, and interactively develop and deploy a more advanced benchmark together with the participants.