Guiding Cloud Users for Cost and Performance through Testing and Recommendation

Public Infrastructure as a Service (IaaS) cloud computing is growing rapidly, with businesses, institutions and individuals moving their workloads to clouds such as Amazon Web Services, Microsoft Azure and Google Compute Engine. The main uses today include development and testing environments, high performance computing and batch processing, and public websites and web-based applications. Cloud users bene fit from low cost of ownership, a pay-as-you-go pricing model where they only pay for the resource usage they have procured, and ability to scale the resource usage up and down during execution, i.e., elasticity. Although IaaS clouds provide great customizability and elasticity, it is eventually the cloud user’s responsibility to choose proper virtual machine (VM) instance confi gurations and auto-scaling policies to ensure their requirements are met. In making these decisions, they are challenged by unpredictable performance, complex elasticity metrics, no control of physical resources, and di fferences among di fferent cloud service providers.

This project addresses the need to support IaaS cloud users in achieving their cost-performance requirements as they port their applications to the cloud in a cost e ffective way. In particular, the research is embodied in an envisioned testing and recommendation system that will determine proper VM instance con figurations and auto-scaling policies. The focus is on testing towards meeting performance and cost requirements for applications to be ported to the IaaS cloud. Thus, the research investigates how each of a customized testing framework’s components needs to be customized to cloud applications. This includes the input, the test cases, test case generator, test case executor, and the test oracle. By taking a test-based approach to help cloud users, the research provides solutions using only user-accessible information to satisfy user requirements, addresses the limits of static analysis techniques that rely on performance predictability, and can be applied to a wide range of application types and cloud services.

This project will contribute to the state of the art by tackling several challenges to bring automated support to cloud users, including: (1) design and development of a testing framework customized to applications in the cloud, (2) various instantiations of the framework to test different configurations of a cloud application possibly provided by the cloud user, (3) algorithms for recommending instance configurations that meet performance and cost requirements, (4) techniques to minimize the test case search space, thus the number of test cases generated and executed, and (5) implementation and evaluation of the testing and recommendation system including metrics.

 As the number of cloud users continues to grow, the tedious, complex, and costly task of porting the application to meet cost-performance requirements will be significantly eased through automatic support. Such a testing and recommendation system enables non-experts to port their applications in a cost effective way.

This project is funded by the National Science Foundation Grant NSF 1618310.
The PI, Lori Pollock, is collaborating with Mary Lou Soffa at University of Virginia and Wei Wang at University of Texas, San Antonio also funded by the NSF through a collaborative grant.