Testing Cloud Applications under Cloud-Uncertainty Performance Effects

Wei Wang, Ningjing Tian, Sunzhou Huang, Sen He, Abhijeet Srivastava, Mary Lou Soffa, Lori Pollock

Research output: Chapter in Book/Report/Conference proceedingConference contribution

22 Scopus citations

Abstract

The paradigm shift of deploying applications to the cloud has introduced both opportunities and challenges. Although clouds use elasticity to scale resource usage at runtime to help meet an application's performance requirements, developers are still challenged by unpredictable performance, little control of execution environment, and differences among cloud service providers, all while being charged for their cloud usages. Application performance stability is particularly affected by multi-tenancy in which the hardware is shared among varying applications and virtual machines. Developers porting their applications need to meet performance requirements, but testing on the cloud under the effects of performance uncertainty is difficult and expensive, due to high cloud usage costs. This paper presents a first approach to testing an application with typical inputs for how its performance will be affected by performance uncertainty, without incurring undue costs of brute force testing in the cloud. We specify cloud uncertainty testing criteria, design a test-based strategy to characterize the black box cloud's performance distributions using these testing criteria, and support execution of tests to characterize the resource usage and cloud baseline performance of the application to be deployed. Importantly, we developed a smart test oracle that estimates the application's performance with certain confidence levels using the above characterization test results and determines whether it will meet its performance requirements. We evaluated our testing approach on both the Chameleon cloud and Amazon web services; results indicate that this testing strategy shows promise as a cost-effective approach to test for performance effects of cloud uncertainty when porting an application to the cloud.

Original languageEnglish (US)
Title of host publicationProceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages81-92
Number of pages12
ISBN (Electronic)9781538650127
DOIs
StatePublished - May 25 2018
Externally publishedYes
Event11th IEEE International Conference on Software Testing, Verification and Validation, ICST 2018 - Vasteras, Sweden
Duration: Apr 9 2018Apr 13 2018

Publication series

NameProceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018

Conference

Conference11th IEEE International Conference on Software Testing, Verification and Validation, ICST 2018
Country/TerritorySweden
CityVasteras
Period4/9/184/13/18

Keywords

  • Bootstrapping
  • Cloud Applications
  • Cloud Performance Uncertainty
  • Software Testing on Cloud

ASJC Scopus subject areas

  • Software
  • Safety, Risk, Reliability and Quality

Fingerprint

Dive into the research topics of 'Testing Cloud Applications under Cloud-Uncertainty Performance Effects'. Together they form a unique fingerprint.

Cite this