TY - GEN
T1 - Testing Cloud Applications under Cloud-Uncertainty Performance Effects
AU - Wang, Wei
AU - Tian, Ningjing
AU - Huang, Sunzhou
AU - He, Sen
AU - Srivastava, Abhijeet
AU - Soffa, Mary Lou
AU - Pollock, Lori
N1 - Funding Information:
Acknowledgments This work was supported by the National Science Foundation under grants CCF-1617390 and CCF-1618310. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied of NSF. The authors would like to thank the anonymous reviewers for their insightful comments. We would also like to thank Jinay Jani and Xin Nie for their valuable inputs.
Publisher Copyright:
© 2018 IEEE.
PY - 2018/5/25
Y1 - 2018/5/25
N2 - The paradigm shift of deploying applications to the cloud has introduced both opportunities and challenges. Although clouds use elasticity to scale resource usage at runtime to help meet an application's performance requirements, developers are still challenged by unpredictable performance, little control of execution environment, and differences among cloud service providers, all while being charged for their cloud usages. Application performance stability is particularly affected by multi-tenancy in which the hardware is shared among varying applications and virtual machines. Developers porting their applications need to meet performance requirements, but testing on the cloud under the effects of performance uncertainty is difficult and expensive, due to high cloud usage costs. This paper presents a first approach to testing an application with typical inputs for how its performance will be affected by performance uncertainty, without incurring undue costs of brute force testing in the cloud. We specify cloud uncertainty testing criteria, design a test-based strategy to characterize the black box cloud's performance distributions using these testing criteria, and support execution of tests to characterize the resource usage and cloud baseline performance of the application to be deployed. Importantly, we developed a smart test oracle that estimates the application's performance with certain confidence levels using the above characterization test results and determines whether it will meet its performance requirements. We evaluated our testing approach on both the Chameleon cloud and Amazon web services; results indicate that this testing strategy shows promise as a cost-effective approach to test for performance effects of cloud uncertainty when porting an application to the cloud.
AB - The paradigm shift of deploying applications to the cloud has introduced both opportunities and challenges. Although clouds use elasticity to scale resource usage at runtime to help meet an application's performance requirements, developers are still challenged by unpredictable performance, little control of execution environment, and differences among cloud service providers, all while being charged for their cloud usages. Application performance stability is particularly affected by multi-tenancy in which the hardware is shared among varying applications and virtual machines. Developers porting their applications need to meet performance requirements, but testing on the cloud under the effects of performance uncertainty is difficult and expensive, due to high cloud usage costs. This paper presents a first approach to testing an application with typical inputs for how its performance will be affected by performance uncertainty, without incurring undue costs of brute force testing in the cloud. We specify cloud uncertainty testing criteria, design a test-based strategy to characterize the black box cloud's performance distributions using these testing criteria, and support execution of tests to characterize the resource usage and cloud baseline performance of the application to be deployed. Importantly, we developed a smart test oracle that estimates the application's performance with certain confidence levels using the above characterization test results and determines whether it will meet its performance requirements. We evaluated our testing approach on both the Chameleon cloud and Amazon web services; results indicate that this testing strategy shows promise as a cost-effective approach to test for performance effects of cloud uncertainty when porting an application to the cloud.
KW - Bootstrapping
KW - Cloud Applications
KW - Cloud Performance Uncertainty
KW - Software Testing on Cloud
UR - http://www.scopus.com/inward/record.url?scp=85048429659&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048429659&partnerID=8YFLogxK
U2 - 10.1109/ICST.2018.00018
DO - 10.1109/ICST.2018.00018
M3 - Conference contribution
AN - SCOPUS:85048429659
T3 - Proceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018
SP - 81
EP - 92
BT - Proceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 11th IEEE International Conference on Software Testing, Verification and Validation, ICST 2018
Y2 - 9 April 2018 through 13 April 2018
ER -