Agile is one of the most widely used software development methodologies that include user stories, the smallest units semi-structured specifications to capture the requirements from a user's point of view. Despite being popular, only a little research has been done to automate the quality checking/analysis of a user story before assigning it to a sprint. In this study, we have chosen two metrics, i.e., Testable and Valuable criteria from INVEST checklist, and have applied supervised machine learning classifiers to automatically classify them. Since the industrial data collected for the research was unbalanced, we also applied data balancing techniques such as SMOTE, RUS, ROS, and Back translation (BT) to verify if they improved any classification metrics. Although we did not see any significant improvements in accuracy and precision for the classifiers after applying data balancing techniques, we noticed a significant improvement in recall values across all the classifiers. Our research provides some promising insights into how this research could be used in the software industry to automate the analysis of user stories and improve the quality of software produced.