Building Your Test Automation Checklist

Share with your network...

Your testing team should decide which tests to automate as early as possible. This information can be used to develop more accurate testing deadlines and better organize your test suite, and even more importantly it can help determine which test automation solution(s) your team adopts. Certain types of tests are ripe for automated testing, like performance tests, while others are not suitable such as single-use “one-off” tests. There is of course a gray area in-between where pros and cons need to be weighed, like testing anything dynamically generated or a semi-frequently changing user interface.

Before meeting with your team to wrangle over these key decisions, skim the Test Automation Checklist below:

 

 

RIPE FOR AUTOMATION

  • Older, repetitive tests that have crystal clear pass/fails for verification and/or validation. A good example would be a test case that tries various username and password combinations for a web page login.
  • Data-driven tests and/or any test that requires lengthy, in-depth preparation. Things like configuring and cleaning up large volumes of test data, or the submissions of lengthy forms with various combinations of input data. Let your test automation solution take the brunt of the repetitive configuration management workload.
  • Sanity and smoke tests, especially any typical paths that your users take through the system under test. For example, an e-commerce website should make doubly sure that users have no issues perusing the sections of its top three best-selling products as this is both a critical and common path for buyers.
  • Time-consuming tests that involve lengthy and repetitive user interactions. Not only is it expensive to tie up a human tester with long, monotonous tests like this, but over time the manual tester becomes glassy-eyed and bored from mindlessly clicking – and in turn the overall quality of testing drops.
  • Performance tests, as the very nature of determining speed, reliability, scalability, and resource usage is what automated tests are perfectly designed to tackle. The process of finding system benchmarks demands the consistency and precision afforded by automated testing.
READ MORE  7 Ways to Reduce Deployment Risk During WMS Implementation

 

 

NOT IDEAL FOR AUTOMATION

  • Tests that require instant feedback beyond the depth and scale provided via report from your test automation solution.
  • Test cases that cover brand-new, rarely-used, and/or currently unstable system features.
  • Outlier, corner test cases.
  • Exploratory tests; excluding any repetitive configuration management like data preparation or teardown.
  • Tests that will change often, as the savings reaped from automating is more than negated by overly frequent maintenance.
  • Tests with subjective validation that have more of a “look and feel” check as opposed to a black-and-white pass/fail.
  • Tests that require verification and/or validation of something physical (shipping label, 3D printed model, etc.) that can’t be digitally verified.

The bottom line is automated testing works seamlessly with a variety of tests, such as regression testing. However, some tests don’t yield the benefits that others do and are not suitable for automated testing such as single use “one-off” tests. We hope this Test Automation Checklist helps break down which ones are and which ones aren’t.

Are you interested in learning more about implementing test automation in your warehouse system? Check out our success stories, our blogs, or learn more about the Cycle platform

This post was written by:
James Prior
Technical Pre-Sales Consultant
James has been working in software pre-sales and implementation since 2000, and has more recently settled into focusing on technical pre-sales.

Share with your network...