Email Subscription Form

Saturday, October 6, 2018

Automating Tests for a Complicated Feature

Two weeks ago, I introduced a hypothetical software feature called the Superball Sorter, which would sort out different colors and sizes of Superballs among four children.  I discussed how to create a test plan for the feature, and then last week I followed up with a post about how to organize a test plan using a simple spreadsheet.  This week I'll be describing how to automate tests for the feature.

In case you don't have time to view the post where I introduced the feature, here's how it works:

  • Superballs can be sorted among four children- Amy, Bob, Carol, and Doug
  • The balls come in two sizes: large and small
  • The balls come in six colors: red, orange, yellow, green, blue, and purple
  • The children can be assigned one or more rules for sorting: for example, Amy could have a rule that says that she only accepts large balls, or Bob could have a rule that says he only accepts red or orange balls
  • Distribution of the balls begins with Amy and then proceeds through the other children in alphabetical order, and continues in the same manner as if one was dealing a deck of cards
  • Each time a new ball is sorted, distribution continues with the next child in the list
  • The rules used must result in all the balls being sortable; if they do not, an error will be returned
  • Your friendly developer has created a ball distribution engine that will create balls of various sizes and colors for you to use in testing


When automating tests, we want to keep our tests as simple as possible.  This means sharing code whenever we can.  In examining the manual test plan, we can see that there are three types of tests here:
  • A test where none of the children have any rules
  • Tests where all the children have rules, but the rules result in some balls not being sortable
  • Tests where one or more children have rules, and all of the balls are sortable

We can create a separate test class for each of these types, which will make it easy for us to share code inside each class.  We'll also have methods that will be shared among the classes:

child.deleteBalls()- this will clear all the distributed balls, getting ready for the next test

child.deleteRules()- this will clear out all the existing rules, getting ready for the next test

distributeBalls(numberOfBalls)- this will randomly generate a set number of balls of various sizes and colors and distribute them one at a time to the children, according to the rules

verifyEvenDistribution(numberOfBalls)- this is for scenarios where none of the children have rules; it will take the number of balls distributed and verify that each child has one-fourth of the balls

child.addRule(Size size, Color color)- this will set a rule for a child; each child can have more than one rule, and either the size or color (but not both) can be null

child.verifyRulesRespected()- for the specified child, this will iterate through each ball and each rule and verify that each ball has respected each rule

child.addFourthRuleAndVerifyError(Size size, Color color)- for the tests in the InvalidRules class, it will always be the fourth child's rules that will trigger the error, because it's only with the fourth child that the Sorter realizes that there will be balls that can't be sorted.  So this method will assert that an error is returned.

Each test class should have setup and cleanup steps to avoid repetitive code.  However, I would use child.deleteBalls() and child.deleteRules() for each child in both my setup and cleanup steps, just in case there is there is an error in a test that causes the cleanup step to be missed.

For the DistributionWithNoRules test class, each test would consist of:

distributeBalls(numberOfBalls);
verifyEvenDistribution(numberOfBalls);

All that would vary in each test would be the number of balls passed in.

For the DistributionWithRules test class, each test would consist of:

child.AddRule(Size size, Color color); --repeated as many times as needed for each child
distributeBalls(numberOfBalls);
child.verifyRulesRespected(); --repeated for each child that has one or more rules

Finally, for the InvalidRules test class, each test would consist of:

child.AddRule(Size size, Color color); --repeated as many times as needed for the first three children
child.addFourthRuleAndVerifyError(Size size, Color color); --verifying that the fourth rule triggers an error

The nice thing about organizing the tests like this is that it will be easy to vary the tests.  For example, you could have the DistributionWithNoRules test class run with 40, 400, and 800 balls; or you could set up a random number generator that would generate any multiple of four for each test.

You could also set up your DistributionWithRules and InvalidRules test classes to take in rule settings from a separate table, varying the table occasionally for greater test coverage.

Astute readers may have noticed that there are a few holes in my test plan:

  • how would I assert even distribution in a test scenario where no children have rules and the number of balls is not evenly divisible by four?
  • how can I show that a child who doesn't have a rule is still getting the correct number of balls, when there are one or more children with a rule?  For example, if Amy has a rule that she only gets red balls, and Bob, Carol, and Doug have no rules, how can I prove that Bob, Carol and Doug get an even distribution of balls?
  • will the Superball Sorter work with more or less than four children? How would I adjust my test plan for this?
I'll leave it to you to think about handling these things, and perhaps I will tackle them in future posts.

I've had a lot of fun over the last few weeks creating the idea of the Superball Sorter and thinking of ways to test it!  I've started to actually write code for this, and someday when it's finished I will share it with you.

No comments:

Post a Comment

New Blog Location!

I've moved!  I've really enjoyed using Blogger for my blog, but it didn't integrate with my website in the way I wanted.  So I...