Thursday 20 September 2012

Which Comes First - Performance Testing Or Test Automation?


At the very outset of this post, let me say that I am not a test automation specialist at all. I have done test automation for a couple of products but performance testing is much more my forte. I have however, been on projects where both these exercises have been performed on a particular release. My aim here is to put a road map around how to go about doing this.

First up, some views about test automation. My opinion is that it is not testing at all - its development. The tests that are automated are the functional tests - test automation just does the same things through an automated process. To that end, test automation does not tell us something about the system that functional testing does not. Sure, it tells us the same things a lot quicker and with greater efficiency and that saves a lot of precious time for other activities - but nothing more. Unlike performance testing, which is highly recommended and could cost management a lot if not done, test automation is not critical to the health of the system. In my experience, it has been somewhat rare for projects to do both performance testing and test automation - usually its one or the other (performance testing tends to win out), but the discussion here assumes both are performed.

In the first release, there will not be a regression test library of automated tests to run, so test automation needs to proceed hand in hand with functional testing. Test cases that are candidates for automation need to be identified very early on. As functional testing progresses, regression testing of tested functionality should be done through the test automation scripts. The way this helps performance testing is that the system is likely to be in a stable state for scripting, given that functional testing has reduced the functional errors to a certain degree. Furthermore, the scripts for some automation test cases could be exported to the performance testing tool for performance testing (this is possible at least in HP's suite) - not every automation test case will be tested for performance, so one has to choose. By the time the second release comes around, there is a library of automated regression test cases that should be run and expanded (as part of the functionality in the new release) as often as needed. Performance testing stands to benefit the same way as before.

One of the pitfalls to avoid when following this road map is to fall into the temptation of reducing the scope of (or doing away with entirely) performance testing if test automation execution reports tell us that all is well and dandy with the system. This can happen when pressure to release the product on schedule increases. It must be remembered that no matter how well functional testing went (in terms of removing defects) and how quickly regression testing was completed due to test automation, performance testing is about testing the non-functional attributes of the system. It's aims are entirely different. Functional testing and test automation results will provide information about what works in the system for a single user - it says nothing about how well it would work when under load and/or when system resources are stressed.

A lot of other issues such as choice of test cases to automate, which ones of those to use for performance testing etc. are out of scope for this discussion and will be left for another day. In general, test automation and performance testing offer value in different ways - it is up to the test manager and the performance test manager to figure out ways to get the most out of these, as well as how one benefits from the other.