Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News How Do We Automate Testing?

How Do We Automate Testing?

Leia em Português

This item in japanese

Test automation will demand time and attention, but when done the right way, is worth the investment. Don’t overdo automation; instead, Karishma Kolli suggested focusing on the needs and requirements. Having clean and easy-to-read code is very important to keep your test suite maintainable.

Karishma Kolli, a software development engineer, spoke about mythbusters on test automation at the Fall Online Testing Conference 2017. InfoQ is covering this conference with Q&As, summaries, and articles.

InfoQ interviewed Kolli about testing tools and skills, test automation, and noteworthy developments in automated testing.

InfoQ: What is your advice for selecting suitable testing tools, and which criteria should we use?

Karishma Kolli: The test tool selection process mainly depends on your business/testing needs. For example, performance is a huge deal for some applications, but it’s not much of a priority for others. Same with load, UI, etc.

It depends on your targeted audience and business impact.

The availability of a skill-set, budget and time-factor can also a play secondary role in determining your tools. In the software world we always have timelines and deadlines, so if we have a tight timeline then we may choose something simpler, or even a commercial tool.

InfoQ: Different types of testing require different tools. How can we manage a variety of tools and testing skills sets?

Kolli: It’s very simple. When we understand and acknowledge the need for different testing tools, managing the tools won’t be a problem. We manage various development tools and skill sets, and managing a variety of testing tools and skill sets is no different.

In the article Thoughts on Test Automation in Agile, Rajneesh Namta explained why we should carefully decide what to automate:

Do not automate for the sake of automation. Due consideration should be given to concerns like maintainability and execution time before adding new tests. Each test the team adds to the automated test suite becomes part of the production code base and therefore must be maintained just like the rest of the code base - for the entire life of the application. Adding tests that are overly complex or difficult to maintain end up slowing down the feedback cycle to the team and should be avoided.

InfoQ: How much time should teams spend on test automation? How can you know if you’re putting in enough time, and not too much?

Kolli: Initially, test automation will need more time but once your framework is ready, you wouldn’t have to invest a lot of time in it. Typically, if a team has one/two manual QAs, then one SDET should be capable of handling the automation needs. Of course, it varies based on how much testing is handled with automation and the type of application.

If you are automating everything on UI, then you are spending more time than needed on automation.

Dave Farley suggested in Automated Acceptance Testing Supports Continuous Delivery that we shouldn’t use UI Record-and-playback Systems:

By their nature, UI Record-and-playback systems start from assuming that your UI is the focus of the test, rather than the behaviour of your system that the user wants. This is actually a technically-focused kind of test, rather than a behaviourally-focused kind of test. As a result, these tests are always more fragile, more prone to break in the face of relatively small changes to the system-under-test. So I avoid them - too much work in the long run.

InfoQ: What’s your view on using UI Record-and-playback Systems in testing?

Kolli: I personally am not a fan of record and playback. It’s the same as "one size fits all", and in reality it’s not a good fit for most of us. Record and-playback won’t address your tailor needs; they are the most flaky tests and as a result you won’t be able to establish confidence in your automation testing suite.

InfoQ: What are your main learnings on test automation?

Kolli: Over the years I realized that it is easy to get carried away and overdo automation.

It is important to focus on needs and requirements.

For example, if our requirement is to test client critical areas, then in that case we don’t have to invest our time on features that only 2% of the users use.

If our need is to support 10,000 users at a time, then we don’t have to test for 11,000.

Maintaining the automation suite may not sound like much effort but as your code keeps growing, it gets more and more complex to update and maintain your code, so having clean and easy-to-read code is very important.

We often tend to ignore tiny details such as reports and configuration, but if we take proper care of these tiny details then they turn into valuable assets.

InfoQ: Which noteworthy developments do you see in automated testing? What do you expect will happen in the near future?

Kolli: Everyday more and more companies are acknowledging the importance of automation testing. Test automation has evolved from being just record and-playback on your local computer to running an entire custom made test suite on cloud.

With the development of artificial intelligence, test automation will be much more reliable and effective. Familiar examples are Siri and Google; they learn and evolve, identify the difference in voice, etc. AI can be used the same way in software development to identify code patterns, common error areas, etc.

With artificial intelligence, test automation can become an integral part of development.

Rate this Article