Five reasons test automation projects fail

Test automation software needs to be treated like other software. (Getty Images)

Test automation, defined here as software written for the express purpose of helping test other software more efficiently, has an alarmingly high failure rate. Often, this is due to unrealistic expectations, lack of understanding around automation, or not putting enough focus on value.

Testing automation is an investment, and like any other investment, it’s important to understand the pros and cons to make sure that you are making the best investment possible for your project and your company. This is not a one size fit’s all solution, and the answer will differ depending on what your goals are, the skillset of your staff, your deliverables, and buy-in from all key stakeholders. With so much to consider, it’s important to know going into your automation project why these projects often fail and what to look out for.

 

Fierce AI Week

Register today for Fierce AI Week - a free virtual event | August 10-12

Advances in AI and Machine Learning are adding an unprecedented level of intelligence to everything through capabilities such as speech processing and image & facial recognition. An essential event for design engineers and AI professionals, Engineering AI sessions during Fierce AI Week explore some of the most innovative real-world applications today, the technological advances that are accelerating adoption of AI and Machine Learning, and what the future holds for this game-changing technology.

 

1. Forgetting automation software is still software

Far too often, people approach automation engagements thinking that they don’t need to use the same process as ‘real’ software development; however, this is patently false. Automation testing software has many of the same needs of the product it is being developed for. Sufficient planning must go into your automation solution. What tech stack are you going to use? What platforms are you going to support? Who is going to be writing the scripts? Who is going to maintain them long term? Are the subject matter experts on the project willing to write automation scripts as well? You must answer all these questions, and potentially more in order to ensure a successful automation effort.

If you have a group of testers that are struggling to keep up with the current workload, then it may be difficult to introduce a Cucumber/Gherkin-based solution where they are expected to create and maintain a host of feature files. Additionally, if you have a highly technical team, introducing a record and playback tool might hold the team back from delivering the value they otherwise could. Further, there needs to be an architect who understands the entire stack from creating test cases, trigger test execution, setting up test environments, running the tests against real and simulated hardware, and communicating the results back. This person’s job is to make sure that all of the pieces are working together in harmony to set the team up for success.

There are ‘codeless’ solutions out on the market, many of which have varying degrees of success. Unfortunately, in today’s day and age there is no silver bullet magic recorder that can build your test suite out with the click of a button and require no maintenance ever again. Until that day comes, you or your team will likely be writing code somewhere. This code needs to be treated just like any other code: it needs peer reviews, requires a versioning solution (also likely a solution for multiple people to work in parallel), and needs to be tested. It might get you up and running a lot faster to skip some of these steps, but they are crucial for long-term maintainability. Maintenance is just as important, if not more so, than the scripts themselves. Finally, your automation code must also be tested as well. You need that confidence that lets you know you are covering what you think you are covering, and to know when changes in your application break your scripts.

2. Automating ‘just because’

One of the biggest mistakes I see is that once people get excited about automation, they then want to automate everything. Often the first question I get after people are sold on the idea of automation is “Great, now when can stop manual testing?” or “How long will it take to automate everything 100%?” This is not how you should be thinking about automation testing. Instead, imagine a different scenario. We have to figure out a solution to ensure that we maintain the utmost quality in our application. We have a multitude of different tools in order to help us in that goal. Some of these happen to more manual tools, others are more automation focused.

In order to complete our goal with the highest level of quality, some of the problems we encounter will be better suited for a manual approach .e.g., look and feel, user experience, etc. Other things will be better suited for an automated approach, e.g., login 100 times, setup 50 user accounts, book with 5 different credit cards. Use the right tool for the job, with a focus on maximizing value. Constantly reevaluate what you are testing and why you are testing it that way. Don’t waste time maintaining old tests that have lost their value. Don’t automate something that only takes five minutes to check and is only done once a quarter.
 

3. No organizational support

If you haven’t learned from personal experience or this article, automation is not easy, but that doesn’t mean it isn’t useful or powerful. However, it is neither of those things if it does not have proper support from the organization as a whole. Key stakeholders need a say in the automation prioritization to ensure high-value items are tackled first. Subject matter experts can help point out risks or upcoming roadblocks in automation tests. More manual-focused QAs need to understand what is being automated to avoid duplication of efforts.

The organization as a whole must be bought in as well. Management needs to understand automation is an investment, and to set expectations/projections accordingly.

Without this understanding teams get frustrated with perceived slow progress, deadlines can get misconstrued, and ideas get oversold. What would otherwise be categorized as an increase in quality, efficiency, and speed can be misconstrued as over budget and delayed to a less informed party. Finally, willingness to invest in tools, training, keeping up with best practices, and getting the right people are critical to a successful ongoing automation effort. 

4. Expecting automation to fix everything

Automation can be a wonderful tool to allow a team to release faster, find defects earlier, and streamline the process. Automation will not lower gas mileage on your car, or help you finally pull the trigger on that week-long trip to the all-inclusive resort you’ve been eyeing. In all seriousness, automation is a tool just like anything else. If your team has a well-established process and runs like a well-oiled machine, automation will likely be a powerful tool to your arsenal. Instead, if the project is struggling to keep up with quality concerns and you are releasing twice as fast than you have capacity, automation might not solve everything.

Can it help? Sure. At the same time, everyone needs to have a shared understanding that there are likely other issues in play that are causing problems as well. Automation might be one part of a multi-part solution that includes better training for the team and less features per release. Think of it as you would think of your car. If it’s in great shape already, perhaps putting in a more efficient engine will cause you to get better gas mileage and go from 0 to 60 quicker. On the other hand, imagine the bumper is falling off and the brake pads are long overdue for replacement. Putting in a new engine is just going to get you to that highway where the car breaks down faster. 

5. Overlooking maintenance

One of the most important things that are constantly overlooked is test maintenance. Maintenance can often be 30% or more (in longer projects it can exceed the original cost) of the entire budget! You’ve spent all this time, effort, and energy to get organizational buy-in, build your tests, and show them to the team. Now all you want to do is to sit back and watch the bugs roll in! However, a few days in and one thing changes which breaks your test. After fixing that, there’s a refactor. Then you notice a timing issue that didn’t exist before. Oh, and there’s that new feature that just got added! On and on it goes. As much as we don’t want to think about it, maintenance needs to be a part of every stage of our planning.

After investing so much, it’s a shame when value isn’t realized as scripts are left unattended to go chase the next shiny cost-cutting, speed-improving, wow-inducing feature. Fight for the time and resources to keep up your scripts and keep maintainability in mind. Use best practices when creating your scripts and when it makes sense, spend that extra time to make them adaptable.  This doesn’t mean that scripts need to be maintained indefinitely! Part of maintenance is evaluating if what you are working on is still delivering the value less than that of the maintenance. When that ceases to be the case, stop supporting that test. It can be hard to move on from something you put blood, sweat, and tears into, but sometimes it’s the only way to move the product forward and get more coverage.

Building testing automation software is not something that should be taken lightly but given the demand for speed to market and other organizational goals, many organizations are diving headfirst in without fully understanding what automation really looks like. By understanding common failures of automation efforts, we can avoid repeating the same mistakes ourselves.

Ford Arnett is the lead automation engineer at Bottle Rocket, a digital experience consultancy that that provides business strategy, product, design and technology services.

Suggested Articles

Hydrogen refueling stations are limited in the U.S., restricting interest in use of fuel cell electric cars


Silicon Labs is providing the BT module needed for detecting proximity with another Maggy device

Many of Nvidia’s competitors also use Arm designs, and are sure to object to the deal