On ending the regression automation fixation

Note: in my observation, scripted test execution and the type of regression test scripts I’m referring to are slowly going away, but a lot of organizations I work with still use them. Not every organization is full of testers working in a context-driven and exploratory way while applying CI/CD and releasing multiple times per day. If you’re working in one, that’s fine. This blog post probably is not for you. But please keep in mind that there are still many organizations that apply a more traditional, script-based approach to testing.

In the last couple of months, I’ve been talking regularly about some of the failures I’ve made (repeatedly!) during my career so far. My talk at the Romanian Testing Conference, for example, kicked off with me confessing that in retrospect, a lot of the work I’ve done until all too recently has been, well, inefficient at best, and plain worthless in other cases. Only slowly am I now learning what automation really is about, and how to apply it in a more useful and effective manner than the ‘just throw more tools at it’ approach I’ve been supporting for too long.

Today, I’d like to show you another example of things that, in hindsight, I should have been doing better for longer.

One of my stock answers to the question ‘Where should we start when we’re starting with automation?’ would be to ‘automate your existing regression tests first’. This makes sense, right? Regression tests are often performed at the end of a delivery cycle to check whether existing functionality aspects have not been impacted negatively as a result of new features that were added to the product. These tests are often tedious – new stuff is exciting to test, while existing features are so last Tuesday – and often take a long time to perform, and one thing there often isn’t left is time at the end of a delivery cycle. So, automating away those regression tests is a good thing. Right?

Well, maybe. But maybe not so much.

To be honest, I don’t think ‘start with automating your regression tests’ isn’t a very good answer anymore, if it has ever been (again, hindsight is 20/20…). It can be a decent answer in some situations, but I can think of a lot of situations where it might not be. Why not? Well, for two reasons.

Regression scripts are too long
The typical regression test scripts I’ve seen are looong. As in, dozens of steps with various checkpoints along the way. That’s all well and good if a human is performing them, but when they are turned into an automated script verbatim, things tend to fall apart easily.

For example, humans are very good at finding a workaround if the application under test behaves slightly differently than is described in the script. So, say you have a 50-step regression script (which is not uncommon), and at step 10 the application does something similar to what is expected, but not precisely the same. In this case, a tester can easily make a note, find a possible way around and move on to collect information regarding the remaining steps.

Automation, on the other hand, simply says ‘f*ck you’ and exits with a failure or exception, leaving you with no feedback at all about the behaviour to be verified in steps 11 through 50.

So, to make automation more efficient by reducing the risk of early failure, the regression scripts need to be rewritten and shortened, most of the times by breaking them up in smaller, independently executed sections. This takes time and eats away the intended increase in speed expected from the introduction of automation. And on top of that, it may also frustrate people unfamiliar to testing and automation, because instead of 100 scripts, you now have to automate 300. Or 400. And that sounds like more work!

Regression scripts are written from an end user perspective
The other problem with translating regression scripts verbatim is that these scripts are often written from an end user perspective, operating on the user interface of the application under test. Again, that’s all well and fine when you’re a human, but for automation it might not be the most effective way to gain information about the quality of your application under test. User interface-driven automation is notoriously hard to write and maintain, hard to stabilize, slow to execute and relatively highly prone to false positives.

Here too, in order to translate your existing regression scripts into effective and efficient automation, you’ll need to take a thorough look at what exactly is verified through those scripts, find out where the associated behaviour or logic is implemented, find or develop a way to communicate with your application under test on that layer (possibly the user interface, more likely an API, a single class or method or maybe even a database table or two) and take it from there.

Sure, this is a valuable exercise that will likely result in more efficient and stable automation, but it’s a step that’s easy to overlook when you’re given a batch of regression scripts with the sole requirement to ‘automate them all’. And, again, it sounds like more work, which not everybody may like to hear.

So, what to do instead?

My advice: forget about automating your regression tests.

There. I’ve said it.

Instead, ask yourself the following three questions with regards to your testing efforts:

  1. What’s consuming my testing time?
  2. What part of my testing efforts are repetitive?
  3. What part of my testing efforts can be repeated or enhanced by a script?

The answer(s) to these questions may (vaguely) resemble that what you do during your regression testing, but it might also uncover other, much more valuable ways to apply automation to your testing. If so, would it still make sense to aim for ‘automating the regression testing’? I think not.

So, start writing your automation with the above questions in mind, and keep repeating to yourself and those around you that automation is there to make your and their life easier, to enable you and them to do your work more effectively. It’s not just there to be applied everywhere, and definitely not to blindly automate an existing regression test suite.

On choosing both/and, not either/or

Choices. We all make them tens of times each day. Peanut butter or cheese (cheese for me, most of the time). Jeans or slacks (jeans, definitely). Coffee or tea (decent coffee with a glass of water on the side please). And when you’re working on or learning about automation, there’s a multitude of choices you also can (and sometimes have to) make. A lot of these choices, as I see people discussing and making them, are flawed in my opinion, though. Some of them are even false dichotomies. Let’s take a look at the choices people think they need to make, and how there are other options available. Options that might lead to better results, and to being better at your job.

Do I need to learn Java or .NET? Selenium or UFT?
Creating automation often involves writing code. So, the ability to write code is definitely a valuable one. However, getting hung up on a specific programming language might limit your options as you’re trying to get ahead.

I still see many people asking what programming language they need to learn when they’re starting out or advancing in their career. If you’d ask me, the answer is ‘it doesn’t really matter’. With the abundance in tools, languages, libraries and frameworks that are available to software development teams nowadays, chances are high that your next gig will require using a different language than your current one.

As an example, I recently started a new project. So far, in most of my projects I’ve written automation in either Java or .NET. Not in this one, though. In the couple of weeks I’ve been here, I’ve created automation using PHP, Go and JavaScript. And you know what? It wasn’t that hard. Why? Because I’ve made a habit of learning how to program and of studying principles of object oriented programming instead of learning the ins and outs of a specific programming language. Those specifics can be found everywhere on Google and StackOverflow.

The same goes for automation tools. I started writing UI-level automation using TestPartner. Then QuickTest Pro (now UFT). I’ve used Selenium in a few projects. I’ve dabbled with Cypress. Now, I’m using Codecept. It doesn’t matter. The principles behind these tools are much the same: you identify objects on a screen, then you interact with them. You need to take care of waiting strategies. If you become proficient in these strategies, which tool you’re using doesn’t matter that much anymore. I’ve stopped chasing the ‘tool du jour’, because there will always be a new one to learn. The principles have been the same for decades, though. What do you think would be a better strategy to improve yourself?

Identify and learn to apply common principles and patterns, don’t get hung up on a single tool or language. Choose both/and, not either/or.

Do I stay a manual tester or become an automation engineer?
Another one of the choices I see people struggling with often is the one between staying a ‘manual tester’ (a term that I prefer not to use for all the reasons Michael Bolton gives in this blog post of his and becoming an automation engineer. If you’d ask me, this is a perfect example of a flawed choice in the testing field. It’s not a matter of either/or. It’s a matter of both/and.

Automation supports software testing, it does not replace it. If you want to become more proficient in automation, you need to become more proficient in testing, too. I’ve only fairly recently realized this myself, by the way. For years, all I did was automation, automation, automation, without thinking whether my efforts actually supported the testing that was being done. I’ve learned since that if you don’t know what testing looks like (hint: it’s much more than clicking buttons and following scripts), then you’ll have a pretty hard time effectively supporting those activities with automation.

Don’t abandon one type of role for the other one, especially when there’s so much overlap between them. Choose both/and, not either/or.

Do I learn to write tests against the user interface, or can I better focus on APIs?
So, I’ve been writing a lot about the benefits of writing tests at the API level, not only on this blog, but also in numerous talks and training courses. When I do so, I am often quite critical about the way too many people apply user interface-driven automation. And there IS a lot of room for improvement there, definitely. That does not mean that I’m saying you should abandon this type of automation at all, just that you should be very careful when deciding where to apply it.

Like in the previous examples, it is not a matter of either/or. For example, consider something as simple and ubiquitous as a login screen (or any other type of form in an application). When deciding on the approach for writing tests for it, it’s not a simple choice between tests at the UI level or tests at the API level; rather it depends on what you’re testing. writing a test that checks whether an end user sees the login form and all associated in their browser? Whether the user can interact with the form? Whether the data entered by the user is sent to the associated API correctly? Or whether the form looks like it’s supposed to? Those are tests that should be carried out at the UI level. Checking whether the data provided by the user is processed correctly? Whether incorrectly formatted data is handled in the appropriate manner? Whether the right level of access grants is given to the user upon enter a specific combination of username and password? Those tests might target a level below the UI. Many thanks, by the way, to Richard Bradshaw for mentioning this example somewhere on Slack. I owe you one more beer.

Being able to make the right decision on the level and scope to write the test on required knowing what the benefits and drawbacks and the possibilities of the alternatives are. It also requires the ability to recognize and apply principles and patterns to make the best possible decision.

Again, identify and learn to apply common principles and patterns, don’t get hung up on a single tool or language. Choose both/and, not either/or.

The point I’ve been trying to make with the examples above is that, like with so many things in life, being the best possible automation engineer isn’t a matter of choosing A over B. Of being able to do X or Y. What, in my opinion, will make you much better in your role is being able to do, or at least understand, A and B, X and Y. Then, extract their commonalities (these will often take the form of the previously mentioned principles and practices) and learn how to apply them. Study them. Learn more about them. Fail at applying them, and then learn from that.

I’m convinced that this is a much better approach to sustainable career development than running after the latest tool or hype and becoming a self-proclaimed expert at it, only to have to make a radical shift every couple of years (or even months, sometimes).

Don’t become a one trick pony. Choose both/and, not either/or.

On the 2018 Romanian Testing Conference

So, last week I had the pleasure of attending the 2018 edition of the Romanian Testing Conference. It was my second visit to Cluj: after having delivered a workshop at the conference last year I was invited to do another workshop for this year’s edition. I told Andrei, one of the organizers, that I would gladly accept if he:

  1. could schedule my workshop for the Thursday (Wednesday and Thursday were pre-conference workshops days, the conference itself was on the Friday), and
  2. would make it at least as good an event -and if possible, better- than last year.

Challenge mutually accepted!

During the time when the CFP was open, I sneakily submitted a proposal for a talk as well, and was quite surprised to see it accepted too. Yay! More work!

I left for Romania on Wednesday and arrived around 7 PM at the hotel Grand Italia, which again was both the place where the speakers had their rooms as well as the venue for the conference itself. I cannot stress how awesome it is to be able to pop in and out of your room before, during and after workshops and talks without having to go to another place. Need a rest? Go to your room. Forgot something? You’ll have retrieved it in minutes. Want to check if there’s anyone up for a chat and/or a drink? Just ride the elevator down.

Again, the organization went to great lengths to make us speakers and workshops hosts as comfortable as they could. Always someone around if you have questions, picking you up from and bringing you back to the airport in dedicated RTC cars (even at stupid o’clock), it is a wonderfully organized event.

Thursday – workshop day
Like last year, I hosted a workshop around API testing and automation. Where I only used REST Assured last year, I decided to give the participants a broader overview of tools by including some exercises with SoapUI, as well as a demo of Parasoft SOAtest, a commercially licensed API testing tool. Also, compared to last year, I threw in more background on the ‘why?’ and the ‘what?’ of API testing.

Me delivering my workshop

I had 30 participants (like all other workshops and the conference itself, it was sold out) and after a bumpy start, including a couple of power outages, we were off. Like with all workshops, it took me a little time to gauge the level of experience in the room and to adjust my pace accordingly, but I think I got it right pretty quickly. With several rounds of instructions > exercise > feedback, time was flying by! Breaks were plenty and before I knew it, the working part of the day was over.

We were invited to a wonderful speakers dinner in the hotel restaurant, which provided plenty of time and opportunity to catch up with those other speakers I met before, as well as to meet those that I hadn’t had the privilege to meet yet. After a day of teaching and all the impressions from dinner, I decided to be sensible and make it an early night. Mission failed, because once in my room it still took me hours to fall asleep. My brain just couldn’t shut off..

Friday – conference day
Friday morning came quickly and that meant conference day time! The programme was strong with this one.. So many good talks on the schedule. Yet, like with many conferences, I spent most of the day in the hallway track, preparing for my talk (I wasn’t on until 4 PM) as well as having a chat with speakers and attendees. For me, that’s often at least as valuable as the talks themselves.

Still, I saw three talks: first Angie Jones’ keynote (I met her a month earlier in Utrecht but had never seen her speak before), then Viktor Slavchev’s talk and finally Maria Kedemo’s keynote. All three were very good talks and I learned a lot from them, both in terms of the message they conveyed as well as their presentation style.

This day flew by too and before I knew it, it was time for my own talk. Now, I’m a decent workshop host (or so I’d like to think…) but I am not an experienced speaker, so doing a talk takes a lot out of me, both in terms of the time it takes to prepare as well as the energy I spend during the talk itself. Still, I was pretty pleased with how I did, and the feedback afterwards reflected that. Maybe I just need to do this more often…

Me during my talk

After the closing talk, which I skipped in favor of going outside, enjoying the beautiful weather and winding down, the conference was already over. To round it all off, we went out for a bite with some of the speakers before we attended the conference closing party. The organization had one final surprise in tow for me there when they gave me an award for the best workshop of the conference. Seeing the list of amazing workshops that they had on offer this year, I certainly did not expect that!

Since my flight back home left at an ungodly hour the next morning, I decided not to make it too long an evening (not everybody followed my example judging from my Twitter timeline the next morning..). Travels home were uneventful (which I consider a good thing) and suddenly, it was all over again.

My thoughts on this wonderful conference, the organization and the volunteers can be summarized by this tweet, I think:

So, did the organization deliver? Well, I did get to do my workshop on the Thursday, and I had an amazing time again, so yes, I’d say mission accomplished.

Who knows, we’ll be seeing each other there next year?