On the 2018 Romanian Testing Conference

So, last week I had the pleasure of attending the 2018 edition of the Romanian Testing Conference. It was my second visit to Cluj: after having delivered a workshop at the conference last year I was invited to do another workshop for this year’s edition. I told Andrei, one of the organizers, that I would gladly accept if he:

  1. could schedule my workshop for the Thursday (Wednesday and Thursday were pre-conference workshops days, the conference itself was on the Friday), and
  2. would make it at least as good an event -and if possible, better- than last year.

Challenge mutually accepted!

During the time when the CFP was open, I sneakily submitted a proposal for a talk as well, and was quite surprised to see it accepted too. Yay! More work!

I left for Romania on Wednesday and arrived around 7 PM at the hotel Grand Italia, which again was both the place where the speakers had their rooms as well as the venue for the conference itself. I cannot stress how awesome it is to be able to pop in and out of your room before, during and after workshops and talks without having to go to another place. Need a rest? Go to your room. Forgot something? You’ll have retrieved it in minutes. Want to check if there’s anyone up for a chat and/or a drink? Just ride the elevator down.

Again, the organization went to great lengths to make us speakers and workshops hosts as comfortable as they could. Always someone around if you have questions, picking you up from and bringing you back to the airport in dedicated RTC cars (even at stupid o’clock), it is a wonderfully organized event.

Thursday – workshop day
Like last year, I hosted a workshop around API testing and automation. Where I only used REST Assured last year, I decided to give the participants a broader overview of tools by including some exercises with SoapUI, as well as a demo of Parasoft SOAtest, a commercially licensed API testing tool. Also, compared to last year, I threw in more background on the ‘why?’ and the ‘what?’ of API testing.

Me delivering my workshop

I had 30 participants (like all other workshops and the conference itself, it was sold out) and after a bumpy start, including a couple of power outages, we were off. Like with all workshops, it took me a little time to gauge the level of experience in the room and to adjust my pace accordingly, but I think I got it right pretty quickly. With several rounds of instructions > exercise > feedback, time was flying by! Breaks were plenty and before I knew it, the working part of the day was over.

We were invited to a wonderful speakers dinner in the hotel restaurant, which provided plenty of time and opportunity to catch up with those other speakers I met before, as well as to meet those that I hadn’t had the privilege to meet yet. After a day of teaching and all the impressions from dinner, I decided to be sensible and make it an early night. Mission failed, because once in my room it still took me hours to fall asleep. My brain just couldn’t shut off..

Friday – conference day
Friday morning came quickly and that meant conference day time! The programme was strong with this one.. So many good talks on the schedule. Yet, like with many conferences, I spent most of the day in the hallway track, preparing for my talk (I wasn’t on until 4 PM) as well as having a chat with speakers and attendees. For me, that’s often at least as valuable as the talks themselves.

Still, I saw three talks: first Angie Jones’ keynote (I met her a month earlier in Utrecht but had never seen her speak before), then Viktor Slavchev’s talk and finally Maria Kedemo’s keynote. All three were very good talks and I learned a lot from them, both in terms of the message they conveyed as well as their presentation style.

This day flew by too and before I knew it, it was time for my own talk. Now, I’m a decent workshop host (or so I’d like to think…) but I am not an experienced speaker, so doing a talk takes a lot out of me, both in terms of the time it takes to prepare as well as the energy I spend during the talk itself. Still, I was pretty pleased with how I did, and the feedback afterwards reflected that. Maybe I just need to do this more often…

Me during my talk

After the closing talk, which I skipped in favor of going outside, enjoying the beautiful weather and winding down, the conference was already over. To round it all off, we went out for a bite with some of the speakers before we attended the conference closing party. The organization had one final surprise in tow for me there when they gave me an award for the best workshop of the conference. Seeing the list of amazing workshops that they had on offer this year, I certainly did not expect that!

Since my flight back home left at an ungodly hour the next morning, I decided not to make it too long an evening (not everybody followed my example judging from my Twitter timeline the next morning..). Travels home were uneventful (which I consider a good thing) and suddenly, it was all over again.

My thoughts on this wonderful conference, the organization and the volunteers can be summarized by this tweet, I think:

So, did the organization deliver? Well, I did get to do my workshop on the Thursday, and I had an amazing time again, so yes, I’d say mission accomplished.

Who knows, we’ll be seeing each other there next year?

On creating reasonable expectations in test automation – A TestBash Netherlands workshop

Last week, I had the incredible pleasure of co-facilitating (with Ard Kramer) a workshop on creating reasonable expectations in test automation at the second edition of TestBash Netherlands. In this post, I’d like to tell you a little more about how we got there, what the workshop was all about and, of course, how it all went down on the big day.

The build up
The adventure started when I was contacted by Huib Schoots, who has been responsible for organizing the conference, with the question if I’d be interested to host a pre-conference workshop. We discussed several subjects back and forth and, in the end, decided upon a workshop that would help people create better test automation strategies. Since I do have a lot of experience with hands-on test automation workshops where people work on exercises on their machines, but much less so with coming up with and organizing workshops where facilitating group discussions plays a big role, we thought it was a good idea to bring someone on board that has much more experience with this type of sessions. Ard was very high on both of our lists, and luckily, he was up for it as well.

Our combined experience in test automation, testing and facilitating workshops turned out to be a great match. During a number of preparation sessions (for those of you thinking of hosting similar workshops: it sure takes a lot of time to prepare!) we came up with a series of exercises, carried out either individually or in smaller groups, that ultimately would result in people coming up with three or four actionable items that they could take back to their jobs on Monday, answering tangible problems and addressing real issues that they faced in their test automation efforts.

The workshop
As said, the workshop consisted of a number of exercises that would help people identify and address gaps and opportunities in their test automation strategy. It would take way too long to describe the entire workshop, but here’s the gist of it..

We started out by having the attendees come up -individually- with strengths, weaknesses, opportunities and threats (indeed, a SWOT analysis) in their current test automation efforts. To help them on their way, we presented them with six aspects of a test automation strategy, with some example questions that you could ask to help identify strengths or pain points. These six categories (or aspects that we think make up a solid test automation strategy) are:

  • Technical
  • Knowledge and experience
  • Means and resources
  • Process and methodology
  • Organization
  • Business value

As you can see, in our opinion there’s a lot more to creating and implementing a successful test automation strategy than throwing tools at problems!

We then had the attendees discuss the challenges that resulted from the SWOT analysis in five different rounds, organized in groups based on the aforementioned categories. Each participant had the opportunity to address a subject in four different categories. For the fifth round, they had to play the role of facilitator on a subject they felt knowledgeable or comfortable about. These discussion rounds made up the larger part of the day.

Cards and forms for the discussion rounds

Finally, we had the attendees pick and pitch their most interesting improvement point and create a 99 second pitch for it, which they presented to their group of 5-6 people. Each group would then pick the best or otherwise most interesting pitch, which would be presented to the entire group, on stage. The intention behind this (and basically, behind the setup of the entire workshop) was to have people discuss with and learn as much from as many other attendees as possible.

The big day
For the day, we had a total of 27(ish, I’m not 100% sure tbh) attendees, which made for perfect group sizes. It’s always exciting to see how a new workshop or training course turns out in practice, you can think of a lot of things that might happen – and we sure did think of a lot of scenarios – but in the end, you never know what’s going to happen on game day!

As the day unfolded, Ard and I were very happy to see that the group went about our exercises with enthusiasm. Of course, there are always things that could have gone better, but all in all, discussions were going strong throughout the day and we didn’t have to correct course much.

Participants hard at work during the workshop

It does help that the general audience of TestBash conferences is made up of people that are willing to open up to, discuss with and learn from their peers. In this, these conferences are of a truly high quality, and we as facilitators learned just as much as the participants.

The part of the day I am probably most proud of is that at the end, we had some great 99 second pitches presented on stage, and at least two of the people presenting their pitches to the workshop participants repeated their talk on conference day, in front of an audience of 150-200. We sort of hoped that this would happen, but you never know how it turns out. It was truly rewarding to see this unfold in the way we intended it to do. The only downside is that I wasn’t there in person as I wasn’t able to make it to the conference day, but I can assure I lived it vicariously through my Twitter feed!

The aftermath
The workshop day flew by, and at the end, we asked the people to do a ‘dot vote’ and give us some honest feedback on what they liked, what they were indifferent about and what we could have done better. As you can see in the picture below, I think we did a decent job overall..

Feedback on our workshop

For me personally, preparing and delivering this workshop has been a great learning experience as well. As I said, the exercises in my training courses are mostly completed individually, on laptops. I’ve learned a lot about this type of workshop from Ard in the process, something that I’m sure will be of great value to me in the future (thanks again, Ard!).

I’m already looking forward to facilitating this workshop many more times in the future!

On handling processing time in your integration tests with Awaitility, and on using the Gmail API

For someone that writes and talks about REST Assured a lot, it has taken me a lot of time to find an opportunity to use it in an actual client project. Thankfully, my current project finally gives me the opportunity to use it on a scale broader than the examples and exercises from my workshop. Being able to do so has made me realize that some of the concepts and features I teach in the workshop deserve a more prominent spot, while others can be taught later on, or more briefly.

But that’s not what I wanted to talk about today. No, now that I actually use REST Assured for integration testing, as in: the integration between a system and its adjacent components, there’s something I have to deal with that I haven’t yet had to tackle before: processing time, or the time it takes for a message triggered using the REST API of system A to reach and be processed by system B, and being able to verify its result through consuming the REST API of system B.

Unlike writing tests for a single API, which is how I have been using and demonstrating REST Assured until now, I need a mechanism that helps me wait exactly long enough for the processing to be finished. Similar to user interface-driven automation, I could theoretically solve this by using calls to Thread.sleep(), but that’s ugly and vile and… just no.

Instead, I needed a mechanism that allowed me to poll an API until its response indicated a certain state change had occurred before performing the checks I needed to perform. In this case, and this is the example I’ll use in the remainder of this blog post, I invoked the REST API of system A to trigger a password reset for a given user account, and wanted to check if that resulted in a ‘password reset’ email message arriving in system B, system B being a specific Gmail inbox here.

Triggering the password reset
Triggering the password reset is done by means of a simple API call, which I perform (as expected) using REST Assured:

given().
    spec(identityApiRequestSpecification)
and().
    body(new PasswordForgottenRequestBody()).
when().
    post("/passwordreset").
then().
    assertThat().
    statusCode(204);

Waiting until the password reset email arrives
As stated above, I could just include a fixed waiting period of, say, 10 seconds before checking Gmail and seeing whether the email message arrived as expected. But again, Thread.sleep() is evil and dirty and… and should be avoided at all times. No, I wanted a better approach. Preferably one that didn’t result in unreadable code, both because I use my code in demos and I’d like to spend as little time as possible explaining my tests to others, and therefore want to keep it as readable as possible. Looking for a suitable library (why reinvent the wheel.. ) I was pointed to a solution that was created by Johan Haleby (not coincidentally also the creator of REST Assured), called Awaitility. From the website:

Awaitility is a DSL that allows you to express expectations of an asynchronous system in a concise and easy to read manner.

I’m not going to write about all of the features provided by Awaitility here (the usage guide does that way better than I ever could), but to demonstrate its expression power, here’s how I used it in my test:

await().
    atMost(10, TimeUnit.SECONDS).
with().
    pollInterval(1, TimeUnit.SECONDS).
    until(() -> this.getNumberOfEmails() == 1);

This does exactly what it says on the tin: it executes a method called getNumberOfEmails() once per second for a duration of 10 seconds, until the result returned by that method equals 1 (in which case my test execution continues) or until the 10 second timeout period has been exceeded, resulting in an exception being thrown. All with a single line of readable code. That’s how powerful it is.

In this example, the getNumberOfEmails() is a method that retrieves the contents for a specific Gmail mailbox and returns the number of messages in it. Before the test starts, I empty the mailbox completely to make sure that no old messages remain there and cause false positives in my test. Here’s how it looks:

private int getNumberOfEmails() {

    return given().
        spec(gmailApiRequestSpec).
    when().
        get("/messages").
    then().
        extract().
        path("resultSizeEstimate");
}

This method retrieves the number of emails in a Gmail inbox (the required OAuth2 authentication details, base URL and base path are specified in the gmailApiRequestSpec RequestSpecification) by means of a GET call to /messages and extracting and returning the value of the resultSizeEstimate field of the JSON response returned by the API. If you want to know more about the Gmail API, its documentation can be found here, by the way.

Checking the content of the password reset message
So, now that we know that an email message has arrived in the Gmail inbox, all that’s left for us to do is check whether it is a password reset message and not any other type of email message that might have arrived during the execution of our test. All we need to do is to once more retrieve the contents of the mailbox, extract the message ID of the one email message in it, use that to retrieve the details for that message and check whatever we want to check (in this case, whether it has landed in the inbox and whether the subject line has the correct value):

String messageID =
    given().
        spec(gmailApiRequestSpec).
    when().
        get("/messages").
    then().
        assertThat().
        body("resultSizeEstimate", equalTo(1)).
    and().
        extract().
        path("messages.id[0]");

    // Retrieve email and check its contents
    given().
        spec(gmailApiRequestSpec).
    and().
        pathParam("messageID", messageID).
    when().
        get("/messages/{messageID}").
    then().
        assertThat().
        body("labelIds",hasItem("INBOX")).
    and().
        body("payload.headers.find{it.name=='Subject'}.value",equalTo("Password reset"));

Gmail authentication
It took me a little while to figure out how to consume the Gmail API. In the end, this proved to be quite simple but I spent a couple of hours fiddling with OAuth2, authentication codes, OAuth2 access and refresh tokens and the way Google has implemented this. Describing how all this goes is beyond the scope of this blog post, but you can find instructions here. Once you’ve obtained a refresh token, store it, because that’s the code you can use to generate a new access token through the API, without having to deal with the pesky authentication user interface. For those of you more experienced with OAuth2, this might sound obvious, but it took me a while to figure it out. Still, it’s far better than writing automation against the Gmail user interface, though (seriously, DON’T DO THAT).

So, to wrap things up, there are two lessons here.

One, if you’re looking for a library that helps you deal with processing times in integration tests in a flexible and readable manner, and you’re writing your tests in Java, I highly recommend taking a look at Awaitility. I’ve only recently discovered it but I’m sure this one won’t leave my tool belt anytime soon.

Two, if you want to include checking email into your integration or possibly even your end-to-end tests, skip the user interface and go the API route instead. Alternatively, you could try an approach like Angie Jones presents in a recent blog post, leveraging the JavaMail API, instead. Did I say don’t use the Gmail (or Outlook, or Yahoo, or whatever) user interface?