Step-by-step integration testing: a case study

For the last 9 months or so, I have been working as a tester on a project where we develop and deliver the supply chain suite connected to a brand new highly automated warehouse for a big Dutch retailer. As with so many modern software development projects, we have to deal with a lot of different applications and the information that is exchanged between them. For example, the process of ordering a single article on the website and then processing it all the way until the moment it is on your doorstep involves 10+ applications and multiple times that amount of XML messages.

Testing whether all these applications communicate correctly with one another is not simply a matter of placing an order and seeing what happens. It requires structured testing and a bottom-up approach, starting with the smallest level of integration and moving up until the complete set of applications involved is exercised. In this post, I will try and sketch how we have done this using three levels of integration testing.

First, here’s a highly simplified overview of what the application landscape looks like. On one side we have the supply chain suite and all other applications containing and managing necessary information. On the other side there’s the warehouse itself. I have been mostly involved in testing the former, but now that we’re into the final stages of the project, I am also involved (to an extent) in integration testing between both sides.

The application landscape

Level 1: message-level integration testing
The first level of integration testing that we perform is on the message level. On this level, we check whether message type XYZ can be sent successfully from application A to application B. Virtually all application integration is done using the Microsoft BizTalk platform. To create and perform these tests, we use BizUnit, a test tool specifically designed for testing BizTalk integrations. Every test follows the same procedure:

  1. Prepare the environment by cleaning relevant input and output queues and file locations
  2. Place the message type to be tested on the relevant BizTalk receive location (a queue or RESTful web service)
  3. Validate whether the message has been processed successfully by BizTalk and placed on the correct send location.
  4. Check whether the messages have been archived properly for auditing purposes
  5. Rinse and repeat for other message flows

Note that on this test level, no checks are performed on the contents of messages. The only checks that are performed concern message processing and routing. BizTalk does not do anything or even care for message contents, it only processes and routes messages based on XML header information. Therefore, it does not make sense to perform message content validations here.

Scope of the BizUnit message level tests

Level 2: business process-level integration
The second level of integration testing that is performed focuses on successful completion of different business processes, such as customer orders, customer returns, purchasing, etc. These business processes involve the exchange of information between multiple applications. As the warehouse management system is developed in parallel, that interface is simulated using a custom built simulation tool. On a side note: this is a form of service virtualization.

Tests on this level involve triggering the relevant business process and tracking the process instance as related messages pass through the applications involved. For example, a test on the customer order process is started by creating a new order and verifying amongst other things if the order:

  • can be successfully picked and shipped by the warehouse simulator,
  • is created, updated and closed correctly by the supply chain suite,
  • is administrated correctly in the order manager,
  • triggers the correct stock movements, and
  • successfully triggers the invoicing process

Scope of the business process tests

A number of tests on this level have been automated using FitNesse. For me, this was the first project where I had to use FitNesse, and while it does the job, I haven’t exactly fallen in love with it yet. Maybe I just don’t know enough about how to use it properly?

Level 3: integration with warehouse
The third and final level of integration testing was done on the interface between the supply chain suite and connecting applications and the actual warehouse itself. As both systems were developed in parallel, it took quite a bit of time before we finally were able to test this interface properly. And even though our warehouse simulation had been designed and implemented very carefully, and it certainly did a lot for us in speeding up the development process, the first integration tests showed that there is no substitute for the real thing. After lots of bug fixing and retesting, we were able to successfully complete this final level of integration testing.

Scope of the warehouse integration tests

For this final level of integration testing, we were not able to use automated tests due to the time required by the warehouse to physically pick and ship the created orders. It would not have made sense to build automated tests that have to wait for an hour or more before a response indicating the order has been shipped is returned from the warehouse. The test cases executed mostly follow the same steps as those in level 2 as they are also focused on executing business processes.

I hope this post has given you some ideas on how to break down the task of integration testing for a large and reasonably complex application landscape. Thinking in different integration levels has certainly helped me to determine which steps and which checks to include in a given test scenario.

"