Blog Search

Tuesday, June 4, 2013

Automated tests in four J-steps

Hi there! It’s really important in software development also thinking on automated functional tests for many reasons, which is not intended to discuss it today. However, it’s very common QA team (at least I have seen a lot) using complex automation tools in order to accomplish some simple tests. These tools could make the creation and test evolution very hard to follow.

Our purpose in this article is to simply build an example with the four J's: automate integration test with Jersey, as a REST client and JUnit as our main test framework, publish test documentation with Javadoc and at last, setup Jenkins to show JUnit test reports and Javadoc documentation in it's dashboard. As you will see, this configuration is very easy to create and maintain. You should think twice before using a more powerful test strategy. This simple configuration case could cover the most part of automated test needs.

Example setup

In order to show the power of J's, we have implemented a simple webservice test and deployed it at Jenkins environment. Therefore, it is expected to be a very practical example, and that is why we have illustrated it in four J-steps.

J-step #1: Jersey

Jersey is a RESTful library, reference implementation of JAX-RS specification. We have easily implemented webservice request client in code snippet below:



The code purpose is to make a request to url and save response in some parameterized class instance, using Java generics mechanism. We have also specified that the webservice returns a xml object that will be unmarshalling behind the scenes to our Java object.

J-step #2: Javadoc

We expect maven to generate test javadoc (supposing we are required to write test documentation using Javadoc). To make it happen, let's setup maven-javadoc-plugin in pom.xml:



Javadoc plugin for maven allows us to customize tags, so we can write all automated test documentation in javadoc. I think this is a great advantage, despite the fact QA team is always looking for tools that could provide some structured way of test documentation. Javadoc in automated test can reach this goal. In configuration we have created customized javadoc annotations: @input and @assert. Whenever these annotations appear in code, they will be mapped as respective head tag values in javadoc.

J-step #3: Junit

So, we already have configured javadoc plugin, then let's create our first (and documented, for sure!) test class:



This test is really, really, and really simple. A request is done using requestWebService() method and put response in a Summation instance. You can implement Summation class using classes inside JAXB(another "J") javax.xml.bind package. The method getResult() actually contains the result of sum operation, provided by webservice. Finally, it can be used it to compare with expected result using JUnit assertEquals.

J-step #4: Jenkins

Let's deploy the structure in our continuous integration environment. The last part is setting up Jenkins. Here, Javadoc and JUnit report views in dashboard can be enabled filling form below:



We just have to input maven generated test report and javadoc directories. Approaching like this, we can easily visualize JUnit reports and test documentation in Jenkins dashboard. The test javadoc written before should look like this:



As expected, @assert and @input annotations were converted to "Test assertion" and "Test input", respectively.

Remarks

Integration test is an essential part of all software development cycle. However, sometimes it is lacked because of inherent complexity of doing this activity. This approach provides you a simple way to write automated webservice tests without a "silver bullet" tool, which sometimes is very hard to maintain. I mean, there are a lot of interesting and powerful test tools, but sometimes it's required us to develop things really simple under time and/or budget restrictions.

Another advantage is when deploying test cases, test reports and javadoc at Jenkins, we are concentrating these artifacts in the same environment. This scenario can contribute to test maintainability.

That's acceptable for now. See you next post.

No comments:

Post a Comment