The Server Labs Blog Rotating Header Image

Automation

Creating Sonar Reports from Hudson

Introduction

In order to guarantee the quality of software development projects, it is important to be able to verify that a continuous integration build meets a minimum set of quality control criteria. The open source project Hudson provides the popular continuous integration server we will use throughout our example. Similarly, Sonar is a lead open source tool providing a centralized platform for storing and managing this type of quality control indicators. By integrating Sonar with Hudson, we’re able to extract and verify quality control metrics stored by Sonar in automated and recurrent manner from Hudson. By verifying these quality metrics we can qualify a given build as valid from a quality perspective, and quickly flag down builds where violations occur. At the same time, it will be very useful to generate summaries of key quality metrics in an automated manner, informing interested parties with a daily email.

Installing Hudson

As a first step, you will need to download and install Hudson from http://hudson-ci.org/.

Installing the Groovy Postbuild Plugin

In order to be able to extend Hudson with custom Groovy-based scripts, we will use the Groovy Postbuild Plugin. To install this plugin, you will have to click on Manage Hudson followed by Manage Plugins, as shown below:

You will then have to select the Available tab at the top, and search for Groovy Postbuild Plugin under the section Other Post-Build Actions.

Sonar Reporting the Groovy Way

Once the Groovy Postbuild Plugin has been successfully installed and Hudson restarted, you can go ahead and download the SonarReports package and extract it to ${HUDSON_HOME}, the home directory of the Hudson server (e.g. the folder .hudson under the user’s home directory on Windows systems). This zip file contains the file SonarReports.groovy under scripts/groovy, which will be created under ${HUDSON_HOME} after expansion.

Hudson Job Configuration

To facilitate reuse of our Hudson configuration for Sonar, we will first create a Sonar Metrics job to be used as a template. We can then create a new job for each project we wish to create Sonar reports for by simply copying this job template.

In the Sonar Metrics job, we first create the necessary parameters that will be used as thresholds and validated by our Groovy script. To this end, we select the checkbox This build is parameterized under the job’s configuration. We then configure the parameters are shown below, where we have provided the corresponding screenshots:

  • projectName: project name that will appear in emails sent from Hudson.
  • sonarProjectId: internal project ID used by Sonar.
  • sonarUrl: URL for the Sonar server.
  • emailRecipients: email addresses for recipients of Sonar metrics summary.
  • rulesComplianceThreshold: minimum percentage of rule compliance for validating a build. A value of false means this metric will not be enforced.
  • blockerThreshold: maximum number of blocker violations for validating a build. A value of false means this metric will not be enforced.
  • criticalThreshold: maximum number of critical violations for validating a build. A value of false means this metric will not be enforced.
  • majorThreshold: maximum number of major violations for validating a build. A value of false means this metric will not be enforced.
  • codeCoverageThreshold: minimum percentage of code coverage for unit tests for validating a build. A value of false means this metric will not be enforced.
  • testSuccessThreshold: minimum percentage of successful unit tests for validating a build. A value of false means this metric will not be enforced.
  • violationsThreshold: maximum number of violations of all type for validating a build. A value of false means this metric will not be enforced.

Finally, we enable the Groovy Postbuild plugin by selecting the corresponding checkbox under the Post-build Actions section of the job configuration page. In the text box, we include the following Groovy code to call into our script:

sonarReportsScript = "${System.getProperty('HUDSON_HOME')}/scripts/groovy/SonarReports.groovy"
shell = new GroovyShell(getBinding())
println "Executing script for Sonar report generation from ${sonarReportsScript}"
shell.evaluate(new File(sonarReportsScript))

Your Hudson configuration page should look like this:

Generating Sonar Reports

In order to automatically generate Sonar reports, we can configure our Hudson job to build periodically (e.g. daily) by selecting this option under Build Triggers. The job will then execute with the specified frequency, using the default quality thresholds we configured in the job’s parameters.

It is also possible to run the job manually to generate reports on demand at any time. In this case, Hudson will ask for the value of the threshold parameters that will be passed in to our Groovy script. These values will override the default values specified in the job’s configuration. Here is an example:

Verifying Quality Control Metrics

When the Hudson job runs, our Groovy script will verify that any thresholds defined in the job’s configuration are met by the project metrics extracted from Sonar. If the thresholds are met, the build will succeed and a summary of the quality control metrics will appear in the Hudson build. In addition, a summary email will be sent to the recipient list emailRecipients, providing interested parties with information regarding the key analyzed metrics.

On the other hand, if the thresholds are not met, the build will be marked as failed and the metric violation described in the Hudson build. Similarly, an email will be sent out informing recipients of the quality control violation.

Conclusion

This article demonstrates how Hudson can be extended with the use of dynamic programming languages like Groovy. In our example, we have created a Hudson job that verifies quality control metrics generated by Sonar and automatically sends quality reports by email. This type of functionality is useful in continuous integration environments, in order to extend the default features provided by Hudson or Sonar to meet custom needs.

Intellectual Property (IPR) Management and Monitoring Tools

It seems that every day projects have more and more dependencies on libraries (internal or external) and, of course, many of these depend on other libraries, resulting in a large dependency tree for any given project. How do you know if any of those libraries contain some code which is licensed in a way that is incompatible with your company’s policies e.g. no GPL?

BT (the former British Telecom) apparently didn’t and ended up having to publish all the code used in one of the routers it distributes due to a GPL violation.

To give you an idea of the scale of this problem, doing a quick search of my local Maven repository reveals that it has 1760 JAR files in it. Admittedly not all of these belong to one single project but maybe they are spread out over 20 different projects. It is pretty infeasible to try to manage such a task manually.

Tools like Maven are a great help for managing dependency trees in your project but doesn’t help much with checking the licenses that each dependency uses. The pom.xml file permits the use of a <license> element but it is optional, many libraries either don’t use Maven or don’t specify the license and you have to check compliance manually in any case.

This is where IPR monitoring tools come in. Such tools allow the definition of licensing policies at an organizational level and provide mechanisms to monitor compliance with these policies in software projects, raising alerts on detected violations.

We recently had to take a look at such tools for one of our clients. After studying the market, we discovered that are currently no open-source solutions covering this problem domain, but several commercial tools address the problem of continuous IPR monitoring.

For reference purposes, here is a list of the providers that we discovered:

IPR Management Tool Site
Palamida Compliance Edition http://www.palamida.com
Black Duck Protex http://www.blackducksoftware.com/protex
Protecode http://www.protecode.com
HiSoftware AccVerify http://www.hisoftware.com
OpenLogic Library or Enterprise Edition http://www.openlogic.com

All of these commercial products offer common features:

  • Automated binary and source code analysis with multi-language support (Java, C/C++, C#,
    Visual Basic, Perl, Python, PHP). The analysis is performed against an external proprietary
    database that contains the code of most open-source products.
  • Provide workflows in order to control the IPR of the software projects through the whole
    lifecycle, based on defined licensing policies.
  • Approval/disapproval licensing mechanisms as well as billing of materials for
    software releases summarizing components, licenses, approval status and license/policy
    violations.
  • Different levels of code fragment recognition to detect reuse of code.
  • User interfaces offering policy management, reporting and dashboard features.
  • Support for integration of code scan in Continuous Integration platforms via command line
    interface execution.

We think that these products are going to become increasingly important as the total number of libraries used in projects shows no sign of decreasing and there will always be a need to protect intellectual property.

Performance Tests with Jmeter, Maven and Hudson

Continuing with the series of blog posts regarding testing, automation and continuous integration, this time I will talk about how to integrate performance tests, in this case using Jmeter, with Maven and the Hudson continuous integration system. Jmeter is one of the main tools we use in our projects to create relevant performance tests, and automation and integration in our CI systems is essential. We also use SoapUI and Grinder depending on the platform, but we will cover those in future posts.

To integrate Jmeter and Maven, you must use the Maven Jmeter plugin, the first version of which is officially hosted here. There are are several posts discussing the use of this plugin that I have used as reference, particularly this one from James Lorenzen.

An updated version of this original plugin was released in google code by Ronald Alleva. It adds support for the latest version of Jmeter (2.3.2) and contains some additional enhacements such as parameterisation support. I decided to use this version to get this extra funcionality.

To make this post as simple and easy to follow as possible, my idea was to test Jmeter tests automation and integration against an application web running in an embeded Jetty container.

My objectives were:

  • Be able to run my Jmeter tests in a specific maven phase.
  • Create the relevant Jmeter test results html reports and publish them during the maven site phase.
  • Integrate Jmeter execution and reporting with Hudson, via the Hudson Jmeter plugin available here.

This post is a combination of James and Ron’s posts, adding solutions to problems I discovered while creating and executing my tests and showing the integration with Hudson.

Ronald describes pretty well here the different steps required in order to make the plugin work. I will not include these steps but rather comment on them.

  • Download the support jars and the maven Jmeter plugin and install them.
  • Add the dependency to your maven project. In our case, we want to execute our Jmeter tests when we start the embedded jetty container. We usually start/stop Jetty as part of the pre-integration and post-integration maven phases. We just need to add the Jmeter plugin execution to the integration-test phase (I have also included the jetty part below):

   
       env-dev-jetty
	
	   true
	
	
	   
		
			org.apache.jmeter
			maven-jmeter-plugin
			1.0
			
				
					jmeter-tests
					integration-test
					
						jmeter
					
				
			
			
				
					${project.build.directory}/jmeter-reports
				
				
					   localhost
			       	  	   9080
       	  	  	  	
			
		
		
			org.mortbay.jetty
			maven-jetty-plugin
			6.1.12rc1
			
				
					${basedir}/target/${project.build.finalName}/${project.parent.name}-webapp-${project.version}.war
				
				
					${project.parent.name}-webapp
				
				
					
					   9080
					   60000
				        
				
				
					
						
						   pmtool.application.environment
						
						dev
					
				
				9966
				foo
				true
				0
			
			
				
					start-jetty
					pre-integration-test
					
						run-war
					
				
				
					stop-jetty
					post-integration-test
					
						stop
					
				
			
		
	
   


In a real scenario where, as part of your project cycle, the application is deployed to a development or pre-production environment, we could use the verify phase to execute the performance tests.

  • Also, in case you want to generate html reports for publication on the project site generated by Maven, execute the XSLT transformation from xml to html provided by Jmeter using the xml-maven-plugin:

   org.codehaus.mojo
   xml-maven-plugin
   1.0-beta-2
   
      
         pre-site
         
             transform
         
         
    
    
         
             
                  ${project.build.directory}/jmeter-reports
                  src/test/resources/jmeter-results-detail-report_21.xsl
                  ${project.build.directory}/site/jmeter-results
                  
                      
                         html
                      
                  
             
         
    

  • Create some Jmeter tests. I created 2 Jmeter tests, PerfTest1 and PerfTest2, which are parametized so different hostnames and ports can be specified for the tests. The maven profile “env-dev-jetty” (which is the default used) defines the Jmeter parameters for Jetty in the “jmeterUserProperties” section. We have a separate profile to run Jmeter against against the application deployed in a weblogic container.
  • Copy the jmeter.properties file to the “/src/test/jmeter” folder in your project and modify the properties to fit your needs. For example, I had to modify some “jmeter.save.saveservice” parameters to enable additional output and also some “log_level” to see a bit more of detail in the jmeter.log file.
  • Execute mvn to run the tests.
mvn integration-test

Up to this point everything was ok apart from the fact that when running in maven, the Jmeter tests where hanging just after finishing correctly. After some investigation, I found that the reason was the exit logic implemented in the maven Jmeter plugin. The plugin exits when the number of threads drops to the number just before the Jmeter start call. This works in most cases, but when you are also running an embedded Jetty server, the threads spawned to service the requests triggered by the Jmeter tests are counted as well, causing the wait. The plugin eventually exits when all these connection threads are closed after timing out.

The solution was to change the exit logic, monitoring the jmeter.log file as described here, instead of monitoring the number of threads. This should work in most cases.

  • Integrate into Hudson, the continuous integration that we use. For that I installed the Jmeter Hudson plugin and configured it as shown below:
Hudson Jmeter Plugin Configuration

Hudson Jmeter Plugin Configuration

  • In order to keep the Jmeter detailed performance test results, I configured Hudson to archive the relevant reports as build artifacts.
  • The Hudson and Maven Jmeter plugins have different naming convention for the report files. The maven Jmeter plugin generates the report filename with a date which is not compatible with the static file name required by the Hudson Jmeter Plugin. Solution: I modified the maven plugin source code to generate a results file without the date in it (e.g. from PerfTest1-090413.xml to PerfTest1-result.xml). There is no disadvantage as Hudson will store the results for each build as an artifact, keeping a log.
  • The Hudson plugin doesn’t really support collecting information about multiple Jmeter Tests at this moment. The way to go would be to group different performance tests into a single Jmeter file and create separate Hudson jobs for them.
  • After all this setup has been done, you can execute some builds and the check that the Jmeter execution graphs works as they should. I created an example, containing errors and different response times to make it pretty. The graphs show the trend for only one of the Jmeter tests.
Hudson Jmeter Test Execution Trend

Hudson Jmeter Test Execution Trend

  • Note the section “Last Successful Artifacts” that contains the html reports created during the maven site creation and archived for future reference. An example of this detail report is:
Jmeter Detailed report as Hudson Artifact

Jmeter Detailed report as Hudson Artifact