The Server Labs Blog Rotating Header Image

Continuous Integration

Creating Sonar Reports from Hudson

Introduction

In order to guarantee the quality of software development projects, it is important to be able to verify that a continuous integration build meets a minimum set of quality control criteria. The open source project Hudson provides the popular continuous integration server we will use throughout our example. Similarly, Sonar is a lead open source tool providing a centralized platform for storing and managing this type of quality control indicators. By integrating Sonar with Hudson, we’re able to extract and verify quality control metrics stored by Sonar in automated and recurrent manner from Hudson. By verifying these quality metrics we can qualify a given build as valid from a quality perspective, and quickly flag down builds where violations occur. At the same time, it will be very useful to generate summaries of key quality metrics in an automated manner, informing interested parties with a daily email.

Installing Hudson

As a first step, you will need to download and install Hudson from http://hudson-ci.org/.

Installing the Groovy Postbuild Plugin

In order to be able to extend Hudson with custom Groovy-based scripts, we will use the Groovy Postbuild Plugin. To install this plugin, you will have to click on Manage Hudson followed by Manage Plugins, as shown below:

You will then have to select the Available tab at the top, and search for Groovy Postbuild Plugin under the section Other Post-Build Actions.

Sonar Reporting the Groovy Way

Once the Groovy Postbuild Plugin has been successfully installed and Hudson restarted, you can go ahead and download the SonarReports package and extract it to ${HUDSON_HOME}, the home directory of the Hudson server (e.g. the folder .hudson under the user’s home directory on Windows systems). This zip file contains the file SonarReports.groovy under scripts/groovy, which will be created under ${HUDSON_HOME} after expansion.

Hudson Job Configuration

To facilitate reuse of our Hudson configuration for Sonar, we will first create a Sonar Metrics job to be used as a template. We can then create a new job for each project we wish to create Sonar reports for by simply copying this job template.

In the Sonar Metrics job, we first create the necessary parameters that will be used as thresholds and validated by our Groovy script. To this end, we select the checkbox This build is parameterized under the job’s configuration. We then configure the parameters are shown below, where we have provided the corresponding screenshots:

  • projectName: project name that will appear in emails sent from Hudson.
  • sonarProjectId: internal project ID used by Sonar.
  • sonarUrl: URL for the Sonar server.
  • emailRecipients: email addresses for recipients of Sonar metrics summary.
  • rulesComplianceThreshold: minimum percentage of rule compliance for validating a build. A value of false means this metric will not be enforced.
  • blockerThreshold: maximum number of blocker violations for validating a build. A value of false means this metric will not be enforced.
  • criticalThreshold: maximum number of critical violations for validating a build. A value of false means this metric will not be enforced.
  • majorThreshold: maximum number of major violations for validating a build. A value of false means this metric will not be enforced.
  • codeCoverageThreshold: minimum percentage of code coverage for unit tests for validating a build. A value of false means this metric will not be enforced.
  • testSuccessThreshold: minimum percentage of successful unit tests for validating a build. A value of false means this metric will not be enforced.
  • violationsThreshold: maximum number of violations of all type for validating a build. A value of false means this metric will not be enforced.

Finally, we enable the Groovy Postbuild plugin by selecting the corresponding checkbox under the Post-build Actions section of the job configuration page. In the text box, we include the following Groovy code to call into our script:

sonarReportsScript = "${System.getProperty('HUDSON_HOME')}/scripts/groovy/SonarReports.groovy"
shell = new GroovyShell(getBinding())
println "Executing script for Sonar report generation from ${sonarReportsScript}"
shell.evaluate(new File(sonarReportsScript))

Your Hudson configuration page should look like this:

Generating Sonar Reports

In order to automatically generate Sonar reports, we can configure our Hudson job to build periodically (e.g. daily) by selecting this option under Build Triggers. The job will then execute with the specified frequency, using the default quality thresholds we configured in the job’s parameters.

It is also possible to run the job manually to generate reports on demand at any time. In this case, Hudson will ask for the value of the threshold parameters that will be passed in to our Groovy script. These values will override the default values specified in the job’s configuration. Here is an example:

Verifying Quality Control Metrics

When the Hudson job runs, our Groovy script will verify that any thresholds defined in the job’s configuration are met by the project metrics extracted from Sonar. If the thresholds are met, the build will succeed and a summary of the quality control metrics will appear in the Hudson build. In addition, a summary email will be sent to the recipient list emailRecipients, providing interested parties with information regarding the key analyzed metrics.

On the other hand, if the thresholds are not met, the build will be marked as failed and the metric violation described in the Hudson build. Similarly, an email will be sent out informing recipients of the quality control violation.

Conclusion

This article demonstrates how Hudson can be extended with the use of dynamic programming languages like Groovy. In our example, we have created a Hudson job that verifies quality control metrics generated by Sonar and automatically sends quality reports by email. This type of functionality is useful in continuous integration environments, in order to extend the default features provided by Hudson or Sonar to meet custom needs.

Intellectual Property (IPR) Management and Monitoring Tools

It seems that every day projects have more and more dependencies on libraries (internal or external) and, of course, many of these depend on other libraries, resulting in a large dependency tree for any given project. How do you know if any of those libraries contain some code which is licensed in a way that is incompatible with your company’s policies e.g. no GPL?

BT (the former British Telecom) apparently didn’t and ended up having to publish all the code used in one of the routers it distributes due to a GPL violation.

To give you an idea of the scale of this problem, doing a quick search of my local Maven repository reveals that it has 1760 JAR files in it. Admittedly not all of these belong to one single project but maybe they are spread out over 20 different projects. It is pretty infeasible to try to manage such a task manually.

Tools like Maven are a great help for managing dependency trees in your project but doesn’t help much with checking the licenses that each dependency uses. The pom.xml file permits the use of a <license> element but it is optional, many libraries either don’t use Maven or don’t specify the license and you have to check compliance manually in any case.

This is where IPR monitoring tools come in. Such tools allow the definition of licensing policies at an organizational level and provide mechanisms to monitor compliance with these policies in software projects, raising alerts on detected violations.

We recently had to take a look at such tools for one of our clients. After studying the market, we discovered that are currently no open-source solutions covering this problem domain, but several commercial tools address the problem of continuous IPR monitoring.

For reference purposes, here is a list of the providers that we discovered:

IPR Management Tool Site
Palamida Compliance Edition http://www.palamida.com
Black Duck Protex http://www.blackducksoftware.com/protex
Protecode http://www.protecode.com
HiSoftware AccVerify http://www.hisoftware.com
OpenLogic Library or Enterprise Edition http://www.openlogic.com

All of these commercial products offer common features:

  • Automated binary and source code analysis with multi-language support (Java, C/C++, C#,
    Visual Basic, Perl, Python, PHP). The analysis is performed against an external proprietary
    database that contains the code of most open-source products.
  • Provide workflows in order to control the IPR of the software projects through the whole
    lifecycle, based on defined licensing policies.
  • Approval/disapproval licensing mechanisms as well as billing of materials for
    software releases summarizing components, licenses, approval status and license/policy
    violations.
  • Different levels of code fragment recognition to detect reuse of code.
  • User interfaces offering policy management, reporting and dashboard features.
  • Support for integration of code scan in Continuous Integration platforms via command line
    interface execution.

We think that these products are going to become increasingly important as the total number of libraries used in projects shows no sign of decreasing and there will always be a need to protect intellectual property.

The Server Labs open sources their Maven utPLSQL plugin

Following on from my post the other day, I’m very happy to announce that we have released the source code for our Maven utPLSQL plugin under an Apache 2.0 license.

The code (and downloads for the latest version of the plugin) is published on the Google Code website and is available from the following URL:

http://code.google.com/p/maven-utplsql-plugin/

We hope that the publication of this source code will enable more people to test their PL/SQL code regularly using continuous integration, thereby improving the quality of their code.

If anyone is interested in contributing an enhancement or bug fix, they are more than welcome.

Continuous Integration with Oracle PL/SQL, utPLSQL and Hudson

PL/SQL is not exactly the coolest technology around these days but many businesses rely on it for their day-to-day operation. The fact that it is relatively old and does not mean that you cannot apply Extreme Programming(XP) practices to PL/SQL development. A few years ago, Steven Feuerstein developed utPLSQL, a unit testing framework for PL/SQL, which covers the testing part of XP. In this post, I will show you how to apply Continuous Integration (another XP practice) to PL/SQL development using Hudson.

This article will explain how to construct a Maven project that enables you to do the following executing just a single maven command:

  • Deploy the project database schema to an Oracle database
  • Install the project PL/SQL pacakges to an Oracle database
  • Insert data specified in SQL files into the tables in the database
  • Install the utPLSQL test pacakges to an Oracle database
  • Run the utPLSQL unit tests

UPDATE: Since this article was written, we have open-sourced the Maven utPLSQL plugin.

Additionally, we will configure a Hudson project that performs all of the above when you submit a change to the source code repository (Subversion in this case) and generates a visual report showing the test results. If any test failed, Hudson can be configured to send an email to the project developers so that they can immediately investigate and fix the problem, in line with common continuous integration practices.

The Oracle database

This post assumes that you have an Oracle database available. I used an Oracle 10g XE instance running on localhost to develop this post. XE is free to download and is sufficient for development purposes.

I created a user called “testing” with password “testing” in the database and also installed the utPLSQL schema, which is a necessary step to following this example.

NOTE: If you install the utPLSQL schema into an Oracle 10g XE database, make sure that you have granted access on UTL_FILE to public as explained here (remember to connect as sysdba when doing this otherwise it won’t work).

A PL/SQL Project in Maven

Maven is a build tool which has a large number of plugins available and is very well integrated with Hudson and so is an ideal choice for contructing our PL/SQL project. If you have an existing project, it is not hard to move it to a maven-compatible structure.

In this post, we will work with a PL/SQL test project. You can download the source code for this project here. Once you’ve downloaded the code, copy it to a folder on your machine and unzip it. The example follows the Maven standard directory layout, which I have customized to suit PL/SQL development:

  • /src/main/plsql/ – PL/SQL source code (packages, functions)
  • /src/test/plsql/ – PL/SQL unit tests (written in utPLSQL)
  • /src/main/sql/ – SQL scripts for creating the schema
  • /src/main/resources/data/ – SQL scripts for inserting schema data

The source code included in the project is a modified version of two examples that come with the utPLSQL package and is not really too important here – other than it is some code for which there exists some utPLSQL tests.

Adding artifacts to the local Maven repository

NOTE: If you have not already installed Maven, follow these instructions.

Maven generally downloads libraries that it needs from the internet. However, there are two libraries which we will use in this post which are not available in any public Maven repository so we will have to install them by hand. One is the Oracle JDBC driver which is not publically available due to Oracle’s licensing restrictions and the second is the plugin that The Server Labs have developed for running utPLSQL tests, which we have not (yet at least) made publically available. Below are the instructions that you must follow to install these two artifacts before continuing:

The Maven utPLSQL plugin developed by The Server Labs

  • Download the plugin here and the required pom.xml file here
  • Execute the following in a command console to install the plugin. Note that in this case I downloaded the plugin to /tmp/maven-utplsql-plugin-10-snapshot1.jar and the pom.xml file to /tmp/pom1.xml. Change this to suite your environment.
    mvn install:install-file -Dfile=/tmp/maven-utplsql-plugin-10-snapshot1.jar -DpomFile=/tmp/pom1.xml
    

Oracle JDBC driver

  • In the machine on which you have installed Oracle, go to the product/10.2.0/server/jdbc/lib directory of the Oracle Home. On my ubuntu machine with Oracle 10g XE installed, this is /usr/lib/oracle/xe/app/oracle/product/10.2.0/server/jdbc/lib.
  • Copy the ojdbc14.jar file to somewhere convenient on your machine e.g. /tmp
  • Execute the following in a command console to install the JDBC driver in the local Maven repository. Note that you should change /tmp/ojdbc14.jar to the path to which you stored the JAR in the step above.
    mvn install:install-file -Dfile=/tmp/ojdbc14.jar -DgroupId=com.oracle -DartifactId=ojdbc14 -Dversion=9.0.2.0.0 -Dpackaging=jar -DgeneratePom=true
    

Configuring the PL/SQL project

In Maven, build information is contained in the pom.xml file which is in the root of the example project. If you open this file and take a look at it, you will see that there are two maven plugins configured which do most of the work:

Maven SQL Plugin

This plugin installs the database schema, PL/SQL packages, data and PL/SQL test packages in the specified Oracle database when you run the maven test command. It relies on being able to find the PL/SQL packages, SQL files etc. in the places specified in the directory structure above. The database connection details are specified in the main <configuration> element – change these to suit your environment. Although the configuration initially looks quite complicated, it really is just the same thing repeated various times and is easy to understand once you get used to it.

      
        org.codehaus.mojo
        sql-maven-plugin
        1.3

        
            
                com.oracle
                ojdbc14
                9.0.2.0.0
            
        

        
          oracle.jdbc.driver.OracleDriver
          jdbc:oracle:thin:@localhost:1521:xe
          testing
          testing
          /
          row
          true
        

        
          
            create-schema
            process-test-resources
            
              execute
            
            
              ascending
              ;
              normal
              continue
                false
              
                src/main/sql
                  
                    **/*.sql
                  
              
            
          

          
            create-plsql-packages
            process-test-resources
            
              execute
            
            
              ascending
              
                src/main/plsql
                  
                    **/*.pks
                    **/*.pkb
                    **/*.sf
                  
              
            
          

          
            insert-data
            process-test-resources
            
              execute
            
            
              ascending
              ;
              normal
              continue
                false
              
                src/main/resources/data
                  
                    **/*.sql
                  
              
            
          

          
            create-plsql-test-packages
            process-test-resources
            
              execute
            
            
              ascending
              
                src/test/plsql
                  
                    **/*.pks
                    **/*.pkb
                    **/*.sf
                    **/*.sp
                  
              
            
           

       
      

The Maven utPLSQL plugin

This is a plugin written by The Server Labs for executing utPLSQL tests from Maven. It is configured in a similar way to the Maven SQL plugin shown above. You must specify the dependency that the plugin has on the Oracle JDBC driver and the details to connect to the Oracle testing user.

The plugin allows you to specify either <packageName>xxx</packageName> or <testSuiteName>yyy</testSuiteName>. You must specify one or the other but not both. Which you specify depends on whether you want to run a utPLSQL test package or test suite. The example below specifies that we should run the ‘All’ utPLSQL test suite.

When this plugin runs, it runs the utPLSQL test suite or package and retrieves the results from the Oracle database (it looks up the results that utPLSQL stores in tables). The report that it generates is compatible with the unit test reports created by the Maven Surefire plugin.

      
        com.theserverlabs.maven.utplsql
        maven-utplsql-plugin
        1.0-SNAPSHOT

        
            
                com.oracle
                ojdbc14
                9.0.2.0.0
            
        

        
          oracle.jdbc.driver.OracleDriver
          jdbc:oracle:thin:@localhost:1521:xe
          testing
          testing
          
          All
        
        
          
            run-plsql-test-packages
            process-test-resources
            
              execute
            
          
        
      

Building the PL/SQL project

Open a command console and navigate to the folder which contains the downloaded project (the folder with pom.xml in it). Execute the following:

mvn test

You should see some output like this:

[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building PL SQL test project
[INFO]    task-segment: [test]
[INFO] ------------------------------------------------------------------------
[INFO] [resources:resources]
[INFO] Using default encoding to copy filtered resources.
[INFO] [compiler:compile]
[INFO] No sources to compile
[INFO] [resources:testResources]
[INFO] Using default encoding to copy filtered resources.
[INFO] [sql:execute {execution: create-schema}]
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/main/sql/mybooks/schema.sql
[INFO] 3 of 3 SQL statements executed successfully
[INFO] [sql:execute {execution: create-plsql-packages}]
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/main/plsql/betwnstr.sf
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/main/plsql/mybooks/mybooks.pkb
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/main/plsql/mybooks/mybooks.pks
[INFO] 3 of 3 SQL statements executed successfully
[INFO] [sql:execute {execution: insert-data}]
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/main/resources/data/mybooks.sql
[INFO] 6 of 6 SQL statements executed successfully
[INFO] [sql:execute {execution: create-plsql-test-packages}]
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/test/plsql/mybooks/ut_mybooks.pkb
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/test/plsql/mybooks/ut_mybooks.pks
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/test/plsql/suite.sp
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/test/plsql/ut_betwnstr.pkb
[INFO] Executing file: /media/WINDOWS/work/utplsql/plsqltest-svn/src/test/plsql/ut_betwnstr.pks
[INFO] 5 of 5 SQL statements executed successfully
[INFO] [utplsql:execute {execution: run-plsql-test-packages}]
[INFO] using JDBC driver : oracle.jdbc.driver.OracleDriver
[INFO] Running UTPLSQL test suite All
[INFO] Creating test report for run 134
[INFO] Creating test report for run 133
[INFO] [compiler:testCompile]
[INFO] No sources to compile
[INFO] [surefire:test]
[INFO] No tests to run.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3 seconds
[INFO] Finished at: Wed May 13 17:59:08 CEST 2009
[INFO] Final Memory: 16M/295M
[INFO] ------------------------------------------------------------------------

If you connect to your database as the “testing” user, you should see that new tables and packages have been created. Below is an example of how my schema looks in SQL Developer:

Artifacts in SQL Developer

Artifacts in SQL Developer

Adding Continuous Integration with Hudson

Although it is great that we can install the schema and run PL/SQL unit tests with just one Maven command, we can add even more value by building this project in Hudson – a continuous integration system. We will configure Hudson to detect changes to the source code in the Subversion repository, re-installing the database artefacts and running the unit tests each time a change is found. This helps ensure that the source code in the repository always compiles and passes all the unit tests and highlights bugs and integration problems earlier in the development process, when they are cheaper to fix.

Adding the source code to Subversion

This post assumes that you already have a Subversion repository available. If you don’t, you can read how to get and install Subversion here. In the commands below, substitute http://subversion.company.com/svn/repo/ for the URL to your subversion repository. Our repositories are organised following the recommended subversion repository layout so the source code for the main branch is always in the folder /trunk.

To import the source code into the repository, execute the following commands in a command console on a machine with the subversion client installed, adapting them for your platform and environment. Here I assume that you download and unzipped the source code for the PL/SQL test project to /tmp/plsql-test-project.

svn mkdir http://subversion.company.com/svn/repo/trunk/plsql-test-project/
cd /tmp/plsql-test-project/
svn import . http://subversion.company.com/svn/repo/trunk/plsql-test-project/

Once you’ve done the import, check that you can see the source code in a web browser by going to http://subversion.company.com/svn/repo/trunk/plsql-test-project/.

Installing Hudson

Hudson is very easy to install. Go to the Hudson homepage and click on the link to download the latest version of hudson.war. Then, in a console, go to the directory to which you downloaded hudson.war and run the following command:

java -jar hudson.war --httpPort=9080

This starts up Hudson so that it is available on port 9080 instead of the standard 8080, which tends to collide with other applications (for example Oracle XE if you have it installed).

In a web browser, go to http://localhost:9080/ to view the hudson homepage. Follow the steps below to create and configure the Hudson job to build the PL/SQL project:

  • Specify the Maven home in Manage Hudson > Configure System
  • Click on the “New Job” link in the top-left corner
  • Enter a job name and select “build maven2 project”
  • Click OK to create the job and you should be taken to the job configuration page.
  • In the Source Code Management section, click on Subversion
  • Enter the Subversion Repository URL to which you uploaded the project code i.e. http://subversion.company.com/svn/repo/trunk/plsql-test-project/
  • Enter the username/password to access Subversion if Hudson prompts you for them
  • In the build triggers section, choose ‘poll scm’ and enter the */10 * * * * in the scheduling box
  • In the ‘Goals and options’ part of the build section, enter “test” (without the quotes)
  • Click on Save

To test the build, click on the “Build Now” link in Hudson. Once the build has finished, you should see a new build in the “Build History” list of builds for the job. It should have a yellow circle next to it, indicating that the build completed ok but that there was a unit test failure. Click on the link for the build and you should see a screen like that below:

Hudson Job Test Failure screen

Hudson Job Test Failure screen

Note that at the centre of the screen, it says “Test Result (1 failure)”. Click on the failure (BETWNSTR.UT_BETWNSTR) to see the error message – EQ “zero start” Expected “abc” and got “ab”. Or, instead, click on the “Test Result” link to see the entire test report. This should show that 16 tests were executed, with 1 failure. These are all the tests that are included in the “All” utPLSQL package.

If we configure an email server in Hudson (out of the scope of this post), we can receieve email notifications if the build fails, either because of a construction problem or because there is a unit test failure.

Fixing the test failure

To get rid of the test failure, we will modify the project source code. Firstly, we must check it out from Subversion, using the commands below (which you should customize to your environment):

mkdir /tmp/plsql-checkout
cd /tmp/plsql-checkout
svn co http://subversion.company.com/svn/repo/trunk/plsql-test-project/ .

Edit line 46 of the file src/test/plsql/ut_betwnstr.pkb and change the string “abc” to “ab”. Then check the file into Subversion using the following command:

svn ci src/test/plsql/ut_betwnstr.pkb -m "fixed bug"

At this point you can either wait up to 10 mins for Hudson to detect your change in Subversion or, if you are impatient (like me!) you can click on the “build now” link in the Hudson job to force Hudson to pick up the change in Subversion and rebuild the project. When it excecutes the tests, there should now be no failures.

Conclusion

This is a long post with a lot of steps in it but most of them are setup-related. Once you have this set up, you’ll see that it is really easy to do unit testing and continuous integration with PL/SQL. This should help reduce integration time and increase the quality of the code deployed. It’s also worth mentioning that once you start using Hudson, you can take advantage of the vast number of Hudson plugins available. For example, you could update your bug and issue tracking system automatically when you check in a fix for a bug via one of the Hudson plugins.

Performance Tests with Jmeter, Maven and Hudson

Continuing with the series of blog posts regarding testing, automation and continuous integration, this time I will talk about how to integrate performance tests, in this case using Jmeter, with Maven and the Hudson continuous integration system. Jmeter is one of the main tools we use in our projects to create relevant performance tests, and automation and integration in our CI systems is essential. We also use SoapUI and Grinder depending on the platform, but we will cover those in future posts.

To integrate Jmeter and Maven, you must use the Maven Jmeter plugin, the first version of which is officially hosted here. There are are several posts discussing the use of this plugin that I have used as reference, particularly this one from James Lorenzen.

An updated version of this original plugin was released in google code by Ronald Alleva. It adds support for the latest version of Jmeter (2.3.2) and contains some additional enhacements such as parameterisation support. I decided to use this version to get this extra funcionality.

To make this post as simple and easy to follow as possible, my idea was to test Jmeter tests automation and integration against an application web running in an embeded Jetty container.

My objectives were:

  • Be able to run my Jmeter tests in a specific maven phase.
  • Create the relevant Jmeter test results html reports and publish them during the maven site phase.
  • Integrate Jmeter execution and reporting with Hudson, via the Hudson Jmeter plugin available here.

This post is a combination of James and Ron’s posts, adding solutions to problems I discovered while creating and executing my tests and showing the integration with Hudson.

Ronald describes pretty well here the different steps required in order to make the plugin work. I will not include these steps but rather comment on them.

  • Download the support jars and the maven Jmeter plugin and install them.
  • Add the dependency to your maven project. In our case, we want to execute our Jmeter tests when we start the embedded jetty container. We usually start/stop Jetty as part of the pre-integration and post-integration maven phases. We just need to add the Jmeter plugin execution to the integration-test phase (I have also included the jetty part below):

   
       env-dev-jetty
	
	   true
	
	
	   
		
			org.apache.jmeter
			maven-jmeter-plugin
			1.0
			
				
					jmeter-tests
					integration-test
					
						jmeter
					
				
			
			
				
					${project.build.directory}/jmeter-reports
				
				
					   localhost
			       	  	   9080
       	  	  	  	
			
		
		
			org.mortbay.jetty
			maven-jetty-plugin
			6.1.12rc1
			
				
					${basedir}/target/${project.build.finalName}/${project.parent.name}-webapp-${project.version}.war
				
				
					${project.parent.name}-webapp
				
				
					
					   9080
					   60000
				        
				
				
					
						
						   pmtool.application.environment
						
						dev
					
				
				9966
				foo
				true
				0
			
			
				
					start-jetty
					pre-integration-test
					
						run-war
					
				
				
					stop-jetty
					post-integration-test
					
						stop
					
				
			
		
	
   


In a real scenario where, as part of your project cycle, the application is deployed to a development or pre-production environment, we could use the verify phase to execute the performance tests.

  • Also, in case you want to generate html reports for publication on the project site generated by Maven, execute the XSLT transformation from xml to html provided by Jmeter using the xml-maven-plugin:

   org.codehaus.mojo
   xml-maven-plugin
   1.0-beta-2
   
      
         pre-site
         
             transform
         
         
    
    
         
             
                  ${project.build.directory}/jmeter-reports
                  src/test/resources/jmeter-results-detail-report_21.xsl
                  ${project.build.directory}/site/jmeter-results
                  
                      
                         html
                      
                  
             
         
    

  • Create some Jmeter tests. I created 2 Jmeter tests, PerfTest1 and PerfTest2, which are parametized so different hostnames and ports can be specified for the tests. The maven profile “env-dev-jetty” (which is the default used) defines the Jmeter parameters for Jetty in the “jmeterUserProperties” section. We have a separate profile to run Jmeter against against the application deployed in a weblogic container.
  • Copy the jmeter.properties file to the “/src/test/jmeter” folder in your project and modify the properties to fit your needs. For example, I had to modify some “jmeter.save.saveservice” parameters to enable additional output and also some “log_level” to see a bit more of detail in the jmeter.log file.
  • Execute mvn to run the tests.
mvn integration-test

Up to this point everything was ok apart from the fact that when running in maven, the Jmeter tests where hanging just after finishing correctly. After some investigation, I found that the reason was the exit logic implemented in the maven Jmeter plugin. The plugin exits when the number of threads drops to the number just before the Jmeter start call. This works in most cases, but when you are also running an embedded Jetty server, the threads spawned to service the requests triggered by the Jmeter tests are counted as well, causing the wait. The plugin eventually exits when all these connection threads are closed after timing out.

The solution was to change the exit logic, monitoring the jmeter.log file as described here, instead of monitoring the number of threads. This should work in most cases.

  • Integrate into Hudson, the continuous integration that we use. For that I installed the Jmeter Hudson plugin and configured it as shown below:
Hudson Jmeter Plugin Configuration

Hudson Jmeter Plugin Configuration

  • In order to keep the Jmeter detailed performance test results, I configured Hudson to archive the relevant reports as build artifacts.
  • The Hudson and Maven Jmeter plugins have different naming convention for the report files. The maven Jmeter plugin generates the report filename with a date which is not compatible with the static file name required by the Hudson Jmeter Plugin. Solution: I modified the maven plugin source code to generate a results file without the date in it (e.g. from PerfTest1-090413.xml to PerfTest1-result.xml). There is no disadvantage as Hudson will store the results for each build as an artifact, keeping a log.
  • The Hudson plugin doesn’t really support collecting information about multiple Jmeter Tests at this moment. The way to go would be to group different performance tests into a single Jmeter file and create separate Hudson jobs for them.
  • After all this setup has been done, you can execute some builds and the check that the Jmeter execution graphs works as they should. I created an example, containing errors and different response times to make it pretty. The graphs show the trend for only one of the Jmeter tests.
Hudson Jmeter Test Execution Trend

Hudson Jmeter Test Execution Trend

  • Note the section “Last Successful Artifacts” that contains the html reports created during the maven site creation and archived for future reference. An example of this detail report is:
Jmeter Detailed report as Hudson Artifact

Jmeter Detailed report as Hudson Artifact