Java 7 Code Coverage with Gradle and Jacoco

Thanks

Steven Dicks’ post on Jacoco and Gradle is a great start to integrating Jacoco and Gradle, this is a small iteration on top of that work.

Java 7 Code Coverage

The state of Code Coverage took a serious turn for the worst when Java 7 came out. The byte-code changes in Java 7 effectively made Emma and Cobertura defunct. They will not work with Java 7 constructs. Fortunately there is a new player in town called JaCoCo (for Java Code Coverage). JaCoCo is the successor to Emma which is being built on the knowledge gained over the years by the Eclipse and Emma teams on how best to do code coverage. And works with Java 7 out-of-the-box.

The advantage of using established tools is that they generally are well supported across your toolchain. JaCoCo is fairly new and so support in Gradle isn’t so smooth. Fortunately Steven’s post got me started down the right path. The one thing that I wanted to improve right away was to use transitive dependency declarations as opposed to having local jar files in my source repository. JaCoCo is now available in the Maven repos so we can do that. One thing to note is that the default files build in the Maven repo are Eclipse plugins, so we need to reference the “runtime” classifier in our dependency

The Gradle Script

configurations {
    codeCoverage
    codeCoverageAnt
}
dependencies {
    codeCoverage 'org.jacoco:org.jacoco.agent:0.5.10.201208310627:runtime@jar'
    codeCoverageAnt 'org.jacoco:org.jacoco.ant:0.5.10.201208310627'
}
test {
    systemProperties = System.properties
    jvmArgs "-javaagent:${configurations.codeCoverage.asPath}=destfile=${project.buildDir.path}/coverage-results/jacoco.exec,sessionid=HSServ,append=false",
            '-Djacoco=true',
            '-Xms128m',
            '-Xmx512m',
            '-XX:MaxPermSize=128m'
}
task generateCoverageReport << {
    ant {
        taskdef(name:'jacocoreport', classname: 'org.jacoco.ant.ReportTask', classpath: configurations.codeCoverageAnt.asPath)
 
        mkdir dir: "build/reports/coverage"
 
        jacocoreport {
            executiondata {
                fileset(dir: "build/coverage-results") {
                    include name: 'jacoco.exec'
                }
            }
            structure(name: project.name) {
                classfiles {
                    fileset(dir: "build/classes/main") {
                        exclude name: 'org/ifxforum/**/*'
                        exclude name: 'org/gca/euronet/generated**/*'
                    }
                }
                sourcefiles(encoding: 'CP1252') {
                    fileset dir: "src/main/java"
                }
            }
 
            xml  destfile: "build/reports/coverage/jacoco.xml"
            html destdir:  "build/reports/coverage"
        }
    }
}

A Few Details

The magic is in the jvmArgs of the test block. JaCoCo is run as a Java Agent which uses the runtime instrumentation feature added in Java 6 to be able to inspect the running code. Extra arguments can be added to JaCoCo there including things like excludes to exclude specific classes from coverage. The available parameters are the same as the maven JaCoCo parameters.

The generateCoverageReport task converts the jacoco.exec binary into html files for human consumption. If you’re just integrating with a CI tool, like Jenkins, then you probably don’t need this, but it’s handy for local use and to dig into the details of what’s covered.

Loose Ends

One problem that I ran into was referencing project paths like the project.buildDir from within an Ant task. Hopefully someone will come along and let me know how that’s done.

Check Multiple Mercurial Repositories for Incoming Changes

Currently I have a whole bunch of Mercurial repositories in a directory. All of these are cloned from a central repository that the team pushes their changes to. I like to generally keep my local repositories up-to-date so that I can review changes. Manually running hg incoming -R some_directory on 20 different projects is a lot of work. So I automated it with a simple shell script.

This script will run incoming (or outgoing) on all of the local repositories and print the results to the console. Then I can manually sync the ones that have changed if I want.

I called this file hgcheckall.sh and run it like: ./hgcheckall.sh incoming

#!/bin/bash
 
# Find all the directories that are mercurial repos
dirs=(`find . -name ".hg"`)
# Remove the /.hg from the path and that's the base repo dir
merc_dirs=( "${dirs[@]//\/.hg/}" )
 
case $1 in
    incoming)
    for indir in ${merc_dirs[@]}; do
        echo "Checking: ${indir}"
        hg -R "$indir" incoming
    done
    ;;
    outgoing)
    for outdir in ${merc_dirs[@]}; do
        echo "Checking: ${outdir}"
        hg -R "$outdir" outgoing
    done
    ;;
    *)
    echo "Usage: hgcheckall.sh [incoming|outgoing]"
    ;;
esac

I guess the next major improvement would be to capture the output and then automatically sync the ones that have changed, but I haven’t gotten around to that yet.

DRY your CruiseControl.NET Configuration

Don’t Repeat Yourself (DRY) is one of the principles of good software development. The idea is that there should ideally be one and only one “source of knowledge” for a particular fact or calculation in a system. Basically it comes down to not copying-and-pasting code around or duplicating code if at all possible. The advantages of this are many.

Advantages of DRY

  • There will be less code to maintain
  • If a bug is found, it should only have to be fixed in one place
  • If an algorithm or process is changed, it only needs to be changed in one place
  • More of the code should become reusable because as you do this you will parameterize methods to make them flexible for more cases

If it’s good for code isn’t it good for other things like configuration? Why yes it is.

Using CruiseControl.NET Configuration Builder

The Configuration Preprocessor allows you to define string properties and full blocks of XML to use for substitution and replacement. To start using the Configuration Preprocessor, you add xmlns:cb=”urn:ccnet.config.builder”, an xml namespace, to your document to tell the config parser that you plan to do this.

From there you can define a simple property like:

<cb:define client="xxx"/>

Or you can make it a full block of XML:

<cb:define name="svn-block">
    <sourcecontrol type="svn">
        <trunkUrl>http://svn.example.com/svn/$(client)/$(project)/trunk</trunkUrl>
        <workingDirectory>D:\Builds-Net\projects\$(client)\$(project)\trunk</workingDirectory>
        <executable>svn.exe</executable>
        <autoGetSource>true</autoGetSource>
    </sourcecontrol>
</cb:define>

Defining Reusable Blocks

Using these ideas I wanted to come up with a templated approach that would allow me to share configuration among multiple projects. That way, if I added new statistics or change the layout of my build server, I would only have to change it in a single place. Thus keeping things DRY. It also encourages more consistency across multiple projects making things easier to understand.

So, I started defining some reusable blocks in the main ccnet.config file which you can see below. The exact details will depend on your configuration of course.

Full Example of config.xml

<!--
How to add a new project:
Step 1. Create a config file named "<config>-project.xml"
Step 2. Add the project reference below
-->
 
<cruisecontrol xmlns:cb="urn:ccnet.config.builder">
 
    <!-- cb defines to compose reusable blocks of configuration -->
    <!-- use <cb:define client="xxx"/> and <cb:define project="yyy"/> to use these -->
 
    <cb:define name="svn-block">
        <sourcecontrol type="svn">
            <trunkUrl>http://svn.example.com/svn/$(client)/$(project)/trunk</trunkUrl>
            <workingDirectory>D:\Builds-Net\projects\$(client)\$(project)\trunk</workingDirectory>
            <executable>svn.exe</executable>
            <autoGetSource>true</autoGetSource>
        </sourcecontrol>
    </cb:define>
 
    <cb:define name="msbuild-20-block">
        <msbuild>
            <executable>C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\MSBuild.exe</executable>
            <workingDirectory>D:\Builds-Net\projects\$(client)\$(project)\trunk</workingDirectory>
            <projectFile>build.proj</projectFile>
            <buildArgs>$(build-args) </buildArgs>
            <targets>$(build-targets)</targets>
            <timeout>600</timeout>
            <logger>D:\Program Files\CruiseControl.NET\server\Rodemeyer.MsBuildToCCNet.dll</logger>
        </msbuild>
    </cb:define>
 
    <cb:define name="msbuild-35-block">
        <msbuild>
            <executable>C:\WINDOWS\Microsoft.NET\Framework\v3.5\MSBuild.exe</executable>
            <workingDirectory>D:\Builds-Net\projects\$(client)\$(project)\trunk</workingDirectory>
            <projectFile>build.proj</projectFile>
            <buildArgs>$(build-args) </buildArgs>
            <targets>$(build-targets)</targets>
            <timeout>600</timeout>
            <logger>D:\Program Files\CruiseControl.NET\server\Rodemeyer.MsBuildToCCNet.dll</logger>
        </msbuild>
    </cb:define>
 
    <cb:define name="merge-block">
        <!-- Merge the output of tests, code coverage and fxcop -->
        <merge>
            <files>
                <file>D:\Builds-Net\projects\$(client)\$(project)\trunk\*.Test.xml</file>
                <file>D:\Builds-Net\projects\$(client)\$(project)\trunk\*.CoverageMerge.xml</file>
                <file>D:\Builds-Net\projects\$(client)\$(project)\trunk\*.CoverageSummary.xml</file>
                <file>D:\Builds-Net\projects\$(client)\$(project)\trunk\*.FxCop.xml</file>
            </files>
        </merge>
    </cb:define>
 
    <cb:define name="loggers-block">
        <xmllogger>
            <logDir>D:\Builds-Net\projects\$(client)\$(project)\logs</logDir>
        </xmllogger>
        <rss/>
        <modificationHistory  onlyLogWhenChangesFound="true" />
    </cb:define>
 
    <cb:define name="stats-block">
        <statistics>
            <statisticList>
                <firstMatch name="Svn Revision" xpath="//modifications/modification/changeNumber" />
                <firstMatch name="Coverage" xpath="//coverageReport/project/@coverage" generateGraph="true"/>
                <firstMatch name="Warnings" xpath="//msbuild/@warning_count" generateGraph="true"/>
                <firstMatch name="Errors" xpath="//msbuild/@error_count" generateGraph="true"/>
 
                <!-- NDepend -->
                <!--
                <firstMatch name="ILInstructions" xpath="//ApplicationMetrics/@NILInstruction" />
                <firstMatch name="Avgerage Complexity" xpath="//ApplicationMetrics/MethodCC/@Avg" />
                <firstMatch name="Max Complexity" xpath="//ApplicationMetrics/MethodCC/@MaxVal" />
                <firstMatch name="LinesOfCode" xpath="//ApplicationMetrics/@NbLinesOfCode" generateGraph="true"/>
                <firstMatch name="LinesOfComment" xpath="//ApplicationMetrics/@NbLinesOfComment" generateGraph="true"/>
                -->
            </statisticList>
        </statistics>
    </cb:define>
 
    <cb:include href="config-client-project.xml"/>
    <cb:include href="config-client2-project-trunk.xml"/>
</cruisecontrol>

At the end of the file you can see the cb:include references. Those are one-line includes to include the configuration of each project. This makes things easier to manage, I think, because you only have to look at the individual project configuration.

Using Reusable Blocks in Individual Configuration Files

From there I need to make use of those defined blocks in in individual file. The first thing I needed to do was to set the parameters that I had defined as simple string replacements in the reusable blocks. Normally you would do that with cb:define as I showed above. But the trick is that you can only have one property with a given name defined. If you include multiple project configurations that doesn’t work. What does work is using cb:scope definitions. This allows for a value to be defined only within a specific scope.

<cb:scope 
         client="ExampleClient" 
         project="SpecialProject"
         build-args="/p:Configuration=Debug"
         build-targets="Clean;Test">
 ...
</cb:scope>

From there you just need to start including the blocks that you defined in the main ccnet.confg within the scope block.

Full Example of Project Configuration

 
<!-- CruiseControl.NET configuration -->
<project name="ExampleClient SpecialProject" xmlns:cb="urn:ccnet.config.builder">
 
    <cb:scope 
         client="ExampleClient" 
         project="SpecialProject"
         build-args="/p:Configuration=Debug"
         build-targets="Clean;Test">
 
        <cb:svn-block/>
 
        <tasks>
            <cb:msbuild-35-block/>
        </tasks>
 
        <publishers>
 
            <cb:merge-block/>
 
            <!-- Enable collection of project statistics -->
            <cb:stats-block/>
 
            <cb:loggers-block/>
 
            <email mailhost="smtp.example.com" from="ccnet@example.com" includeDetails="true">
                <users>
                    <user name="Developer One" group="buildmaster" address="dev1@example.com" />
                    <user name="Developer Two" group="developers" address="dev2@example.com" />
                </users>
                <groups>
                    <group name="developers" notification="change" />
                    <group name="buildmaster" notification="change" />
                </groups>
            </email>
        </publishers>
    </cb:scope>
</project>

As you can see, the only one I didn’t template out was the email block because that depends on the developers working on each project.

Have fun bringing simplicity and consistency to your Cruise Control.NET configuration!

For the full details see the CruiseControl.NET Configuration Preprocessor documentation.

Windows Subversion Maintenance Scripts

Previously I wrote about the Subversion Maintenance Scripts that I use for doing things like backups and upgrades. They were bash shell scripts that automated dealing with multiple Subversion repositories. The secret was that those guys were being run using Cygwin on Windows. Recently we got a new, more powerful server to run our Virtuals on. I decided to rewrite the scripts using native Windows Batch files so that I wouldn’t have to install Cygwin again. Hopefully someone else will find this useful too.

Windows Batch Script for Creating a Default Repository Structure

I work for a custom software development consulting firm so we have a lot of different clients. We don’t want to mix different clients’ code so we create a repository per client. We often get multiple projects per client though (yeah, they like us) so we go ahead and create a project under the client. We also like to use the trunk, tags and branches directories by default.

With all of these rules: Automate!

create_repo.bat

@echo off
 
REM Check command line args to make sure a customer and project name were passed
IF (%1)==() GOTO CMDLINE
IF (%2)==() GOTO CMDLINE
 
REM If svnadmin is not in your path, uncomment the following
REM PATH=%PATH%;"D:\Program Files\Subversion\bin"
 
REM Set up variables with the directories
SET CUSTOMER=%1
SET PROJECT=%2
SET CUR_DIR=%~dp0
SET CUR_DIR=%CUR_DIR:\=/%
SET SVN_DIR=%CUR_DIR%/repos/%CUSTOMER%
SET PROJ_DIR=file:///%SVN_DIR%/%PROJECT%
 
echo Creating directory in %SVN_DIR%
 
svnadmin create "%SVN_DIR%"
svn -m "Create default structure." mkdir %PROJ_DIR% %PROJ_DIR%/trunk %PROJ_DIR%/tags %PROJ_DIR%/branches
GOTO END
 
:CMDLINE
echo Usage: %0 ^<customer^> ^<project^>
 
:END
echo Done

Some of the interesting pieces:
%~dp0 gives you the directory that the current script lives in.
%CUR_DIR:\=/% replaces the backslash (\) with forward slashes (/) so that the SVN file:// URL will work correctly.

Windows Batch Script to Dump Your Subversion Repositories

The next thing you might want to do is to dump all of your repositories so that you can back them up or upgrade SVN or move to a new server. With just one repository it’s easy to do by hand. With 20+ repositories it’s much nicer to run a script and then go do something else.

dump_all.bat

@echo off
 
SET DUMP_DIR=dumps
SET REPOS_DIR=repos
 
IF NOT EXIST %DUMP_DIR% (
echo Creating directory %DUMP_DIR%
md %DUMP_DIR%
)
 
echo Dumping...
 
FOR /f %%i IN ('dir /B /A:D %REPOS_DIR%\*') DO (
    echo Dumping %REPOS_DIR%\%%i to %DUMP_DIR%\%%i.dump
    svnadmin dump %REPOS_DIR%\%%i > %DUMP_DIR%\%%i.dump
)

Some of the interesting pieces:
FOR /f %%i IN (‘dir /B /A:D %REPOS_DIR%\*’) DO is the part that finds each directory (in this case an SVN repository). Then for each repository you run the svnadmin dump command and send the dump file to a different directory.

Windows Batch Script to Load Your Subversion Repositories

After you’ve dumped all of your repositories and you need to restore them, because you’ve upgraded or moved them to a new server we need to do the reverse of the dump script.

restore_all.bat

@echo off
 
SET DUMP_DIR=dumps
SET REPOS_DIR=repos
 
echo Restoring....
 
REM %%~ni becomes the dump name without .dump
 
FOR /f %%i IN ('dir /B %DUMP_DIR%\*.dump') DO (
    echo Restoring %DUMP_DIR%\%%i to %REPOS_DIR%\%%~ni
    svnadmin create %REPOS_DIR%\%%~ni
    svnadmin load %REPOS_DIR%\%%~ni < %DUMP_DIR%\%%i
)

Some of the interesting pieces:
FOR in this case is the same as the dump, except it’s looking for all the dump files.
%%i holds the name of the dump file (e.g. repo.dump). %%~ni strips off the extension to give you just the name of the file (e.g. repo).

Now you can manage your repositories with ease if you’re stuck on a Windows machine.

Capistrano and Ferret DRB

This is a bit of a followup to my previous post on Capistrano with Git and Passenger. I decided to use Ferret via the acts_as_ferret (AAF) plugin. Ferret is a full-text search inspired by Apache’s Lucene but written in Ruby.

Basically Ferret and Lucene keep a full-text index outside of the database that allows it to quickly perform full-text searches and find the identifers of rows in your database. Then you can go get those objects out of the database. It’s pretty slick.

Ferret uses DRb as a means of supporting multiple-concurrent clients and for scaling across multiple machines. You really don’t need to know much about DRb to use AAF, but you do need to run the ferret DRb server in your production environment. Which gets us to…

Automating The Starting and Stopping of ferret_server

A few lines of code in your Capistrano deploy.rb and you are off and running.

before "deploy:start" do 
  run "#{current_path}/script/ferret_server -e production start"
end 
 
after "deploy:stop" do 
  run "#{current_path}/script/ferret_server -e production stop"
end
 
after 'deploy:restart' do
  run "cd #{current_path} && ./script/ferret_server -e production stop"
  run "cd #{current_path} && ./script/ferret_server -e production start"
end

Except it doesn’t work. I ended up with some errors like:
could not execute command
no such file to load — /usr/bin/../config/environment

It also ends up that it’s not Capistrano’s fault.

Acts As Ferret server_manager.rb

In the file vendor/plugins/acts_as_ferret/lib/server_manager.rb there is a line that sets up where to look for its environment information. For some reason this is the default:

  # require(File.join(File.dirname(__FILE__), '../../../../config/environment'))
  require(File.join(File.dirname(ENV['_']), '../config/environment'))

If you notice, there is a line commented out. It just so happens that uncommenting that line and commenting out the other fixed the issue for me. It ends up that ENV[‘_’] points to the base path of the executable and thats /usr/bin/env. And that doesn’t work. I’m not sure why that’s the default behavior.

Anyway, it’s easily fixed:

  require(File.join(File.dirname(__FILE__), '../../../../config/environment'))
  # require(File.join(File.dirname(ENV['_']), '../config/environment'))

With that fix in place, the Capistrano deployment will restart the Ferret DRb server when you deploy your application.

Update
According to John in the comments below you can fix the AAF problem without changing the code as well. Just add default_run_options[:shell] = false to your Capistrano script and that will take care of it.

Database Migrations for .NET

One of the more difficult things to manage in software projects is often changing a database schema over time. On the projects that I work on, we don’t usually have DBAs who manage the schema so it is left up to the developers to figure out. The other thing you have to manage is applying changes to the database in such a way that you don’t disrupt the work of other developers on your team. We need the change to go in at the same time as the code so that Continuous Integration can work.

Migrations

While I don’t know if they were invented there, migrations seem to have been popularized by Ruby on Rails. Rails is a database centric framework that implies the properties of your domain from the schema of your database. For that reason it makes sense that they came up with a very good way of These are some example migrations to give you an idea of the basics of creating a schema.

001_AddAddressTable.cs:

using Migrator.Framework;
using System.Data;
[Migration(1)]
public class AddAddressTable : Migration
{
    override public void Up()
    {
         Database.AddTable("Address",
             new Column("id", DbType.Int32, ColumnProperty.PrimaryKey),
             new Column("street", DbType.String, 50),
             new Column("city", DbType.String, 50),
             new Column("state", DbType.StringFixedLength, 2),
             new Column("postal_code", DbType.String, 10)
    }
    override public void Down()
    {
        Database.RemoveTable("Address");
    }
}

02_AddAddressColumns.cs:

using Migrator.Framework;
using System.Data;
[Migration(2)]
public class AddAddressColumns : Migration
{
    public override void Up()
    {
        Database.AddColumn("Address", new Column("street2", DbType.String, 50));
        Database.AddColumn("Address", new Column("street3", DbType.String, 50));
    }
    public override void Down()
    {
        Database.RemoveColumn("Address", "street2");
        Database.RemoveColumn("Address", "street3");
    }
}

003_AddPersonTable.cs:

using Migrator.Framework;
using System.Data;
[Migration(3)]
public class AddPersonTable : Migration
{
    public override void Up()
    {
        Database.AddTable("Person", 
            new Column("id", DbType.Int32, ColumnProperty.PrimaryKey),
            new Column("first_name", DbType.String, 50),
            new Column("last_name", DbType.String, 50),
            new Column("address_id", DbType.Int32, ColumnProperty.Unsigned)
        );
        Database.AddForeignKey("FK_PERSON_ADDRESS", "Person", "address_id", "Address", "id");
    }
    public override void Down()
    {
        Database.RemoveTable("Person");
    }
}

Run Your Migrations

The best way to run your migrations will be to integrate it into your build automation tool of choice. If you are not using one, now is the time.

MigratorDotNet supports MSBuild and NAnt.

MSBuild:

<Target name="Migrate" DependsOnTargets="Build">
    <CreateProperty Value="-1"  Condition="'$(SchemaVersion)'==''">
        <Output TaskParameter="Value" PropertyName="SchemaVersion"/>
    </CreateProperty>
    <Migrate Provider="SqlServer" 
            Connectionstring="Database=MyDB;Data Source=localhost;User Id=;Password=;" 
            Migrations="bin/MyProject.dll" 
            To="$(SchemaVersion)"/>
</Target>

NAnt:

<target name="migrate" description="Migrate the database" depends="build">
  <property name="version" value="-1" overwrite="false" />
  <migrate
    provider="MySql|PostgreSQL|SqlServer"
    connectionstring="Database=MyDB;Data Source=localhost;User Id=;Password=;"
    migrations="bin/MyProject.dll"
    to="${version}" />
</target>

So You Want to Migrate?

Some more documentation and example are available MigratorDotNet. Some of the changes represented are still in an experimental branch that is in the process of being merged.


MigratorDotNet is a continuation of code started by Marc-André Cournoyer and Nick Hemsley.

Start a New Branch on your Remote Git Repository

Git is a distributed version control system so it allows you to create branches locally and commit against them. It also supports a more centralized repository model. When using a centralized repository you can push changes to it so that others can pull them more easily. I have a tendency to work on multiple computers. Because of this, I like to use a centralized repository to track the branches as I work on them. That way no matter what machine I’m on, I can still get at my branches.

The Workflow

My workflow is generally something like this:

  1. Create a remote branch
  2. Create a local branch that tracks it
  3. Work, Test, Commit (repeat) – this is all local
  4. Push (pushes commits to the remote repository)

Git commands can be a bit esoteric at times and I can’t always seem to remember how to create a remote git branch and then start working on new code. There also seems to be multiple ways of doing it. I’m documenting the way that seem to work for me so that I can remember it. Maybe it will help someone else too.

Creating a Remote Branch

  1. Create the remote branch

    git push origin origin:refs/heads/new_feature_name

  2. Make sure everything is up-to-date

    git fetch origin

  3. Then you can see that the branch is created.

    git branch -r

This should show ‘origin/new_feature_name’

  1. Start tracking the new branch
    git checkout --track -b new_feature_name origin/new_feature_name

This means that when you do pulls that it will get the latest from that branch as well.

  1. Make sure everything is up-to-date
    git pull

Cleaning up Mistakes

If you make a mistake you can always delete the remote branch

git push origin :heads/new_feature_name

(Ok Git’ers – that has to be the least intuitive command ever.)

Use the Branch from Another Location

When you get to another computer or clone the git repository to a new computer, then you just need to start tracking the new branch again.

git branch -r

to show all the remote branches

git checkout --track -b new_branch origin/new_feature_name

to start tracking the new branch

Automate it A Bit

That’s a pretty easy thing to automate with a small shell script luckily

</p>
 
<h1>!/bin/sh</h1>
 
<h1>git-create-branch <branch_name></h1>
 
<p>if [ $# -ne 1 ]; then
         echo 1>&amp;2 Usage: $0 branch_name
         exit 127
fi</p>
 
<p>set branch_name = $1
git push origin origin:refs/heads/${branch_name}
git fetch origin
git checkout --track -b ${branch_name} origin/${branch_name}
git pull

For further help, you might want to check out:

CruiseControl With a Specific Version of Grails

Continuous Integration is a good practice in software development. It helps catch problems early to prevent them from becoming bigger problems later. It helps to reinforce other practices like frequent checkins and unit testing as well. I’m using CruiseControl (CC) for Continuous Integration at the moment.

One of the things about Grails is that it is really run through a series of scripts and classes that set up the environment. The Ant scripts really just delegate the work to those grails scripts. To run properly, the GRAILS_HOME environment needs to be set so that it can find the proper classes, etc. This is not a problem if you are running a single Grails application in Continuous Integration. The issue arises when you want to run multiple against different version of Grails. A project I’m working on uncovered a bug in the 1.0.2 release of Grails. The code worked fine on 1.0.1 so I wanted to run against that specific version of Grails.

It ends up this is not to hard with a few small changes to your Ant build.xml file.

First you can declares some properties that have the paths to the Grails directory and the grails executable (the .bat version if your CC server is on Windows).

<property name="cc-grails.home" value="C:\grails-1.0.1" />
<property name="cc-grails" value="${cc-grails.home}\bin\grails.bat" />

Next you can declare a custom target to execute on the CC server. You reference the ‘cc-grails’ property declared. The key is that you must override the GRAILS_HOME when you execute the grails script.

<target name="cc-test" description="--> Run a Grails applications unit tests">
    <exec executable="${cc-grails}" failonerror="true">
        <env key="GRAILS_HOME" value="${cc-grails.home}"/>
	<arg value="test-app"/>
    </exec>                               
</target>

Now the Continuous Integration of your Grails app runs against a specific version of Grails.

The Full build.xml

<project name="project" default="test">
 
    <condition property="grails" value="grails.bat">
        <os family="windows"/>
    </condition>
    <property name="grails" value="grails" />
    <property name="cc-grails.home" value="C:\grails-1.0.1" />
    <property name="cc-grails" value="${cc-grails.home}\bin\grails.bat" />
 
	<!-- ================================= 
          target: clean              
         ================================= -->
    <target name="clean" description="--> Cleans a Grails application">
		<exec executable="${grails}" failonerror="true">
			<arg value="clean"/>
		</exec>                               
    </target>
 
	<!-- ================================= 
          target: war              
         ================================= -->
    <target name="war" description="--> Creates a WAR of a Grails application">
		<exec executable="${grails}" failonerror="true">
			<arg value="war"/>
		</exec>                               
    </target>
 
	<!-- ================================= 
          target: test              
         ================================= -->
    <target name="test" description="--> Run a Grails applications unit tests">
		<exec executable="${grails}" failonerror="true">
			<arg value="test-app"/>
		</exec>                               
    </target>
 
    <!-- ================================= 
      target: cc-test              
     ================================= -->
    <target name="cc-test" description="--> Run a Grails applications unit tests in Continuous Integration mode">
		<exec executable="${cc-grails}" failonerror="true">
            <env key="GRAILS_HOME" value="${cc-grails.home}"/>
			<arg value="test-app"/>
		</exec>                               
    </target>
 
	<!-- ================================= 
          target: deploy              
         ================================= -->
    <target name="deploy" depends="war" description="--> The deploy target (initially empty)">
        <!-- TODO -->
    </target>
</project>

Test Automation Seminar

A long, long time ago in a land far away, I worked with my friend Frank Cohen to help him build the first version of a Web Service and Web Application test tool that was called TestMaker. Since then, Frank has made all kinds of improvements turning it into a really nice, graphical, scriptable testing tool. Frank has written books on Fast SOA as well as Java Testing and Design.

Now Frank is putting on a Test Automation Seminar covering a broad array of topics. He’ll be talking about his own test-automation tool TestMaker, naturally, but he’ll also be talking about others including Selenium. The seminar is going to cover a lot of currently hot technologies and techniques. In addition to general web-application testing, they are going to get into Ajax/Web 2.0 testing, REST and SOAP.

If you ever wanted to know more about functional test automation or performance and scalability testing, this might be a good, hands-on seminar to get you jump started. Check it out.

From Frank:

PushToTest is hosting a 2-Day seminar on open-source test automation. We will be covering load and scalability tests of Web applications, Ajax, Web 2.0, Service Oriented Architecture (SOA,) and functional testing of Windows and Java desktop applications. We will teach you TestMaker, soapUI, Glassbox, Selenium, and several other open-source tools.

Automated Subversion Tagging With MSBuild

I’ve written previously about using MSBuild With Nunit as well as a bit of a Manifesto on Relentless Build Automation. I believe that automating the build and deployment process is a necessary step to ensure the reliable delivery of quality software.

Release Management

One of the things that we as software developers have to do is regularly make releases, either to QA or to a customer. When we make releases we need to be able to continue to develop the software to add new features and fix bugs. Tagging is a safety process that we use in a version control system to allow us to easily get to the code that we used to build a specific release. When you Tag, you can always rebuild that release. So, if weeks or months down the road you need to fix a critical bug, but don’t want to release new features, you can get back to the Tag, create a Branch, fix the bug and release the new build to your users.

How Do We Ensure We Can Recreate Releases

How can we ensure that we will be able to recreate a release that we make to either QA or a customer? Use Automation to Tag your builds when you create them of course.

I’ve contributed a new SvnCopy Task to the MSBuild Community Tasks project which was just accepted and committed. It is currently in the Subversion repository for the project but should be available shortly in an official build. This will allow you to easily automate the process of Tagging or Branching your builds when your release. Subversion uses the Copy metaphor for both Branching and Tagging operations which is different from some other version control systems.

Example:

<Target Name="GetRemoteRevisionNumber">
    <SvnInfo RepositoryPath="$(SvnRemoteRoot)">
        <Output TaskParameter="LastChangedRevision" 
             PropertyName="RemoteSvnRevisionNumber"  />
    </SvnInfo>
    <Message Text="Revision: $(RemoteSvnRevisionNumber)"/>
</Target>
 
<Target Name="Tag" DependsOnTargets="GetRemoteRevisionNumber">
    <Error Condition="$(SvnUserName)==''" 
        Text="You must set your Subversion Username."/>
    <Error Condition="$(SvnPassword)==''" Text="You must set your Subversion Password."/>
    <SvnCopy RepositoryPath="$(SvnRemoteRoot)/trunk"
        DestinationPath="$(SvnRemoteRoot)/tags/REV-$(RemoteSvnRevisionNumber)"
        Message="Auto-tagging Revision: $(RemoteSvnRevisionNumber)"
        Username="$(SvnUserName)" password="$(SvnPassword)"/>
    <Message Text="Tagged: REV-$(RemoteSvnRevisionNumber)"/>
</Target>

You can then integrate the process of creating a Tag every time you generate a build by tying together Tasks with dependencies. In the example below, the GenerateTestBuild calls GenerateCabFiles and Tag to automatically build the installer and Tag Subversion with the current revisions number.

<Target Name="GenerateCabFiles" DependsOnTargets="Build">
    <Exec WorkingDirectory="." 
        Command="devenv &quot;$(SolutionFileName)&quot; /build $(Configuration) /project  $(ConfigCabFileProject)"/>
    <Exec WorkingDirectory="." 
        Command="devenv &quot;$(SolutionFileName)&quot; /build $(Configuration) /project  $(UiCabFileProject)"/>
</Target>
 
<Target Name="DeployCabFiles" DependsOnTargets="GenerateCabFiles">
    <MakeDir Directories="$(DestRoot)\$(BuildFolderPrefix)$(SvnRevisionNumber)"/>
    <MakeDir Directories="$(DestRoot)\$(BuildFolderPrefix)$(SvnRevisionNumber)\FormXmls"/>
    <Copy SourceFiles="@(CabFiles)" 
        DestinationFolder="$(DestRoot)\Build$(SvnRevisionNumber)"/>
    <Copy SourceFiles="$(AutoUpdaterConfigFile)" 
        DestinationFolder="$(DestRoot)\Build$(SvnRevisionNumber)"/>
    <Copy SourceFiles="$(AutoUpdaterConfigFile)" DestinationFolder="$(DestRoot);"/>
    <Copy SourceFiles="@(FormXmlFiles)" 
        DestinationFolder="$(DestRoot)\$(BuildFolderPrefix)$(SvnRevisionNumber)\FormXmls"/>
</Target>
<Target Name="GenerateTestBuild" DependsOnTargets="DeployCabFiles;Tag"/>

Hopefully this will help you get started on some more automation.

Update:
MSBuild Community Tasks version 1.2 has been released containing this code. You can get it here.

Resources

MSBuild Community Tasks
Subversion