Wednesday, February 27, 2013

Publishing from SBT to WebDAV (CloudBees)

This week I was trying to setup a Jenkins in the Cloud using CloudBees to learn about building SBT projects in a CI environment. I had just forked and adjusted the Jacoco4Sbt plugin (see my previous post) and wanted to build and host this project somewhere in the cloud. CloudBees provides an excelent service for this and when using Dev@CloudBees it is also free.

The whole setup of Jenkins on CloudBees is pretty easy. As well as building the project using SBT.
The problem was however, how to deploy the artifacts to the CloudBees Maven repository which is available via WebDAV. Apparently, the SBT publish task does not create the required directory structure on a WebDAV target location, so the publish task failed.

While Googling around I could not find a better solution then to create those directories by hand either by mounting the WebDAV locally or by using Curl. At first I used Curl to create the directories after which the artifact could successfully be published, but I was not happy with this solution. What if I change the organization, name or version of the project? Then I would have to create those directories manually again. The whole goal of an automated build/deploy system is, that everything works automatically.

So ... I created a new SBT plugin which I called: WebDav4Sbt.

The goal of this plugin is to automatically create those directories needed on a WebDAV location so the artifacts can be published. The plugin uses Sardine, a Java WebDAV client, to implement the WebDAV MKCOL command to create the WebDAV collections (a.k.a. directories).

The plugin adds 2 new tasks for SBT:
* mkcol  -  A tasks which creates the publish directories
* publish  -  The adapted default publish task which automatically calls mkcol so the required directories are always available.

See the WebDAV4Sbt project page for info on how to add this plugin to your project and use it.

I am working on a bigger article on how to setup a CI environment in the cloud in which I will share how I setup the CI to host, build and publish Scala projects using BitBucket and CloudBees. I hope I can publish it somewhere next week.

Friday, February 22, 2013

Integration testing, code coverage and SBT

Now I'm doing more and more Scala, I'm also using SBT more. With Maven there are plenty of plugins available for all kinds code checking. For SBT I want the same stuff, but I'm still discovering how to get all this.

Integration testing

Integration testing is default not enabled in SBT, but it's easy to add it. Create a new Scala class file in your <project-folder>/project folder with:
import sbt._
import Keys._

object IntegrationTesting extends Build {

  lazy val root =
    Project("root", file("."))
      .configs( IntegrationTest )
      .settings( Defaults.itSettings : _* )

By default sbt will run all tests in parallel. If your tests are using or creating the same (mock) services, your might want to disable parellel execution for integration tests.

To disable parallel execution add to your build.sbt

    parallelExecution in IntegrationTest := false

To run you integration test

    sbt it:test

Code coverage

JaCoCo seems to become the standard Java Code Coverage tool now Emma and Cobertura are not actively developed anymore and not everyone wants to buy Atlassian's Clover.
There is a perfectly good plugin for SBT called jacoco4sbt. Just running jacoco:cover will run al your unit tests and create a JaCoCo coverage report. The report will be locate in <project-folder>target/scala-<version>/jacoco/html.

However, now I just have setup my project to do both unit- and integration testing, I also want the code coverage for the integration tests. The jacoco4sbt plugin does not support this however.
I started out by extending the 'IntegrationTesting' class I had created for adding integration test support (see above). It's very well possible to add more functionality this way and I got code coverage working for integration tests this way. But all this coding in a project config did not feel right, so I forked the jacoco4sbt plugin and added the integration testing support right in the plugin. I will send a pull request to Joachim of the jacoco4sbt plugin. Hopefully he will include my changes to his plugin.
On March 3rd 2013 Joachim merged my changes in the official Jacoco4Sbt plugin. See the offical plugin site for the latest info on how to use the plugin for integration testing.
With the adjusted plugin, it's now possible to run code coverage for integration tests and to merge the results with the unit tests to get a single report with all coverage details.

New plugin functionality

To run code coverage on your unittests, still run the plugin using jacoco:<key> as before.
To run code coverage on your integration tests, use it-jacoco:<key>

To run all tests and save the execution data


To create a report


To run all tests and create a report


    Note: You can stil run [it-]jacoco:test, but running [it-]jacoco:report after this will give an
    error because the execution data was not saved. Basically, don't use 'test' but only the
    goals 'check' or 'cover' depending on whether you want to create a report directly or not.

To run both unittest and integration-tests and create a combined report


The report is generated in <project>/target/scala-<version>/it-jacoco/html.

To Not merge the unit test and integration-test data
Disable the merging of reports in the build.sbt

    itJacoco.mergeReports in itJacoco.Config := false

    With mergeReports set to false, it-jacoco will only include the integration-test data in the
    report. To include the unit test coverage again use it-jacoco:merge before creating the
    report. This will always merge even when mergeReports is set to false.

To disable parallel execution for code coverage

Add this line to your build.sbt

    parallelExecution in itJacoco.Config := false


The adjusted plugin fully supports integration testing in sbt. By default the unit test and integration test reports are merged together, but this can be disabled by settings mergeReports to false.
I will send a pull request to Joachim to get these changes included in the jacoco4sbt plugin.

Meanwhile, my forked project is available here: The project page contains details on how to build the project. See the original jacoco4sbt site on how to add the plugin to your project.
Comments are welcome.

Friday, February 1, 2013

Reading mail using Akka and Apache Camel

Yesterday I was playing a bit with Akka and the Akka-Camel extension. This extension very nicely provides a way to combine Camel routing with actors. Just for fun, I was trying to read my Gmail inbox using the Camel Mail component.

Setting up the mail endpoint for gmail is pretty easy. This uri let you connect to Gmail and poll for new messages every 10 secondss:
With the Akka-Camel extention it is very easy to use this mail endpoint and process the messages asynchronously using an actor
object Boot extends App {

  val system = ActorSystem("gmail-client")
  val gmail  = system.actorOf(Props[GmailConsumer])

  val logActor = system.actorOf(Props[LogActor])
  logActor ! "App started"

class GmailConsumer extends Consumer {
  def endpointUri = """imaps://<username[@domain]>&password=<password>&delete=false&unseen=true&consumer.delay=10000"""

  val logActor = context.actorOf(Props[LogActor])
  def receive = {
    case msg: CamelMessage => logActor ! msg

class LogActor extends Actor {
  val logger = LoggerFactory.getLogger(this.getClass)

  def receive = {
    case msg =>"Received: "+msg)
This little application creates an actor system and two actors.
The GmailConsumer extends the Consumer trait from Akka-Camel. You just need to provide a endpointUri definition and the route from the endpoint to the actor is automatically setup. The Akka-Camel extension wraps up the message in an immutable CamelMessage. The real MailMessage is in the CamelMessage body.

A problem reading the message body
The sample above just simply logs the received CamelMessage. A problem arose when trying to read the real mail body. Underneath the Camel Mail component just uses JavaMail. Apparently, the mail body is lazily read. When trying to read the mail body from inside the actor, a FolderClosedException was raised. The Mail Component connects and disconnects for each poll. So when the actor asynchronously receives the message, the Mail Component has already disconnected again from the mail Store and the body cannot be read anymore.
Note: the mail headers don't have this problem. They are all available in the CamelMessage already. The mail headers contain many keys among which are 'Subject', 'From', 'To', 'Cc', 'Date', 'Message-ID', etc.

A solution
The solution is that you have to preprocess the MailMessage and get the message body before dispatching it to the actor. One way of doing that is by creating a route in code from the mail endpoint through a processor to the target actor. The processor gets the headers and message bodies from the MailMessage and puts them in a custom POJO.
The actor will now receive a CamelMessage containing the POJO which contains the bodies and headers.

Below the same sample application as above, but now a custom route is added using camel.context.addRoutes(..) and the EmailConsumer is now a normal actor and does not extend the Consumer trait anymore.
The EmailRouteBuilder creates a simple route from(mailUri).process(mailProcessor).to(target).
It is the mailProcessor that does the work by getting the mail bodies and put those in a POJO (EmailMessage).

object Boot extends App {

  val gmail = """imaps://<username[@domain]>&password=<password>&delete=false&unseen=true&consumer.delay=10000"""

  val system = ActorSystem("gmail-client")
  val camel = CamelExtension(system)
  val gmailActor  = system.actorOf(Props[EmailConsumer])
  camel.context.addRoutes(EmailRoute(gmail, gmailActor))

  val logActor = system.actorOf(Props[LogActor])
  logActor ! "App started"

class EmailConsumer extends Actor {
  val logActor = context.actorOf(Props[LogActor])
  def receive = {
    case msg: CamelMessage => {
      // not possible to use msg.bodyAs[] because this actor no long extends Consumer
      // and thus has no implicit Camel context.
      val mail: EmailMessage = msg.body.asInstanceOf[EmailMessage]

      logActor ! mail

class EmailRouteBuilder(mailUri: String, target: ActorRef) extends RouteBuilder {
  require(mailUri.startsWith("imaps://") || mailUri.startsWith("imap://") || mailUri.startsWith("pop3s://") || mailUri.startsWith("pop3://"))
  val mailProcessor = new EmailMessageProcessor

  import akka.camel._ // to import implicit toActorRouteDefinition method
  def configure() {

object EmailRoute {
  def apply(mailUri: String, target: ActorRef) = new EmailRouteBuilder(mailUri, target)

class LogActor extends Actor {
  val logger = LoggerFactory.getLogger(this.getClass)

  def receive = {
    case msg =>"Received: "+msg)

Now the EmailConsumer receives a CamelMessage containing a EmailMessage which contains both the mail headers and the mail bodies (a message received from GMail contains both a TEXT/PLAIN body and a TEXT/HTML body).

For the full sources, including the mailProcessor, see my Akka-Camel-Gmail project on BitBucket.
See the Akka documentation for more info about the Akka-Camel extension. It is very well documented.
See the Camel Mail Component for how to create mail endpoint uri's and all the parameters and options that are supported. You can also send mail via this component so that would be a very easy way for an actor to send email.