Friday, April 19, 2013

WebDav4Sbt version 1.2 released

Today version 1.2 of the WebDav4Sbt plugin was released.
This version fixes an issue with publishing Ivy artifacts to WebDav. Thanks to Flanker_9 for raising the issue.

To publish an artifact Ivy-style instead of Maven add to your _build.sbt_:

    publishMavenStyle := false

For more info, see the WebDav repo:
I got the permission to add my plugin to the plugins list on Although I am very busy at the moment, I'm going to try to do this soon.

Saturday, March 23, 2013

New Webdav4Sbt release adds crossPaths support

Version 1.1 of the Webdav4Sbt plugin was just released.
The new version adds support for publishing Java artifacts which do not need the Scala version in the artifact name. To remove the Scala version from the Java artifact name, add to your build.sbt:

    crossPaths := false

Thanks to jplikesbikes for this contribution.

See the Webdav4Sbt plugin site for more info.

Monday, March 4, 2013

Integration test support merged in Jacoco4Sbt plugin

Yesterday Joachim merged the Integration Test Support changes I made into the jacoco4sbt plugin and he released version 2.0.0 of the plugin.

To use this plugin version, add to your project/plugings.sbt:

    addSbtPlugin("de.johoop" % "jacoco4sbt" % "2.0.0")

And to the build.sbt:

    import de.johoop.jacoco4sbt._

    import JacocoPlugin._

    seq(jacoco.settings : _*)     // for unit test coverage
    seq(itJacoco.settings : _*)  // for integration test coverage

See my previous post about how to enable integration testing with sbt.

Wednesday, February 27, 2013

Publishing from SBT to WebDAV (CloudBees)

This week I was trying to setup a Jenkins in the Cloud using CloudBees to learn about building SBT projects in a CI environment. I had just forked and adjusted the Jacoco4Sbt plugin (see my previous post) and wanted to build and host this project somewhere in the cloud. CloudBees provides an excelent service for this and when using Dev@CloudBees it is also free.

The whole setup of Jenkins on CloudBees is pretty easy. As well as building the project using SBT.
The problem was however, how to deploy the artifacts to the CloudBees Maven repository which is available via WebDAV. Apparently, the SBT publish task does not create the required directory structure on a WebDAV target location, so the publish task failed.

While Googling around I could not find a better solution then to create those directories by hand either by mounting the WebDAV locally or by using Curl. At first I used Curl to create the directories after which the artifact could successfully be published, but I was not happy with this solution. What if I change the organization, name or version of the project? Then I would have to create those directories manually again. The whole goal of an automated build/deploy system is, that everything works automatically.

So ... I created a new SBT plugin which I called: WebDav4Sbt.

The goal of this plugin is to automatically create those directories needed on a WebDAV location so the artifacts can be published. The plugin uses Sardine, a Java WebDAV client, to implement the WebDAV MKCOL command to create the WebDAV collections (a.k.a. directories).

The plugin adds 2 new tasks for SBT:
* mkcol  -  A tasks which creates the publish directories
* publish  -  The adapted default publish task which automatically calls mkcol so the required directories are always available.

See the WebDAV4Sbt project page for info on how to add this plugin to your project and use it.

I am working on a bigger article on how to setup a CI environment in the cloud in which I will share how I setup the CI to host, build and publish Scala projects using BitBucket and CloudBees. I hope I can publish it somewhere next week.

Friday, February 22, 2013

Integration testing, code coverage and SBT

Now I'm doing more and more Scala, I'm also using SBT more. With Maven there are plenty of plugins available for all kinds code checking. For SBT I want the same stuff, but I'm still discovering how to get all this.

Integration testing

Integration testing is default not enabled in SBT, but it's easy to add it. Create a new Scala class file in your <project-folder>/project folder with:
import sbt._
import Keys._

object IntegrationTesting extends Build {

  lazy val root =
    Project("root", file("."))
      .configs( IntegrationTest )
      .settings( Defaults.itSettings : _* )

By default sbt will run all tests in parallel. If your tests are using or creating the same (mock) services, your might want to disable parellel execution for integration tests.

To disable parallel execution add to your build.sbt

    parallelExecution in IntegrationTest := false

To run you integration test

    sbt it:test

Code coverage

JaCoCo seems to become the standard Java Code Coverage tool now Emma and Cobertura are not actively developed anymore and not everyone wants to buy Atlassian's Clover.
There is a perfectly good plugin for SBT called jacoco4sbt. Just running jacoco:cover will run al your unit tests and create a JaCoCo coverage report. The report will be locate in <project-folder>target/scala-<version>/jacoco/html.

However, now I just have setup my project to do both unit- and integration testing, I also want the code coverage for the integration tests. The jacoco4sbt plugin does not support this however.
I started out by extending the 'IntegrationTesting' class I had created for adding integration test support (see above). It's very well possible to add more functionality this way and I got code coverage working for integration tests this way. But all this coding in a project config did not feel right, so I forked the jacoco4sbt plugin and added the integration testing support right in the plugin. I will send a pull request to Joachim of the jacoco4sbt plugin. Hopefully he will include my changes to his plugin.
On March 3rd 2013 Joachim merged my changes in the official Jacoco4Sbt plugin. See the offical plugin site for the latest info on how to use the plugin for integration testing.
With the adjusted plugin, it's now possible to run code coverage for integration tests and to merge the results with the unit tests to get a single report with all coverage details.

New plugin functionality

To run code coverage on your unittests, still run the plugin using jacoco:<key> as before.
To run code coverage on your integration tests, use it-jacoco:<key>

To run all tests and save the execution data


To create a report


To run all tests and create a report


    Note: You can stil run [it-]jacoco:test, but running [it-]jacoco:report after this will give an
    error because the execution data was not saved. Basically, don't use 'test' but only the
    goals 'check' or 'cover' depending on whether you want to create a report directly or not.

To run both unittest and integration-tests and create a combined report


The report is generated in <project>/target/scala-<version>/it-jacoco/html.

To Not merge the unit test and integration-test data
Disable the merging of reports in the build.sbt

    itJacoco.mergeReports in itJacoco.Config := false

    With mergeReports set to false, it-jacoco will only include the integration-test data in the
    report. To include the unit test coverage again use it-jacoco:merge before creating the
    report. This will always merge even when mergeReports is set to false.

To disable parallel execution for code coverage

Add this line to your build.sbt

    parallelExecution in itJacoco.Config := false


The adjusted plugin fully supports integration testing in sbt. By default the unit test and integration test reports are merged together, but this can be disabled by settings mergeReports to false.
I will send a pull request to Joachim to get these changes included in the jacoco4sbt plugin.

Meanwhile, my forked project is available here: The project page contains details on how to build the project. See the original jacoco4sbt site on how to add the plugin to your project.
Comments are welcome.

Friday, February 1, 2013

Reading mail using Akka and Apache Camel

Yesterday I was playing a bit with Akka and the Akka-Camel extension. This extension very nicely provides a way to combine Camel routing with actors. Just for fun, I was trying to read my Gmail inbox using the Camel Mail component.

Setting up the mail endpoint for gmail is pretty easy. This uri let you connect to Gmail and poll for new messages every 10 secondss:
With the Akka-Camel extention it is very easy to use this mail endpoint and process the messages asynchronously using an actor
object Boot extends App {

  val system = ActorSystem("gmail-client")
  val gmail  = system.actorOf(Props[GmailConsumer])

  val logActor = system.actorOf(Props[LogActor])
  logActor ! "App started"

class GmailConsumer extends Consumer {
  def endpointUri = """imaps://<username[@domain]>&password=<password>&delete=false&unseen=true&consumer.delay=10000"""

  val logActor = context.actorOf(Props[LogActor])
  def receive = {
    case msg: CamelMessage => logActor ! msg

class LogActor extends Actor {
  val logger = LoggerFactory.getLogger(this.getClass)

  def receive = {
    case msg =>"Received: "+msg)
This little application creates an actor system and two actors.
The GmailConsumer extends the Consumer trait from Akka-Camel. You just need to provide a endpointUri definition and the route from the endpoint to the actor is automatically setup. The Akka-Camel extension wraps up the message in an immutable CamelMessage. The real MailMessage is in the CamelMessage body.

A problem reading the message body
The sample above just simply logs the received CamelMessage. A problem arose when trying to read the real mail body. Underneath the Camel Mail component just uses JavaMail. Apparently, the mail body is lazily read. When trying to read the mail body from inside the actor, a FolderClosedException was raised. The Mail Component connects and disconnects for each poll. So when the actor asynchronously receives the message, the Mail Component has already disconnected again from the mail Store and the body cannot be read anymore.
Note: the mail headers don't have this problem. They are all available in the CamelMessage already. The mail headers contain many keys among which are 'Subject', 'From', 'To', 'Cc', 'Date', 'Message-ID', etc.

A solution
The solution is that you have to preprocess the MailMessage and get the message body before dispatching it to the actor. One way of doing that is by creating a route in code from the mail endpoint through a processor to the target actor. The processor gets the headers and message bodies from the MailMessage and puts them in a custom POJO.
The actor will now receive a CamelMessage containing the POJO which contains the bodies and headers.

Below the same sample application as above, but now a custom route is added using camel.context.addRoutes(..) and the EmailConsumer is now a normal actor and does not extend the Consumer trait anymore.
The EmailRouteBuilder creates a simple route from(mailUri).process(mailProcessor).to(target).
It is the mailProcessor that does the work by getting the mail bodies and put those in a POJO (EmailMessage).

object Boot extends App {

  val gmail = """imaps://<username[@domain]>&password=<password>&delete=false&unseen=true&consumer.delay=10000"""

  val system = ActorSystem("gmail-client")
  val camel = CamelExtension(system)
  val gmailActor  = system.actorOf(Props[EmailConsumer])
  camel.context.addRoutes(EmailRoute(gmail, gmailActor))

  val logActor = system.actorOf(Props[LogActor])
  logActor ! "App started"

class EmailConsumer extends Actor {
  val logActor = context.actorOf(Props[LogActor])
  def receive = {
    case msg: CamelMessage => {
      // not possible to use msg.bodyAs[] because this actor no long extends Consumer
      // and thus has no implicit Camel context.
      val mail: EmailMessage = msg.body.asInstanceOf[EmailMessage]

      logActor ! mail

class EmailRouteBuilder(mailUri: String, target: ActorRef) extends RouteBuilder {
  require(mailUri.startsWith("imaps://") || mailUri.startsWith("imap://") || mailUri.startsWith("pop3s://") || mailUri.startsWith("pop3://"))
  val mailProcessor = new EmailMessageProcessor

  import akka.camel._ // to import implicit toActorRouteDefinition method
  def configure() {

object EmailRoute {
  def apply(mailUri: String, target: ActorRef) = new EmailRouteBuilder(mailUri, target)

class LogActor extends Actor {
  val logger = LoggerFactory.getLogger(this.getClass)

  def receive = {
    case msg =>"Received: "+msg)

Now the EmailConsumer receives a CamelMessage containing a EmailMessage which contains both the mail headers and the mail bodies (a message received from GMail contains both a TEXT/PLAIN body and a TEXT/HTML body).

For the full sources, including the mailProcessor, see my Akka-Camel-Gmail project on BitBucket.
See the Akka documentation for more info about the Akka-Camel extension. It is very well documented.
See the Camel Mail Component for how to create mail endpoint uri's and all the parameters and options that are supported. You can also send mail via this component so that would be a very easy way for an actor to send email.

Monday, January 28, 2013

Playing with Spray

Today I have been playing a bit with Spray. I have an idea to build something, but therefore I also need to serve some pages, so I would like Spray to be some kind of website. Starting with the spray template project I added Twirl for templating, support for serving static file and Twitter Bootstrap for a nice layout.
Now this is a nice starting point to start a web application based on Spray.

For the onces interested, I shared my new 'Spray - Twirl - Twitter Bootstrap' template project on BitBucket:

Have fun.

Friday, January 25, 2013

Selenium testing with ScalaTest

The latest milestone of ScalaTest, 2.0M5b, now has support for creating Selenium tests.
Creating a Selenium test is now as easy as this:
class MyServiceBrowserTest extends FlatSpec with ShouldMatchers with BeforeAndAfterAll with Chrome {

  val homepage = "http://localhost:8080"

  behavior of "MyService application"

  it should "Say Hello" in {

    go to (homepage)
    pageTitle should be("Welcome to MyService")

  override protected def afterAll() {
Just by adding with Chrome you create a Selenium test which runs in a Chrome browser. Other supported drivers are Firefox, Safari, HtmlUnit and InternetExplorer.

ScalaTest comes with a very natural dsl for creating Selenium tests. You can anything Selenium supports: click on buttons, submit forms, go back, capture screenshots, execute JavaScript, etc. See the ScalaTest Selenium page for more info about this dsl.

There are just 2 things you need to do to start working with this:
- add the required dependencies to your build.sbt
- install the appropriate browser driver. This is not necessary if you use the HtmlUnit driver.

Adding dependencies
In your build file, add the dependencies to ScalaTest and Seleniu. For example in your build.sbt add:
  "org.scalatest"       %   "scalatest_2.10" % "2.0.M5b" % "test",
  "org.seleniumhq.selenium" % "selenium-java" % "2.28.0" % "test"
Install Chrome driver
For Chrome, download the chromedriver from
On a Mac (or Linux I guess) just unzip the driver and put it in /usr/bin.

Done. Now create your test, add with Chrome and run your test. Only remember to quit the browser after all tests so the browser gets closed again.

For other browsers. see the Selenium downloads page for other browser drivers.