Clojure web development – state of the art – part 2

This is part 2 of my “Clojure web development” series. You can discuss first part on this reddit thread. After reading the comments I must explain two assumptions I had writing this series:

  • Keep things easy to understand for people from outside Clojure land, especially Java devs. That’s why I use REST/JSON in favor of transit and component as a “dependency injection” implementation which could be easily explained as Spring framework equivalent. The same goes with Om which is a bit verbose, but in my opinion it’s easier to understand for a start and has wider adoption than the other React wrappers.
  • Keep things easy to bootstrap on a developer machine. This is a hands-on walkthrough and all the individual steps have been committed to GitHub. That’s why I use MongoDB, which could not be the best choice for scaling your application to millions of users, but it’s perfect for bootstrapping – no schema, tables, just insert data and start working. I highly recommend Honza Kral polyglot persistence talk, where he encourages starting simple and optimize for developer happiness at the start of a project.

In previous post we bootstrapped a basic web application serving REST data with (static for now) Clojurescript frontend, all fully reloadable thanks to reloaded repl and figwheel. You can find final working version of it in this branch.

Today we’re going to display contact list stored in MongoDB. I assume you have MongoDB installed, if not – it’s trivial with docker.

Serving contact list from database

OK, let’s start. In the backend we need to add some dependencies to project.clj:

:dependencies ... [org.danielsz/system "0.1.9"] [com.novemberain/monger "2.0.0"]]

monger is an idiomatic Clojure wrapper for Mongo Java driver and system is a nice collection of components for various datastores, including Mongo (a bit like Spring Data for Spring).

In order to interact with a data store I like to use the concept of abstract repository. This should hide the implementation details from the invoker and allows to switch to another store in the future. So let’s create an abstract interface (in Clojure – protocol) in components/repo.clj:

(ns modern-clj-web.component.repo) (defprotocol ContactRepository (find-all [this]))

We need this as a parameter to allow Clojure runtime dispatching correct implementation of this repository. Mongo implementation with Monger is really simple:

(ns modern-clj-web.component.repo (:require [monger.collection :as mc] [monger.json])) ... (defrecord ContactRepoComponent [mongo] ContactRepository (find-all [this] (mc/find-maps (:db mongo) "contacts"))) (defn new-contact-repo-component [] (->ContactRepoComponent {}))

Things to notice here:

  • mc/find-maps just returns all records from collection as Clojure maps
  • ContactComponent gets injected with mongo component created by system library, which adds Mongo client under :db key
  • Since this component is stateless, we don’t need to implement component/Lifecycle interface, but it still can be wired in system like a typical lifecycle-aware component
  • Requiring monger.json adds JSON serialization support for Mongo types (e.g. ObjectId)

Ok, it’s now time to use our new component in the endpoint/example.clj:

(:require ... [modern-clj-web.component.repo :as r]) (defn example-endpoint [{ repo :contact-repo}] (routes ... (GET "/contacts" [] (response (r/find-all repo)))

The {repo :contact-repo} notation (destructuring) automatically binds :contact-repo key from system map to the repo value. So we need to assign our component to that key in system.clj:

(:require ... [modern-clj-web.component.repo :refer [new-contact-repo-component]] [system.components.mongo :refer [new-mongo-db]]) (-> (component/system-map :app (handler-component (:app config)) :http (jetty-server (:http config)) :example (endpoint-component example-endpoint) :mongo (new-mongo-db (:mongo-uri config)) :contact-repo (new-contact-repo-component)) (component/system-using {:http [:app] :app [:example] :example [:contact-repo] :contact-repo [:mongo]}))))

In short – we use system’s new-mongo-db to create Mongo component, make it a dependency to repository which itself is a dependency of example endpoint.

And finally we need to configure :mongo-uri config property in config.clj:

(def environ {:http {:port (some-> env :port Integer.)}} :mongo-uri "mongodb://localhost:27017/contacts"})

To check if it works fine, restart repl, type (go) again and make a GET to http://localhost:3000/contacts.

curl http://localhost:3000/contacts []

OK, so we got empty list since we don’t have any data in Mongo database. Let’s add some with mongo console:

mongo localhost:27017/contacts MongoDB shell version: 2.4.9 connecting to: localhost:27017/contacts > db.contacts.insert({firstname: "Al", lastname: "Pacino"}); > db.contacts.insert({firstname: "Johnny", lastname: "Depp"});

And finally our endpoint should return these two records:

curl http://localhost:3000/contacts [{"lastname":"Pacino","firstname":"Al","_id":"56158345fd2dabeddfb18799"},{"lastname":"Depp","firstname":"Johnny","_id":"56158355fd2dabeddfb1879a"}]

Sweet! Again – in case of any problems, check this commit.

Getting contacts from ClojureScript

In this step we’ll fetch the contacts with AJAX call on our ClojureScript frontend. As usual, we need few dependencies in project.clj for a start:

:dependencies ... [org.clojure/clojurescript "1.7.48"] [org.clojure/core.async "0.1.346.0-17112a-alpha"] [cljs-http "0.1.37"]

ClojureScript should be already visible by using figwheel, but it’s always better to require specific version explicitly. cljs-http is a HTTP client for ClojureScript and core.async provides facilities for asynchronous communication in CSP model, especially useful in ClojureScript. Let’s see how it works in practice.

To make an AJAX call we need to call methods from cljs-http.client, so let’s add this in core.cljs:

(ns ^:figwheel-always modern-clj-web.core (:require [cljs-http.client :as http])) (println (http/get "/contacts"))

You should see #object[cljs.core.async.impl.channels.ManyToManyChannel]. What madness is that???

This is the time we enter the core.async. The most common way to deal with asynchronous network calls from Javascript is by using callbacks or promises. The core.async way is by using channels. It makes your code look more like a sequence of synchronous calls and it’s easier to reason about. So the http/get function returns a channel on which the result is published when response arrives. In order to receive that message we need to read from this channel by using <! function. Since this is blocking, we also need to surround this call with go macro, just like in go language. So the correct way of getting contacts looks like this:

(:require ... [cljs.core.async :refer [ >! chan]]) (go (let [response ( (http/get "/contacts"))] (println (:body response))))

Adding Om component

Dealing with frontend code without introducing any structure could quickly become a real nightmare. In late 2015 we have basically two major JS frameworks on the field: Angular nad React. ClojureScript paradigms (functional programming, immutable data structures) fit really nice into React philosophy. In short, React application is composed of components taking data as input and rendering HTML as output. The output is not a real DOM, but so-called virtual DOM, which helps calculating diff from current view to updated one.

Among many React wrappers in ClojureScript I like using Om with om-tools to reduce some verbosity. Let’s introduce it into our project.clj:

:dependencies ... [org.omcljs/om "0.9.0"] [prismatic/om-tools "0.3.12"]

To render a “hello world” component we need to add some code in core.cljs:

(:require ... [om.core :as om] [om-tools.core :refer-macros [defcomponent]] [om-tools.dom :as dom :include-macros true])) (def app-state (atom {:message "hello from om"})) (defcomponent app [data owner] (render [_] (dom/div (:message data)))) (om/root app app-state {:target (.getElementById js/document "main")})

What’s going on here? The main concept of Om is keeping whole application state in one global atom, which is Clojure way of managing state. So we pass this app-state map (wrapped in atom) as a parameter to om/root which mounts components into real DOM (<div id="main"/> from index.html). The app component just displays the :message value, so you should see “hello from om” rendered. If you have fighweel running, you can change the message value, and it should be updated instantly.

And finally let’s render our contacts with Om:

(defn get-contacts [] (go (let [response ( (http/get "/contacts"))] (:body response)))) (defcomponent contact-comp [contact _] (render [_] (dom/li (str (:firstname contact) " " (:lastname contact))))) (defcomponent app [data _] (will-mount [_] (go (let [contacts ( (get-contacts))] (om/update! data :contacts contacts)))) (render [_] (dom/div (dom/h2 (:message data)) (dom/ul (om/build-all contact-comp (:contacts data))))))

So the contact-comp is just rendering a single contact. We use om/build-all to render all contacts visible in :contacts field in global state. And most tricky part – we use will-mount lifecycle method to get contacts from server when app component is about to be mounted to the DOM.

Again, in this commit should be a working version in case of any problems.

And if you liked Om, I highly recommend official tutorials and Zero to Om series.

You May Also Like

New HTTP Logger Grails plugin

I've wrote a new Grails plugin - httplogger. It logs:

  • request information (url, headers, cookies, method, body),
  • grails dispatch information (controller, action, parameters),
  • response information (elapsed time and body).

It is mostly useful for logging your REST traffic. Full HTTP web pages can be huge to log and generally waste your space. I suggest to map all of your REST controllers with the same path in UrlMappings, e.g. /rest/ and configure this plugin with this path.

Here is some simple output just to give you a taste of it.

17:16:00,331 INFO  filters.LogRawRequestInfoFilter  - 17:16:00,340 INFO  filters.LogRawRequestInfoFilter  - 17:16:00,342 INFO  filters.LogGrailsUrlsInfoFilter  - 17:16:00,731 INFO  filters.LogOutputResponseFilter  - >> #1 returned 200, took 405 ms.
17:16:00,745 INFO filters.LogOutputResponseFilter - >> #1 responded with '{count:0}'
17:18:55,799 INFO  filters.LogRawRequestInfoFilter  - 17:18:55,799 INFO  filters.LogRawRequestInfoFilter  - 17:18:55,800 INFO  filters.LogRawRequestInfoFilter  - 17:18:55,801 INFO  filters.LogOutputResponseFilter  - >> #2 returned 404, took 3 ms.
17:18:55,802 INFO filters.LogOutputResponseFilter - >> #2 responded with ''

Official plugin information can be found on Grails plugins website here: http://grails.org/plugins/httplogger or you can browse code on github: TouK/grails-httplogger.

Integration testing custom validation constraints in Jersey 2

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources.

Custom constraints

One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class:

@Document
public class Todo {
    private Long id;
    @NotNull
    private String description;
    @NotNull
    private Boolean completed;
    @NotNull
    private DateTime dueDate;

    @JsonCreator
    public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) {
        this.description = description;
        this.dueDate = dueDate;
        this.completed = false;
    }
    // getters and setters
}

Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand - what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) - but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification:

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForCreation.Validator.class)
public @interface ValidForCreation {
    //...
    class Validator implements ConstraintValidator<ValidForCreation, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && todo.getDescription() != null
                && todo.getDueDate() != null;
        }
    }
}

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForModification.Validator.class)
public @interface ValidForModification {
    //...
    class Validator implements ConstraintValidator<ValidForModification, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null);
        }
    }
}

And now you can move validation annotations to the definition of a REST endpoint:

@POST
@Consumes(APPLICATION_JSON)
public Response create(@ValidForCreation Todo todo) {...}

@PUT
@Consumes(APPLICATION_JSON)
public Response update(@ValidForModification Todo todo) {...}

And now you can remove those NotNulls from your model.

Integration testing

There are in general two approaches to integration testing:

  • test is being run on separate JVM than the app, which is deployed on some other integration environment
  • test deploys the application programmatically in the setup block.

Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap - an Arquillian subproject:

    WebAppContext webAppContext = new WebAppContext();
    webAppContext.setResourceBase("src/main/webapp");
    webAppContext.setContextPath("/");
    File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml")
                .importCompileAndRuntimeDependencies()
                .resolve().withTransitivity().asFile();
    for (File file: mavenLibs) {
        webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI()));
    }
    webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI()));

    webAppContext.setConfigurations(new Configuration[] {
        new AnnotationConfiguration(),
        new WebXmlConfiguration(),
        new WebInfConfiguration()
    });
    server.setHandler(webAppContext);

(this Stackoverflow thread inspired me a lot here)

Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos):

@Test
@Parameters(method = "provideInvalidTodosForCreation")
public void shouldRejectInvalidTodoWhenCreate(Todo todo) {
    Response response = createTarget().request().post(Entity.json(todo));

    assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode());
}

private static Object[] provideInvalidTodosForCreation() {
    return $(
        new TodoBuilder().withDescription("test").build(),
        new TodoBuilder().withDueDate(DateTime.now()).build(),
        new TodoBuilder().withId(123L).build(),
        new TodoBuilder().build()
    );
}

OK, enough of reading, feel free to clone the project and start writing your REST services!

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources.

Custom constraints

One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class:

@Document
public class Todo {
    private Long id;
    @NotNull
    private String description;
    @NotNull
    private Boolean completed;
    @NotNull
    private DateTime dueDate;

    @JsonCreator
    public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) {
        this.description = description;
        this.dueDate = dueDate;
        this.completed = false;
    }
    // getters and setters
}

Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand - what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) - but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification:

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForCreation.Validator.class)
public @interface ValidForCreation {
    //...
    class Validator implements ConstraintValidator<ValidForCreation, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && todo.getDescription() != null
                && todo.getDueDate() != null;
        }
    }
}

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForModification.Validator.class)
public @interface ValidForModification {
    //...
    class Validator implements ConstraintValidator<ValidForModification, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null);
        }
    }
}

And now you can move validation annotations to the definition of a REST endpoint:

@POST
@Consumes(APPLICATION_JSON)
public Response create(@ValidForCreation Todo todo) {...}

@PUT
@Consumes(APPLICATION_JSON)
public Response update(@ValidForModification Todo todo) {...}

And now you can remove those NotNulls from your model.

Integration testing

There are in general two approaches to integration testing:

  • test is being run on separate JVM than the app, which is deployed on some other integration environment
  • test deploys the application programmatically in the setup block.

Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap - an Arquillian subproject:

    WebAppContext webAppContext = new WebAppContext();
    webAppContext.setResourceBase("src/main/webapp");
    webAppContext.setContextPath("/");
    File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml")
                .importCompileAndRuntimeDependencies()
                .resolve().withTransitivity().asFile();
    for (File file: mavenLibs) {
        webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI()));
    }
    webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI()));

    webAppContext.setConfigurations(new Configuration[] {
        new AnnotationConfiguration(),
        new WebXmlConfiguration(),
        new WebInfConfiguration()
    });
    server.setHandler(webAppContext);

(this Stackoverflow thread inspired me a lot here)

Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos):

@Test
@Parameters(method = "provideInvalidTodosForCreation")
public void shouldRejectInvalidTodoWhenCreate(Todo todo) {
    Response response = createTarget().request().post(Entity.json(todo));

    assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode());
}

private static Object[] provideInvalidTodosForCreation() {
    return $(
        new TodoBuilder().withDescription("test").build(),
        new TodoBuilder().withDueDate(DateTime.now()).build(),
        new TodoBuilder().withId(123L).build(),
        new TodoBuilder().build()
    );
}

OK, enough of reading, feel free to clone the project and start writing your REST services!