Scheduling tasks using Message Queue

Introduction

How to schedule your task for later execution? You often create table in database, configure job that checks if due time of any task occured and then execute it.

But there is easier way if only you have message broker with your application... You could publish/send your message and tell it that it should be delivered with specified delay.

Scheduling messages using ActiveMQ

ActiveMQ is open source message broker written in Java. It is implementation of JMS (Java Message Service).

You could start its broker with scheduling support by adding flag schedulerSupport to broker configuration:

<beans ...>
...
<broker xmlns="http://activemq.apache.org/schema/core"
brokerName="localhost"
dataDirectory="${activemq.data}"
schedulerSupport="true">
...
</broker>
...
</beans>

Now, if you want to delay receiving message by few seconds, you could add property during message creation, e.g.:

message.setLongProperty(ScheduledMessage.AMQ_SCHEDULED_DELAY, 8000)

Delay unit is miliseconds.

Of course queue must be persisted.

When you listen for message on the same queue, then you will see that message indeed will be received with 8 second delay.

...
Send time: Tue Dec 01 18:51:23 CET 2015
...
Message received at Tue Dec 01 18:51:31 CET 2015
...

Scheduling messages using RabbitMQ

Scheduling tasks is not only the feature of ActiveMQ. It is also available with RabbitMQ.

RabitMQ is message broker written in Erlang. It uses protocol AMQP.

First you have to install plugin rabbitmq_delayed_message_exchange. It could be done via command:

rabbitmq-plugins enable --offline rabbitmq_delayed_message_exchange

You have to define exchange in RabbitMQ which will use features from this plugin. Queue for delayed messages should be bound to this exchange. Routing key should be set to queue name.

channel.exchangeDeclare(exchange, 'x-delayed-message', true, false, ['x-delayed-type': 'direct']);
channel.queueBind(queue, exchange, queue);
channel.queueDeclare(queue, true, false, false, null);

Of course queue must be persisted.

To test it just publish new message with property x-delay:

channel.basicPublish(exchange,
queue,
new AMQP.BasicProperties.Builder().headers('x-delay': 8000).build(),
"Message: $currentUuid".bytes)

Message will be delayed with 8 seconds:

...
Send time: Tue Dec 01 19:04:18 CET 2015
...
Message received at Tue Dec 01 19:04:26 CET 2015
...

Conclusion

Why you create similar mechanism for handling scheduled tasks on your own, when you could use your message brokers and delayed messages to schedule future tasks?

Sources are available here.

Kotlin’s extensions for each class

Extensions in Kotlin are very powerful mechanism. It allows for add any method to any of existing classes. Each instance has (as in Java) equals, toString and hashCode methods, but there is much more in Kotlin.

Example classes


Let's define some simple classes describing person: normal class and data class.

class PersonJaxb {
var firstName: String? = null
var lastName: String? = null
var age: Int? = null
}

data class Person(val firstName: String, val lastName: String, val age: Int)

Normal class extensions


All instances have methods described below.

apply method


I often work with jaxb classes similar to PersonJaxb, which has not all arg constructor and all fields must be set via setters. Kotlin helps to deal with it via apply method. Target instance is provided as delagate to closure so we could define all fields values in it and returns this. The signature is T.apply(f: T.() -> Unit): T.

@Test
fun applyTest() {
//when
val person = PersonJaxb().apply {
firstName = "John"
lastName = "Smith"
age = 20
}

//then
assertEquals(20, person.age)
assertEquals("John", person.firstName)
assertEquals("Smith", person.lastName)
}

let method


Another extension is let method which is similar to map operation for collections. It has signature T.let(f: (T) -> R): R. this is passed as parameter to given closure/function.

@Test
fun letTest() {
//when
val fullName = Person("John", "Smith", 20).let {
"${it.firstName} ${it.lastName}"
}

//then
assertEquals("John Smith", fullName)
}

run method


run method looks like merge of apply and let methods: access to this is via delegate as in apply, but it also returns value as in let method. It has signature T.run(f: T.() -> R): R.

@Test
fun runTest() {
//when
val fullName = Person("John", "Smith", 20).run {
"$firstName $lastName"
}

//then
assertEquals("John Smith", fullName)
}

to method


Each instance has also defined to infix operator, which is used to create Pair. Pairs is helpful to create map entries. It has signature A.to(that: B): Pair<A, B>.

@Test
fun toTest() {
//when
val pair = Person("John", "Smith", 20) to 5

//then
assertEquals(Person("John", "Smith", 20), pair.first)
assertEquals(5, pair.second)
}

Data class methods


Data class instances have also some other helpful methods (which are not extensions, but are generated for us).

componentX methods


Data class Person has three fields and it has component method generated for each of them: component1 for firstName, component2 for lastName and component3 for age.

@Test
fun componentsTest() {
//when
val p = Person("John", "Smith", 20)

//then
assertEquals("John", p.component1())
assertEquals("Smith", p.component2())
assertEquals(20, p.component3())
}

Why is it helpful? componentX methods are used in extracting (similar to Scala case classes extracting mechanism), e. g.:

@Test
fun extractingTest() {
//when
val (first, last, age) = Person("John", "Smith", 20)

//then
assertEquals(20, age)
assertEquals("John", first)
assertEquals("Smith", last)
}

copy method


copy method allows to create new instance based on current instance.

@Test
fun copyTest() {
//when
val person = Person("John", "Smith", 20).copy(lastName = "Kowalski", firstName = "Jan")

//then
assertEquals(Person("Jan", "Kowalski", 20), person)
}

Summary


Kotlin's extensions for each instances are very simple and help to solve many problems. The code written with these extensions is much more readable and concise than written in Java.

Sources are available here.

How to recover after Hibernate’s OptimisticLockException

I've read many articles about optimistic locking and OptimisticLockException itself. Problem is that each one of them ended up getting their first exception and no word on recovery. What to do next? Repeat? If so, how? Or drop it? Is there any chance to continue? How? Even more, documentation says that if you get Hibernate exception - you're done, it's not recoverable:

An exception thrown by Hibernate means you have to rollback your database transaction and close the Session immediately (this is discussed in more detail later in the chapter). If your Session is bound to the application, you have to stop the application. Rolling back the database transaction does not put your business objects back into the state they were at the start of the transaction. This means that the database state and the business objects will be out of sync. Usually this is not a problem, because exceptions are not recoverable and you will have to start over after rollback anyway.

Here is my attempt on this: repeatable and recoverable.

Business case

Let's say we have distributed application with two web servers, connected to the same database. Applications use optimistic locking to avoid collisions. Customers buy lottery coupons, which are numbered from 1 to 100. In the same second Alice on web server 1 draws two coupons: 11 and 12. In the same moment Bob reserves two coupons on web server 2. It draws 11 and 13 for Bob and tries to write it back to database. But it fails, since Alice's commit was first. I want a web application server to draw coupons for Bob again and then - try to save again until it succeeds.

Solution

For every request Hibernate associates different Session that is flushed at the end of request processing. If you hit OptimisticLockException then this Request Session is polluted and will be rolled back. To avoid this we will create a separate Hibernate's Session especially for drawing coupons. If separate session fails - drop this session and try again in a new one. If it succeeds - merge it with a main request session. Request Session cannot be touched during draws. Take a look at the following picture:

On this picture yellow and green short-term sessions has failed with OptimisticLockException. Red session was successful and these objects are merged to a main session on the left.

Reservation entity

Key requirement here is to keep a domain you want to lock on as small as possible and not coupled directly to anything else. Best approach here is to create some Reservation entity with few fields, let's say: couponId and customerId. For each Coupon create one Reservation row and use reserved boolean field as a reservation status. For coupon and customer use weak identifiers (long) instead of real entities. This way no object tree will be loaded and Reservation stays decoupled.

import lombok.Getter;
import lombok.Setter;
import lombok.ToString;

import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;

import static org.apache.commons.lang3.Validate.isTrue;
import static org.springframework.util.Assert.isNull;

@Getter
@Setter
@ToString
@Entity
public class Reservation {
@Id
private long id;
@Column("coupon_id")
private long couponId;
@Column("customer_id")
private Long customerId;
@Column("is_reserved")
private boolean reserved;

public void reserve(long reservingCustomerId) {
isTrue(reserved == false);
isNull(customerId);
reserved = true;
customerId = reservingCustomerId;
}
}

Method reserve has some business assertions to keep it legit. You can also spot lombok's annotations that provide getters, setters and toString methods on the fly.

Reservation Service

Reservation Service implements my solution. I've commented some steps for better reading. Please read a source code:

import lombok.extern.slf4j.Slf4j;
import org.hibernate.*;
import org.hibernate.ejb.HibernateEntityManager;
import org.springframework.orm.hibernate4.HibernateOptimisticLockingFailureException;

import javax.persistence.OptimisticLockException;
import javax.persistence.PersistenceContext;
import java.util.List;

@Slf4j
public class ReservationService {

@PersistenceContext
private HibernateEntityManager hibernateEntityManager;

@SuppressWarnings("uncheked")
private Iterable<Reservation> reserveOptimistic(long customerId, final int count) throws NoFreeReservationsException {
log.info("Trying to reserve {} reservations for customer {}", count, customerId);

//This is the request session that needs to stay clean
Session currentSession = hibernateEntityManager.getSession();
Iterable<Reservation> reserved = null;

do {
//This is our temporary session to work on
Session newSession = hibernateEntityManager.getSession().getSessionFactory().openSession();
newSession.setFlushMode(FlushMode.COMMIT);
Transaction transaction = newSession.beginTransaction();

List<Reservation> availableReservations = null;

try {
Query query = newSession.createQuery("from Reservation r where r.reserved = false")
.setLockMode("optimistic", LockMode.OPTIMISTIC)
.setMaxResults(count);

availableReservations = query.list();

//There is no available reservations to reserve
if (availableReservations.isEmpty()) {
throw new NoFreeReservationsException();
}

for (Reservation available : availableReservations) {
available.reserve(customerId);
newSession.save(available);
}

//Commit can throw optimistic lock exception if it fails
transaction.commit();

//Commit succeeded - this reference is used outside try-catch-finally block
reserved = availableReservations;

} catch (OptimisticLockException | StaleObjectStateException | HibernateOptimisticLockingFailureException e) {
log.info("Optimistic lock exception occurred for customer {} and count {}: {} {}", customerId, count, e.getClass(), e.getMessage());

transaction.rollback();

for (Reservation availableMsisdn : availableReservations) {
newSession.evict(availableMsisdn);
}
} finally {
newSession.close();
}
//Repeat until we reserve something
} while (reserved == null);

log.info("Successfully reserved {} reservations for customer {}", count, customerId);

//Merge reserved entities to request session
for (Reservation reservedMsisdn : reserved) {
currentSession.merge(reservedMsisdn);
}

return reserved;
}
}

This code says it all. It tries to reserve some Reservations until it succeeds in a do-while loop. Main Request Session is not polluted and it achieves our goal.

I hope this example helps you in similar some cases. It works as expected for a few months on our customer's production site and I recommend this solution.

Clojure web development – state of the art – part 2

This is part 2 of my “Clojure web development” series. You can discuss first part on this reddit thread. After reading the comments I must explain two assumptions I had writing this series:

  • Keep things easy to understand for people from outside Clojure land, especially Java devs. That’s why I use REST/JSON in favor of transit and component as a “dependency injection” implementation which could be easily explained as Spring framework equivalent. The same goes with Om which is a bit verbose, but in my opinion it’s easier to understand for a start and has wider adoption than the other React wrappers.

  • Keep things easy to bootstrap on a developer machine. This is a hands-on walkthrough and all the individual steps have been committed to GitHub. That’s why I use MongoDB, which could not be the best choice for scaling your application to millions of users, but it’s perfect for bootstrapping - no schema, tables, just insert data and start working. I highly recommend Honza Kral polyglot persistence talk, where he encourages starting simple and optimize for developer happiness at the start of a project.

In previous post we bootstrapped a basic web application serving REST data with (static for now) Clojurescript frontend, all fully reloadable thanks to reloaded repl and figwheel. You can find final working version of it in this branch.

Today we’re going to display contact list stored in MongoDB. I assume you have MongoDB installed, if not - it’s trivial with docker.

Serving contact list from database

OK, let’s start. In the backend we need to add some dependencies to project.clj:

:dependencies
   ...
   [org.danielsz/system "0.1.9"]
   [com.novemberain/monger "2.0.0"]]

monger is an idiomatic Clojure wrapper for Mongo Java driver and system is a nice collection of components for various datastores, including Mongo (a bit like Spring Data for Spring).

In order to interact with a data store I like to use the concept of abstract repository. This should hide the implementation details from the invoker and allows to switch to another store in the future. So let’s create an abstract interface (in Clojure - protocol) in components/repo.clj:

(ns modern-clj-web.component.repo)

(defprotocol ContactRepository
  (find-all [this]))

We need this as a parameter to allow Clojure runtime dispatching correct implementation of this repository. Mongo implementation with Monger is really simple:

(ns modern-clj-web.component.repo
   (:require [monger.collection :as mc]
             [monger.json]))
...

(defrecord ContactRepoComponent [mongo]
  ContactRepository
  (find-all [this]
    (mc/find-maps (:db mongo) "contacts")))

(defn new-contact-repo-component []
  (->ContactRepoComponent {}))

Things to notice here:

  • mc/find-maps just returns all records from collection as Clojure maps
  • ContactComponent gets injected with mongo component created by system library, which adds Mongo client under :db key
  • Since this component is stateless, we don’t need to implement component/Lifecycle interface, but it still can be wired in system like a typical lifecycle-aware component
  • Requiring monger.json adds JSON serialization support for Mongo types (e.g. ObjectId)

Ok, it’s now time to use our new component in the endpoint/example.clj:

(:require
  ...
  [modern-clj-web.component.repo :as r])

(defn example-endpoint [{repo :contact-repo}]
 (routes
...
   (GET "/contacts" [] (response (r/find-all repo)))

The {repo :contact-repo} notation (destructuring) automatically binds :contact-repo key from system map to the repo value. So we need to assign our component to that key in system.clj:

(:require
  ...
  [modern-clj-web.component.repo :refer [new-contact-repo-component]]
  [system.components.mongo :refer [new-mongo-db]])

 (-> (component/system-map
          :app  (handler-component (:app config))
          :http (jetty-server (:http config))
          :example (endpoint-component example-endpoint)
          :mongo (new-mongo-db (:mongo-uri config))
          :contact-repo (new-contact-repo-component))
         (component/system-using
          {:http [:app]
           :app  [:example]
           :example [:contact-repo]
           :contact-repo [:mongo]}))))

In short - we use system’s new-mongo-db to create Mongo component, make it a dependency to repository which itself is a dependency of example endpoint.

And finally we need to configure :mongo-uri config property in config.clj:

(def environ
  {:http {:port (some-> env :port Integer.)}}
   :mongo-uri "mongodb://localhost:27017/contacts"})

To check if it works fine, restart repl, type (go) again and make a GET to http://localhost:3000/contacts.

curl http://localhost:3000/contacts
[]

OK, so we got empty list since we don’t have any data in Mongo database. Let’s add some with mongo console:

mongo localhost:27017/contacts
MongoDB shell version: 2.4.9
connecting to: localhost:27017/contacts
>  db.contacts.insert({firstname: "Al", lastname: "Pacino"});
>  db.contacts.insert({firstname: "Johnny", lastname: "Depp"});

And finally our endpoint should return these two records:

curl http://localhost:3000/contacts
[{"lastname":"Pacino","firstname":"Al","_id":"56158345fd2dabeddfb18799"},{"lastname":"Depp","firstname":"Johnny","_id":"56158355fd2dabeddfb1879a"}]

Sweet! Again - in case of any problems, check this commit.

Getting contacts from ClojureScript

In this step we’ll fetch the contacts with AJAX call on our ClojureScript frontend. As usual, we need few dependencies in project.clj for a start:

:dependencies 
     ...
       [org.clojure/clojurescript "1.7.48"]
       [org.clojure/core.async "0.1.346.0-17112a-alpha"]
       [cljs-http "0.1.37"]

ClojureScript should be already visible by using figwheel, but it’s always better to require specific version explicitly. cljs-http is a HTTP client for ClojureScript and core.async provides facilities for asynchronous communication in CSP model, especially useful in ClojureScript. Let’s see how it works in practice.

To make an AJAX call we need to call methods from cljs-http.client, so let’s add this in core.cljs:

(ns ^:figwheel-always modern-clj-web.core
  (:require [cljs-http.client :as http]))

(println (http/get "/contacts"))

You should see #object[cljs.core.async.impl.channels.ManyToManyChannel]. What madness is that???

This is the time we enter the core.async. The most common way to deal with asynchronous network calls from Javascript is by using callbacks or promises. The core.async way is by using channels. It makes your code look more like a sequence of synchronous calls and it’s easier to reason about. So the http/get function returns a channel on which the result is published when response arrives. In order to receive that message we need to read from this channel by using <! function. Since this is blocking, we also need to surround this call with go macro, just like in go language. So the correct way of getting contacts looks like this:

(:require
...
  [cljs.core.async :refer [<! >! chan]])

(go
  (let [response (<! (http/get "/contacts"))]
    (println (:body response))))

Adding Om component

Dealing with frontend code without introducing any structure could quickly become a real nightmare. In late 2015 we have basically two major JS frameworks on the field: Angular nad React. ClojureScript paradigms (functional programming, immutable data structures) fit really nice into React philosophy. In short, React application is composed of components taking data as input and rendering HTML as output. The output is not a real DOM, but so-called virtual DOM, which helps calculating diff from current view to updated one.

Among many React wrappers in ClojureScript I like using Om with om-tools to reduce some verbosity. Let’s introduce it into our project.clj:

:dependencies 
     ...
   [org.omcljs/om "0.9.0"]
   [prismatic/om-tools "0.3.12"]

To render a “hello world” component we need to add some code in core.cljs:

(:require 
   ...
            [om.core :as om]
            [om-tools.core :refer-macros [defcomponent]]
            [om-tools.dom :as dom :include-macros true]))

(def app-state (atom {:message "hello from om"}))

(defcomponent app [data owner]
  (render [_]
    (dom/div (:message data))))

(om/root app app-state
         {:target (.getElementById js/document "main")})

What’s going on here? The main concept of Om is keeping whole application state in one global atom, which is Clojure way of managing state. So we pass this app-state map (wrapped in atom) as a parameter to om/root which mounts components into real DOM (<div id="main"/> from index.html). The app component just displays the :message value, so you should see “hello from om” rendered. If you have fighweel running, you can change the message value, and it should be updated instantly.

And finally let’s render our contacts with Om:

(defn get-contacts []
  (go
    (let [response (<! (http/get "/contacts"))]
      (:body response))))

(defcomponent contact-comp [contact _]
  (render [_]
    (dom/li (str (:firstname contact) " " (:lastname contact)))))

(defcomponent app [data _]
  (will-mount [_]
    (go
      (let [contacts (<! (get-contacts))]
        (om/update! data :contacts contacts))))
  (render [_]
    (dom/div
      (dom/h2 (:message data))
      (dom/ul
        (om/build-all contact-comp (:contacts data))))))

So the contact-comp is just rendering a single contact. We use om/build-all to render all contacts visible in :contacts field in global state. And most tricky part - we use will-mount lifecycle method to get contacts from server when app component is about to be mounted to the DOM.

Again, in this commit should be a working version in case of any problems.

And if you liked Om, I highly recommend official tutorials and Zero to Om series.

Getting started with Haskell, stack and spacemacs

It has been very long time since my last blog post. During this period I have become big enthusiast of functional programming, especially using Haskell language. In this and next posts I am going to show that Haskell can be very pleasant to use and with proper tools we are able to develop applications without unnecessary burden.

Recently, many useful tools and editors emerged and they are really easy and convenient to use. In this post I intend to present toolchain that I am using in my everyday Haskell programming.

This post is not an introduction to Haskell language. It is meant to describe how to setup Haskell with stack build tool and spacemacs as an editor. I am also planning to write a post about Haskell basics and its usage in my little project in series of the next posts.

The only necessary prerequisite is having the most recent version of Emacs installed on your system.

New project build/management tool - stack

Managing dependencies and build process is always a gruesome task and there are many tools to ease this work. In Haskell most popular dependency management tool is cabal. It is based on Hackage repository (https://hackage.haskell.org). 

One of the desired features of build tools are reproducible builds. We would like to build project in new environment or on the new developer’s machine and have the same outcome in every situation. This would require same compiler version, same libraries etc. 

Lately, new tool came out - stack (https://github.com/commercialhaskell/stack). It is aimed at reproducible builds and simple project management. Stack takes care of proper configuration of your project environment. 

Stack achieves reproducible builds by using curated snapshot packages managed by special versioned resolvers. It uses cabal as a package manager. Packages are grouped into resolvers. There are two types of resolvers: LTS (long term support) and nightly. The latter contains packages in fresh version but there is also a drawback - potential instability. On the other hand, LTS resolvers contain fixed version of packages which are tested and should not cause any problems. If you are not in need of using latest packages version, LTS resolver should entirely satisfy your project needs.

What is more, stack can also download and setup locally Haskell compiler in version required by your project.

stack in action

Using stack to create new project is really easy. After installing it on our machine (description of installation is included in documentation on project's GitHub page I linked above), all we need to do to create new project is execute below steps in our terminal:
stack new hello-haskell
cd hello-haskell
stack setup
stack build
stack exec hello-haskell
These commands create new project with name hello-haskell. stack setup initialises environment, install compiler (if it will be required) and necessary libraries for project. stack build builds and compiles project. stack exec … executes executable program built earlier.

If you would like to play around with your project's code you should type stack ghci in your terminal - this will launch Haskell interactive console - ghci - in version specified in project configuration.

Another stack command worth mentioning is stack test which executes test suites declared in test/ directory.

Dependencies and project settings are placed in hello-haskell.cabal file. It is standard cabal configuration file where we can add desired libraries, set project version, licence, link to the repository and so on. I suggest reading some cabal documentation if you have any doubts but in my opinion this file is very easy and straightforward to edit.

Settings specific for stack are placed in stack.yaml file. Most important option is resolver - which influences version of GHC compiler and libraries your project will be using.

There is one thing you might encounter while setting up project dependencies. What if you need library that is not present in any of stack resolvers? Well, then we must go to stack.yaml file and edit or add section:
extra-deps:
- Vec-1.0.5
With this information stack will download and build desired package from hackage repositories. In my case I needed Vec library so I added it on a list with full name containing version number.

All details and gotchas are described in stack’s Wiki on GitHub. Be sure to check it out frequently as stack is still very young tool and it can change quite often. Documentation is very strong point of stack as it describes very well many aspects of its usages.

Powerful editor in new edition - spacemacs

I have spent a lot of time searching for editor that is easy to use with Haskell and that integrates well with its tools like REPL. I’ve been working with Sublime Text for some time as it is integrated quite well with Haskell when using SublimeHaskell package. However, recently I’ve discovered spacemacs project.

spacemacs (https://github.com/syl20bnr/spacemacs) is a easy-to-use kit for Emacs focused on ergonomics. What is great about it is that it embraces Evil mode of Emacs which mimics Vim-style editing and document navigation. With this feature spacemacs is really straightforward for users which know Vim. It is also possible to mix Vim and Emacs style in the same time.

In my opinion, it is really great feature as we can use this editor in the way we like more or is more convenient to us. Whether we are Vim-lovers or Emacs-fans or we want to mix them both - spacemacs allows to work in whatever style we like. I personally use mostly Vim-like mode with only few of original Emacs commands and with spacemacs shortcuts for many actions.

spacemacs is based on layers which add additional functionalities to editor. It can enrich our development environment with syntax completion, git integration, code completion and integration with build tools for many languages.

One of these layers is haskell layer. It supports this language quite well with syntax checking, code suggestions, built-in REPL and code templates for common patterns.

I refer to the official documentation for detailed installation instruction on various platform. After we are ready and spacemacs is on our disk, we can proceed.

Entire spacemacs configuration is placed in .spacemacs file in your home directory. This file is written in Lisp-like language and contains many options to change or add. Here is my current .spacemacs file on what this post section is based: 
https://gist.github.com/rafalnowak/202aba0ee7986515345b

In dotspacemacs-configuration-layers we need to add haskell layer (I also recommend setting auto-completion and syntax-checking layers as well). In order to get layer to work properly, we need to install some additional packages:
stack install stylish-haskell hlint hasktags
Next step is adding these two settings to .spacemacs just after text ;; User initialization goes here:
(add-hook 'haskell-mode-hook 'turn-on-haskell-indentation)
(add-to-list 'exec-path "~/.local/bin/")
It makes spacemacs aware of Haskell indentation style and adds binaries installed by stack to path. It is important as we want to make our editor able to run Haskell tools. 

Full description, as well as platform specific problems, are listed in Haskell layer documentation: https://github.com/syl20bnr/spacemacs/tree/master/layers/%2Blang/haskell There is also a list of useful shortcuts used by this layer.

One essential note: if you wish to use spacemacs with ghc-mod integration, you will need to install ghc-mod at least in version 5.4.0.0. Previous versions do not work properly with Haskell layer and stack. To install ghc-mod in this version you must add cabal-helper-0.6.1.0 to your extra-deps section in stack.yaml and run 
stack install ghc-mod
Which should proceed now without problems.  

After this configuration we are ready to use all power of Haskell and stack in our projects. We will also have solid support from editor. If you have followed steps above, you will see that spacemacs is colouring Haskell syntax, checking its correctness and giving you code completion tips. There is also interactive console for Haskell available under SPC m s s keys combination which makes quick testing of new functions possible. 

Unfortunately, there are some disadvantages of spacemacs. For me, the biggest drawdown is its responsivity. Sometimes during code completion or syntax checking it can hang application for a second or less.

Summary

As we could see, Haskell with stack and spacemacs is really powerful yet still simple to use. With stack we can achieve reproducible builds with specific compiler and libraries versions as well as easy project management. spacemacs allows us to create code quickly with support for Haskell syntax, build tools and code completion.

In my next post I am going to describe my experiences with my first bigger Haskell project - functional ray tracer I have been working on recently - https://github.com/rafalnowak/RaytracaH

Kotlin, Callable and ExecutorService

I've recently written about using Callable in Groovy, but how does it look like in kotlin?

Callable instance as separete class

First, let's create class which implements Callable interface:
class MyJob : Callable<Int> {
    override fun call() = 42
}
Now we could pass instance of this class to executorService:
fun callableAsClassInstance() =
    executorService.submit(MyJob()).get()
When we run it, as we expect, it returns 42. (Tests are written in groovy, not in kotlin).
def 'should submit callable as class instance from kotlin'() {
    expect:
        callableExample.callableAsClassInstance() == 42
}

Casting to Callable

If we create map with string "call" to closure or just closure and try to cast to Callable, we always obtain CastClassException:
fun callableAsMap() =
    executorService.submit(mapOf("call" to { 42 }) as Callable<Int>).get()

fun callableAsClosure() =
    executorService.submit({ 42 } as Callable<Int>).get()
def 'should submit callable as closure from kotlin'() {
    when:
        callableExample.callableAsClosure() == 42
    then:
        thrown(ClassCastException)
}

def 'should submit callable as map from kotlin'() {
    when:
        callableExample.callableAsMap() == 42
    then:
        thrown(ClassCastException)
}
It does not work as in groovy...

Pass instance method

So maybe passing an instance method which produces value will work?
private fun callMe() = 42

fun callableAsPassedFunction(): Int? {
    return executorService.submit(::callMe).get()
}
It does not even compile. Why? It moans that there is no submit method which could be called with such argument... But, what interesting, if we create a value to which we assign closure and pass it to submit method then everything is ok and get returns 42.
private val callMe = { 42 }

fun callableAsPassedLocalFunction(): Int? {
    return executorService.submit(callMe).get()
}

Inline implementation

Of course, there is also an option to create Callable inline:
fun callableAsInlineImplementation() =
    executorService.submit(Callable<Int> { 42 }).get()
And this is IHMO the best and the most comfortable way to create Callable in kotlin, because its syntax is much nicer than in java or groovy. Source code is available here.

Easy configuration usage with ConfigSlurper

What's the problem?

We have to deal with properties in almost every projects that we write. Properties class, which we use in these cases, is just mapping key to value. Sometimes it is fine, but in many cases properties look like tree. Example of properties file is shown below:
systemName=test
endpoint.first.protocol=http
endpoint.first.address=localhost
endpoint.first.port=8080
endpoint.first.path=test
endpoint.second.protocol=ftp
endpoint.second.address=localhost
endpoint.second.port=21
endpoint.second.user=admin
endpoint.second.password=pass
Here we have simple properties like systemName and also complex endpoints definition (all properties which start with endpoint) and single endpoints definition (each endpoint properties starts with endpoint.<ENDPOINT_NAME>). How simple could it be to treat this properties like a tree and simply extract subset of them? The answer is using ConfigSlurper.

ConfigSlurper from properties

To use ConfigSlurper just parse properties object:
def 'should import configuration from properties'() {
    given:
        Properties p = new Properties()
        p.load(ConfigSlurperTest.getResourceAsStream('/configuration.properties'))
    expect:
        new ConfigSlurper().parse(p).systemName as String == 'test'
}
Parse method returns ConfigObject which is just very clever map Map. Now you could get property using dot notation:
def 'should get nested property'() {
    expect:
        fromProperties.endpoint.first.protocol == 'http'
}
But there is a deal. If you use ConfigObject then you cannot use it like normal Properties and get property with dots.
def 'should not used nested property as one string'() {
    expect:
        fromProperties.'endpoint.first.protocol' != 'http'
}
ConfigObject allows you to extract subtree as Properties:
def 'should get first endpoint info from properties'() {
    expect:
        fromProperties.endpoint.first.toProperties() == [
            protocol: 'http',
            address : 'localhost',
            port    : '8080',
            path    : 'test'
        ]
}
and even:
def 'should allow for nested property as one string when toProperties called'() {
    expect:
        fromProperties.endpoint.toProperties()['first.protocol'] == 'http'
}
If you want to know how many endpoint you have and how they are named you could use keySet method:
def 'should get list of endpoints'() {
    expect:
        fromProperties.endpoint.keySet() == ['first', 'second'] as Set
}
ConfigSlurper do not return null even if property is not found, so you could get nested property without fear:
def 'should not throw exception when missing property'() {
    expect:
        fromProperties.endpoint.third.port.toProperties() == [:] as Properties
}
You have only to be careful, when have property named like begining of another property:
def 'should throw exception when asking for too nested property'() {
    when:
        fromProperties.endpoint.first.port.test
    then:
        thrown(MissingPropertyException)
}
fromProperties.endpoint.first.port returns String and do not have test property. You could also print properties from ConfigObject:
println fromProperties.prettyPrint()
The output looks like this:
endpoint {
    first {
        path='test'
        port='8080'
        protocol='http'
        address='localhost'
    }
    second {
        password='pass'
        protocol='ftp'
        address='localhost'
        port='21'
        user='admin'
    }
}
systemName='test'
Hmm... It looks like DSL. Why do not keep your configuration in this manner?

ConfigSlurper from script

Your configuration could be a groovy script.
systemName = 'test'
endpoint {
    first {
        path = 'test'
        port = 8080
        protocol = 'http'
        address = 'localhost'
    }
    second {
        password = 'pass'
        protocol = 'ftp'
        address = 'localhost'
        port = 21
        user = 'admin'
    }
}
test.key = ['really': 'nested?'] as Properties
You could pass such configuration as resource stream or file content:
def 'should get config from script as url'() {
    given:
        ConfigObject config = new ConfigSlurper().parse(ConfigSlurperTest.getResource('/configuration.groovy'))
    expect:
        config.systemName == 'test'
}

def 'should get config from script as string'() {
    given:
        ConfigObject config = new ConfigSlurper().parse(ConfigSlurperTest.getResource('/configuration.groovy').text)
    expect:
        config.systemName == 'test'
}
What interesting all your properties do not have to be strings. It could be any object: String, long, int, etc.
def 'should get nested properties from script as int'() {
    expect:
        fromScript.endpoint.first.port == 8080
}

def 'should get really nested properties from script and continue digging'() {
    expect:
        fromScript.test.key.really == 'nested?'
}

Conclusion

You could deal with properties like simple Map, but why if you could instead use it like tree of properties? Sources are available here.

Clojure web development – state of the art

It’s now more than a year that I’m getting familiar with Clojure and the more I dive into it, the more it becomes the language. Once you defeat the “parentheses fear”, everything else just makes the difference: tooling, community, good engineering practices. So it’s now time for me to convince others. In this post I’ll try to walktrough a simple web application from scratch to show key tools and libraries used to develop with Clojure in late 2015.

Note for Clojurians: This material is rather elementary and may be useful for you if you already know Clojure a bit but never did anything bigger than hello world application.

Note for Java developers: This material shows how to replace Spring, Angular, grunt, live-reload with a bunch of Clojure tools and libraries and a bit of code.

The repo with final code and individual steps is here.

Bootstrap

I think all agreed that component is the industry standard for managing lifecycle of Clojure applications. If you are a Java developer you may think of it as a Spring (DI) replacement - you declare dependencies between “components” which are resolved on “system” startup. So you just say “my component needs a repository/database pool” and component library “injects” it for you.

To keep things simple I like to start with duct web app template. It’s a nice starter component application following the 12-factor philosophy. So let’s start with it:

lein new duct clojure-web-app +example

The +example parameter tells duct to create an example endpoint with HTTP routes - this would be helpful. To finish bootstraping run lein setup inside clojure-web-app directory.

Ok, let’s dive into the code. Component and injection related code should be in system.clj file:

(defn new-system [config]
  (let [config (meta-merge base-config config)]
    (-> (component/system-map
         :app  (handler-component (:app config))
         :http (jetty-server (:http config))
         :example (endpoint-component example-endpoint))
        (component/system-using
         {:http [:app]
          :app  [:example]
          :example []}))))

In the first section you instantiate components without dependencies, which are resolved in the second section. So in this example, “http” component (server) requires “app” (application abstraction), which in turn is injected with “example” (actual routes). If your component needs others, you just can get then by names (precisely: by Clojure keywords).

To start the system you must fire a REPL - interactive environment running within context of your application:

lein repl

After seeing prompt type (go). Application should start, you can visit http://localhost:3000 to see some example page.

A huge benefit of using component approach is that you get fully reloadable application. When you change literally anything - configuration, endpoints, implementation, you can just type (reset) in REPL and your application is up-to-date with the code. It’s a feature of the language, no JRebel, Spring-reloaded needed.

Adding REST endpoint

Ok, in the next step let’s add some basic REST endpoint returning JSON. We need to add 2 dependencies in project.clj file:

:dependencies
 ...
  [ring/ring-json "0.3.1"]
  [cheshire "5.1.1"]

Ring-json adds support for JSON for your routes (in ring it’s called middleware) and cheshire is Clojure JSON parser (like Jackson in Java). Modifying project dependencies if one of the few tasks that require restarting the REPL, so hit CTRL-C and type lein repl again.

To configure JSON middleware we have to add wrap-json-body and wrap-json-response just before wrap-defaults in system.clj:

(:require 
 ...
 [ring.middleware.json :refer [wrap-json-body wrap-json-response]])

(def base-config
   {:app {:middleware [[wrap-not-found :not-found]
                      [wrap-json-body {:keywords? true}]
                      [wrap-json-response]
                      [wrap-defaults :defaults]]

And finally, in endpoint/example.clj we must add some route with JSON response:

(:require 
 ...
 [ring.util.response :refer [response]]))

(defn example-endpoint [config]
  (routes
    (GET "/hello" [] (response {:hello "world"}))
    ...

Reload app with (reset) in REPL and test new route with curl:

curl -v http://localhost:3000/hello

< HTTP/1.1 200 OK
< Date: Tue, 15 Sep 2015 21:17:37 GMT
< Content-Type: application/json; charset=utf-8
< Set-Cookie: ring-session=37c337fb-6bbc-4e65-a060-1997718d03e0;Path=/;HttpOnly
< X-XSS-Protection: 1; mode=block
< X-Frame-Options: SAMEORIGIN
< X-Content-Type-Options: nosniff
< Content-Length: 151
* Server Jetty(9.2.10.v20150310) is not blacklisted
< Server: Jetty(9.2.10.v20150310)
<
* Connection #0 to host localhost left intact
{"hello": "world"}

It works! In case of any problems you can find working version in this commit.

Adding frontend with figwheel

Coding backend in Clojure is great, but what about the frontend? As you may already know, Clojure could be compiled not only to JVM bytecode, but also to Javascript. This may sound familiar if you used e.g. Coffeescript. But ClojureScript philosophy is not only to provide some syntax sugar, but improve your development cycle with great tooling and fully interactive development. Let’s see how to achieve it.

The best way to introduce ClojureScript to a project is figweel. First let’s add fighweel plugin and configuration to project.clj:

:plugins
   ...
   [lein-figwheel "0.3.9"]

And cljsbuild configuration:

:cljsbuild
    {:builds [{:id "dev"
               :source-paths ["src-cljs"]
               :figwheel true
               :compiler {:main       "clojure-web-app.core"
                          :asset-path "js/out"
                          :output-to  "resources/public/js/clojure-web-app.js"
                          :output-dir "resources/public/js/out"}}]}

In short this tells ClojureScript compiler to take sources from src-cljs with figweel support and put resulting JavaScript into resources/public/js/clojure-web-app.js file. So we need to include this file in a simple HTML page:

<!DOCTYPE html>
<html>
<head>
</head>
<body>
  <div id="main">
  </div>
  <script src="js/clojure-web-app.js" type="text/javascript"></script>
</body>
</html>

To serve this static file we need to change some defaults and add corresponding route. In system.clj change api-defaults to site-defaults both in require section and base-config function. In example.clj add following route:

(GET "/" [] (io/resource "public/index.html"))

Again (reset) in REPL window should reload everything.

But where is our ClojureScript source file? Let’s create file core.cljs in src-cljs/clojure-web-app directory:

(ns ^:figwheel-always clojure-web-app.core)

(enable-console-print!)

(println "hello from clojurescript")

Open another terminal and run lein figwheel. It should compile ClojureScript and print ‘Prompt will show when figwheel connects to your application’. Open http://localhost:3000. Fighweel window should prompt:

To quit, type: :cljs/quit
cljs.user=>

Type (js/alert "hello"). Boom! If everything worked you should see and alert in your browser. Open developers console in your browser. You should see hello from clojurescript printed on the console. Change it in core.cljs to (println "fighweel rocks") and save the file. Without reloading the page your should see updated message. Fighweel rocks! Again, in case of any problems, refer to this commit.

UPDATE: In the latest duct release, there is an +cljs option which make it possible to use reloaded repl and fighweel in a single REPL. Highly recommended!

In the next post I’ll show how to fetch data from MongoDB, serve it with REST to the browser and write ReactJs/Om components to render it. Stay tuned!