Integration testing custom validation constraints in Jersey 2

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources. Custom constraints One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class: @Document public class Todo { private Long id; @NotNull private String description; @NotNull private Boolean completed; @NotNull private DateTime dueDate; @JsonCreator public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) { this.description = description; this.dueDate = dueDate; this.completed = false; } // getters and setters } Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand – what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) – but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification: @Target({TYPE, PARAMETER}) @Retention(RUNTIME) @Constraint(validatedBy = ValidForCreation.Validator.class) public @interface ValidForCreation { //... class Validator implements ConstraintValidator<ValidForCreation, Todo> { /... @Override public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) { return todo != null && todo.getId() == null && todo.getDescription() != null && todo.getDueDate() != null; } } } @Target({TYPE, PARAMETER}) @Retention(RUNTIME) @Constraint(validatedBy = ValidForModification.Validator.class) public @interface ValidForModification { //... class Validator implements ConstraintValidator<ValidForModification, Todo> { /... @Override public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) { return todo != null && todo.getId() == null && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null); } } } And now you can move validation annotations to the definition of a REST endpoint: @POST @Consumes(APPLICATION_JSON) public Response create(@ValidForCreation Todo todo) {...} @PUT @Consumes(APPLICATION_JSON) public Response update(@ValidForModification Todo todo) {...} And now you can remove those NotNulls from your model. Integration testing There are in general two approaches to integration testing: test is being run on separate JVM than the app, which is deployed on some other integration environment test deploys the application programmatically in the setup block. Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap – an Arquillian subproject: WebAppContext webAppContext = new WebAppContext(); webAppContext.setResourceBase("src/main/webapp"); webAppContext.setContextPath("/"); File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml") .importCompileAndRuntimeDependencies() .resolve().withTransitivity().asFile(); for (File file: mavenLibs) { webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI())); } webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI())); webAppContext.setConfigurations(new Configuration[] { new AnnotationConfiguration(), new WebXmlConfiguration(), new WebInfConfiguration() }); server.setHandler(webAppContext); (this Stackoverflow thread inspired me a lot here) Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos): @Test @Parameters(method = "provideInvalidTodosForCreation") public void shouldRejectInvalidTodoWhenCreate(Todo todo) { Response response = createTarget().request().post(Entity.json(todo)); assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode()); } private static Object[] provideInvalidTodosForCreation() { return $( new TodoBuilder().withDescription("test").build(), new TodoBuilder().withDueDate(DateTime.now()).build(), new TodoBuilder().withId(123L).build(), new TodoBuilder().build() ); } OK, enough of reading, feel free to clone the project and start writing your REST services! I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources. Custom constraints One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class: @Document public class Todo { private Long id; @NotNull private String description; @NotNull private Boolean completed; @NotNull private DateTime dueDate; @JsonCreator public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) { this.description = description; this.dueDate = dueDate; this.completed = false; } // getters and setters } Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand – what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) – but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification: @Target({TYPE, PARAMETER}) @Retention(RUNTIME) @Constraint(validatedBy = ValidForCreation.Validator.class) public @interface ValidForCreation { //... class Validator implements ConstraintValidator<ValidForCreation, Todo> { /... @Override public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) { return todo != null && todo.getId() == null && todo.getDescription() != null && todo.getDueDate() != null; } } } @Target({TYPE, PARAMETER}) @Retention(RUNTIME) @Constraint(validatedBy = ValidForModification.Validator.class) public @interface ValidForModification { //... class Validator implements ConstraintValidator<ValidForModification, Todo> { /... @Override public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) { return todo != null && todo.getId() == null && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null); } } } And now you can move validation annotations to the definition of a REST endpoint: @POST @Consumes(APPLICATION_JSON) public Response create(@ValidForCreation Todo todo) {...} @PUT @Consumes(APPLICATION_JSON) public Response update(@ValidForModification Todo todo) {...} And now you can remove those NotNulls from your model. Integration testing There are in general two approaches to integration testing: test is being run on separate JVM than the app, which is deployed on some other integration environment test deploys the application programmatically in the setup block. Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap – an Arquillian subproject: WebAppContext webAppContext = new WebAppContext(); webAppContext.setResourceBase("src/main/webapp"); webAppContext.setContextPath("/"); File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml") .importCompileAndRuntimeDependencies() .resolve().withTransitivity().asFile(); for (File file: mavenLibs) { webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI())); } webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI())); webAppContext.setConfigurations(new Configuration[] { new AnnotationConfiguration(), new WebXmlConfiguration(), new WebInfConfiguration() }); server.setHandler(webAppContext); (this Stackoverflow thread inspired me a lot here) Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos): @Test @Parameters(method = "provideInvalidTodosForCreation") public void shouldRejectInvalidTodoWhenCreate(Todo todo) { Response response = createTarget().request().post(Entity.json(todo)); assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode()); } private static Object[] provideInvalidTodosForCreation() { return $( new TodoBuilder().withDescription("test").build(), new TodoBuilder().withDueDate(DateTime.now()).build(), new TodoBuilder().withId(123L).build(), new TodoBuilder().build() ); } OK, enough of reading, feel free to clone the project and start writing your REST services!

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources.

Custom constraints

One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class:

@Document
public class Todo {
    private Long id;
    @NotNull
    private String description;
    @NotNull
    private Boolean completed;
    @NotNull
    private DateTime dueDate;

    @JsonCreator
    public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) {
        this.description = description;
        this.dueDate = dueDate;
        this.completed = false;
    }
    // getters and setters
}

Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand – what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) – but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification:

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForCreation.Validator.class)
public @interface ValidForCreation {
    //...
    class Validator implements ConstraintValidator<ValidForCreation, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && todo.getDescription() != null
                && todo.getDueDate() != null;
        }
    }
}

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForModification.Validator.class)
public @interface ValidForModification {
    //...
    class Validator implements ConstraintValidator<ValidForModification, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null);
        }
    }
}

And now you can move validation annotations to the definition of a REST endpoint:

@POST
@Consumes(APPLICATION_JSON)
public Response create(@ValidForCreation Todo todo) {...}

@PUT
@Consumes(APPLICATION_JSON)
public Response update(@ValidForModification Todo todo) {...}

And now you can remove those NotNulls from your model.

Integration testing

There are in general two approaches to integration testing:

  • test is being run on separate JVM than the app, which is deployed on some other integration environment
  • test deploys the application programmatically in the setup block.

Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap – an Arquillian subproject:

    WebAppContext webAppContext = new WebAppContext();
    webAppContext.setResourceBase("src/main/webapp");
    webAppContext.setContextPath("/");
    File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml")
                .importCompileAndRuntimeDependencies()
                .resolve().withTransitivity().asFile();
    for (File file: mavenLibs) {
        webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI()));
    }
    webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI()));

    webAppContext.setConfigurations(new Configuration[] {
        new AnnotationConfiguration(),
        new WebXmlConfiguration(),
        new WebInfConfiguration()
    });
    server.setHandler(webAppContext);

(this Stackoverflow thread inspired me a lot here)

Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos):

@Test
@Parameters(method = "provideInvalidTodosForCreation")
public void shouldRejectInvalidTodoWhenCreate(Todo todo) {
    Response response = createTarget().request().post(Entity.json(todo));

    assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode());
}

private static Object[] provideInvalidTodosForCreation() {
    return $(
        new TodoBuilder().withDescription("test").build(),
        new TodoBuilder().withDueDate(DateTime.now()).build(),
        new TodoBuilder().withId(123L).build(),
        new TodoBuilder().build()
    );
}

OK, enough of reading, feel free to clone the project and start writing your REST services!

You May Also Like

Super Confitura Man

How Super Confitura Man came to be :)

Recently at TouK we had a one-day hackathon. There was no main theme for it, you just could post a project idea, gather people around it and hack on that idea for a whole day - drinks and pizza included.

My main idea was to create something that could be fun to build and be useful somehow to others. I’d figured out that since Confitura was just around a corner I could make a game, that would be playable at TouK’s booth at the conference venue. This idea seemed good enough to attract Rafał Nowak @RNowak3 and Marcin Jasion @marcinjasion - two TouK employees, that with me formed a team for the hackathon.

Confitura 01

The initial plan was to develop a simple mario-style game, with preceduraly generated levels, random collectible items and enemies. One of the ideas was to introduce Confitura Man as the main character, but due to time constraints, this fall through. We’ve decided to just choose a random available sprite for a character - hence the onion man :)

Confitura 02

How the game is played?

Since we wanted to have a scoreboard and have unique users, we’ve printed out QR codes. A person that would like to play the game could pick up a QR code, show it against a camera attached to the play booth. The start page scanned the QR code and launched the game with username read from paper code.

The rest of the game was playable with gamepad or keyboard.

Confitura game screen

Technicalities

Writing a game takes a lot of time and effort. We wanted to deliver, so we’ve decided to spend some time in the days before the hackathon just to bootstrap the technology stack of our enterprise.

We’ve decided that the game would be written in some Javascript based engine, with Google Chrome as a web platform. There are a lot of HTML5 game engines - list of html5 game engines and you could easily create a game with each and every of them. We’ve decided to use Phaser IO which handles a lot of difficult, game-related stuff on its own. So, we didn’t have to worry about physics, loading and storing assets, animations, object collisions, controls input/output. Go see for yourself, it is really nice and easy to use.

Scoreboard would be a rip-off from JIRA Survivor with stats being served from some web server app. To make things harder, the backend server was written in Clojure. With no experience in that language in the team, it was a bit risky, but the tasks of the server were trivial, so if all that clojure effort failed, it could be rewritten in something we know.

Statistics

During the whole Confitura day there were 69 unique players (69 QR codes were used), and 1237 games were played. The final score looked like this:

  1. Barister Lingerie 158 - 1450 points
  2. Boilerdang Custardbath 386 - 1060 points
  3. Benadryl Clarytin 306 - 870 points

And the obligatory scoreboard screenshot:

Confitura 03

Obstacles

The game, being created in just one day, had to have problems :) It wasn’t play tested enough, there were some rough edges. During the day we had to make a few fixes:

  • the server did not respect the highest score by specific user, it was just overwritting a user’s score with it’s latest one,
  • there was one feature not supported on keyboard, that was available on gamepad - turbo button
  • server was opening a database connection each time it got a request, so after around 5 minutes it would exhaust open file limit for MongoDB (backend database), this was easily fixed - thou the fix is a bit hackish :)

These were easily identified and fixed. Unfortunately there were issues that we were unable to fix while the event was on:

  • google chrome kept asking for the permission to use webcam - this was very annoying, and all the info found on the web did not work - StackOverflow thread
  • it was hard to start the game with QR code - either the codes were too small, or the lighting around that area was inappropriate - I think this issue could be fixed by printing larger codes,

Technology evaluation

All in all we were pretty happy with the chosen stack. Phaser was easy to use and left us with just the fun parts of the game creation process. Finding the right graphics with appropriate licensing was rather hard. We didn’t have enough time to polish all the visual aspects of the game before Confitura.

Writing a server in clojure was the most challenging part, with all the new syntax and new libraries. There were tasks, trivial in java/scala, but hard in Clojure - at least for a whimpy beginners :) Nevertheless Clojure seems like a really handy tool and I’d like to dive deeper into its ecosystem.

Source code

All of the sources for the game can be found here TouK/confitura-man.

The repository is split into two parts:

  • game - HTML5 game
  • server - clojure based backend server

To run the server you need to have a local MongoDB installation. Than in server’s directory run: $ lein ring server-headless This will start a server on http://localhost:3000

To run the game you need to install dependencies with bower and than run $ grunt from game’s directory.

To launch the QR reading part of the game, you enter http://localhost:9000/start.html. After scanning the code you’ll be redirected to http://localhost:9000/index.html - and the game starts.

Conclusion

Summing up, it was a great experience creating the game. It was fun to watch people playing the game. And even with all those glitches and stupid graphics, there were people vigorously playing it, which was awesome.

Thanks to Rafał and Michał for great coding experience, and thanks to all the players of our stupid little game. If you’d like to ask me about anything - feel free to contact me by mail or twitter @zygm0nt

Recently at TouK we had a one-day hackathon. There was no main theme for it, you just could post a project idea, gather people around it and hack on that idea for a whole day - drinks and pizza included.

My main idea was to create something that could be fun to build and be useful somehow to others. I’d figured out that since Confitura was just around a corner I could make a game, that would be playable at TouK’s booth at the conference venue. This idea seemed good enough to attract >Conclusion