Testing Kotlin with Spock Part 2 – Enum with instance method

The enum class with instance method in Kotlin is quite similar to its Java version, but they are look a bit different in the bytecode. Let’s see the difference by writing some tests using Spock.What do we want to test?Let’s see the code that we want to…

Testing Kotlin with Spock Part 2 – Enum instance method

The enum class with instance method in Kotlin is quite similar to its Java version, but they are look a bit different in the bytecode. Let’s see the difference by writing some tests using Spock.

What do we want to test?

Let’s see the code that we want to test:

enum class EnumWithInstanceMethod {
    PLUS {
        override fun sign(): String = "+"
    },
    MINUS {
        override fun sign(): String = "-"
    };

    abstract fun sign(): String
}

Obviously, it can be written in a better way (e. g. using enum instance variable), but this example shows the case we want to test in the simplest way.

How to test it with Spock?

The simplest test (that does not work)

First, we can write the test like we would do it with a Java enum:

def "should use enum method like in java"() {
    expect:
        EnumWithInstanceMethod.MINUS.sign() == '-'
}

The test fails:

Condition failed with Exception:

EnumWithInstanceMethod.MINUS.sign() == '-'
                             |
                             groovy.lang.MissingMethodException: No signature of method: static com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod$MINUS.sign() is applicable for argument types: () values: []
                             Possible solutions: sign(), sign(), is(java.lang.Object), find(), with(groovy.lang.Closure), find(groovy.lang.Closure)


    at com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethodTest.should use enum method like in java(EnumWithInstanceMethodTest.groovy:11)
Caused by: groovy.lang.MissingMethodException: No signature of method: static com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod$MINUS.sign() is applicable for argument types: () values: []
Possible solutions: sign(), sign(), is(java.lang.Object), find(), with(groovy.lang.Closure), find(groovy.lang.Closure)
    ... 1 more

Interesting… Why is Groovy telling us that we are trying to call a static method? Maybe we are not using the enum instance but something else?. Let’s create a test where we pass the enum instance to method:

static String consume(EnumWithInstanceMethod e) {
    return e.sign()
}
def "should pass enum as parameter"() {
    expect: consume(EnumWithInstanceMethod.MINUS) == '-'
}

Error message:

Condition failed with Exception:

consume(EnumWithInstanceMethod.MINUS) == '-'
|
groovy.lang.MissingMethodException: No signature of method: static com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethodTest.consume() is applicable for argument types: (java.lang.Class) values: [class com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod$MINUS]
Possible solutions: consume(com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod)


    at com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethodTest.should pass enum as parameter(EnumWithInstanceMethodTest.groovy:29)
Caused by: groovy.lang.MissingMethodException: No signature of method: static com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethodTest.consume() is applicable for argument types: (java.lang.Class) values: [class com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod$MINUS]
Possible solutions: consume(com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod)
    ... 1 more

Now we see that we passed the class com.github.alien11689.testingkotlinwithspock.EnumWithInstanceMethod$MINUS, not the enum instance.

But it works in Java…

Analogous code in JUnit works perfectly and the test passes:

@Test
public void shouldReturnSign() {
    assertEquals("-", EnumWithInstanceMethod.MINUS.sign());
}

Java can access Kotlin’s instance method without problems, so maybe something is wrong with Groovy…

But the Java enum with instance method, e. g.

public enum EnumWithInstanceMethodInJava {
    PLUS {
        public String sign() {
            return "+";
        }
    },
    MINUS {
        public String sign() {
            return "-";
        }
    };

    public abstract String sign();
}

works correctly in the Spock test:

def "should use enum method"() {
    expect:
        EnumWithInstanceMethodInJava.MINUS.sign() == '-'
}

What’s the difference?

We can spot the difference just by looking at the compiled classes:

$ tree build/classes/main/
build/classes/main/
└── com
    └── github
        └── alien11689
            └── testingkotlinwithspock
                ├── AdultValidator.class
                ├── EnumWithInstanceMethod.class
                ├── EnumWithInstanceMethodInJava$1.class
                ├── EnumWithInstanceMethodInJava$2.class
                ├── EnumWithInstanceMethodInJava.class
                ├── EnumWithInstanceMethod$MINUS.class
                ├── EnumWithInstanceMethod$PLUS.class
                ├── Error.class
                ├── Ok.class
                ├── ValidationStatus.class
                └── Validator.class

Java generates anonymous classes (EnumWithInstanceMethodInJava$1 and EnumWithInstanceMethodInJava$2) for the enum instances, but Kotlin names those classes after the enum instances names (EnumWithInstanceMethod$MINUS and EnumWithInstanceMethod$PLUS).

How does it tie into the problem with Groovy? Groovy does not need the .class when accessing class in code, so when we try to access EnumWithInstanceMethod.MINUS, Groovy converts it to EnumWithInstanceMethod.MINUS.class, not the instance of the enum. The same problem does not occur in Java code since there is no EnumWithInstanceMethodInJava$MINUS class.

Solution

Knowing the difference, we can solve the problem of accessing Kotlin’s enum instance in our Groovy code.

The first solution is accessing the enum instance with valueOf method:

def "should use enum method working"() {
    expect:
        EnumWithInstanceMethod.valueOf('MINUS').sign() == '-'
}

 

The second way is to tell Groovy explicitly that we want to access the static field which is the instance of enum:

def "should use enum method"() {
    expect:
        EnumWithInstanceMethod.@MINUS.sign() == '-'
}

You can choose either solution depending on style of your code and your preferences.

Show me the code

Code is available here.

You May Also Like

Open IMS Core Mr interface

Open IMS Core does’t have standard way to define connection to MRF (Media Resource Function) on Mr interface.In IMS Mr interface is based on SIP and is similar to ISC used by Application Server (AS). Because of that we can define MRF as IMS AS and just add Wildcard PSI that has trigger on that AS. That [...]

Integration testing custom validation constraints in Jersey 2

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources.

Custom constraints

One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class:

@Document
public class Todo {
    private Long id;
    @NotNull
    private String description;
    @NotNull
    private Boolean completed;
    @NotNull
    private DateTime dueDate;

    @JsonCreator
    public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) {
        this.description = description;
        this.dueDate = dueDate;
        this.completed = false;
    }
    // getters and setters
}

Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand - what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) - but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification:

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForCreation.Validator.class)
public @interface ValidForCreation {
    //...
    class Validator implements ConstraintValidator<ValidForCreation, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && todo.getDescription() != null
                && todo.getDueDate() != null;
        }
    }
}

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForModification.Validator.class)
public @interface ValidForModification {
    //...
    class Validator implements ConstraintValidator<ValidForModification, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null);
        }
    }
}

And now you can move validation annotations to the definition of a REST endpoint:

@POST
@Consumes(APPLICATION_JSON)
public Response create(@ValidForCreation Todo todo) {...}

@PUT
@Consumes(APPLICATION_JSON)
public Response update(@ValidForModification Todo todo) {...}

And now you can remove those NotNulls from your model.

Integration testing

There are in general two approaches to integration testing:

  • test is being run on separate JVM than the app, which is deployed on some other integration environment
  • test deploys the application programmatically in the setup block.

Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap - an Arquillian subproject:

    WebAppContext webAppContext = new WebAppContext();
    webAppContext.setResourceBase("src/main/webapp");
    webAppContext.setContextPath("/");
    File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml")
                .importCompileAndRuntimeDependencies()
                .resolve().withTransitivity().asFile();
    for (File file: mavenLibs) {
        webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI()));
    }
    webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI()));

    webAppContext.setConfigurations(new Configuration[] {
        new AnnotationConfiguration(),
        new WebXmlConfiguration(),
        new WebInfConfiguration()
    });
    server.setHandler(webAppContext);

(this Stackoverflow thread inspired me a lot here)

Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos):

@Test
@Parameters(method = "provideInvalidTodosForCreation")
public void shouldRejectInvalidTodoWhenCreate(Todo todo) {
    Response response = createTarget().request().post(Entity.json(todo));

    assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode());
}

private static Object[] provideInvalidTodosForCreation() {
    return $(
        new TodoBuilder().withDescription("test").build(),
        new TodoBuilder().withDueDate(DateTime.now()).build(),
        new TodoBuilder().withId(123L).build(),
        new TodoBuilder().build()
    );
}

OK, enough of reading, feel free to clone the project and start writing your REST services!

I recently joined a team trying to switch a monolithic legacy system into set of RESTful services in Java. They decided to use latest 2.x version of Jersey as a REST container which was not a first choice for me, since I’m not a big fan of JSR-* specs. But now I must admit that JAX-RS 2.x is doing things right: requires almost zero boilerplate code, support auto-discovery of features and prefers convention over configuration like other modern frameworks. Since the spec is still young, it’s hard to find good tutorials and kick-off projects with some working code. I created jersey2-starter project on GitHub which can be used as starting point for your own production-ready RESTful service. In this post I’d like to cover how to implement and integration test your own validation constraints of REST resources.

Custom constraints

One of the issues which bothers me when coding REST in Java is littering your class model with annotations. Suppose you want to build a simple Todo list REST service, when using Jackson, validation and Spring Data, you can easily end up with this as your entity class:

@Document
public class Todo {
    private Long id;
    @NotNull
    private String description;
    @NotNull
    private Boolean completed;
    @NotNull
    private DateTime dueDate;

    @JsonCreator
    public Todo(@JsonProperty("description") String description, @JsonProperty("dueDate") DateTime dueDate) {
        this.description = description;
        this.dueDate = dueDate;
        this.completed = false;
    }
    // getters and setters
}

Your domain model is now effectively blured by messy annotations almost everywhere. Let’s see what we can do with validation constraints (@NotNulls). Some may say that you could introduce some DTO layer with own validation rules, but it conflicts for me with pure REST API design, which stands that you operate on resources which should map to your domain classes. On the other hand - what does it mean that Todo object is valid? When you create a Todo you should provide a description and due date, but what when you’re updating? You should be able to change any of description, due date (postponing) and completion flag (marking as done) - but you should provide at least one of these as valid modification. So my idea is to introduce custom validation constraints, different ones for creation and modification:

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForCreation.Validator.class)
public @interface ValidForCreation {
    //...
    class Validator implements ConstraintValidator<ValidForCreation, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && todo.getDescription() != null
                && todo.getDueDate() != null;
        }
    }
}

@Target({TYPE, PARAMETER})
@Retention(RUNTIME)
@Constraint(validatedBy = ValidForModification.Validator.class)
public @interface ValidForModification {
    //...
    class Validator implements ConstraintValidator<ValidForModification, Todo> {
    /...
        @Override
        public boolean isValid(Todo todo, ConstraintValidatorContext constraintValidatorContext) {
            return todo != null
                && todo.getId() == null
                && (todo.getDescription() != null || todo.getDueDate() != null || todo.isCompleted() != null);
        }
    }
}

And now you can move validation annotations to the definition of a REST endpoint:

@POST
@Consumes(APPLICATION_JSON)
public Response create(@ValidForCreation Todo todo) {...}

@PUT
@Consumes(APPLICATION_JSON)
public Response update(@ValidForModification Todo todo) {...}

And now you can remove those NotNulls from your model.

Integration testing

There are in general two approaches to integration testing:

  • test is being run on separate JVM than the app, which is deployed on some other integration environment
  • test deploys the application programmatically in the setup block.

Both of these have their pros and cons, but for small enough servoces, I personally prefer the second approach. It’s much easier to setup and you have only one JVM started, which makes debugging really easy. You can use a generic framework like Arquillian for starting your application in a container environment, but I prefer simple solutions and just use emdedded Jetty. To make test setup 100% production equivalent, I’m creating full Jetty’s WebAppContext and have to resolve all runtime dependencies for Jersey auto-discovery to work. This can be simply achieved with Maven resolved from Shrinkwrap - an Arquillian subproject:

    WebAppContext webAppContext = new WebAppContext();
    webAppContext.setResourceBase("src/main/webapp");
    webAppContext.setContextPath("/");
    File[] mavenLibs = Maven.resolver().loadPomFromFile("pom.xml")
                .importCompileAndRuntimeDependencies()
                .resolve().withTransitivity().asFile();
    for (File file: mavenLibs) {
        webAppContext.getMetaData().addWebInfJar(new FileResource(file.toURI()));
    }
    webAppContext.getMetaData().addContainerResource(new FileResource(new File("./target/classes").toURI()));

    webAppContext.setConfigurations(new Configuration[] {
        new AnnotationConfiguration(),
        new WebXmlConfiguration(),
        new WebInfConfiguration()
    });
    server.setHandler(webAppContext);

(this Stackoverflow thread inspired me a lot here)

Now it’s time for the last part of the post: parametrizing our integration tests. Since we want to test validation constraints, there are many edge paths to check (and make your code coverage close to 100%). Writing one test per each case could be a bad idea. Among the many solutions for JUnit I’m most convinced to the Junit Params by Pragmatists team. It’s really simple and have nice concept of JQuery-like helper for creating providers. Here is my tests code (I’m also using builder pattern here to create various kinds of Todos):

@Test
@Parameters(method = "provideInvalidTodosForCreation")
public void shouldRejectInvalidTodoWhenCreate(Todo todo) {
    Response response = createTarget().request().post(Entity.json(todo));

    assertThat(response.getStatus()).isEqualTo(BAD_REQUEST.getStatusCode());
}

private static Object[] provideInvalidTodosForCreation() {
    return $(
        new TodoBuilder().withDescription("test").build(),
        new TodoBuilder().withDueDate(DateTime.now()).build(),
        new TodoBuilder().withId(123L).build(),
        new TodoBuilder().build()
    );
}

OK, enough of reading, feel free to clone the project and start writing your REST services!