Multi module Gradle project with IDE support

This article is a short how-to about multi-module project setup with usage of the Gradle automation build tool. Here’s how Rich Seller, a StackOverflow user, describes Gradle: Gradle promises to hit the sweet spot between Ant and Maven. It uses Ivy’s approach for dependency resolution. It allows for convention over configuration but also includes Ant tasks as first class citizens. It also wisely allows you to use existing Maven/Ivy repositories. So why would one use yet another JVM build tool such as Gradle? The answer is simple: to avoid frustration involved by Ant or Maven. Short story I was fooling around with some fresh proof of concept and needed a build tool. I’m pretty familiar with Maven so created project from an artifact, and opened the build file, pom.xml for further tuning. I had been using Grails with its own build system (similar to Gradle, btw) already for some time up then, so after quite a time without Maven, I looked on the pom.xml and found it to be really repulsive. Once again I felt clearly: XML is not for humans. After quick googling I found Gradle. It was still in beta (0.8 version) back then, but it’s configured with Groovy DSL and that’s what a human likes :) Where are we In the time Ant can be met but among IT guerrillas, Maven is still on top and couple of others like for example Ivy conquer for the best position, Gradle smoothly went into its mature age. It’s now available in 1.3 version, released at 20th of November 2012. I’m glad to recommend it to anyone looking for relief from XML configured tools, or for anyone just looking for simple, elastic and powerful build tool. Lets build I have already written about basic project structure so I skip this one, reminding only the basic project structure: <project root> │ ├── build.gradle └── src ├── main │ ├── java │ └── groovy │ └── test ├── java └── groovy Have I just referred myself for the 1st time? Achievement unlocked! ;) Gradle as most build tools is run from a command line with parameters. The main parameter for Gradle is a ‘task name’, for example we can run a command: gradle build. There is no ‘create project’ task, so the directory structure has to be created by hand. This isn’t a hassle though. Java and groovy sub-folders aren’t always mandatory. They depend on what compile plugin is used. Parent project Consider an example project ‘the-app’ of three modules, let say: database communication layer domain model and services layer web presentation layer Our project directory tree will look like: the-app │ ├── dao-layer │ └── src │ ├── domain-model │ └── src │ ├── web-frontend │ └── src │ ├── build.gradle └── settings.gradle the-app itself has no src sub-folder as its purpose is only to contain sub-projects and build configuration. If needed it could’ve been provided with own src though. To glue modules we need to fill settings.gradle file under the-app directory with a single line of content specifying module names: include 'dao-layer', 'domain-model', 'web-frontend' Now the gradle projects command can be executed to obtain such a result: :projects ------------------------------------------------------------ Root project ------------------------------------------------------------ Root project 'the-app' +--- Project ':dao-layer' +--- Project ':domain-model' \--- Project ':web-frontend' …so we know that Gradle noticed the modules. However gradle build command won’t run successful yet because build.gradle file is still empty. Sub project As in Maven we can create separate build config file per each module. Let say we starting from DAO layer. Thus we create a new file the-app/dao-layer/build.gradle with a line of basic build info (notice the new build.gradle was created under sub-project directory): apply plugin: 'java' This single line of config for any of modules is enough to execute gradle build command under the-app directory with following result: :dao-layer:compileJava :dao-layer:processResources UP-TO-DATE :dao-layer:classes :dao-layer:jar :dao-layer:assemble :dao-layer:compileTestJava UP-TO-DATE :dao-layer:processTestResources UP-TO-DATE :dao-layer:testClasses UP-TO-DATE :dao-layer:test :dao-layer:check :dao-layer:build BUILD SUCCESSFUL Total time: 3.256 secs To use Groovy plugin slightly more configuration is needed: apply plugin: 'groovy' repositories { mavenLocal() mavenCentral() } dependencies { groovy 'org.codehaus.groovy:groovy-all:2.0.5' } At lines 3 to 6 Maven repositories are set. At line 9 dependency with groovy library version is specified. Of course plugin as ‘java’, ‘groovy’ and many more can be mixed each other. If we have settings.gradle file and a build.gradle file for each module, there is no need for parent the-app/build.gradle file at all. Sure that’s true but we can go another, better way. One file to rule them all Instead of creating many build.gradle config files, one per each module, we can use only the parent’s one and make it a bit more juicy. So let us move the the-app/dao-layer/build.gradle a level up to the-app/build-gradle and fill it with new statements to achieve full project configuration: def langLevel = 1.7 allprojects { apply plugin: 'idea' group = 'com.tamashumi' version = '0.1' } subprojects { apply plugin: 'groovy' sourceCompatibility = langLevel targetCompatibility = langLevel repositories { mavenLocal() mavenCentral() } dependencies { groovy 'org.codehaus.groovy:groovy-all:2.0.5' testCompile 'org.spockframework:spock-core:0.7-groovy-2.0' } } project(':dao-layer') { dependencies { compile 'org.hibernate:hibernate-core:4.1.7.Final' } } project(':domain-model') { dependencies { compile project(':dao-layer') } } project(':web-frontend') { apply plugin: 'war' dependencies { compile project(':domain-model') compile 'org.springframework:spring-webmvc:3.1.2.RELEASE' } } idea { project { jdkName = langLevel languageLevel = langLevel } } At the beginning simple variable langLevel is declared. It’s worth knowing that we can use almost any Groovy code inside build.gradle file, statements like for example if conditions, for/while loops, closures, switch-case, etc… Quite an advantage over inflexible XML, isn’t it? Next the allProjects block. Any configuration placed in it will influence – what a surprise – all projects, so the parent itself and sub-projects (modules). Inside of the block we have the IDE (Intellij Idea) plugin applied which I wrote more about in previous article (look under “IDE Integration” heading). Enough to say that with this plugin applied here, command gradle idea will generate Idea’s project files with modules structure and dependencies. This works really well and plugins for other IDEs are available too. Remaining two lines at this block define group and version for the project, similar as this is done by Maven. After that subProjects block appears. It’s related to all modules but not the parent project. So here the Groovy language plugin is applied, as all modules are assumed to be written in Groovy. Below source and target language level are set. After that come references to standard Maven repositories. At the end of the block dependencies to groovy version and test library – Spock framework. Following blocks, project(‘:module-name’), are responsible for each module configuration. They may be omitted unless allProjects or subProjects configure what’s necessary for a specific module. In the example per module configuration goes as follow: Dao-layer module has dependency to an ORM library – Hibernate Domain-model module relies on dao-layer as a dependency. Keyword project is used here again for a reference to other module. Web-frontend applies ‘war’ plugin which build this module into java web archive. Besides it referes to domain-model module and also use Spring MVC framework dependency. At the end in idea block is basic info for IDE plugin. Those are parameters corresponding to the Idea’s project general settings visible on the following screen shot. jdkName should match the IDE’s SDK name otherwise it has to be set manually under IDE on each Idea’s project files (re)generation with gradle idea command. Is that it? In the matter of simplicity – yes. That’s enough to automate modular application build with custom configuration per module. Not a rocket science, huh? Think about Maven’s XML. It would take more effort to setup the same and still achieve less expressible configuration quite far from user-friendly. Check the online user guide for a lot of configuration possibilities or better download Gradle and see the sample projects. As a tasty bait take a look for this short choice of available plugins: java groovy scala cpp eclipse netbeans ida maven osgi war ear sonar project-report signing and more, 3rd party plugins…

This article is a short how-to about multi-module project setup with usage of the Gradle automation build tool.

Here’s how Rich Seller, a StackOverflow user, describes Gradle:

Gradle promises to hit the sweet spot between Ant and Maven. It uses Ivy’s approach for dependency resolution. It allows for convention over configuration but also includes Ant tasks as first class citizens. It also wisely allows you to use existing Maven/Ivy repositories.

So why would one use yet another JVM build tool such as Gradle? The answer is simple: to avoid frustration involved by Ant or Maven.

Short story

I was fooling around with some fresh proof of concept and needed a build tool. I’m pretty familiar with Maven so created project from an artifact, and opened the build file, pom.xml for further tuning.
I had been using Grails with its own build system (similar to Gradle, btw) already for some time up then, so after quite a time without Maven, I looked on the pom.xml and found it to be really repulsive.

Once again I felt clearly: XML is not for humans.

After quick googling I found Gradle. It was still in beta (0.8 version) back then, but it’s configured with Groovy DSL and that’s what a human likes :)

Where are we

In the time Ant can be met but among IT guerrillas, Maven is still on top and couple of others like for example Ivy conquer for the best position, Gradle smoothly went into its mature age. It’s now available in 1.3 version, released at 20th of November 2012. I’m glad to recommend it to anyone looking for relief from XML configured tools, or for anyone just looking for simple, elastic and powerful build tool.

Lets build

I have already written about basic project structure so I skip this one, reminding only the basic project structure:

<project root>
│
├── build.gradle
└── src
    ├── main
    │   ├── java
    │   └── groovy
    │
    └── test
        ├── java
        └── groovy

Have I just referred myself for the 1st time? Achievement unlocked! ;)

Gradle as most build tools is run from a command line with parameters. The main parameter for Gradle is a ‘task name’, for example we can run a command: gradle build.
There is no ‘create project’ task, so the directory structure has to be created by hand. This isn’t a hassle though.
Java and groovy sub-folders aren’t always mandatory. They depend on what compile plugin is used.

Parent project

Consider an example project ‘the-app’ of three modules, let say:

  1. database communication layer
  2. domain model and services layer
  3. web presentation layer

Our project directory tree will look like:

the-app
│
├── dao-layer
│   └── src
│
├── domain-model
│   └── src
│
├── web-frontend
│   └── src
│
├── build.gradle
└── settings.gradle

the-app itself has no src sub-folder as its purpose is only to contain sub-projects and build configuration. If needed it could’ve been provided with own src though.

To glue modules we need to fill settings.gradle file under the-app directory with a single line of content specifying module names:

include 'dao-layer', 'domain-model', 'web-frontend'

Now the gradle projects command can be executed to obtain such a result:

:projects
------------------------------------------------------------
Root project
------------------------------------------------------------
Root project 'the-app'
+--- Project ':dao-layer'
+--- Project ':domain-model'
\--- Project ':web-frontend'

…so we know that Gradle noticed the modules. However gradle build command won’t run successful yet because build.gradle file is still empty.

Sub project

As in Maven we can create separate build config file per each module. Let say we starting from DAO layer.
Thus we create a new file the-app/dao-layer/build.gradle with a line of basic build info (notice the new build.gradle was created under sub-project directory):

apply plugin: 'java'

This single line of config for any of modules is enough to execute gradle build command under the-app directory with following result:

:dao-layer:compileJava
:dao-layer:processResources UP-TO-DATE
:dao-layer:classes
:dao-layer:jar
:dao-layer:assemble
:dao-layer:compileTestJava UP-TO-DATE
:dao-layer:processTestResources UP-TO-DATE
:dao-layer:testClasses UP-TO-DATE
:dao-layer:test
:dao-layer:check
:dao-layer:build

BUILD SUCCESSFUL

Total time: 3.256 secs

To use Groovy plugin slightly more configuration is needed:

apply plugin: 'groovy'

repositories {
    mavenLocal()
    mavenCentral()
}

dependencies {
    groovy 'org.codehaus.groovy:groovy-all:2.0.5'
}

At lines 3 to 6 Maven repositories are set. At line 9 dependency with groovy library version is specified. Of course plugin as ‘java’, ‘groovy’ and many more can be mixed each other.

If we have settings.gradle file and a build.gradle file for each module, there is no need for parent the-app/build.gradle file at all. Sure that’s true but we can go another, better way.

One file to rule them all

Instead of creating many build.gradle config files, one per each module, we can use only the parent’s one and make it a bit more juicy. So let us move the the-app/dao-layer/build.gradle a level up to the-app/build-gradle and fill it with new statements to achieve full project configuration:

pipeline {
    agent any
    stages {
        stage('Unit Test') {
            steps {
                sh 'mvn clean test'
            }
        }
        stage('Deploy Standalone') {
            steps {
                sh 'mvn deploy -P standalone'
            }
        }
        stage('Deploy AnyPoint') {
            environment {
                ANYPOINT_CREDENTIALS = credentials('anypoint.credentials')
            }
            steps {
                sh 'mvn deploy -P arm -Darm.target.name=local-4.0.0-ee -Danypoint.username=${ANYPOINT_CREDENTIALS_USR}  -Danypoint.password=${ANYPOINT_CREDENTIALS_PSW}'
            }
        }
        stage('Deploy CloudHub') {
            environment {
                ANYPOINT_CREDENTIALS = credentials('anypoint.credentials')
            }
            steps {
                sh 'mvn deploy -P cloudhub -Dmule.version=4.0.0 -Danypoint.username=${ANYPOINT_CREDENTIALS_USR} -Danypoint.password=${ANYPOINT_CREDENTIALS_PSW}'
            }
        }
    }
}

At the beginning simple variable langLevel is declared. It’s worth knowing that we can use almost any Groovy code inside build.gradle file, statements like for example if conditions, for/while loops, closures, switch-case, etc… Quite an advantage over inflexible XML, isn’t it?

Next the allProjects block. Any configuration placed in it will influence – what a surprise – all projects, so the parent itself and sub-projects (modules). Inside of the block we have the IDE (Intellij Idea) plugin applied which I wrote more about in previous article (look under “IDE Integration” heading). Enough to say that with this plugin applied here, command gradle idea will generate Idea’s project files with modules structure and dependencies. This works really well and plugins for other IDEs are available too.
Remaining two lines at this block define group and version for the project, similar as this is done by Maven.

After that subProjects block appears. It’s related to all modules but not the parent project. So here the Groovy language plugin is applied, as all modules are assumed to be written in Groovy.
Below source and target language level are set.
After that come references to standard Maven repositories.
At the end of the block dependencies to groovy version and test library – Spock framework.

Following blocks, project(‘:module-name’), are responsible for each module configuration. They may be omitted unless allProjects or subProjects configure what’s necessary for a specific module. In the example per module configuration goes as follow:

  • Dao-layer module has dependency to an ORM library – Hibernate
  • Domain-model module relies on dao-layer as a dependency. Keyword project is used here again for a reference to other module.
  • Web-frontend applies ‘war’ plugin which build this module into java web archive. Besides it referes to domain-model module and also use Spring MVC framework dependency.

At the end in idea block is basic info for IDE plugin. Those are parameters corresponding to the Idea’s project general settings visible on the following screen shot.

jdkName should match the IDE’s SDK name otherwise it has to be set manually under IDE on each Idea’s project files (re)generation with gradle idea command.

Is that it?

In the matter of simplicity – yes. That’s enough to automate modular application build with custom configuration per module. Not a rocket science, huh? Think about Maven’s XML. It would take more effort to setup the same and still achieve less expressible configuration quite far from user-friendly.

Check the online user guide for a lot of configuration possibilities or better download Gradle and see the sample projects.
As a tasty bait take a look for this short choice of available plugins:

  • java
  • groovy
  • scala
  • cpp
  • eclipse
  • netbeans
  • ida
  • maven
  • osgi
  • war
  • ear
  • sonar
  • project-report
  • signing

and more, 3rd party plugins…

You May Also Like

Log4j and MDC in Grails

Log4j provides very useful feature: MDC - mapped diagnostic context. It can be used to store data in context of current thread. It may sound scary a bit but idea is simple.

My post is based on post http://burtbeckwith.com/blog/?p=521 from Burt Beckwith's excellent blog, it's definitely worth checking if you are interested in Grails.

Short background story...


Suppose we want to do logging our brand new shopping system and we want to have in each log customer's shopping basket number. And our system can be used at once by many users who can perform many transactions, actions like adding items and so on. How can we achieve that? Of course we can add basket number in every place where we do some logging but this task would be boring and error-prone. 

Instead of this we can use MDC to store variable with basket number in map. 

In fact MDC can be treated as map of custom values for current thread that can be used by logger. 


How to do that with Grails?


Using MDC with Grails is quite simple. All we need to do is to create our own custom filter which works for given urls and puts our data in MDC.

Filters in Grails are classes in directory grails-app/conf/* which names end with *Filters.groovy postfix. We can create this class manually or use Grails command: 
grails create-filters info.rnowak.App.Basket

In result class named BasketFilters will be created in grails-app/conf/info/rnowak/UberApp.

Initially filter class looks a little bit empty:
class BasketFilters {
def filters = {
all(controller:'*', action:'*') {
before = {

}
after = { Map model ->

}
afterView = { Exception e ->

}
}
}
}
All we need to do is fill empty closures, modify filter properties and put some data into MDC.

all is the general name of our filter, as class BasketFilters (plural!) can contain many various filters. You can name it whatever you want, for this post let assume it will be named basketFilter

Another thing is change of filter parameters. According to official documentation (link) we can customize our filter in many ways. You can specify controller to be filtered, its actions, filtered urls and so on. In our example you can stay with default option where filter is applied to every action of every controller. If you are interested in filtering only some urls, use uri parameter with expression describing desired urls to be filtered.

Three closures that are already defined in template have their function and they are started in these conditions:

  • before - as name says, it is executed before filtered action takes place
  • after - similarly, it is called after the action
  • afterView - called after rendering of the actions view
Ok, so now we know what are these mysterious methods and when they are called. But what can be done within them? In official Grails docs (link again) under section 7.6.3 there is a list of properties that are available to use in filter.

With that knowledge, we can proceed to implementing filter.

Putting something into MDC in filter


What we want to do is quite easy: we want to retrieve basket number from parameters and put it into MDC in our filter:
class BasketFilters {
def filters = {
basketFilter(controller:'*', action:'*') {
before = {
MDC.put("basketNumber", params.basketNumber ?: "")
}
after = { Map model ->
MDC.remove("basketNumber")
}
}
}
}

We retrieve basket number from Grails params map and then we put in map under specified key ("basketNumber" in this case), which will be later used in logger conversion pattern. It is important to remove custom value after processing of action to avoid leaks.

So we are putting something into MDC. But how make use of it in logs?


We can refer to custom data in MDC in conversion patter using syntax: %X{key}, where key is our key we used in filter to put data, like:
def conversionPattern = "%d{yyyy-MM-dd HH:mm:ss} %-5p %t [%c{1}] %X{basketNumber} - %m%n"


And that's it :) We've put custom data in log4j MDC and successfully used it in logs to display interesting values.

Inconsistent Dependency Injection to domains with Grails

I've encountered strange behavior with a domain class in my project: services that should be injected were null. I've became suspicious as why is that? Services are injected properly in other domain classes so why this one is different?

Constructors experiment

I've created an experiment. I've created empty LibraryService that should be injected and Book domain class like this:

class Book {
def libraryService

String author
String title
int pageCount

Book() {
println("Finished constructor Book()")
}

Book(String author) {
this()
this.@author = author
println("Finished constructor Book(String author)")
}

Book(String author, String title) {
super()
this.@author = author
this.@title = title
println("Finished constructor Book(String author, String title)")
}

Book(String author, String title, int pageCount) {
this.@author = author
this.@title = title
this.@pageCount = pageCount
println("Finished constructor Book(String author, String title, int pageCount)")
}

void logInjectedService() {
println(" Service libraryService is injected? -> $libraryService")
}
}
class LibraryService {
def serviceMethod() {
}
}

Book has 4 explicit constructors. I want to check which constructor is injecting dependecies. This is my method that constructs Book objects and I called it in controller:

class BookController {
def index() {
constructAndExamineBooks()
}

static constructAndExamineBooks() {
println("Started constructAndExamineBooks")
Book book1 = new Book().logInjectedService()
Book book2 = new Book("foo").logInjectedService()
Book book3 = new Book("foo", 'bar').logInjectedService()
Book book4 = new Book("foo", 'bar', 100).logInjectedService()
Book book5 = new Book(author: "foo", title: 'bar')
println("Finished constructor Book(Map params)")
book5.logInjectedService()
}
}

Analysis

Output looks like this:

Started constructAndExamineBooks
Finished constructor Book()
Service libraryService is injected? -> eu.spoonman.refaktor.LibraryService@2affcce2
Finished constructor Book()
Finished constructor Book(String author)
Service libraryService is injected? -> eu.spoonman.refaktor.LibraryService@2affcce2
Finished constructor Book(String author, String title)
Service libraryService is injected? -> null
Finished constructor Book(String author, String title, int pageCount)
Service libraryService is injected? -> null
Finished constructor Book()
Finished constructor Book(Map params)
Service libraryService is injected? -> eu.spoonman.refaktor.LibraryService@2affcce2

What do we see?

  1. Empty constructor injects dependencies.
  2. Constructor that invokes empty constructor explicitly injects dependencies.
  3. Constructor that invokes parent's constructor explicitly does not inject dependencies.
  4. Constructor without any explicit call declared does not call empty constructor thus it does not inject dependencies.
  5. Constructor provied by Grails with a map as a parameter invokes empty constructor and injects dependencies.

Conclusion

Always explicitily invoke empty constructor in your Grail domain classes to ensure Dependency Injection! I didn't know until today either!

Atom Feeds with Spring MVC

How to add feeds (Atom) to your web application with just two classes?
How about Spring MVC?

Here are my assumptions:
  • you are using Spring framework
  • you have some entity, say “News”, that you want to publish in your feeds
  • your "News" entity has creationDate, title, and shortDescription
  • you have some repository/dao, say "NewsRepository", that will return the news from your database
  • you want to write as little as possible
  • you don't want to format Atom (xml) by hand
You actually do NOT need to use Spring MVC in your application already. If you do, skip to step 3.


Step 1: add Spring MVC dependency to your application
With maven that will be:
<dependency>
    <groupId>org.springframework</groupId>
    <artifactId>spring-webmvc</artifactId>
    <version>3.1.0.RELEASE</version>
</dependency>

Step 2: add Spring MVC DispatcherServlet
With web.xml that would be:
<servlet>
    <servlet-name>dispatcher</servlet-name>
    <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
    <init-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>classpath:spring-mvc.xml</param-value>
    </init-param>
    <load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
    <servlet-name>dispatcher</servlet-name>
    <url-pattern>/feed</url-pattern>
</servlet-mapping>
Notice, I set the url-pattern to “/feed” which means I don't want Spring MVC to handle any other urls in my app (I'm using a different web framework for the rest of the app). I also give it a brand new contextConfigLocation, where only the mvc configuration is kept.

Remember that, when you add a DispatcherServlet to an app that already has Spring (from ContextLoaderListener for example), your context is inherited from the global one, so you should not create beans that exist there again, or include xml that defines them. Watch out for Spring context getting up twice, and refer to spring or servlet documentation to understand what's happaning.

Step 3. add ROME – a library to handle Atom format
With maven that is:
<dependency>
    <groupId>net.java.dev.rome</groupId>
    <artifactId>rome</artifactId>
    <version>1.0.0</version>
</dependency>

Step 4. write your very simple controller
@Controller
public class FeedController {
    static final String LAST_UPDATE_VIEW_KEY = "lastUpdate";
    static final String NEWS_VIEW_KEY = "news";
    private NewsRepository newsRepository;
    private String viewName;

    protected FeedController() {} //required by cglib

    public FeedController(NewsRepository newsRepository, String viewName) {
        notNull(newsRepository); hasText(viewName);
        this.newsRepository = newsRepository;
        this.viewName = viewName;
    }

    @RequestMapping(value = "/feed", method = RequestMethod.GET)        
    @Transactional
    public ModelAndView feed() {
        ModelAndView modelAndView = new ModelAndView();
        modelAndView.setViewName(viewName);
        List<News> news = newsRepository.fetchPublished();
        modelAndView.addObject(NEWS_VIEW_KEY, news);
        modelAndView.addObject(LAST_UPDATE_VIEW_KEY, getCreationDateOfTheLast(news));
        return modelAndView;
    }

    private Date getCreationDateOfTheLast(List<News> news) {
        if(news.size() > 0) {
            return news.get(0).getCreationDate();
        }
        return new Date(0);
    }
}
And here's a test for it, in case you want to copy&paste (who doesn't?):
@RunWith(MockitoJUnitRunner.class)
public class FeedControllerShould {
    @Mock private NewsRepository newsRepository;
    private Date FORMER_ENTRY_CREATION_DATE = new Date(1);
    private Date LATTER_ENTRY_CREATION_DATE = new Date(2);
    private ArrayList<News> newsList;
    private FeedController feedController;

    @Before
    public void prepareNewsList() {
        News news1 = new News().title("title1").creationDate(FORMER_ENTRY_CREATION_DATE);
        News news2 = new News().title("title2").creationDate(LATTER_ENTRY_CREATION_DATE);
        newsList = newArrayList(news2, news1);
    }

    @Before
    public void prepareFeedController() {
        feedController = new FeedController(newsRepository, "viewName");
    }

    @Test
    public void returnViewWithNews() {
        //given
        given(newsRepository.fetchPublished()).willReturn(newsList);
        
        //when
        ModelAndView modelAndView = feedController.feed();
        
        //then
        assertThat(modelAndView.getModel())
                .includes(entry(FeedController.NEWS_VIEW_KEY, newsList));
    }

    @Test
    public void returnViewWithLastUpdateTime() {
        //given
        given(newsRepository.fetchPublished()).willReturn(newsList);

        //when
        ModelAndView modelAndView = feedController.feed();

        //then
        assertThat(modelAndView.getModel())
                .includes(entry(FeedController.LAST_UPDATE_VIEW_KEY, LATTER_ENTRY_CREATION_DATE));
    }

    @Test
    public void returnTheBeginningOfTimeAsLastUpdateInViewWhenListIsEmpty() {
        //given
        given(newsRepository.fetchPublished()).willReturn(new ArrayList<News>());

        //when
        ModelAndView modelAndView = feedController.feed();

        //then
        assertThat(modelAndView.getModel())
                .includes(entry(FeedController.LAST_UPDATE_VIEW_KEY, new Date(0)));
    }
}
Notice: here, I'm using fest-assert and mockito. The dependencies are:
<dependency>
 <groupId>org.easytesting</groupId>
 <artifactId>fest-assert</artifactId>
 <version>1.4</version>
 <scope>test</scope>
</dependency>
<dependency>
 <groupId>org.mockito</groupId>
 <artifactId>mockito-all</artifactId>
 <version>1.8.5</version>
 <scope>test</scope>
</dependency>

Step 5. write your very simple view
Here's where all the magic formatting happens. Be sure to take a look at all the methods of Entry class, as there is quite a lot you may want to use/fill.
import org.springframework.web.servlet.view.feed.AbstractAtomFeedView;
[...]

public class AtomFeedView extends AbstractAtomFeedView {
    private String feedId = "tag:yourFantastiSiteName";
    private String title = "yourFantastiSiteName: news";
    private String newsAbsoluteUrl = "http://yourfanstasticsiteUrl.com/news/"; 

    @Override
    protected void buildFeedMetadata(Map<String, Object> model, Feed feed, HttpServletRequest request) {
        feed.setId(feedId);
        feed.setTitle(title);
        setUpdatedIfNeeded(model, feed);
    }

    private void setUpdatedIfNeeded(Map<String, Object> model, Feed feed) {
        @SuppressWarnings("unchecked")
        Date lastUpdate = (Date)model.get(FeedController.LAST_UPDATE_VIEW_KEY);
        if (feed.getUpdated() == null || lastUpdate != null || lastUpdate.compareTo(feed.getUpdated()) > 0) {
            feed.setUpdated(lastUpdate);
        }
    }

    @Override
    protected List<Entry> buildFeedEntries(Map<String, Object> model, HttpServletRequest request, HttpServletResponse response) throws Exception {
        @SuppressWarnings("unchecked")
        List<News> newsList = (List<News>)model.get(FeedController.NEWS_VIEW_KEY);
        List<Entry> entries = new ArrayList<Entry>();
        for (News news : newsList) {
            addEntry(entries, news);
        }
        return entries;
    }

    private void addEntry(List<Entry> entries, News news) {
        Entry entry = new Entry();
        entry.setId(feedId + ", " + news.getId());
        entry.setTitle(news.getTitle());
        entry.setUpdated(news.getCreationDate());
        entry = setSummary(news, entry);
        entry = setLink(news, entry);
        entries.add(entry);
    }

    private Entry setSummary(News news, Entry entry) {
        Content summary = new Content();
        summary.setValue(news.getShortDescription());
        entry.setSummary(summary);
        return entry;
    }

    private Entry setLink(News news, Entry entry) {
        Link link = new Link();
        link.setType("text/html");
        link.setHref(newsAbsoluteUrl + news.getId()); //because I have a different controller to show news at http://yourfanstasticsiteUrl.com/news/ID
        entry.setAlternateLinks(newArrayList(link));
        return entry;
    }

}

Step 6. add your classes to your Spring context
I'm using xml approach. because I'm old and I love xml. No, seriously, I use xml because I may want to declare FeedController a few times with different views (RSS 1.0, RSS 2.0, etc.).

So this is the forementioned spring-mvc.xml

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">

    <bean class="org.springframework.web.servlet.view.ContentNegotiatingViewResolver">
        <property name="mediaTypes">
            <map>
                <entry key="atom" value="application/atom+xml"/>
                <entry key="html" value="text/html"/>
            </map>
        </property>
        <property name="viewResolvers">
            <list>
                <bean class="org.springframework.web.servlet.view.BeanNameViewResolver"/>
            </list>
        </property>
    </bean>

    <bean class="eu.margiel.pages.confitura.feed.FeedController">
        <constructor-arg index="0" ref="newsRepository"/>
        <constructor-arg index="1" value="atomFeedView"/>
    </bean>

    <bean id="atomFeedView" class="eu.margiel.pages.confitura.feed.AtomFeedView"/>
</beans>

And you are done.

I've been asked a few times before to put all the working code in some public repo, so this time it's the other way around. I've describe things that I had already published, and you can grab the commit from the bitbucket.

Hope that helps.