Red Hat

In Relation To Gunnar Morling

In Relation To Gunnar Morling

Updating Hibernate ORM in WildFly

Posted by    |       |    Tagged as Hibernate ORM

In this post I’ll show you how easy it is to use the latest and greatest version of Hibernate ORM with WildFly 10.

Traditionally, updating Hibernate in WildFly required some good knowledge of the server’s module system and the structure of the ORM modules. It certainly was doable, but it involved search/replace in existing module descriptors and generally wasn’t very convenient.

This has become much simpler with last week’s release of Hibernate ORM 5.2.1!

We now provide a ZIP archive containing all the required modules, making it a breeze to add the latest version of Hibernate to an existing WildFly instance. And what’s best: the version of Hibernate packaged with the application server remains untouched; switching between this and the new version is just a matter of setting one small configuration option, and you can go back at any time.


The ZIP file is available in Maven Central, so you can automate its download as part of your build if needed. These are the GAV coordinates:

  • groupId: org.hibernate

  • artifactId: hibernate-orm-modules

  • version: 5.2.1.Final

  • classifier: wildfly-10-dist

  • type: zip

Unzip the archive into the modules directory of your WildFly instance. If done correctly, you should see two sub-directories under modules: system (the server’s original modules) and org (the new Hibernate ORM modules).

Choosing the right version of Hibernate ORM

Having added the Hibernate ORM 5.2 modules to the server, you need to configure your application so it uses that specific Hibernate version instead of the default one coming with the server. To do so, just add the following property to your META-INF/persistence.xml file:

    <property name="" value="org.hibernate:5.2" />

In case you have several Hibernate releases added to your WildFly server, you also can define a specific micro version:

<property name="" value="org.hibernate:5.2.1.Final" />

Example project

As an example for using the module ZIP, I’ve created a small Maven project. You can find it in the hibernate-demos repository. To run the example project, simply execute mvn clean verify. Let’s take a quick look at some interesting parts.

First, in the Maven POM file, the maven-dependency-plugin is used to

  • download the WildFly server and unpack it into the target directory

  • download the Hibernate ORM module ZIP and extract its contents into the modules directory of WildFly:

                    <!-- WildFly server; Unpacked into target/wildfly-10.0.0.Final -->
                    <!-- Hibernate ORM modules; Unpacked into target/wildfly-10.0.0.Final/modules -->

Next, Hibernate ORM 5.2 must be enabled for the application using the property as discussed above:

<?xml version="1.0" encoding="UTF-8"?>

    <persistence-unit name="testPu" transaction-type="JTA">

            <property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
            <property name="" value="org.hibernate:5.2"/>

This persistence unit is using the example datasource configured by default in WildFly, which is using an in-memory H2 database.

Finally, we need a test to ensure that actually Hibernate ORM 5.2 is used and not the 5.0 version coming with WildFly. To do so, we can simply invoke one of the methods of EntityManager which are exposed through Hibernate’s classic Session API as of 5.2:
public class HibernateModulesOnWildflyIT {

    public static WebArchive createDeployment() {
        return ShrinkWrap.create( WebArchive.class )
                .addClass( Kryptonite.class )
                .addAsWebInfResource( EmptyAsset.INSTANCE, "beans.xml" )
                .addAsResource( "META-INF/persistence.xml" );

    private EntityManager entityManager;

    public void shouldUseHibernateOrm52() {
        Session session = entityManager.unwrap( Session.class );

        Kryptonite kryptonite1 = new Kryptonite(); = 1L;
        kryptonite1.description = "Some Kryptonite";
        session.persist( kryptonite1 );


        // EntityManager methods exposed through Session only as of 5.2
        Kryptonite loaded = session.find( Kryptonite.class, 1L );

        assertThat( loaded.description, equalTo( "Some Kryptonite" ) );

The call to find() would fail with a NoSuchMethodError if Hibernate ORM 5.0 instead of 5.2 was used. To try it out, simply remove the property from persistence.xml.

The test itself is executed using the great Arquillian tool. It creates a test WAR file with the contents configured in the createDeployment() method, starts up the WildFly server, deploys the WAR file and runs the test within the running container. Using the Arquillian Transactional Extension, the test method is executed within a transaction which is rolled back afterwards.

Feedback welcome!

The Hibernate ORM module ZIP file is a very recent addition to Hibernate. Should you run into any issues, please let us know and we’ll be happy to help.

You can learn more in this topical guide. Hibernate ORM 5.2.1 is the first release to provide a module ZIP, the next 5.1.x release will provide one, too.

Give it a try and let us know about your feedback!

Bean Validation and the Jigsaw Liaison

Posted by    |       |    Tagged as Discussions

Unless you’ve been living under a rock for the last months and years, you’ve probably heard about the efforts for adding a module system to the Java platform, code-named "Project Jigsaw".

Defining a module system and modularizing a huge system like the JDK is by no means a trivial task, so it’s not by surprise that the advent of Jigsaw has been delayed several times. But I think by now it’s a rather safe bet to expect Jigsaw to be released as part of JDK 9 eventually (the exact release date remains to be defined), especially since it became part of the early access builds a while ago.

This means that if you are an author of a library or framework, you should grab the latest JDK preview build and make sure your lib can be used a) on Java 9 and b) within modularized applications using Jigsaw.

The latter is what we are going to discuss in more detail in the following, taking Bean Validation and its reference implementation, Hibernate Validator as an example. We’ll see what is needed to convert them into Jigsaw modules and use them in a modularized environment.

Now one might ask why having a module system and providing libraries as modules based on such system is a good thing? There are many facets to that, but I think a good answer is that modularization is a great tool for building software systems from encapsulated, loosely-coupled and re-usable components with clearly defined interfaces. It makes API design a very conscious decision and on the other hand gives library authors the freedom to change internal implementation aspects of their module without risking compatibility issues with clients.

If you are not yet familiar with Jigsaw at all, it’s recommended to take a look at the project home page. It contains links to many useful resources, especially check out "The State of the Module System" which gives a great overview. I also found this two-part introduction very helpful.

Getting started

In order to follow our little experiment of "Jigsaw-ifying" Bean Validation, you should be using a Bash-compatible shell and be able to run commands such as wget. On systems lacking support for Bash by default, Cygwin can be used. You also need git to download some source code from GitHub.

Let’s get started by downloading and installing the latest JDK 9 early access build (build 122 has been used when writing this post). Then run java -version to confirm that JDK 9 is enabled. You should see an output like this:

java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+122)
Java HotSpot(TM) 64-Bit Server VM (build 9-ea+122, mixed mode)

After that, create a base directory for our experiments:

mkdir beanvalidation-with-jigsaw

Change into that directory and create some more sub-directories for storing the required modules and 3rd-party libraries:

cd beanvalidation-with-jigsaw

mkdir sources
mkdir modules
mkdir automatic-modules
mkdir tools

As tooling support for Java 9 / Jigsaw still is rather limited at this point, we are going to use plain javac and java commands to compile and test the code. Although that’s not as bad as it sounds and it’s indeed a nice exercise to learn about the existing and new compiler options, I admit I’m looking forward to the point where the known build tools such as Maven will fully support Jigsaw and allow compiling and testing modularized source code. But for now, the plain CLI tools will do the trick :)

Download the source code for Bean Validation and Hibernate Validator from GitHub:

git clone sources/beanvalidation-api
git clone sources/hibernate-validator

As we cannot leverage Maven’s dependency management, we fetch the dependencies required by Hibernate Validator via wget, storing them in the automatic-modules (dependencies) and tools (the JBoss Logging annotation processor needed for generating logger implementations) directory, respectively:

wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules

wget -P tools
wget -P tools
wget -P tools

Automatic modules are a means of Jigsaw to work with libraries in a modularized environment which have not yet been modularized themselves. Essentially, an automatic module is a module which exports all its packages and reads all other named modules.

Its module name is derived from the JAR file name, applying some rules for splitting artifact name and version and replacing hyphens with dots. So e.g. jboss-logging-annotations-2.0.1.Final.jar will have the automatic module name jboss.logging.annotations.

Creating modules for Bean Validation API and Hibernate Validator

Currently, the Bean Validation API and Hibernate Validator are in a state where they can be compiled out of the box using Java 9. But what’s still missing are the required module descriptors which describe a module’s name, its public API, its dependencies to other modules and some other things.

Module descriptors are Java files named and live in the root of a given module. Create the descriptor for the Bean Validation API with the following contents:

module javax.validation {(1)
    exports javax.validation;(2)
    exports javax.validation.bootstrap;
    exports javax.validation.constraints;
    exports javax.validation.constraintvalidation;
    exports javax.validation.executable;
    exports javax.validation.groups;
    exports javax.validation.metadata;
    exports javax.validation.spi;

    uses javax.validation.spi.ValidationProvider;(3)
1 Module name
2 All the packages the module exports (as this is an API module, all contained packages are exported)
3 The usage of the ValidationProvider service


Services have been present in Java for a long time. Originally added as an internal component in the JDK, the service loader mechanism became an official part of the platform as of Java 6.

Since then it has seen wide adoption for building extensible applications from loosely coupled components. With its help, service consumers can solely be implemented against a well-defined service contract, without knowing upfront about a specific service provider and its implementation. Jigsaw embraces the existing service concept and makes services first-class citizens of the modularized world.

Luckily, Bean Validation has been using the service mechanism for locating providers (such as Hibernate Validator) from the get go, so things play out nicely with Jigsaw. As we’ll see in a minute, Hibernate Validator provides an implementation of the ValidationProvider service, allowing the user to bootstrap it without depending on this specific implementation.

But for now let’s compile the Bean Validation module:

export BASE=`pwd`
cd sources/beanvalidation-api
javac -d $BASE/modules/javax.validation $(find src/main/java -name "*.java")
cd $BASE

After compilation, the built module can be found under modules/javax.validation. Note that modules usually will be packaged and redistributed as JAR files, but to keep things simple let’s just work with class directory structures here.

Things get a bit more interesting when it comes to Hibernate Validator. Its module descriptor should look like this:

module org.hibernate.validator.engine {(1)
    exports org.hibernate.validator;(2)
    exports org.hibernate.validator.cfg;
    exports org.hibernate.validator.cfg.context;
    exports org.hibernate.validator.cfg.defs;
    exports org.hibernate.validator.constraints;
    exports org.hibernate.validator.constraintvalidation;
    exports org.hibernate.validator.constraintvalidators;
    exports org.hibernate.validator.engine;
    exports org.hibernate.validator.messageinterpolation;
    exports org.hibernate.validator.parameternameprovider;
    exports org.hibernate.validator.path;
    exports org.hibernate.validator.resourceloading;
    exports org.hibernate.validator.spi.cfg;
    exports org.hibernate.validator.spi.resourceloading;
    exports org.hibernate.validator.spi.time;
    exports org.hibernate.validator.spi.valuehandling;
    exports org.hibernate.validator.valuehandling;

    exports org.hibernate.validator.internal.util.logging to jboss.logging;(3)
    exports org.hibernate.validator.internal.xml to java.xml.bind;

    requires javax.validation;(4)
    requires joda.time;
    requires javax.el.api;
    requires jsoup;
    requires jboss.logging.annotations;
    requires jboss.logging;
    requires classmate;
    requires paranamer;
    requires hibernate.jpa;
    requires java.xml.bind;
    requires java.xml;
    requires java.scripting;
    requires javafx.base;

    provides javax.validation.spi.ValidationProvider with

    uses javax.validation.ConstraintValidator;(6)
1 The module name
2 All the packages the module exports; Hibernate Validator always had a very well defined public API, with all the code parts not meant for public usage living in an internal package. Naturally, only the non-internal parts are exported. Things will be more complex when modularizing an existing component without such clearly defined public API. Likely you’ll need to move some classes around first, untangling public API and internal implementation parts.
3 Two noteworthy exceptions are o.h.v.internal.util.logging and o.h.v.internal.xml which are exported via a so-called "qualified exports". This means that only the jboss.logging module may access the logging package and only java.xml.bind may access the XML package. This is needed as these modules require reflective access to the logging and XML classes, respectively. Using qualified exports, this exposure of internal classes can be limited to the smallest degree possible.
4 All the modules which this module requires. These are the javax.validation module we just built, all the automatic modules we downloaded before and some modules coming with the JDK itself (java.xml.bind, javafx.base etc).
Some of these dependencies might be considered optional at runtime, e.g. Joda Time would only be needed at runtime when actually validating Joda Time types with @Past or @Future. Unfortunately - and in contrast to OSGi or JBoss Modules - Jigsaw doesn’t support the notion of optional module requirements, meaning that all module requirements must be satisfied at compile time as well as runtime. That’s a pity, as it prevents a common pattern for libraries which expose certain functionality depending on what dependencies/classes are available at runtime or not.
The "right answer" with Jigsaw would be to extract these optional features into their own modules (e.g. hibernate.validator.joda.time, hibernate.validator.jsoup etc.) but this comes at the price of making things more complex for users which then need to deal with all these modules.
5 The module provides an implementation of the ValidationProvider service
6 The module uses the ConstraintValidator service, see below

With the module descriptor in place, we can compile the Jigsaw-enabled Hibernate Validator module:

cd sources/hibernate-validator/engine
mkdir -p target/generated-sources/jaxb

xjc -enableIntrospection -p org.hibernate.validator.internal.xml \(1)
    -extension \
    -target 2.1 \
    -d target/generated-sources/jaxb \
    src/main/xsd/validation-configuration-1.1.xsd src/main/xsd/validation-mapping-1.1.xsd \
    -b src/main/xjb/binding-customization.xjb

javac -addmods java.xml.bind,java.annotations.common \(2)
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -processorpath $BASE/tools/jboss-logging-processor-2.0.1.Final.jar:$BASE/tools/jdeparser-2.0.0.Final.jar:$BASE/tools/jsr250-api-1.0.jar:$BASE/automatic-modules/jboss-logging-annotations-2.0.1.Final.jar::$BASE/automatic-modules/jboss-logging-3.3.0.Final.jar \
    -d $BASE/modules/org.hibernate.validator.engine \
    $(find src/main/java -name "*.java") $(find target/generated-sources/jaxb -name "*.java")

cp -r src/main/resources/* $BASE/modules/org.hibernate.validator.engine;(3)

cp -r src/main/xsd/* $BASE/modules/org.hibernate.validator.engine/META-INF;(4)
cd $BASE
1 The xjc utility is used to create some JAXB types from the XML constraint descriptor schemas
2 Compile the source code via javac
3 Copy error message resource bundle into the module directory
4 Copy XML schema files into the module directory

Note how the module path used for compilation refers to the modules directory (containing the javax.validation module) and the automatic-modules directory (containing all the dependencies such as Joda Time etc.).

The resulting module is located under modules/org.hibernate.validator.engine.

Giving it a test ride

Having converted Bean Validation API and Hibernate Validator into proper Jigsaw modules, it’s about time to give these modules a test ride. Create a new compilation unit for that:

mkdir -p sources/com.example.acme/src/main/java/com/example/acme

Within that directory structure, create a very simple domain class and a class with a main method for validating it:

package com.example.acme;

import java.util.List;

import javax.validation.constraints.Min;

public class Car {

    public int seatCount;

    public List<String> passengers;

    public Car(int seatCount, List<String> passengers) {
        this.seatCount = seatCount;
        this.passengers = passengers;
package com.example.acme;

import java.util.Collections;
import java.util.Set;

import javax.validation.ConstraintViolation;
import javax.validation.Validation;
import javax.validation.Validator;

public class ValidationTest {

    public static void main(String... args) {
        Validator validator = Validation.buildDefaultValidatorFactory()

        Set<ConstraintViolation<Car>> violations = validator.validate( new Car( 0, Collections.emptyList() ) );

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );

This obtains a Validator object via Validation#buildDefaultValidatorFactory() (which internally uses the service mechanism described above) and performs a simple validation of a Car object.

Of course we need a, too:

module com.example.acme {
    exports com.example.acme;

    requires javax.validation;

That should look familiar by now: we just export the single package (so Hibernate Validator can access the state of the Car object) and depend on the Bean Validation API module.

Also compilation of this module isn’t much news:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cd $BASE

And with that, we finally can run a first test of Bean Validation under Jigsaw:

java \
    -modulepath modules:automatic-modules \
    -m com.example.acme/com.example.acme.ValidationTest

Similar to javac, there is a new modulepath option for the java command, pointing to one or more directories with Jigsaw modules. The -m switch specifies the main class to run by giving its module name and fully qualified class name.

Mh, that was not really successful:

HV000149: An exception occurred during message interpolation
Caused by: java.lang.UnsupportedOperationException: ResourceBundle.Control not supported in named modules
    at java.util.ResourceBundle.checkNamedModule(java.base@9-ea/
    at java.util.ResourceBundle.getBundle(java.base@9-ea/
    at org.hibernate.validator.resourceloading.PlatformResourceBundleLocator.loadBundle(org.hibernate.validator.engine/

What’s that about? Hibernate Validator is using the Control class in order to merge the contents (error messages) of several resource bundles with the same name found on the classpath. This is not supported in the modularized environment any longer, hence the exception above is raised. Eventually, Hibernate Validator should handle this situation automatically (this is tracked under HV-1073).

For now let’s hack around it and disable the troublesome bundle aggregregation in Hibernate Validator’s AbstractMessageInterpolator. To do so, change true to false in the constructor invocation on line 165:

new PlatformResourceBundleLocator(

Re-compile the Hibernate Validator module. After running the test again, you should now see the following output on the console:

Validation error: must be greater than or equal to 1

Tada, the first successful bean validation in the Jigsaw environment :)

Let me quickly recap what has happened so far:

  • We added a module descriptor to the Bean Validation API, making it a proper Jigsaw module

  • We added a module descriptor to Hibernate Validator; this Bean Validation provider will be discovered by the API module using the service mechanism

  • We created a test module with a main method, which uses the Bean Validation API to perform a simple object validation

(Not) overstepping boundaries

Now let’s be nasty and see whether the module system actually is doing its job as expected. For that add the module requirement to Hibernate Validator to the module descriptor (so its considered for compilation at all) and cast the validator to the internal implementation type in ValidationTest:

module com.example.acme {
    exports com.example.acme;

    requires javax.validation;
    requires org.hibernate.validator.engine;
package com.example.acme;

import javax.validation.Validation;

import org.hibernate.validator.internal.engine.ValidatorImpl;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        ValidatorImpl validator = (ValidatorImpl) Validation.buildDefaultValidatorFactory()

Running javac again, you should now get a compilation error, complaining about the type not being found. So Jigsaw prevents accesses to non-exported types. If you like, try referencing anything from the packages exported by Hibernate Validator, which will work.

That’s a great advantage over the traditional flat classpath, where you might have organized your code base into public and internal parts but then had to hope for users of your library not to step across the line and - accidentally or intentionally - access internal classes.

Custom constraints

With the modules basically working, it’s time to get a bit more advanced and create a custom Bean Validation constraint. This one should make sure that a car does not have more passengers than seats available.

For that we need an annotation type:

package com.example.acme;

import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;

import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;

import javax.validation.Constraint;
import javax.validation.Payload;

@Constraint(validatedBy = { PassengersDontExceedSeatCountValidator.class })
@Target({ TYPE })
public @interface PassengersDontExceedSeatCount {

    String message() default "{com.example.acme.PassengersDontExceedSeatCount.message}";
    Class<?>[] groups() default { };
    Class<? extends Payload>[] payload() default { };

And also a constraint validator implementation:

package com.example.acme;

import javax.validation.ConstraintValidator;
import javax.validation.ConstraintValidatorContext;

import com.example.acme.PassengersDontExceedSeatCount;

public class PassengersDontExceedSeatCountValidator implements
        ConstraintValidator<PassengersDontExceedSeatCount, Car> {

    public void initialize(PassengersDontExceedSeatCount constraintAnnotation) {}

    public boolean isValid(Car car, ConstraintValidatorContext constraintValidatorContext) {
        if ( car == null ) {
            return true;

        return car.passengers == null || car.passengers.size() <= car.seatCount;

A resource bundle with the error message for the constraint is needed, too:

com.example.acme.PassengersDontExceedSeatCount.message=Passenger count must not exceed seat count

Now we can put the new constraint type to the Car class and finally validate it:

public class Car {
package com.example.acme;

import java.util.Arrays;
import java.util.Set;

import javax.validation.Validation;
import javax.validation.Validator;
import javax.validation.ConstraintViolation;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        Validator validator = Validation.buildDefaultValidatorFactory()

        Set<ConstraintViolation<Car>> violations = validator.validate(
            new Car( 2, Arrays.asList( "Anna", "Bob", "Alice" ) )

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );

Compile the example module again; don’t forget to copy the resource bundle to the module directory:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cp -r src/main/resources/* $BASE/modules/com.example.acme
cd $BASE

Run it as before, and you should get a nice error message. But what’s that:

Validation error: {com.example.acme.PassengersDontExceedSeatCount.message}

It seems the error message wasn’t resolved properly, so the raw interpolated message key from the annotation definition has been returned. Now why is this?

Bean Validation error messages are loaded through java.util.ResourceBundle, and due to the strong encapsulation of the modularized environment the Hibernate Validator module cannot "see" the resource bundle provided in the example module.

The updated JavaDocs of ResourceBundle make it clear that only bundles located in the same module as the caller of ResourceBundle#getBundle() can be accessed. In order to access resource bundles from other modules, the service loader mechanism is to be used as per Java 9; A new SPI interface, ResourceBundleProvider, has been added to the JDK for that purpose.

Ultimately, Bean Validation should take advantage of that mechanism, but how can we make things work out for now? As it turns out, Hibernate Validator has its own extension point for customizing the retrieval of resource bundles, ResourceBundleLocator.

This comes in very handy now: we just need to create an implementation of that SPI in the example module:

package com.example.acme.internal;

import java.util.Locale;
import java.util.ResourceBundle;

import org.hibernate.validator.spi.resourceloading.ResourceBundleLocator;

public class MyResourceBundleLocator implements ResourceBundleLocator {

    public ResourceBundle getResourceBundle(Locale locale) {
        return ResourceBundle.getBundle( "ValidationMessages", locale );

When bootstrapping the validator factory, configure a message interpolator using that bundle locator like this:

import org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator;
import com.example.acme.internal.MyResourceBundleLocator;


Validator validator = Validation.byDefaultProvider()
    .messageInterpolator( new ResourceBundleMessageInterpolator( new MyResourceBundleLocator() ) )

As the call to ResourceBundle#getBundle() now originates from the same module that declares the ValidationMessages bundle, the bundle can be found and the error message will be interpolated correctly. Success!

Keeping your privacy

With the custom constraint in place, let’s think about encapsulation a bit more. Wouldn’t it be nice if the constraint validator implementation didn’t live in the exported package but rather somewhere under internal? After all, that class is an implementation detail and should not be referenced directly by users of the @PassengersDontExceedSeatCount constraint.

Another feature of Hibernate Validator is helpful here: service-loader based discovery of constraint validators.

This allows us to remove the reference from the constraint annotation to its validator (just add an empty @Constraint({}) annotation) and relocate the validator implementation to the internal package:

mv sources/com.example.acme/src/main/java/com/example/acme/ \

Also adapt the package declaration in the source file accordingly and add an import for the Car type. We then need to declare the constraint validator as service provider in the module descriptor:

module com.example.acme {

    provides javax.validation.ConstraintValidator
        with com.example.acme.internal.PassengersDontExceedSeatCountValidator;

Compile and run the example module again. You should get an error like this:

java.lang.IllegalAccessException: class org.hibernate.validator.internal.util.privilegedactions.NewInstance (in module org.hibernate.validator.engine) cannot access class com.example.acme.internal.PassengersDontExceedSeatCountValidator (in module com.example.acme) because module com.example.acme does not export com.example.acme.internal to module org.hibernate.validator.engine

This originates from the fact that Hibernate Validator is using the service loader mechanism only for detecting validator types and then instantiates them for each specific constraint usage. As the internal package has not been exported, this instantiation is bound to fail. You have two options now:

  • Use a qualified export in to expose that package to the Hibernate Validator module

  • Use the new -XaddExports option of the java command to dynamically add this export when running the module

Following the latter approach, the java invocation would look like this:

java \
    -modulepath modules:automatic-modules \
    -XaddExports:com.example.acme/com.example.acme.internal=org.hibernate.validator.engine \
    -m com.example.acme/com.example.acme.ValidationTest

While this approach works, it can become a bit tedious when taking other libraries into the picture, that need to perform reflective operations on non-exported types. JPA providers such as Hibernate ORM and dependency injection frameworks are just two examples.

Luckily, the OpenJDK team is aware of that issue and there is an entry for it in the requirements list for the Java Module System: ReflectiveAccessToNonExportedTypes. I sincerely hope that this one gets addressed before Java 9 gets finalized.

XML configuration

As last part of our journey of "Jigsaw-ifying" Bean Validation, let’s take a look at XML-based configuration of constraints. This is a useful alternative, if you cannot put constraint metadata to a model via annotations or e.g. want to override existing annotation-based constraints externally.

The Bean Validation spec defines a validation mapping file for this, which in turn can point to one or more constraint mapping XML files. Create the following files in order to override the @Min constraint of the Car class:

<?xml version="1.0" encoding="UTF-8"?>


<?xml version="1.0" encoding="UTF-8"?>

    xsi:schemaLocation=" validation-mapping-1.1.xsd"
    xmlns="" version="1.1">

    <bean class="com.example.acme.Car" ignore-annotations="true">
        <field name="seatCount">
            <constraint annotation="javax.validation.constraints.Min">
                <element name="value">2</element>

Traditionally, Bean Validation will look for META-INF/validation.xml on the classpath and resolve any linked constraint mapping files relatively to that. If you’ve followed this article that far, you won’t be surprised that this is not going to work in Jigsaw. There is no notion of a "flat classpath" any longer, and thus a validation provider cannot see XML files in other modules, akin to the case of error message bundles discussed above.

More specifically, the method ClassLoader#getResourceAsStream() which is used by Hibernate Validator to open mapping files, won’t work for named modules as of JDK 9. That change will be a tough nut to crack for many projects when migrating to Java 9, as it renders strategies for resource loading known from existing modular environment such as OSGi inoperable. E.g. Hibernate Validator allows to pass in a classloader for loading user-provided resources. In OSGi this can be used to pass what’s called the bundle loader and allow Hibernate Validator to access constraint mapping files and other things provided by the user. This pattern cannot be employed with Jigsaw unfortunately, as getResourceAsStream() "does not find resources in named modules".

But Bean Validation has a way out for this issue, too, as it allows to pass in constraint mappings as InputStream opened by the bootstrapping code. Class#getResourceAsStream() continues to work for resources from the same module, so things will work out as expected when bootstrapping the validator factory like this (don’t forget to close the stream afterwards):

InputStream constraintMapping = ValidationTest.class.getResourceAsStream( "/META-INF/constraints-car.xml" );

Validator validator = Validation.byDefaultProvider()
    .addMapping( constraintMapping )

That way the constraint mapping is opened by code from the same module and thus can be accessed and passed to the Bean Validation provider.

In the longer term, APIs such as Bean Validation should foresee some kind of SPI contract for conveying all required configuration information as per Jigsaw spec lead Mark Reinhold. The user would then expose an instance of that contract via a service implementation.

Wrapping it up

With that, we conclude our experiment of making Bean Validation ready for the Jigsaw module system. Overall, things work out pretty well, and without too much effort Bean Validation and Hibernate Validator can be made first-class citizens in a fully-modularized world.

Some observations from the experiment:

  • The lack of optional module requirements poses a usability issue in my opinion, as it prevents libraries from exposing additional functionality based on what classes and modules are present at runtime in a given environment. This means that users of the library either need to provide other modules they don’t actually need or - if the library has been cut into several modules, one per optional dependency - need to add several modules now where they could have worked with a single one before.

  • The need to explicitly expose internal packages for reflective access by libraries such as Hibernate Validator, but also JPA providers or DI containers, can become tedious. I hope there will be a way to enable such access in a more global way, e.g. by whitelisting "trustworthy modules" such as the aforementioned libraries.

  • The changed behaviors around loading of resources such as configuration files or resource bundles provided by the user of a library will potentially affect many applications when migrating to Java 9. The established pattern of accepting an external classloader for loading user resources will not work anymore, so libraries need to adapt either by providing dedicated extension points (akin to addMapping(InputStream) in Bean Validation) or by migrating to service based approaches as envisioned by the makers of Jigsaw.

  • Tools such as Maven (including plug-ins as e.g. Surefire for running tests) or Gradle but also IDEs still need to catch up with Jigsaw. Using plain javac and java can be fun for a while, but you wish back more powerful tools rather quickly :)

  • Converting Hibernate Validator into a Jigsaw module is relatively easy, as we luckily were very careful about a proper API/implementation split from the beginning. Modularizing existing libraries or applications without such clear distinction will be a much tougher exercise, as it may require lots of types to be moved around and unwanted (package) dependencies to be broken up. There are some tools that can help with that, but that might be a topic for a future blog post by itself.

One thing is for sure: interesting times lie ahead! While migration might be painful here and there, I think it’s overdue that Java gets its proper module system and I look forward to seeing it as an integrated part of the platform very much.

Got feedback from following the steps described above or from your own experiments with Jigsaw? Let’s all together learn from our different experiences and insights, so please share any thoughts on the topic below.

Many thanks to Sander Mak, Sanne Grinovero and Guillaume Smet for reviewing this post!

Emulating property literals with Java 8 method references

Posted by    |       |    Tagged as Discussions

One of the things library developers often miss in Java are property literals. In this post I’m going to show how to make creative use of Java 8 method references to emulate property literals, with the help of some byte code generation.

Akin to class literals (e.g. Customer.class), property literals would allow to refer to the properties of a bean class in a type-safe manner. This would be useful for designing APIs that run actions on specific bean properties or apply some means of configuration to them. E.g. consider the API for programmatic configuration of index mappings in Hibernate Search:

new SearchMapping().entity( Address.class )
    .property( "city", ElementType.METHOD )

Or the validateValue() method from the Bean Validation API for validating a value against the constraints of a single property:

Set<ConstraintViolation<Address>> violations = validator.validateValue( Address.class, "city", "Purbeck" );

In both cases a String is used to represent the city property of the Address class.

That’s error-prone on several levels:

  • The Address class might not have a property city at all. Or one could forget to update that String reference when renaming the property.

  • In the case of validateValue(), there is no way for making sure that the passed value actually satisfies the type of the city property.

Users of the APIs will only find out about these issues when actually running their application. Wouldn’t it be nice if instead the compiler and the language’s type system prevented such wrong usages from the beginning? If Java had property literals, that’d be exactly what you’d get (invented syntax, this does not compile!):

mapping.entity( Address.class )
    .property( Address::city, ElementType.METHOD )


validator.validateValue( Address.class, Address::city, "Purbeck" );

The issues mentioned above would be avoided: Having a typo in a property literal would cause a compilation error which you’d notice right in your IDE. It’d allow to design the configuration API in Hibernate Search in a way to accept only properties of Address when configuring the Address entity. And in case of Bean Validation’s validateValue(), literals could help to ensure that only values can be passed, that are assignable to the property type in question.

Java 8 method references

While Java 8 has no real property literals (and their introduction isn’t planned for Java 9 either), it provides an interesting way to emulate them to some degree: method references. Having been introduced to improve the developer experience when using Lambda expressions, method references also can be leveraged as poor-man’s property literals.

The idea is to consider a reference to a getter method as a property literal:

validator.validateValue( Address.class, Address::getCity, "Purbeck" );

Obviously, this will only work if there actually is a getter. But if your classes are following JavaBeans conventions - which often is the case - that’s fine.

Now how would the definition of the validateValue() method look like? The key is using the new Function type:

public <T, P> Set<ConstraintViolation<T>> validateValue(Class<T> type, Function<? super T, P> property, P value);

By using two type parameters, we make sure that the bean type, the property and the value passed for the property all correctly match. So API-wise, we got what we need: It’s safe to use, and the IDE will even auto-complete method names after starting to write Address::. But how do we derive the property name from the Function in the implementation of validateValue()?

That’s where the fun begins, as the Function interface just defines a single method, apply(), wich runs the function against a given instance of T. That seems not exactly helpful, does it?

ByteBuddy to the rescue

As it turns out, applying the function actually does the trick! By creating a proxy instance of the T type, we have a target for invoking the method and can obtain its name in the proxy’s method invocation handler.

Java comes with support for dynamic proxies out of the box, but that’s limited to proxying interfaces only. As our API should work with any kind of bean, also actual classes, I’m going to use a neat tool called ByteBuddy instead. ByteBuddy provides an easy-to-use DSL for creating classes on the fly which is exactly what we need.

Let’s begin by defining an interface which just allows to store and obtain the name of a property we obtained from a method reference:

public interface PropertyNameCapturer {

    String getPropertyName();

    void setPropertyName(String propertyName);

Now let’s use ByteBudy to programmatically create a proxy class which is assignable to the type of interest (e.g. Address) and PropertyNameCapturer:

public <T> T /* & PropertyNameCapturer */ getPropertyNameCapturer(Class<T> type) {
    DynamicType.Builder<?> builder = new ByteBuddy()                                       (1)
            .subclass( type.isInterface() ? Object.class : type );

    if ( type.isInterface() ) {                                                            (2)
        builder = builder.implement( type );

    Class<?> proxyType = builder
        .implement( PropertyNameCapturer.class )                                           (3)
        .defineField( "propertyName", String.class, Visibility.PRIVATE )
        .method( ElementMatchers.any() )                                                   (4)
            .intercept( PropertyNameCapturingInterceptor.class ) )
        .method( named( "setPropertyName" ).or( named( "getPropertyName" ) ) )             (5)
            .intercept( FieldAccessor.ofBeanProperty() )
        .load(                                                                             (6)

    try {
        Class<T> typed = (Class<T>) proxyType;
        return typed.newInstance();                                                        (7)
    catch (InstantiationException | IllegalAccessException e) {
        throw new HibernateException(
            "Couldn't instantiate proxy for method name retrieval", e

The code may appear a bit dense, so let me run you through it. First we obtain a new ByteBuddy instance (1) which is the entry point into the DSL. It is used to create a new dynamic type that either extends the given type (if it is a class) or extends Object and implements the given type if it is an interface (2).

Next, we let the type implement the PropertyNameCapturer interface and add a field for storing the name of the specified property (3). Then we say that invocations to all methods should be intercepted by PropertyNameCapturingInterceptor (we’ll come to that in a moment) (4). Only setPropertyName() and getPropertyName() (as declared in the PropertyNameCapturer interface) should be routed to write and read access of the field created before (5). Finally, the class is built, loaded (6) and instantiated (7).

That’s all that’s needed to create the proxy type; Thanks to ByteBuddy, this is done in a few lines of code. Now let’s take a look at the interceptor we configured before:

public class PropertyNameCapturingInterceptor {

    public static Object intercept(@This PropertyNameCapturer capturer, @Origin Method method) {         (1)
        capturer.setPropertyName( getPropertyName( method ) );                                           (2)

        if ( method.getReturnType() == byte.class ) {                                                    (3)
            return (byte) 0;
        else if ( ... ) { } // ... handle all primitve types
            // ...
        else {
            return null;

    private static String getPropertyName(Method method) {                                               (4)
        final boolean hasGetterSignature = method.getParameterTypes().length == 0
                && method.getReturnType() != null;

        String name = method.getName();
        String propName = null;

        if ( hasGetterSignature ) {
            if ( name.startsWith( "get" ) && hasGetterSignature ) {
                propName = name.substring( 3, 4 ).toLowerCase() + name.substring( 4 );
            else if ( name.startsWith( "is" ) && hasGetterSignature ) {
                propName = name.substring( 2, 3 ).toLowerCase() + name.substring( 3 );
        else {
            throw new HibernateException( "Only property getter methods are expected to be passed" );    (5)

        return propName;

intercept() accepts the Method being invoked as well as the target of the invocation (1). The annotations @Origin and @This are used to designate the respective parameters so ByteBuddy can generate the correct invocations of intercept() into the dynamic proxy type.

Note that there is no strong dependency from this interceptor to any types of ByteBuddy, meaning that ByteBuddy is only needed when creating that dynamic proxy type but not later on, when actually using it.

Via getPropertyName() (4) we then obtain the name of the property represented by the passed method object and store it in the PropertyNameCapturer (2). If the given method doesn’t represent a getter method, an exception is raised (5). The return value of the invoked getter is irrelevant, so we just make sure to return a sensible "null value" matching the property type (3).

With that, we got everything in place to get hold of the property represented by a method reference passed to validateValue():

public <T, P> Set<ConstraintViolation<T>> validateValue(Class<T> type, Function<? super T, P> property, P value) {
    T capturer = getPropertyNameCapturer( type );
    property.apply( capturer );
    String propertyName = ( (PropertyLiteralCapturer) capturer ).getPropertyName();

    // perform validation of the property value...

When applying the function to the property name capturing proxy, the interceptor will kick in, obtain the property name from the Method object and store it in the capturer instance, from where it can be retrieved finally.

And there you have it, some byte code magic lets us make creative use of Java 8 method references for emulating property literals.

That said, having real property literals as part of the language (dreaming for a moment, maybe Java 10?) would still be very beneficial. It’d allow to deal with private properties and, hopefully, one could refer to property literals from within annotations. Real property literals also would be more concise (no "get" prefix) and it’d generally feel a tad less hackish ;)

During my talk at VoxxedVienna on using Hibernate Search with Elasticsearch earlier this week, there was an interesting question which I couldn’t answer right away:

"When running a full-text query with a projection of fields, is it possible to return the result as a list of POJOs rather than as a list of arrays of Object?"

The answer is: Yes, it is possible, result transformers are the right tool for this.

Let’s assume you want to convert the result of a projection query against the VideoGame entity shown in the talk into the following DTO (data transfer object):

public static class VideoGameDto {

    private String title;
    private String publisherName;
    private Date release;

    public VideoGameDto(String title, String publisherName, Date release) {
        this.title = title;
        this.publisherName = publisherName;
        this.release = release;

    // getters...

This is how you could do it via a result transformer:

FullTextEntityManager ftem = ...;

QueryBuilder qb = ftem.getSearchFactory()
    .forEntity( VideoGame.class )

FullTextQuery query = ftem.createFullTextQuery(
        .onField( "tags" )
        .matching( "round-based" )
    .setProjection( "title", "", "release" )
    .setResultTransformer( new BasicTransformerAdapter() {
        public VideoGameDto transformTuple(Object[] tuple, String[] aliases) {
            return new VideoGameDto( (String) tuple[0], (String) tuple[1], (Date) tuple[2] );
    } );

List<VideoGameDto> results = query.getResultList();

I’ve pushed this example to the demo repo on GitHub.

There are also some ready-made implementations of the ResultTransformer interface which you might find helpful. So be sure to check out its type hierarchy. For this example I found it easiest to extend BasicTransformerAdapter and implement the transformTuple() method by hand.

To the person asking the question: Thanks, and I hope this answer is helpful to you!

Hibernate Validator 5.2.4.Final is out

Posted by    |       |    Tagged as Hibernate Validator Releases

It’s my pleasure to announce the release of Hibernate Validator 5.2.4.Final!

This is a rather small bugfix release which addresses two nasty issues around one of the more advanced features of Bean Validation, redefined default group sequences.

Please refer to the issues themselves (HV-1055, HV-1057) for all the details.

Where do I get it?

Use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.2.4.Final to fetch the release with Maven, Gradle etc. Alternatively, you can find distribution bundles containing all the bits on SourceForge (TAR.GZ, ZIP).

Found a bug? Have a feature request? Then let us know through the following channels:

It’s my pleasure to announce the release of Hibernate Validator 5.2.3.Final!

Wait, didn’t we already do another Hibernate Validator release earlier this month? That’s right, indeed we pushed out the first Alpha of the 5.3 family a couple of days ago. And normally, that’d mean that there would be no further releases of earlier version families.

But in this case we decided to do an exception from the rule as we noticed that Hibernate Validator couldn’t be used with Java 9 (check out issue HV-1048 if you are interested in the details). As we don’t want to keep integrators and users of Hibernate Validator from testing their own software on Java 9, we decided to fix that issue on the current stable release line (in fact we strongly encourage you to test your applications on Java 9 to learn as early as possible about any potential changes you might need to make).

While we were at it, we backported some further bugfixes from 5.3 to 5.2, amongst them one for ensuring compatability with the Google App Engine. As always, you can find the complete list of fixes in the changelog.

Where do I get it?

Use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.2.3.Final to fetch the release with Maven, Gradle etc. Alternatively, you can find distribution bundles containing all the bits on SourceForge (TAR.GZ, ZIP).

Found a bug? Have a feature request? Then let us know through the following channels:

Hibernate Validator 5.3.0.Alpha1 is out

Posted by    |       |    Tagged as Hibernate Validator Releases

It’s my pleasure to announce the first release of Hibernate Validator 5.3!

The overarching idea for the 5.3 timeline is to prototype several features which may potentially be standardized in the Bean Validation 2.0 specification. For instance we’ll work on a solution for the long-standing request for sorting the constraints on single properties.

If you’d like to see any specific features addressed in that prototyping work (and eventually included in BV 2.0), then please get in touch and let us know which are the most important things you are missing from the spec. We’ve compiled a first list of issues we are considering for inclusion in BV 2.0. For sure we cannot address all of them, so it’ll greatly help if you tell us what would be most helpful to you.

Dynamic payloads for constraints

To get things rolling, the Alpha 1 allows to you to enrich custom constraint violations with additional context data. Code examining constraint violations can access and interpret this data in a safer way than by parsing string-based constraint violation messages. Think of it as a dynamic variant of the existing Bean Validation payload feature.

As an example, let’s assume we have a constraint @Matches for making sure a long property matches a given value with some tolerance:

public static class Package {

    @Matches(value=1000, tolerance=100)
    public long weight;

If the annotated value is invalid, the resulting constraint violation should have a specific severity, depending on whether the value lies within the given tolerance or not. That severity value could then for instance be used for formatting the error specifically in a UI.

The definition of the @Matches constraint is nothing new, it’s just a regular custom constraint annotation:

@Constraint(validatedBy = { MatchesValidator.class })
public @interface Matches {

    public enum Severity { WARN, ERROR; }

    String message() default "Must match {value} with a tolerance of {tolerance}";
    Class<?>[] groups() default {};
    Class<? extends Payload>[] payload() default {};

    long value();
    long tolerance();

The constraint validator is where it’s getting interesting:

public class MatchesValidator implements ConstraintValidator<Matches, Long> {

    private long tolerance;
    private long value;

    public void initialize(Matches constraintAnnotation) {
        this.value = constraintAnnotation.value();
        this.tolerance = constraintAnnotation.tolerance();

    public boolean isValid(Long value, ConstraintValidatorContext context) {
        if ( value == null ) {
            return true;

        if ( this.value == value.longValue() ) {
            return true;

        HibernateConstraintValidatorContext hibernateContext = context.unwrap(
                Math.abs( this.value - value ) < tolerance ? Severity.WARN : Severity.ERROR

        return false;

In isValid() the severity object is set via HibernateConstraintValidatorContext#withDynamicPayload(). Note that the payload object must be serializable in case constraint violations are sent to remote clients, e.g. via RMI.

Validation clients may then access the dynamic payload like so:

HibernateConstraintViolation<?> violation = violations.iterator()
    .unwrap( HibernateConstraintViolation.class );

if ( violation.getDynamicPayload( Severity.class ) == Severity.ERROR ) {
    // ...

What else is there?

Other features of the Alpha 1 releases are improved OSGi support (many thanks to Benson Margulies for this!), optional relaxation of parameter constraints (kudos to Chris Beckey!) and several bug fixes and improvements, amongst them support for cross-parameter constraints in the annotation processor (cheers to Nicola Ferraro!).

You can find the complete list of all addressed issues in the change log. To get the release with Maven, Gradle etc. use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.3.0.Alpha1. Alternatively, a distribution bundle containing all the bits is provided on on SourceForge (TAR.GZ, ZIP).

To get in touch, use the following channels:

Hibernate Validator 5.2.2 released

Posted by    |       |    Tagged as Hibernate Validator Releases

I am happy to announce the availability of Hibernate Validator 5.2.2.Final.

This release fixes several bugs, including a nasty regression around private property declarations in inheritance hierarchies and a tricky issue related to classloading in environments such as OSGi.

We also closed a gap in the API for constraint declaration which allows to ignore annotation-based constraints for specific methods, parameters etc.:

HibernateValidatorConfiguration config = Validation.byProvider( HibernateValidator.class ).configure();

ConstraintMapping mapping = config.createConstraintMapping();
mapping.type( OrderService.class )
    .method( "placeOrder", Item.class, int.class )
        .ignoreAnnotations( true )
        .parameter( 0 )
            .ignoreAnnotations( false );
config.addMapping( mapping );

Validator validator = config.buildValidatorFactory().getValidator();

Please refer to the change log for the complete list of all issues. You can get the release with Maven, Gradle etc. using the GAV coordinates org.hibernate:hibernate-validator::5.2.2.Final. Alternatively, a distribution bundle is provided on on SourceForge (TAR.GZ, ZIP).

Get in touch through the following channels:

Order, ooorder! Sorting results in Hibernate Search 5.5

Posted by    |       |    Tagged as Hibernate Search

"Order, ooorder!" - Sometimes not only the honourable members of the House of Commons need to be called to order, but also the results of Hibernate Search queries need to be ordered in a specific way.

To do so, just pass a Sort object to your full-text query before executing it, specifying the field(s) to sort on:

FullTextSession session = ...;
QueryParser queryParser = ...;

FullTextQuery query = session.createFullTextQuery( queryParser.parse( "summary:lucene" ), Book.class );
Sort sort = new Sort( new SortField( "title", SortField.Type.STRING, false ) );
query.setSort( sort );
List<Book> result = query.list();

As of Lucene 5 (which is what Hibernate Search 5.5 is based on), there is a big performance gain if the fields to sort on are known up front. In this case these fields can be stored as so-called "doc value fields", which is much faster and less memory-consuming than the traditional approach of index un-inverting.

For that purpose, Hibernate Search provides the new annotation @SortableField (and it’s multi-valued companion, @SortableFields) for tagging those fields that should be available for sorting. The following example shows how do it:

@Indexed(index = "Book")
public class Book {

    private Integer id;

    @DateBridge(resolution = Resolution.DAY)
    private Date publicationDate;

        @Field(name = "sortTitle", analyze = Analyze.NO, store = Store.NO, index = Index.NO)
    @SortableField(forField = "sortTitle")
    private String title;

    private String summary;

    // constructor, getters, setters ...

@SortableField is used next to the known @Field annotation. In case a single field exists for a given property (e.g. publicationDate) just specifying the @SortableField annotation is enough to make that field sortable. If several fields exist (see the title property), specify the field name via @SortableField#forField().

Note that sort fields must not be analyzed. In case you want to index a given property analyzed for searching purposes, just add another, un-analyzed field for sorting as it is shown for the title property. If the field is only needed for sorting and nothing else, you may configure it as un-indexed and un-stored, thus avoid unnecessary index growth.

For using the configured sort fields when querying nothing has changed. Just specify a Sort with the required field(s):

FullTextQuery query = ...;
Sort sort = new Sort(
    new SortField( "publicationDate", SortField.Type.LONG, false ),
    new SortField( "sortTitle", SortField.Type.STRING, false )
query.setSort( sort );

Now what happens if you sort on fields which you have not explicitly declared as sortable, e.g. when migrating an existing application over to Hibernate Search 5.5? The good news is that the sort will be applied as expected from a functional perspective. Hibernate Search detects the missing sort fields and transparently falls back to index-univerting.

But be aware that this comes with a performance penalty (also it is quite memory-intensive as uninverting is RAM-only operation) and it even might happen that this functionality will be removed from Lucene in a future version altogether. Thus watch out for messages like the following in your log files and follow the advice to declare the missing sort fields:

WARN ManagedMultiReader:142 - HSEARCH000289: Requested sort field(s) summary_forSort are not configured for entity \
type mapped to index Book, thus an uninverting reader must be created. You \
should declare the missing sort fields using @SortField.

When migrating an existing application, be sure to rebuild the affected index(es) as described in the reference guide.

With all the required sort fields configured, your search results with be in order, just as the British parliament members after being called to order by Mr. Speaker.

It’s my pleasure to announce the first Alpha release of Hibernate OGM 5!

This release is based on Hibernate ORM 5.0 Final which we released just last week. The update should be smooth in general, but you should be prepared for some changes if you are bootstrapping Hibernate OGM manually through the Hibernate API and not via JPA. If you are using Hibernate OGM on WildFly, you need to adapt your application to the changed module/slot name of the Hibernate OGM core module which has changed from org.hibernate:ogm to org.hibernate.ogm:main.

Check out the Hibernate OGM migration notes to learn more about migrating from earlier versions of Hibernate OGM to 5.x. Also the Hibernate ORM migration guide is a recommended read.

Experimental support for Redis

Hibernate OGM 5 brings tech preview support for Redis which is a high-performance key/value store with many interesting features.

A huge thank you goes out to community member Mark Paluch for this fantastic contribution! Originally started by Seiya Kawashima, Mark took up the work on this backend and delivered a great piece of work in no time. Looking forward to many more of his contributions to come!

The general mapping approach is to store JSON documents as values in Redis. For instance consider the following entity and embeddables:

public class Account {

    private String login;
    private String password;
    private Address homeAddress;

    // getters, setters etc.
public class Address {

    private String street;
    private String city;
    private String zipCode;
    private String country;
    private AddressType type;

    // getters, setters etc.
public class AddressType {

    private String name;

    // getters, setters etc.

This will be persisted into Redis as a JSON document like this under the key "Account:piere":

    "homeAddress": {
        "country": "France",
        "city": "Paris",
        "postalCode": "75007",
        "street": "1 avenue des Champs Elysees",
        "type": {
            "name": "main"
    "password": "like I would tell ya"

Refer to the Redis chapter of the reference guide to learn more about this new dialect and its capabilities. It is quite powerful already (almost all tests of the Hibernate OGM backend TCK pass) and there is support for using it in WildFly, too.

While JSON is a popular choice for storing structured data amongst Redis users, we will investigate alternative mapping approaches, too. Specifically, one interesting approach would be to store entity properties using Redis hashes. This poses some interesting challenges, though, e.g. regarding type conversion (only String values are supported in hashes) as well as handling of embedded objects and associations.

So if you are faced with the challenge of persisting object models into Redis, give this new backend a try and let us know what you think, open feature requests etc.

Improved mapping of Map properties

Map-typed entity properties are persisted in a more natural format now in MongoDB (and also with the new Redis backend). The following shows an example:

public class User {

    private String id;

    @MapKeyColumn(name = "addressType")
    private Map<String, Address> addresses = new HashMap<>();

    // getters, setters etc.

In previous versions of Hibernate OGM this would have been mapped to a MongoDB document like this:

    "id" : 123,
    "addresses" : [
        { "addressType" : "home", "addresses_id" : 456 },
        { "addressType" : "work", "addresses_id" : 789 }

This is not what one would expect from a document store mapping, though. Therefore Hibernate OGM 5 will create the following, more natural representation instead:

    "id" : 123,
    "addresses" : {
        "home" : 456,
        "work" : 789

This representation is more concise and should improve interoperability with other clients working on the same database. If needed - e.g. for migration purposes - you can enforce the previous mapping through the hibernate.ogm.datastore.document.map_storage property. Check out the reference guide for the details.

The optimized mapping is currently only applied if the map key comprises a single column which is of type String. For other types, e.g. Long or composite map keys the previous mapping is used since JSON/BSON only supports field names which are strings.

An open question for us is whether other key column types should be converted into a string or not. If for instance the addresses map was of type Map<Long, Address> one could think of storing the map keys using field names such as "1", "2" etc. Let us know whether that’s something you’d find helpful or not.

Support for multi-get operations

One of the many optimizations in Hibernate ORM is batch fetching of lazily loaded entities. This is controlled using the @BatchSize annotation. So far, Hibernate OGM did not support batch fetching, resulting in more round trips to the datastore than actually needed.

This situation has been improved by introducing MultigetGridDialect which is an optional "capability interface" that Hibernate OGM backends can implement. If a backend happens to support this contract, the Hibernate OGM engine will take advantage of it and fetch entities configured for lazy loading in batches, resulting in a better performance.

At the moment the new Redis backend makes use of this, with the MongoDB and Neo4j backends following soon.

Upgrade to MongoDB driver 3.0

We have upgraded to version 3.0 of the MongoDB driver. Most users of Hibernate OGM should not be affected by this but down the road this will allow us for some nice performance optimizations and support of some new functionality.

Together with the driver update we have reorganized the connection-level options of the MongoDB backend. All String, int and boolean MongoDB client options can be configured now through the hibernate.ogm.mongodb.driver.* namespace:


These options will be passed on to MongoDB’s client builder as-is. Note that the previously existing option hibernate.ogm.mongodb.connection_timeout has been removed in favor of this new approach.

Where can I get it?

You can retrieve Hibernate OGM 5.0.0.Alpha1 via Maven etc. using the following coordinates:

  • org.hibernate.ogm:hibernate-ogm-core:5.0.0.Alpha1 for the Hibernate OGM core module

  • org.hibernate.ogm:hibernate-ogm-<%BACKEND%>:5.0.0.Alpha1 for the NoSQL backend you want to use, with <%BACKEND%> being one of "mongodb", "redis", "neo4j" etc.

Alternatively, you can download archives containing all the binaries, source code and documentation from SourceForge.

Als always we are looking forward to your feedback very much. The change log tells in detail what’s in there for you. Get in touch through the following channels:

back to top