Red Hat

In Relation To

The Hibernate team blog on everything data.

Null and not-null @DiscriminatorValue options

Posted by    |       |    Tagged as Discussions Hibernate ORM

Inheritance and discriminator columns

Although it can be used for JOINED table inheritance, the @DiscriminatorValue is more common for SINGLE_TABLE inheritance. For SINGLE_TABLE, the discriminator column tells Hibernate the subclass entity type associated with each particular database row.

Without specifying a discriminator column, Hibernate is going to use the default DTYPE column. To visualize how it works, consider the following Domain Model inheritance hierarchy:

@Entity(name = "Account")
@Inheritance(strategy = InheritanceType.SINGLE_TABLE)
public static class Account {

    private Long id;

    private String owner;

    private BigDecimal balance;

    private BigDecimal interestRate;

    public Long getId() {
        return id;

    public void setId(Long id) { = id;

    public String getOwner() {
        return owner;

    public void setOwner(String owner) {
        this.owner = owner;

    public BigDecimal getBalance() {
        return balance;

    public void setBalance(BigDecimal balance) {
        this.balance = balance;

    public BigDecimal getInterestRate() {
        return interestRate;

    public void setInterestRate(BigDecimal interestRate) {
        this.interestRate = interestRate;

@Entity(name = "DebitAccount")
public static class DebitAccount extends Account {

    private BigDecimal overdraftFee;

    public BigDecimal getOverdraftFee() {
        return overdraftFee;

    public void setOverdraftFee(BigDecimal overdraftFee) {
        this.overdraftFee = overdraftFee;

@Entity(name = "CreditAccount")
public static class CreditAccount extends Account {

    private BigDecimal creditLimit;

    public BigDecimal getCreditLimit() {
        return creditLimit;

    public void setCreditLimit(BigDecimal creditLimit) {
        this.creditLimit = creditLimit;

For this mode, Hibernate generates the following database table:

create table Account (
    DTYPE varchar(31) not null,
    id bigint not null,
    balance decimal(19,2),
    interestRate decimal(19,2),
    owner varchar(255),
    overdraftFee decimal(19,2),
    creditLimit decimal(19,2),
    primary key (id)

So when inserting two subclass entities:

DebitAccount debitAccount = new DebitAccount();
debitAccount.setId( 1L );
debitAccount.setOwner( "John Doe" );
debitAccount.setBalance( BigDecimal.valueOf( 100 ) );
debitAccount.setInterestRate( BigDecimal.valueOf( 1.5d ) );
debitAccount.setOverdraftFee( BigDecimal.valueOf( 25 ) );

CreditAccount creditAccount = new CreditAccount();
creditAccount.setId( 2L );
creditAccount.setOwner( "John Doe" );
creditAccount.setBalance( BigDecimal.valueOf( 1000 ) );
creditAccount.setInterestRate( BigDecimal.valueOf( 1.9d ) );
creditAccount.setCreditLimit( BigDecimal.valueOf( 5000 ) );

Hibernate will populate the DTYPE column with the subclass class name:

INSERT INTO Account (balance, interestRate, owner, overdraftFee, DTYPE, id)
VALUES (100, 1.5, 'John Doe', 25, 'DebitAccount', 1)

INSERT INTO Account (balance, interestRate, owner, creditLimit, DTYPE, id)
VALUES (1000, 1.9, 'John Doe', 5000, 'CreditAccount', 2)

While this is rather straightforward for most use cases, when having to integrate a legacy database schema, it might be that the discriminator column contains NULL(s) or some values that are not associated to any entity subclass.

Consider that our database contains records like these:

INSERT INTO Account (DTYPE, balance, interestRate, owner, id)
VALUES (NULL, 300, 0.9, 'John Doe', 3)

INSERT INTO Account (DTYPE, active, balance, interestRate, owner, id)
VALUES ('Other', true, 25, 0.5, 'Johnny Doe', 4)

INSERT INTO Account (DTYPE, active, balance, interestRate, owner, id)
VALUES ('Unsupported', false, 35, 0.6, 'John Doe Jr.', 5)

With the previous mappings, when trying to fetch all Account(s):

Map<Long, Account> accounts = entityManager.createQuery(
        "select a from Account a", Account.class )
.collect( Collectors.toMap( Account::getId, Function.identity()));

We’d bump into the following kind of issues:

org.hibernate.WrongClassException: Object [id=3] was not of the specified subclass
[org.hibernate.userguide.inheritance.Account] : Discriminator: null

org.hibernate.WrongClassException: Object [id=4] was not of the specified subclass
[org.hibernate.userguide.inheritance.Account] : Discriminator: Other

org.hibernate.WrongClassException: Object [id=5] was not of the specified subclass
[org.hibernate.userguide.inheritance.Account] : Discriminator: Unsupported

Fortunately, Hibernate allows us to handle these mappings by using NULL and NOT NULL discriminator value mapping.

For the NULL values, we can annotate the base class Account entity as follows:

@Entity(name = "Account")
@Inheritance(strategy = InheritanceType.SINGLE_TABLE)
@DiscriminatorValue( "null" )
public static class Account {

    private Long id;

    private String owner;

    private BigDecimal balance;

    private BigDecimal interestRate;

    // Getter and setter omitted for brevity

For the Other and Unsupported discriminator values, we can have a miscellaneous entity that handles all values that were not explicitly mapped:

@Entity(name = "MiscAccount")
@DiscriminatorValue( "not null" )
public static class MiscAccount extends Account {

    private boolean active;

    public boolean isActive() {
        return active;

    public void setActive(boolean active) { = active;

This way, the aforementioned polymorphic query works and we can even validate the results:

assertEquals(5, accounts.size());
assertEquals( DebitAccount.class, accounts.get( 1L ).getClass() );
assertEquals( CreditAccount.class, accounts.get( 2L ).getClass() );
assertEquals( Account.class, accounts.get( 3L ).getClass() );
assertEquals( MiscAccount.class, accounts.get( 4L ).getClass() );
assertEquals( MiscAccount.class, accounts.get( 5L ).getClass() );

I have also updated the Hibernate 5.0, 5.1, and 5.2 documentations with these two very useful mapping options.

How we fixed all database connection leaks

Posted by    |       |    Tagged as Discussions Hibernate ORM

The context

By default, all Hibernate tests are run on H2. However, we have a lots of database-specific tests as well, so we should be testing on Oracle, PostgreSQL, MySQL, and possibly SQL Server as well.

When we tried to set up a Jenkins job that uses PostgreSQL, we realized that the job fails because we ran out of connections. Knowing that the PostgreSQL server has a max_connections setting of 30, we realized the connection leak issue was significant.

Needle in a haystack

Just the hibernate-core module alone has over 5000 tests, and hibernate-envers has around 2500 tests as well. But there are many mode modules: hibernate-c3p0, hibernate-ehcache, hibernate-jcache, etc. All in all, we couldn’t just browse the code and spot issues. We needed an automated connection leak detector.

That being said, I came up with a solution that works on H2, Oracle, PostgreSQL, and MySQL as well. Luckily, no problem was spotted in the actual framework code base. All issues were caused by unit tests which did not handle database resources properly.

The most common issues

One of the most widespread issue was caused by improper bootstrapping logic:

public void testInvalidMapping() {
    try {
        new MetadataSources( )
                .addAnnotatedClass( TheEntity.class )
        fail( "Was expecting failure" );
    catch (AnnotationException ignore) {

The issue here is that MetadataSources creates a BootstrapServiceRegistry behind the scenes, which in turn triggers the initialization of the underlying ConnectionProvider. Without closing the BootstrapServiceRegistry explicitly, the ConnectionProvider will not get a chance to close all the currently pooled JDBC Connection(s).

The fix is as simple as that:

public void testInvalidMapping() {
    MetadataSources metadataSources = new MetadataSources( )
        .addAnnotatedClass( TheEntity.class );
    try {
        fail( "Was expecting failure" );
    catch (AnnotationException ignore) {
    finally {
        ServiceRegistry metaServiceRegistry = metadataSources.getServiceRegistry();
        if(metaServiceRegistry instanceof BootstrapServiceRegistry ) {
            BootstrapServiceRegistryBuilder.destroy( metaServiceRegistry );

Another recurring issue was improper transaction handling such as in the following example:

protected void cleanup() {
    Session s = getFactory().openSession();

    TestEntity testEntity = s.get( TestEntity.class, "foo" );
    Assert.assertTrue( testEntity.getParams().isEmpty() );

    TestOtherEntity testOtherEntity = s.get( TestOtherEntity.class, "foo" );
    Assert.assertTrue( testOtherEntity.getParams().isEmpty() );


The first thing to notice is the lack of a try/finally block which should be closing the session even if there is an exception being thrown. But that’s not all.

Not a long time ago, I had fixed HHH-7412, meaning that, for RESOURCE_LOCAL (e.g. JDBC Connection-bound transactions), the logical or physical Connection is closed only when the transaction is ended (either commit or rollback).

Before HHH-7412 was fixed, the Connection was closed automatically when the Hibernate Session was closed as well, but this behavior is not supported anymore. Nowadays, aside from closing the underlying Session, you have to commit/rollback the current running Transaction as well:

protected void cleanup() {
    Session s = getFactory().openSession();

    try {
        TestEntity testEntity = s.get( TestEntity.class, "foo" );
        Assert.assertTrue( testEntity.getParams().isEmpty() );

        TestOtherEntity testOtherEntity = s.get( TestOtherEntity.class, "foo" );
        Assert.assertTrue( testOtherEntity.getParams().isEmpty() );

    catch ( RuntimeException e ) {
        throw e;
    finally {

If you are curious of all the changes that were required, you can check the following two commits: da9c6e1 and f5e10c2. The good news is that the PostgreSQL job is running fine now, and soon we will add jobs for Oracle and a MySQL too.

Updating Hibernate ORM in WildFly

Posted by    |       |    Tagged as Hibernate ORM

In this post I’ll show you how easy it is to use the latest and greatest version of Hibernate ORM with WildFly 10.

Traditionally, updating Hibernate in WildFly required some good knowledge of the server’s module system and the structure of the ORM modules. It certainly was doable, but it involved search/replace in existing module descriptors and generally wasn’t very convenient.

This has become much simpler with last week’s release of Hibernate ORM 5.2.1!

We now provide a ZIP archive containing all the required modules, making it a breeze to add the latest version of Hibernate to an existing WildFly instance. And what’s best: the version of Hibernate packaged with the application server remains untouched; switching between this and the new version is just a matter of setting one small configuration option, and you can go back at any time.


The ZIP file is available in Maven Central, so you can automate its download as part of your build if needed. These are the GAV coordinates:

  • groupId: org.hibernate

  • artifactId: hibernate-orm-modules

  • version: 5.2.1.Final

  • classifier: wildfly-10-dist

  • type: zip

Unzip the archive into the modules directory of your WildFly instance. If done correctly, you should see two sub-directories under modules: system (the server’s original modules) and org (the new Hibernate ORM modules).

Choosing the right version of Hibernate ORM

Having added the Hibernate ORM 5.2 modules to the server, you need to configure your application so it uses that specific Hibernate version instead of the default one coming with the server. To do so, just add the following property to your META-INF/persistence.xml file:

    <property name="" value="org.hibernate:5.2" />

In case you have several Hibernate releases added to your WildFly server, you also can define a specific micro version:

<property name="" value="org.hibernate:5.2.1.Final" />

Example project

As an example for using the module ZIP, I’ve created a small Maven project. You can find it in the hibernate-demos repository. To run the example project, simply execute mvn clean verify. Let’s take a quick look at some interesting parts.

First, in the Maven POM file, the maven-dependency-plugin is used to

  • download the WildFly server and unpack it into the target directory

  • download the Hibernate ORM module ZIP and extract its contents into the modules directory of WildFly:

                    <!-- WildFly server; Unpacked into target/wildfly-10.0.0.Final -->
                    <!-- Hibernate ORM modules; Unpacked into target/wildfly-10.0.0.Final/modules -->

Next, Hibernate ORM 5.2 must be enabled for the application using the property as discussed above:

<?xml version="1.0" encoding="UTF-8"?>

    <persistence-unit name="testPu" transaction-type="JTA">

            <property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
            <property name="" value="org.hibernate:5.2"/>

This persistence unit is using the example datasource configured by default in WildFly, which is using an in-memory H2 database.

Finally, we need a test to ensure that actually Hibernate ORM 5.2 is used and not the 5.0 version coming with WildFly. To do so, we can simply invoke one of the methods of EntityManager which are exposed through Hibernate’s classic Session API as of 5.2:
public class HibernateModulesOnWildflyIT {

    public static WebArchive createDeployment() {
        return ShrinkWrap.create( WebArchive.class )
                .addClass( Kryptonite.class )
                .addAsWebInfResource( EmptyAsset.INSTANCE, "beans.xml" )
                .addAsResource( "META-INF/persistence.xml" );

    private EntityManager entityManager;

    public void shouldUseHibernateOrm52() {
        Session session = entityManager.unwrap( Session.class );

        Kryptonite kryptonite1 = new Kryptonite(); = 1L;
        kryptonite1.description = "Some Kryptonite";
        session.persist( kryptonite1 );


        // EntityManager methods exposed through Session only as of 5.2
        Kryptonite loaded = session.find( Kryptonite.class, 1L );

        assertThat( loaded.description, equalTo( "Some Kryptonite" ) );

The call to find() would fail with a NoSuchMethodError if Hibernate ORM 5.0 instead of 5.2 was used. To try it out, simply remove the property from persistence.xml.

The test itself is executed using the great Arquillian tool. It creates a test WAR file with the contents configured in the createDeployment() method, starts up the WildFly server, deploys the WAR file and runs the test within the running container. Using the Arquillian Transactional Extension, the test method is executed within a transaction which is rolled back afterwards.

Feedback welcome!

The Hibernate ORM module ZIP file is a very recent addition to Hibernate. Should you run into any issues, please let us know and we’ll be happy to help.

You can learn more in this topical guide. Hibernate ORM 5.2.1 is the first release to provide a module ZIP, the next 5.1.x release will provide one, too.

Give it a try and let us know about your feedback!

First Hibernate OGM 5 maintenance release

Posted by    |       |    Tagged as Hibernate OGM Releases

We released Hibernate OGM 5.0.1.Final!

What’s new?

Here some of the most interesting bug fixes and improvements in this release:

  • OGM-818 - Autodetection support for @Entity annontated classes will now work

  • OGM-356 - Object comparison in JPQL queries for MongoDB and Neo4j

  • OGM-1065 - You can now use Hibernate OGM with Cassandra 3 (Thanks joexner!)

You can find all the details in the changelog.

This release is backward compatible with Hibernate OGM 5.0.0.Final but if you need to upgrade from a previous version, you can find help on the migration notes.

Where can I get it?

You can get Hibernate OGM 5.0.1.Final core via Maven using the following coordinates:

  • org.hibernate.ogm:hibernate-ogm-core:5.0.1.Final

and these are the back-ends currently available:

  • Cassandra: org.hibernate.ogm:hibernate-ogm-cassandra:5.0.1.Final

  • CouchDB: org.hibernate.ogm:hibernate-ogm-couchdb:5.0.1.Final

  • Infinispan: org.hibernate.ogm:hibernate-ogm-infinispan:5.0.1.Final

  • Ehcache: org.hibernate.ogm:hibernate-ogm-ehcache:5.0.1.Final

  • MongoDB: org.hibernate.ogm:hibernate-ogm-mongodb:5.0.1.Final

  • Neo4j: org.hibernate.ogm:hibernate-ogm-neo4j:5.0.1.Final

  • Redis: org.hibernate.ogm:hibernate-ogm-redis:5.0.1.Final

Alternatively, you can download archives containing all the binaries, source code and documentation from SourceForge.

How can I get in touch?

You can find us through the following channels:

We are looking forward to hear your feedback!

First bug-fix release for ORM 5.2

Posted by    |       |    Tagged as Hibernate ORM Releases

The first bug-fix release for Hibernate ORM 5.2 has just been published. It is tagged at

The complete list of changes can be found here.

For information on consuming the release via your favorite dependency-management-capable build tool, see

The release bundles can be obtained from SourceForge or BinTray.

We are making good progress on our next major release which focuses on Elasticsearch integration but we don’t forget our beloved users of Hibernate Search 5.5.x and here is a new stable release to prove it!

This bugfix release is entirely based on user feedback so keep it coming!

Hibernate Search version 5.5.4.Final is available now and fixes the following issues:

  • HSEARCH-2301 - CriteriaObjectInitializer is suboptimal when we query only one subtype of a hierarchy

  • HSEARCH-2286 - DistanceSortField should support reverse sorting

  • HSEARCH-2306 - Upgrade 5.5.x to Hibernate ORM 5.0.9

  • HSEARCH-2307 - Documentation shouldn’t suggest need for @Indexed of embedded association fields

Small focus on HSEARCH-2301 as it might significantly improve your performances if you index complex hierarchy of objects. Prior to this fix, when querying the database to hydrate the objects, Hibernate Search was using the root type of the hierarchy potentially leading to queries with a lot of joins. Hibernate now builds the most efficient query possible depending on the effective results.

You can see two instances of this issue on Stack Overflow here and here.


How to get this release

Everything you need is available on Hibernate Search’s web site. Download the full distribution from here, or get it from Maven Central using the above coordinates, and don’t hesitate to reach us in our forums or mailing lists.

We also monitor closely the hibernate-search tag on Stack Overflow.

Hibernate Community Newsletter 13/2016

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.



When I started writing High-Performance Java Persistence, I decided to install four database systems on my current machine:

  • Oracle XE

  • SQL Server Express Edition

  • PostgreSQL

  • MySQL

These four relational databases are the most commonly referred ones on our forum, StackOverflow, as well as on most JIRA issues. However, these four top-ranked databases are not enough because, from time to time, we need to integrate Pull Requests for other database systems, like Informix or DB2.

Since installing a plethora of databases on a single machine is not very practical, we can do better than that. Many database providers have generated Docker images for their products, and this post is going to show you haw easy we can start an Informix database.

Running Informix on Docker

IBM offers Docker images for both Informix Innovator-C and DB2 Express-C.

As explained on Docker Hub, you have to start the container using the following command:

docker run -it --name iif_innovator_c --privileged -p 9088:9088 -p 27017:27017 -p 27018:27018 -p 27883:27883 -e LICENSE=accept ibmcom/informix-innovator-c:latest

To run the Informix Docker container, you have to execute the following command:

docker start iif_innovator_c

After the Docker container is started, we can attach a new shell to it:

docker exec -it iif_innovator_c bash

We have a databases.gradle configuration file which contains the connection properties for all databases we use for testing, and, for Informix, we have the following entry:

informix : [
    'db.dialect' : 'org.hibernate.dialect.InformixDialect',
    'jdbc.driver': 'com.informix.jdbc.IfxDriver',
    'jdbc.user'  : 'informix',
    'jdbc.pass'  : 'in4mix',
    'jdbc.url'   : 'jdbc:informix-sqli://;user=informix;password=in4mix'

With this configuration in place, I only need to setup the current configuration file to use Informix:

gradle clean testClasses -Pdb=informix

Now I can run any Informix integration test right from my IDE.

When I’m done, I stop the Docker container with the following command:

docker stop iif_innovator_c

As simple as that!

Bean Validation and the Jigsaw Liaison

Posted by    |       |    Tagged as Discussions

Unless you’ve been living under a rock for the last months and years, you’ve probably heard about the efforts for adding a module system to the Java platform, code-named "Project Jigsaw".

Defining a module system and modularizing a huge system like the JDK is by no means a trivial task, so it’s not by surprise that the advent of Jigsaw has been delayed several times. But I think by now it’s a rather safe bet to expect Jigsaw to be released as part of JDK 9 eventually (the exact release date remains to be defined), especially since it became part of the early access builds a while ago.

This means that if you are an author of a library or framework, you should grab the latest JDK preview build and make sure your lib can be used a) on Java 9 and b) within modularized applications using Jigsaw.

The latter is what we are going to discuss in more detail in the following, taking Bean Validation and its reference implementation, Hibernate Validator as an example. We’ll see what is needed to convert them into Jigsaw modules and use them in a modularized environment.

Now one might ask why having a module system and providing libraries as modules based on such system is a good thing? There are many facets to that, but I think a good answer is that modularization is a great tool for building software systems from encapsulated, loosely-coupled and re-usable components with clearly defined interfaces. It makes API design a very conscious decision and on the other hand gives library authors the freedom to change internal implementation aspects of their module without risking compatibility issues with clients.

If you are not yet familiar with Jigsaw at all, it’s recommended to take a look at the project home page. It contains links to many useful resources, especially check out "The State of the Module System" which gives a great overview. I also found this two-part introduction very helpful.

Getting started

In order to follow our little experiment of "Jigsaw-ifying" Bean Validation, you should be using a Bash-compatible shell and be able to run commands such as wget. On systems lacking support for Bash by default, Cygwin can be used. You also need git to download some source code from GitHub.

Let’s get started by downloading and installing the latest JDK 9 early access build (build 122 has been used when writing this post). Then run java -version to confirm that JDK 9 is enabled. You should see an output like this:

java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+122)
Java HotSpot(TM) 64-Bit Server VM (build 9-ea+122, mixed mode)

After that, create a base directory for our experiments:

mkdir beanvalidation-with-jigsaw

Change into that directory and create some more sub-directories for storing the required modules and 3rd-party libraries:

cd beanvalidation-with-jigsaw

mkdir sources
mkdir modules
mkdir automatic-modules
mkdir tools

As tooling support for Java 9 / Jigsaw still is rather limited at this point, we are going to use plain javac and java commands to compile and test the code. Although that’s not as bad as it sounds and it’s indeed a nice exercise to learn about the existing and new compiler options, I admit I’m looking forward to the point where the known build tools such as Maven will fully support Jigsaw and allow compiling and testing modularized source code. But for now, the plain CLI tools will do the trick :)

Download the source code for Bean Validation and Hibernate Validator from GitHub:

git clone sources/beanvalidation-api
git clone sources/hibernate-validator

As we cannot leverage Maven’s dependency management, we fetch the dependencies required by Hibernate Validator via wget, storing them in the automatic-modules (dependencies) and tools (the JBoss Logging annotation processor needed for generating logger implementations) directory, respectively:

wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules
wget -P automatic-modules

wget -P tools
wget -P tools
wget -P tools

Automatic modules are a means of Jigsaw to work with libraries in a modularized environment which have not yet been modularized themselves. Essentially, an automatic module is a module which exports all its packages and reads all other named modules.

Its module name is derived from the JAR file name, applying some rules for splitting artifact name and version and replacing hyphens with dots. So e.g. jboss-logging-annotations-2.0.1.Final.jar will have the automatic module name jboss.logging.annotations.

Creating modules for Bean Validation API and Hibernate Validator

Currently, the Bean Validation API and Hibernate Validator are in a state where they can be compiled out of the box using Java 9. But what’s still missing are the required module descriptors which describe a module’s name, its public API, its dependencies to other modules and some other things.

Module descriptors are Java files named and live in the root of a given module. Create the descriptor for the Bean Validation API with the following contents:

module javax.validation {(1)
    exports javax.validation;(2)
    exports javax.validation.bootstrap;
    exports javax.validation.constraints;
    exports javax.validation.constraintvalidation;
    exports javax.validation.executable;
    exports javax.validation.groups;
    exports javax.validation.metadata;
    exports javax.validation.spi;

    uses javax.validation.spi.ValidationProvider;(3)
1 Module name
2 All the packages the module exports (as this is an API module, all contained packages are exported)
3 The usage of the ValidationProvider service


Services have been present in Java for a long time. Originally added as an internal component in the JDK, the service loader mechanism became an official part of the platform as of Java 6.

Since then it has seen wide adoption for building extensible applications from loosely coupled components. With its help, service consumers can solely be implemented against a well-defined service contract, without knowing upfront about a specific service provider and its implementation. Jigsaw embraces the existing service concept and makes services first-class citizens of the modularized world.

Luckily, Bean Validation has been using the service mechanism for locating providers (such as Hibernate Validator) from the get go, so things play out nicely with Jigsaw. As we’ll see in a minute, Hibernate Validator provides an implementation of the ValidationProvider service, allowing the user to bootstrap it without depending on this specific implementation.

But for now let’s compile the Bean Validation module:

export BASE=`pwd`
cd sources/beanvalidation-api
javac -d $BASE/modules/javax.validation $(find src/main/java -name "*.java")
cd $BASE

After compilation, the built module can be found under modules/javax.validation. Note that modules usually will be packaged and redistributed as JAR files, but to keep things simple let’s just work with class directory structures here.

Things get a bit more interesting when it comes to Hibernate Validator. Its module descriptor should look like this:

module org.hibernate.validator.engine {(1)
    exports org.hibernate.validator;(2)
    exports org.hibernate.validator.cfg;
    exports org.hibernate.validator.cfg.context;
    exports org.hibernate.validator.cfg.defs;
    exports org.hibernate.validator.constraints;
    exports org.hibernate.validator.constraintvalidation;
    exports org.hibernate.validator.constraintvalidators;
    exports org.hibernate.validator.engine;
    exports org.hibernate.validator.messageinterpolation;
    exports org.hibernate.validator.parameternameprovider;
    exports org.hibernate.validator.path;
    exports org.hibernate.validator.resourceloading;
    exports org.hibernate.validator.spi.cfg;
    exports org.hibernate.validator.spi.resourceloading;
    exports org.hibernate.validator.spi.time;
    exports org.hibernate.validator.spi.valuehandling;
    exports org.hibernate.validator.valuehandling;

    exports org.hibernate.validator.internal.util.logging to jboss.logging;(3)
    exports org.hibernate.validator.internal.xml to java.xml.bind;

    requires javax.validation;(4)
    requires joda.time;
    requires javax.el.api;
    requires jsoup;
    requires jboss.logging.annotations;
    requires jboss.logging;
    requires classmate;
    requires paranamer;
    requires hibernate.jpa;
    requires java.xml.bind;
    requires java.xml;
    requires java.scripting;
    requires javafx.base;

    provides javax.validation.spi.ValidationProvider with

    uses javax.validation.ConstraintValidator;(6)
1 The module name
2 All the packages the module exports; Hibernate Validator always had a very well defined public API, with all the code parts not meant for public usage living in an internal package. Naturally, only the non-internal parts are exported. Things will be more complex when modularizing an existing component without such clearly defined public API. Likely you’ll need to move some classes around first, untangling public API and internal implementation parts.
3 Two noteworthy exceptions are o.h.v.internal.util.logging and o.h.v.internal.xml which are exported via a so-called "qualified exports". This means that only the jboss.logging module may access the logging package and only java.xml.bind may access the XML package. This is needed as these modules require reflective access to the logging and XML classes, respectively. Using qualified exports, this exposure of internal classes can be limited to the smallest degree possible.
4 All the modules which this module requires. These are the javax.validation module we just built, all the automatic modules we downloaded before and some modules coming with the JDK itself (java.xml.bind, javafx.base etc).
Some of these dependencies might be considered optional at runtime, e.g. Joda Time would only be needed at runtime when actually validating Joda Time types with @Past or @Future. Unfortunately - and in contrast to OSGi or JBoss Modules - Jigsaw doesn’t support the notion of optional module requirements, meaning that all module requirements must be satisfied at compile time as well as runtime. That’s a pity, as it prevents a common pattern for libraries which expose certain functionality depending on what dependencies/classes are available at runtime or not.
The "right answer" with Jigsaw would be to extract these optional features into their own modules (e.g. hibernate.validator.joda.time, hibernate.validator.jsoup etc.) but this comes at the price of making things more complex for users which then need to deal with all these modules.
5 The module provides an implementation of the ValidationProvider service
6 The module uses the ConstraintValidator service, see below

With the module descriptor in place, we can compile the Jigsaw-enabled Hibernate Validator module:

cd sources/hibernate-validator/engine
mkdir -p target/generated-sources/jaxb

xjc -enableIntrospection -p org.hibernate.validator.internal.xml \(1)
    -extension \
    -target 2.1 \
    -d target/generated-sources/jaxb \
    src/main/xsd/validation-configuration-1.1.xsd src/main/xsd/validation-mapping-1.1.xsd \
    -b src/main/xjb/binding-customization.xjb

javac -addmods java.xml.bind,java.annotations.common \(2)
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -processorpath $BASE/tools/jboss-logging-processor-2.0.1.Final.jar:$BASE/tools/jdeparser-2.0.0.Final.jar:$BASE/tools/jsr250-api-1.0.jar:$BASE/automatic-modules/jboss-logging-annotations-2.0.1.Final.jar::$BASE/automatic-modules/jboss-logging-3.3.0.Final.jar \
    -d $BASE/modules/org.hibernate.validator.engine \
    $(find src/main/java -name "*.java") $(find target/generated-sources/jaxb -name "*.java")

cp -r src/main/resources/* $BASE/modules/org.hibernate.validator.engine;(3)

cp -r src/main/xsd/* $BASE/modules/org.hibernate.validator.engine/META-INF;(4)
cd $BASE
1 The xjc utility is used to create some JAXB types from the XML constraint descriptor schemas
2 Compile the source code via javac
3 Copy error message resource bundle into the module directory
4 Copy XML schema files into the module directory

Note how the module path used for compilation refers to the modules directory (containing the javax.validation module) and the automatic-modules directory (containing all the dependencies such as Joda Time etc.).

The resulting module is located under modules/org.hibernate.validator.engine.

Giving it a test ride

Having converted Bean Validation API and Hibernate Validator into proper Jigsaw modules, it’s about time to give these modules a test ride. Create a new compilation unit for that:

mkdir -p sources/com.example.acme/src/main/java/com/example/acme

Within that directory structure, create a very simple domain class and a class with a main method for validating it:

package com.example.acme;

import java.util.List;

import javax.validation.constraints.Min;

public class Car {

    public int seatCount;

    public List<String> passengers;

    public Car(int seatCount, List<String> passengers) {
        this.seatCount = seatCount;
        this.passengers = passengers;
package com.example.acme;

import java.util.Collections;
import java.util.Set;

import javax.validation.ConstraintViolation;
import javax.validation.Validation;
import javax.validation.Validator;

public class ValidationTest {

    public static void main(String... args) {
        Validator validator = Validation.buildDefaultValidatorFactory()

        Set<ConstraintViolation<Car>> violations = validator.validate( new Car( 0, Collections.emptyList() ) );

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );

This obtains a Validator object via Validation#buildDefaultValidatorFactory() (which internally uses the service mechanism described above) and performs a simple validation of a Car object.

Of course we need a, too:

module com.example.acme {
    exports com.example.acme;

    requires javax.validation;

That should look familiar by now: we just export the single package (so Hibernate Validator can access the state of the Car object) and depend on the Bean Validation API module.

Also compilation of this module isn’t much news:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cd $BASE

And with that, we finally can run a first test of Bean Validation under Jigsaw:

java \
    -modulepath modules:automatic-modules \
    -m com.example.acme/com.example.acme.ValidationTest

Similar to javac, there is a new modulepath option for the java command, pointing to one or more directories with Jigsaw modules. The -m switch specifies the main class to run by giving its module name and fully qualified class name.

Mh, that was not really successful:

HV000149: An exception occurred during message interpolation
Caused by: java.lang.UnsupportedOperationException: ResourceBundle.Control not supported in named modules
    at java.util.ResourceBundle.checkNamedModule(java.base@9-ea/
    at java.util.ResourceBundle.getBundle(java.base@9-ea/
    at org.hibernate.validator.resourceloading.PlatformResourceBundleLocator.loadBundle(org.hibernate.validator.engine/

What’s that about? Hibernate Validator is using the Control class in order to merge the contents (error messages) of several resource bundles with the same name found on the classpath. This is not supported in the modularized environment any longer, hence the exception above is raised. Eventually, Hibernate Validator should handle this situation automatically (this is tracked under HV-1073).

For now let’s hack around it and disable the troublesome bundle aggregregation in Hibernate Validator’s AbstractMessageInterpolator. To do so, change true to false in the constructor invocation on line 165:

new PlatformResourceBundleLocator(

Re-compile the Hibernate Validator module. After running the test again, you should now see the following output on the console:

Validation error: must be greater than or equal to 1

Tada, the first successful bean validation in the Jigsaw environment :)

Let me quickly recap what has happened so far:

  • We added a module descriptor to the Bean Validation API, making it a proper Jigsaw module

  • We added a module descriptor to Hibernate Validator; this Bean Validation provider will be discovered by the API module using the service mechanism

  • We created a test module with a main method, which uses the Bean Validation API to perform a simple object validation

(Not) overstepping boundaries

Now let’s be nasty and see whether the module system actually is doing its job as expected. For that add the module requirement to Hibernate Validator to the module descriptor (so its considered for compilation at all) and cast the validator to the internal implementation type in ValidationTest:

module com.example.acme {
    exports com.example.acme;

    requires javax.validation;
    requires org.hibernate.validator.engine;
package com.example.acme;

import javax.validation.Validation;

import org.hibernate.validator.internal.engine.ValidatorImpl;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        ValidatorImpl validator = (ValidatorImpl) Validation.buildDefaultValidatorFactory()

Running javac again, you should now get a compilation error, complaining about the type not being found. So Jigsaw prevents accesses to non-exported types. If you like, try referencing anything from the packages exported by Hibernate Validator, which will work.

That’s a great advantage over the traditional flat classpath, where you might have organized your code base into public and internal parts but then had to hope for users of your library not to step across the line and - accidentally or intentionally - access internal classes.

Custom constraints

With the modules basically working, it’s time to get a bit more advanced and create a custom Bean Validation constraint. This one should make sure that a car does not have more passengers than seats available.

For that we need an annotation type:

package com.example.acme;

import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;

import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;

import javax.validation.Constraint;
import javax.validation.Payload;

@Constraint(validatedBy = { PassengersDontExceedSeatCountValidator.class })
@Target({ TYPE })
public @interface PassengersDontExceedSeatCount {

    String message() default "{com.example.acme.PassengersDontExceedSeatCount.message}";
    Class<?>[] groups() default { };
    Class<? extends Payload>[] payload() default { };

And also a constraint validator implementation:

package com.example.acme;

import javax.validation.ConstraintValidator;
import javax.validation.ConstraintValidatorContext;

import com.example.acme.PassengersDontExceedSeatCount;

public class PassengersDontExceedSeatCountValidator implements
        ConstraintValidator<PassengersDontExceedSeatCount, Car> {

    public void initialize(PassengersDontExceedSeatCount constraintAnnotation) {}

    public boolean isValid(Car car, ConstraintValidatorContext constraintValidatorContext) {
        if ( car == null ) {
            return true;

        return car.passengers == null || car.passengers.size() <= car.seatCount;

A resource bundle with the error message for the constraint is needed, too:

com.example.acme.PassengersDontExceedSeatCount.message=Passenger count must not exceed seat count

Now we can put the new constraint type to the Car class and finally validate it:

public class Car {
package com.example.acme;

import java.util.Arrays;
import java.util.Set;

import javax.validation.Validation;
import javax.validation.Validator;
import javax.validation.ConstraintViolation;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        Validator validator = Validation.buildDefaultValidatorFactory()

        Set<ConstraintViolation<Car>> violations = validator.validate(
            new Car( 2, Arrays.asList( "Anna", "Bob", "Alice" ) )

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );

Compile the example module again; don’t forget to copy the resource bundle to the module directory:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cp -r src/main/resources/* $BASE/modules/com.example.acme
cd $BASE

Run it as before, and you should get a nice error message. But what’s that:

Validation error: {com.example.acme.PassengersDontExceedSeatCount.message}

It seems the error message wasn’t resolved properly, so the raw interpolated message key from the annotation definition has been returned. Now why is this?

Bean Validation error messages are loaded through java.util.ResourceBundle, and due to the strong encapsulation of the modularized environment the Hibernate Validator module cannot "see" the resource bundle provided in the example module.

The updated JavaDocs of ResourceBundle make it clear that only bundles located in the same module as the caller of ResourceBundle#getBundle() can be accessed. In order to access resource bundles from other modules, the service loader mechanism is to be used as per Java 9; A new SPI interface, ResourceBundleProvider, has been added to the JDK for that purpose.

Ultimately, Bean Validation should take advantage of that mechanism, but how can we make things work out for now? As it turns out, Hibernate Validator has its own extension point for customizing the retrieval of resource bundles, ResourceBundleLocator.

This comes in very handy now: we just need to create an implementation of that SPI in the example module:

package com.example.acme.internal;

import java.util.Locale;
import java.util.ResourceBundle;

import org.hibernate.validator.spi.resourceloading.ResourceBundleLocator;

public class MyResourceBundleLocator implements ResourceBundleLocator {

    public ResourceBundle getResourceBundle(Locale locale) {
        return ResourceBundle.getBundle( "ValidationMessages", locale );

When bootstrapping the validator factory, configure a message interpolator using that bundle locator like this:

import org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator;
import com.example.acme.internal.MyResourceBundleLocator;


Validator validator = Validation.byDefaultProvider()
    .messageInterpolator( new ResourceBundleMessageInterpolator( new MyResourceBundleLocator() ) )

As the call to ResourceBundle#getBundle() now originates from the same module that declares the ValidationMessages bundle, the bundle can be found and the error message will be interpolated correctly. Success!

Keeping your privacy

With the custom constraint in place, let’s think about encapsulation a bit more. Wouldn’t it be nice if the constraint validator implementation didn’t live in the exported package but rather somewhere under internal? After all, that class is an implementation detail and should not be referenced directly by users of the @PassengersDontExceedSeatCount constraint.

Another feature of Hibernate Validator is helpful here: service-loader based discovery of constraint validators.

This allows us to remove the reference from the constraint annotation to its validator (just add an empty @Constraint({}) annotation) and relocate the validator implementation to the internal package:

mv sources/com.example.acme/src/main/java/com/example/acme/ \

Also adapt the package declaration in the source file accordingly and add an import for the Car type. We then need to declare the constraint validator as service provider in the module descriptor:

module com.example.acme {

    provides javax.validation.ConstraintValidator
        with com.example.acme.internal.PassengersDontExceedSeatCountValidator;

Compile and run the example module again. You should get an error like this:

java.lang.IllegalAccessException: class org.hibernate.validator.internal.util.privilegedactions.NewInstance (in module org.hibernate.validator.engine) cannot access class com.example.acme.internal.PassengersDontExceedSeatCountValidator (in module com.example.acme) because module com.example.acme does not export com.example.acme.internal to module org.hibernate.validator.engine

This originates from the fact that Hibernate Validator is using the service loader mechanism only for detecting validator types and then instantiates them for each specific constraint usage. As the internal package has not been exported, this instantiation is bound to fail. You have two options now:

  • Use a qualified export in to expose that package to the Hibernate Validator module

  • Use the new -XaddExports option of the java command to dynamically add this export when running the module

Following the latter approach, the java invocation would look like this:

java \
    -modulepath modules:automatic-modules \
    -XaddExports:com.example.acme/com.example.acme.internal=org.hibernate.validator.engine \
    -m com.example.acme/com.example.acme.ValidationTest

While this approach works, it can become a bit tedious when taking other libraries into the picture, that need to perform reflective operations on non-exported types. JPA providers such as Hibernate ORM and dependency injection frameworks are just two examples.

Luckily, the OpenJDK team is aware of that issue and there is an entry for it in the requirements list for the Java Module System: ReflectiveAccessToNonExportedTypes. I sincerely hope that this one gets addressed before Java 9 gets finalized.

XML configuration

As last part of our journey of "Jigsaw-ifying" Bean Validation, let’s take a look at XML-based configuration of constraints. This is a useful alternative, if you cannot put constraint metadata to a model via annotations or e.g. want to override existing annotation-based constraints externally.

The Bean Validation spec defines a validation mapping file for this, which in turn can point to one or more constraint mapping XML files. Create the following files in order to override the @Min constraint of the Car class:

<?xml version="1.0" encoding="UTF-8"?>


<?xml version="1.0" encoding="UTF-8"?>

    xsi:schemaLocation=" validation-mapping-1.1.xsd"
    xmlns="" version="1.1">

    <bean class="com.example.acme.Car" ignore-annotations="true">
        <field name="seatCount">
            <constraint annotation="javax.validation.constraints.Min">
                <element name="value">2</element>

Traditionally, Bean Validation will look for META-INF/validation.xml on the classpath and resolve any linked constraint mapping files relatively to that. If you’ve followed this article that far, you won’t be surprised that this is not going to work in Jigsaw. There is no notion of a "flat classpath" any longer, and thus a validation provider cannot see XML files in other modules, akin to the case of error message bundles discussed above.

More specifically, the method ClassLoader#getResourceAsStream() which is used by Hibernate Validator to open mapping files, won’t work for named modules as of JDK 9. That change will be a tough nut to crack for many projects when migrating to Java 9, as it renders strategies for resource loading known from existing modular environment such as OSGi inoperable. E.g. Hibernate Validator allows to pass in a classloader for loading user-provided resources. In OSGi this can be used to pass what’s called the bundle loader and allow Hibernate Validator to access constraint mapping files and other things provided by the user. This pattern cannot be employed with Jigsaw unfortunately, as getResourceAsStream() "does not find resources in named modules".

But Bean Validation has a way out for this issue, too, as it allows to pass in constraint mappings as InputStream opened by the bootstrapping code. Class#getResourceAsStream() continues to work for resources from the same module, so things will work out as expected when bootstrapping the validator factory like this (don’t forget to close the stream afterwards):

InputStream constraintMapping = ValidationTest.class.getResourceAsStream( "/META-INF/constraints-car.xml" );

Validator validator = Validation.byDefaultProvider()
    .addMapping( constraintMapping )

That way the constraint mapping is opened by code from the same module and thus can be accessed and passed to the Bean Validation provider.

In the longer term, APIs such as Bean Validation should foresee some kind of SPI contract for conveying all required configuration information as per Jigsaw spec lead Mark Reinhold. The user would then expose an instance of that contract via a service implementation.

Wrapping it up

With that, we conclude our experiment of making Bean Validation ready for the Jigsaw module system. Overall, things work out pretty well, and without too much effort Bean Validation and Hibernate Validator can be made first-class citizens in a fully-modularized world.

Some observations from the experiment:

  • The lack of optional module requirements poses a usability issue in my opinion, as it prevents libraries from exposing additional functionality based on what classes and modules are present at runtime in a given environment. This means that users of the library either need to provide other modules they don’t actually need or - if the library has been cut into several modules, one per optional dependency - need to add several modules now where they could have worked with a single one before.

  • The need to explicitly expose internal packages for reflective access by libraries such as Hibernate Validator, but also JPA providers or DI containers, can become tedious. I hope there will be a way to enable such access in a more global way, e.g. by whitelisting "trustworthy modules" such as the aforementioned libraries.

  • The changed behaviors around loading of resources such as configuration files or resource bundles provided by the user of a library will potentially affect many applications when migrating to Java 9. The established pattern of accepting an external classloader for loading user resources will not work anymore, so libraries need to adapt either by providing dedicated extension points (akin to addMapping(InputStream) in Bean Validation) or by migrating to service based approaches as envisioned by the makers of Jigsaw.

  • Tools such as Maven (including plug-ins as e.g. Surefire for running tests) or Gradle but also IDEs still need to catch up with Jigsaw. Using plain javac and java can be fun for a while, but you wish back more powerful tools rather quickly :)

  • Converting Hibernate Validator into a Jigsaw module is relatively easy, as we luckily were very careful about a proper API/implementation split from the beginning. Modularizing existing libraries or applications without such clear distinction will be a much tougher exercise, as it may require lots of types to be moved around and unwanted (package) dependencies to be broken up. There are some tools that can help with that, but that might be a topic for a future blog post by itself.

One thing is for sure: interesting times lie ahead! While migration might be painful here and there, I think it’s overdue that Java gets its proper module system and I look forward to seeing it as an integrated part of the platform very much.

Got feedback from following the steps described above or from your own experiments with Jigsaw? Let’s all together learn from our different experiences and insights, so please share any thoughts on the topic below.

Many thanks to Sander Mak, Sanne Grinovero and Guillaume Smet for reviewing this post!

back to top