Red Hat

In Relation To Gunnar Morling

In Relation To Gunnar Morling

Building Multi-Release JARs with Maven

Posted by    |       |    Tagged as Discussions

Java 9 comes with a new feature very useful to library authors: multi-release JARs (JEP 238).

A multi-release JAR (MR JAR) may contain multiple variants of one and the same class, each targeting a specific Java version. At runtime, the right variant of the class will be loaded automatically, depending on the Java version being used.

This allows library authors to take advantage of new Java versions early on, while keeping compatibility with older versions at the same time. If for instance your library performs atomic compare-and-set operations on variables, you may currently be doing so using the sun.misc.Unsafe class. As Unsafe has never been meant for usage outside the JDK itself, Java 9 comes with a supported alternative for CAS logics in form of var handles. By providing your library as an MR JAR, you can benefit from var handles when running on Java 9 while sticking to Unsafe when running on older platforms.

In the following we’ll discuss how to create an MR JAR using Apache Maven.

Structure of a Multi-Release JAR

Multi-Release JARs contain several trees of class files. The main tree is at the root of the JAR, whereas version-specific trees are located under META-INF/versions, e.g. like this:

JAR root
- Foo.class
- Bar.class
+ META-INF
   - MANIFEST.MF
   + versions
      + 9
         - Bar.class

Here the Foo and the Bar class from the JAR root will be used on Java runtimes which are not aware of MR JARs (i.e. Java 8 and earlier), whereas Foo from the JAR root and Bar from META-INF/versions/9 will be used under Java 9 and later. The JAR manifest must contain an entry Multi-Release: true to indicate that the JAR is an MR JAR.

Example: Getting the Id of the Current Process

As an example let’s assume we have a library which defines a class providing the id of process (PID) it is running in. PIDs shall be represented by a descriptor comprising the actual PID and a String describing the provider of the PID:

src/main/java/com/example/ProcessIdDescriptor.java
package com.example;

public class ProcessIdDescriptor {

    private final long pid;
    private final String providerName;

    // constructor, getters ...
}

Up to Java 8, there is no easy way to obtain the id of the running process. One rather hacky approach is to parse the return value of RuntimeMXBean#getName() which is "pid@hostname" in the OpenJDK / Oracle JDK implementation. While that behavior is not guaranteed to be portable across implementations, let’s use it as the basis for our default ProcessIdProvider:

src/main/java/com/example/ProcessIdProvider.java
package com.example;

public class ProcessIdProvider {

    public ProcessIdDescriptor getPid() {
        String vmName = ManagementFactory.getRuntimeMXBean().getName();
        long pid = Long.parseLong( vmName.split( "@" )[0] );
        return new ProcessIdDescriptor( pid, "RuntimeMXBean" );
    }
}

Also let’s create a simple main class for displaying the PID and the provider it was retrieved from:

src/main/java/com/example/Main.java
package com.example;

public class Main {

    public static void main(String[] args) {
        ProcessIdDescriptor pid = new ProcessIdProvider().getPid();

        System.out.println( "PID: " + pid.getPid() );
        System.out.println( "Provider: " + pid.getProviderName() );
    }
}

Note how the source files created so far are located in the regular src/main/java source directory.

Now let’s create another variant of ProcessIdDescriptor based on Java 9’s new ProcessHandle API, which eventually provides a portable way for obtaining the current PID. This source file is located in another source directory, src/main/java9:

src/main/java9/com/example/ProcessIdProvider.java
package com.example;

public class ProcessIdProvider {

    public ProcessIdDescriptor getPid() {
        long pid = ProcessHandle.current().getPid();
        return new ProcessIdDescriptor( pid, "ProcessHandle" );
    }
}

Setting up the build

With all the source files in place, it’s time to configure Maven so an MR JAR gets built.

Three steps are required for that. The first thing is to compile the additional Java 9 sources under src/main/java9. I hoped I could simply set up another execution of the Maven compiler plug-in for that, but I could not find a way which only would compile src/main/java9 but not the ones from src/main/java for second time.

As a work-around, the Maven Antrun plug-in can be used for configuring a second javac run just for the Java 9 specific sources:

pom.xml
...
<properties>
    <java9.sourceDirectory>${project.basedir}/src/main/java9</java9.sourceDirectory>
    <java9.build.outputDirectory>${project.build.directory}/classes-java9</java9.build.outputDirectory>
</properties>
...
<build>
    ...
    <plugins>
        ...
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-antrun-plugin</artifactId>
            <executions>
                <execution>
                    <id>compile-java9</id>
                    <phase>compile</phase>
                    <configuration>
                        <tasks>
                            <mkdir dir="${java9.build.outputDirectory}" />
                            <javac srcdir="${java9.sourceDirectory}" destdir="${java9.build.outputDirectory}"
                                classpath="${project.build.outputDirectory}" includeantruntime="false" />
                        </tasks>
                    </configuration>
                    <goals>
                        <goal>run</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        ...
    </plugins>
    ...
</build>
...

This uses the target/classes directory (containing the class files emitted by the default compilation) as the classpath, allowing to refer to classes common for all Java versions supported by our MR JAR, e.g. ProcessIdDescriptor. The compiled classes go into target/classes-java9.

The next step is to copy the compiled Java 9 classes into target/classes so they will later be put to the right place within the resulting JAR. The Maven resources plug-in can be used for that:

pom.xml
...
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-resources-plugin</artifactId>
    <executions>
        <execution>
            <id>copy-resources</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${project.build.outputDirectory}/META-INF/versions/9</outputDirectory>
                <resources>
                    <resource>
                        <directory>${java9.build.outputDirectory}</directory>
                    </resource>
                </resources>
            </configuration>
        </execution>
    </executions>
</plugin>
...

This will copy the Java 9 class files from target/classes-java9 to target/classes/META-INF/versions/9.

Finally, the Maven JAR plug-in needs to be configured so the Multi-Release entry is added to the manifest file:

pom.xml
...
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
        <archive>
            <manifestEntries>
                <Multi-Release>true</Multi-Release>
                <Main-Class>com.example.Main</Main-Class>
            </manifestEntries>
        </archive>
        <finalName>mr-jar-demo.jar</finalName>
    </configuration>
</plugin>
...

And that’s it, we got everything together to build a multi-release JAR. Trigger the build via mvn clean package (using Java 9) to create the JAR in the target directory.

In order to take a look whether the JAR contents is alright, list its contents via jar -tf target/mr-jar-demo.jar. You should see the following:

...
com/example/Main.class
com/example/ProcessIdDescriptor.class
com/example/ProcessIdProvider.class
META-INF/versions/9/com/example/ProcessIdProvider.class
...

Eventually, let’s execute the JAR via java -jar target/mr-jar-demo.jar and examine its output. When using Java 8 or earlier, you’ll see the following:

PID: <some pid>
Provider: RuntimeMXBean

Whereas on Java 9, it’ll be this:

PID: <some pid>
Provider: ProcessHandle

I.e. the ProcessIdProvider class from the JAR root will be used on Java 8 and earlier, and the one from META-INF/versions/9 on Java 9.

Conclusion

While javac, jar, java and other JDK tools already support multi-release JARs, build tools like Maven still need to catch up. Luckily, it can be done using some plug-ins for the time being, but it’s my hope that Maven et al. will provide proper support for creating MR JARs out of the box some time soon.

Others have been thinking about the creation of MR JARs, too. E.g. check out this post by my colleague David M. Lloyd. David uses a separate Maven project for the Java 9 specific classes which are then copied back into the main project using the Maven dependency plug-in. Personally, I prefer to have all the sources within one single project, as I find that a tad simpler, though it’s not without quirks either. Specifically, if you have both, src/main/java and src/main/java9, configured as source directories within your IDE, you’ll get an error about the duplicated class ProcessIdProvider. This can be ignored (you might also remove src/main/java9 as a source directory from the IDE if you don’t need to touch it), but it may be annoying to some.

One could think about having the Java 9 classes in another package, e.g. java9.com.example and then using the Maven shade plug-in to relocate them to com.example when building the project, though this seems quite a lot of effort for a small gain. Ultimately, it’d be desirable if IDEs also added support for MR JARs and multiple compilations with different source and target directories within a single project.

Any feedback on this or other approaches for creating MR JARs is welcome in the comments section below. The complete source code of this blog post can be found on GitHub.

Tool Time: Preventing leaky APIs with jQAssistant

Posted by    |       |    Tagged as Discussions

If you’ve ever watched the great show "Home Improvement", you’ll know that a fool with a tool is still a fool. At the same time though, the right tool used in the right way can be very effective for solving complex issues.

In this post I’d like to introduce a tool called jQAssistant which I’ve found very useful for running all sorts of analyses of a project’s code base, e.g. for preventing the leakage of internal types in the public API of a library. This is planned to be the first post in a blog series on developer-centric tools we’ve come to value when working on the different libraries of the Hibernate family.

Keeping your API clean

When providing a library it is an established best practice to clearly distinguish between those parts of the code base intended to be accessed by users (the API of the library) and those parts not meant for outside access.

Having a clearly defined API helps to reduce complexity for the user (they only need to learn and understand the API classes but not all the implementation details), while giving the library authors freedom to refactor and alter implementation classes as they deem it necessary. Usually, the separation of API and implementation types is achieved by using specific package names, with all the non-public parts being located under a package named internal, impl or similar.

One thing you have to be very careful about though is to not leak any internal types in the API. E.g. a method definition like the following is something which should be avoided:

package com.example;

import com.example.internal.Foo;

public interface MyPublicService {

    Foo doFoo();
}

MyPublicService is part of the public API (as it is not part of an internal package), but the doFoo() method returns the internal type Foo. Users of doFoo() thus would have to deal with an implementation type, which is exactly what you wanted to prevent when splitting the API and implementation parts of the library.

Unfortunately, inconsistent APIs like this are defined easily if one isn’t very careful. This is where tooling is helpful: by searching for such malformed APIs in an automated way, they can be spotted early on and be avoided.

Introducing jQAssistant

jQAssistant is an open-source tool allowing you to do exactly that. It parses a project’s code base and creates a model of all the classes, methods, fields etc. in a Neo4j graph database. Using Neo4j’s powerful Cypher query language, you then can execute queries to detect specific patterns and structures in your code base which you are interested in.

Setting up jQAssistant for your project is fairly simple. Assuming you are working with Maven, all that’s needed is the following plug-in configuration in your pom.xml (refer to the official documentation for further details and configuration options):

pom.xml
...
<build>
    <plugins>
        <plugin>
            <groupId>com.buschmais.jqassistant.scm</groupId>
            <artifactId>jqassistant-maven-plugin</artifactId>
            <version>1.1.4</version>
            <executions>
                <execution>
                    <goals>
                        <goal>scan</goal>
                        <goal>analyze</goal>
                    </goals>
                    <configuration>
                        <failOnViolations>true</failOnViolations>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>
...

With the plug-in in place, we can define a Cypher query which finds any API method with an internal return type:

jqassistant/rules.xml
<?xml version="1.0" encoding="UTF-8"?>
<jqa:jqassistant-rules xmlns:jqa="http://www.buschmais.com/jqassistant/core/analysis/rules/schema/v1.0">

    <constraint id="my-rules:PublicMethodsMayNotExposeInternalTypes">
        <description>API/SPI methods must not expose internal types.</description>
        <cypher><![CDATA[
            MATCH
                (class)-[:`DECLARES`]->(method)-[:`RETURNS`]->(returntype)
            WHERE
                NOT class.fqn =~ ".*\\.internal\\..*"
                AND (method.visibility="public" OR method.visibility="protected")
                AND returntype.fqn =~ ".*\\.internal\\..*"
            RETURN
                method
        ]]></cypher>
    </constraint>

    <group id="default">
        <includeConstraint refId="my-rules:PublicMethodsMayNotExposeInternalTypes" />
    </group>

</jqa:jqassistant-rules>

jQAssistant rules are given in a file named jqassistant/rules.xml by default.

The rules themselves are defined using the Cypher query language. If you haven’t used Cypher before, it may feel a bit uncommon at first, but from my own experience I can tell you that one gets the hang of it pretty quickly. Essentially, you describe patterns of graph nodes, their properties, their type (as expressed via "labels", a kind of tag) and their relationships.

In the query above, we use a pattern including three nodes ("class", "method" and "returntype") and two relationships. There must be a relationship of type "DECLARES" from the "class" node to the "method" node and another relationship of type "RETURNS" from the "method" to the "returntype" node.

As we are only interested in methods leaking internal types through the API, the WHERE clause is used to further refine the selection:

  • The declaring class must be part of the API (it must not be in an internal package, as expressed by filtering on the fully-qualified class name),

  • the method must either have public or protected visibility and

  • the return type must be located in an internal package.

To execute the rule, simply build the project with the jQAssistant Maven plug-in configured as above. The plug-in will automatically execute all rules of the default group given in the rules file. If there is any result for any of the executed rules, the build will fail, displaying the result(s) of the affected rules:

[INFO] --- jqassistant-maven-plugin:1.1.4:analyze (default) @ jqassistant-demo ---
...
[ERROR] --[ Constraint Violation ]-----------------------------------------
[ERROR] Constraint: my-rules:PublicMethodsMayNotExposeInternalTypes
[ERROR] Severity: INFO
[ERROR] API/SPI methods must not expose internal types.
[ERROR]   method=com.example.MyPublicService#com.example.internal.Foo doFoo()
[ERROR] -------------------------------------------------------------------
...
[INFO] BUILD FAILURE

API methods should only return API types, but they also should only take API types as parameters. Let’s expand the Cypher query to cover this case, too:

jqassistant/rules.xml
...
<constraint id="my-rules:PublicMethodsMayNotExposeInternalTypes">
    <description>API/SPI methods must not expose internal types.</description>
    <cypher><![CDATA[
      // return values
      MATCH
          (class)-[:`DECLARES`]->(method)-[:`RETURNS`]->(returntype)
      WHERE
          NOT class.fqn =~ ".*\\.internal\\..*"
          AND (method.visibility="public" OR method.visibility="protected")
          AND returntype.fqn =~ ".*\\.internal\\..*"
      RETURN
          method

      // parameters
      UNION ALL
      MATCH
          (class)-[:`DECLARES`]->(method)-[:`HAS`]->(parameter)-[:`OF_TYPE`]->(parametertype)
      WHERE
          NOT class.fqn =~ ".*\\.internal\\..*"
          AND (method.visibility="public" OR method.visibility="protected")
          AND parametertype.fqn =~ ".*\\.internal\\..*"
      RETURN
          method
    ]]></cypher>
</constraint>
...

Similar to SQL we can add further results using the UNION ALL clause. Searching for leaking parameters is done in a similar way as for return values, the only difference is the node pattern we need to detect: there must be a node (the class) with a DECLARES relationship to another node (the method) which has a relationship of type HAS to a third node (the parameter) which finally has a OF_TYPE relationship to a fourth node representing the parameter’s type. The same rules for package names and the method’s visibility apply as for the return value check.

Browsing the model of your project

When declaring rules as the one above, it is vital to know the meta-model of your project’s graph representation, e.g. which types of nodes there are (i.e. which labels they have), what types of relationships there are, which properties the nodes have and so on. This is described in great depth in the jQAssistant reference documentation.

But as jQAssistant is based on Neo4j, you also can use the browser app coming with the database for interactively exploring your project’s structure. To do so, simply run the following command:

mvn jqassistant:scan jqassistant:server

This will populate jQAssistant’s embedded Neo4j database with your project’s structure and start a Neo4j server. In a browser, go to http://localhost:7474/browser/ and you can explore the project code base, run Cypher queries etc. Start by selecting a node label or relationship type on the left or by submitting a Cypher query:

Browsing a jQAssistant model in Neo4j, align=

Trying it out yourself

You can find a complete example of using jQAssistant in a Maven project here on GitHub. You also may take a look at the rules file used by Hibernate Validator. Besides the check for methods exposing internal types this file defines some more rules:

  • public fields in API types must not expose implementation types

  • API types must not extend implementation types

These checks run regularly on our CI server, preventing the accidental introduction of leaky APIs very effectively.

Creating a model of a software project in a graph database is a great idea. The powerful Cypher query language allows to search for interesting structures and patterns in your project in a rather intuitive way. The detection of leaky APIs is just one example. For instance you may define a layered architecture of your business application and ensure that there are no illegal dependencies between the application layers.

Also jQAssistant is not limited to Java classes. Besides the Java plug-in the tool provides many other scanners, e.g. for Maven POM files, JPA persistence units or XML files, allowing you to run all kinds of analyses tailored to the specific needs of your project.

Updating Hibernate ORM in WildFly

Posted by    |       |    Tagged as Hibernate ORM

In this post I’ll show you how easy it is to use the latest and greatest version of Hibernate ORM with WildFly 10.

Traditionally, updating Hibernate in WildFly required some good knowledge of the server’s module system and the structure of the ORM modules. It certainly was doable, but it involved search/replace in existing module descriptors and generally wasn’t very convenient.

This has become much simpler with last week’s release of Hibernate ORM 5.2.1!

We now provide a ZIP archive containing all the required modules, making it a breeze to add the latest version of Hibernate to an existing WildFly instance. And what’s best: the version of Hibernate packaged with the application server remains untouched; switching between this and the new version is just a matter of setting one small configuration option, and you can go back at any time.

Preparations

The ZIP file is available in Maven Central, so you can automate its download as part of your build if needed. These are the GAV coordinates:

  • groupId: org.hibernate

  • artifactId: hibernate-orm-modules

  • version: 5.2.1.Final

  • classifier: wildfly-10-dist

  • type: zip

Unzip the archive into the modules directory of your WildFly instance. If done correctly, you should see two sub-directories under modules: system (the server’s original modules) and org (the new Hibernate ORM modules).

Choosing the right version of Hibernate ORM

Having added the Hibernate ORM 5.2 modules to the server, you need to configure your application so it uses that specific Hibernate version instead of the default one coming with the server. To do so, just add the following property to your META-INF/persistence.xml file:

...
<properties>
    <property name="jboss.as.jpa.providerModule" value="org.hibernate:5.2" />
    ...
</properties>
...

In case you have several Hibernate releases added to your WildFly server, you also can define a specific micro version:

<property name="jboss.as.jpa.providerModule" value="org.hibernate:5.2.1.Final" />

Example project

As an example for using the module ZIP, I’ve created a small Maven project. You can find it in the hibernate-demos repository. To run the example project, simply execute mvn clean verify. Let’s take a quick look at some interesting parts.

First, in the Maven POM file, the maven-dependency-plugin is used to

  • download the WildFly server and unpack it into the target directory

  • download the Hibernate ORM module ZIP and extract its contents into the modules directory of WildFly:

pom.xml
...
<properties>
    <hibernate.version>5.2.1.Final</hibernate.version>
    <wildfly.version>10.0.0.Final</wildfly.version>
    <jboss.home>${project.build.directory}/wildfly-${wildfly.version}</jboss.home>
    ...
</properties>
...
<plugin>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>pre-integration-test</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <!-- WildFly server; Unpacked into target/wildfly-10.0.0.Final -->
                    <artifactItem>
                        <groupId>org.wildfly</groupId>
                        <artifactId>wildfly-dist</artifactId>
                        <version>${wildfly.version}</version>
                        <type>zip</type>
                        <overWrite>false</overWrite>
                        <outputDirectory>${project.build.directory}</outputDirectory>
                    </artifactItem>
                    <!-- Hibernate ORM modules; Unpacked into target/wildfly-10.0.0.Final/modules -->
                    <artifactItem>
                        <groupId>org.hibernate</groupId>
                        <artifactId>hibernate-orm-modules</artifactId>
                        <version>${hibernate.version}</version>
                        <classifier>wildfly-10-dist</classifier>
                        <type>zip</type>
                        <overWrite>false</overWrite>
                        <outputDirectory>${jboss.home}/modules</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>
...

Next, Hibernate ORM 5.2 must be enabled for the application using the jboss.as.jpa.providerModule property as discussed above:

META-INF/persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence
    xmlns="http://xmlns.jcp.org/xml/ns/persistence"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence
        http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd"
    version="2.1">

    <persistence-unit name="testPu" transaction-type="JTA">
        <jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>

        <properties>
            <property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
            <property name="jboss.as.jpa.providerModule" value="org.hibernate:5.2"/>
        </properties>
    </persistence-unit>
</persistence>

This persistence unit is using the example datasource configured by default in WildFly, which is using an in-memory H2 database.

Finally, we need a test to ensure that actually Hibernate ORM 5.2 is used and not the 5.0 version coming with WildFly. To do so, we can simply invoke one of the methods of EntityManager which are exposed through Hibernate’s classic Session API as of 5.2:

HibernateModulesOnWildflyIT.java
@RunWith(Arquillian.class)
public class HibernateModulesOnWildflyIT {

    @Deployment
    public static WebArchive createDeployment() {
        return ShrinkWrap.create( WebArchive.class )
                .addClass( Kryptonite.class )
                .addAsWebInfResource( EmptyAsset.INSTANCE, "beans.xml" )
                .addAsResource( "META-INF/persistence.xml" );
    }

    @PersistenceContext
    private EntityManager entityManager;

    @Test
    @Transactional(value=TransactionMode.ROLLBACK)
    public void shouldUseHibernateOrm52() {
        Session session = entityManager.unwrap( Session.class );

        Kryptonite kryptonite1 = new Kryptonite();
        kryptonite1.id = 1L;
        kryptonite1.description = "Some Kryptonite";
        session.persist( kryptonite1 );

        session.flush();
        session.clear();

        // EntityManager methods exposed through Session only as of 5.2
        Kryptonite loaded = session.find( Kryptonite.class, 1L );

        assertThat( loaded.description, equalTo( "Some Kryptonite" ) );
    }
}

The call to find() would fail with a NoSuchMethodError if Hibernate ORM 5.0 instead of 5.2 was used. To try it out, simply remove the jboss.as.jpa.providerModule property from persistence.xml.

The test itself is executed using the great Arquillian tool. It creates a test WAR file with the contents configured in the createDeployment() method, starts up the WildFly server, deploys the WAR file and runs the test within the running container. Using the Arquillian Transactional Extension, the test method is executed within a transaction which is rolled back afterwards.

Feedback welcome!

The Hibernate ORM module ZIP file is a very recent addition to Hibernate. Should you run into any issues, please let us know and we’ll be happy to help.

You can learn more in this topical guide. Hibernate ORM 5.2.1 is the first release to provide a module ZIP, the next 5.1.x release will provide one, too.

Give it a try and let us know about your feedback!

Bean Validation and the Jigsaw Liaison

Posted by    |       |    Tagged as Discussions

Unless you’ve been living under a rock for the last months and years, you’ve probably heard about the efforts for adding a module system to the Java platform, code-named "Project Jigsaw".

Defining a module system and modularizing a huge system like the JDK is by no means a trivial task, so it’s not by surprise that the advent of Jigsaw has been delayed several times. But I think by now it’s a rather safe bet to expect Jigsaw to be released as part of JDK 9 eventually (the exact release date remains to be defined), especially since it became part of the early access builds a while ago.

This means that if you are an author of a library or framework, you should grab the latest JDK preview build and make sure your lib can be used a) on Java 9 and b) within modularized applications using Jigsaw.

The latter is what we are going to discuss in more detail in the following, taking Bean Validation and its reference implementation, Hibernate Validator as an example. We’ll see what is needed to convert them into Jigsaw modules and use them in a modularized environment.

Now one might ask why having a module system and providing libraries as modules based on such system is a good thing? There are many facets to that, but I think a good answer is that modularization is a great tool for building software systems from encapsulated, loosely-coupled and re-usable components with clearly defined interfaces. It makes API design a very conscious decision and on the other hand gives library authors the freedom to change internal implementation aspects of their module without risking compatibility issues with clients.

If you are not yet familiar with Jigsaw at all, it’s recommended to take a look at the project home page. It contains links to many useful resources, especially check out "The State of the Module System" which gives a great overview. I also found this two-part introduction very helpful.

Getting started

In order to follow our little experiment of "Jigsaw-ifying" Bean Validation, you should be using a Bash-compatible shell and be able to run commands such as wget. On systems lacking support for Bash by default, Cygwin can be used. You also need git to download some source code from GitHub.

Let’s get started by downloading and installing the latest JDK 9 early access build (build 122 has been used when writing this post). Then run java -version to confirm that JDK 9 is enabled. You should see an output like this:

java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+122)
Java HotSpot(TM) 64-Bit Server VM (build 9-ea+122, mixed mode)

After that, create a base directory for our experiments:

mkdir beanvalidation-with-jigsaw

Change into that directory and create some more sub-directories for storing the required modules and 3rd-party libraries:

cd beanvalidation-with-jigsaw

mkdir sources
mkdir modules
mkdir automatic-modules
mkdir tools

As tooling support for Java 9 / Jigsaw still is rather limited at this point, we are going to use plain javac and java commands to compile and test the code. Although that’s not as bad as it sounds and it’s indeed a nice exercise to learn about the existing and new compiler options, I admit I’m looking forward to the point where the known build tools such as Maven will fully support Jigsaw and allow compiling and testing modularized source code. But for now, the plain CLI tools will do the trick :)

Download the source code for Bean Validation and Hibernate Validator from GitHub:

git clone https://github.com/beanvalidation/beanvalidation-api.git sources/beanvalidation-api
git clone https://github.com/hibernate/hibernate-validator.git sources/hibernate-validator

As we cannot leverage Maven’s dependency management, we fetch the dependencies required by Hibernate Validator via wget, storing them in the automatic-modules (dependencies) and tools (the JBoss Logging annotation processor needed for generating logger implementations) directory, respectively:

wget http://repo1.maven.org/maven2/joda-time/joda-time/2.9/joda-time-2.9.jar -P automatic-modules
wget http://repo1.maven.org/maven2/javax/el/javax.el-api/2.2.4/javax.el-api-2.2.4.jar -P automatic-modules
wget http://repo1.maven.org/maven2/org/jsoup/jsoup/1.8.3/jsoup-1.8.3.jar -P automatic-modules
wget http://repo1.maven.org/maven2/org/jboss/logging/jboss-logging-annotations/2.0.1.Final/jboss-logging-annotations-2.0.1.Final.jar -P automatic-modules
wget http://repo1.maven.org/maven2/org/jboss/logging/jboss-logging/3.3.0.Final/jboss-logging-3.3.0.Final.jar -P automatic-modules
wget http://repo1.maven.org/maven2/com/fasterxml/classmate/1.1.0/classmate-1.1.0.jar -P automatic-modules
wget http://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.5.5/paranamer-2.5.5.jar -P automatic-modules
wget http://repo1.maven.org/maven2/org/hibernate/javax/persistence/hibernate-jpa-2.1-api/1.0.0.Final/hibernate-jpa-2.1-api-1.0.0.Final.jar -P automatic-modules

wget http://repo1.maven.org/maven2/org/jboss/logging/jboss-logging-processor/2.0.1.Final/jboss-logging-processor-2.0.1.Final.jar -P tools
wget http://repo1.maven.org/maven2/org/jboss/jdeparser/jdeparser/2.0.0.Final/jdeparser-2.0.0.Final.jar -P tools
wget http://repo1.maven.org/maven2/javax/annotation/jsr250-api/1.0/jsr250-api-1.0.jar -P tools

Automatic modules are a means of Jigsaw to work with libraries in a modularized environment which have not yet been modularized themselves. Essentially, an automatic module is a module which exports all its packages and reads all other named modules.

Its module name is derived from the JAR file name, applying some rules for splitting artifact name and version and replacing hyphens with dots. So e.g. jboss-logging-annotations-2.0.1.Final.jar will have the automatic module name jboss.logging.annotations.

Creating modules for Bean Validation API and Hibernate Validator

Currently, the Bean Validation API and Hibernate Validator are in a state where they can be compiled out of the box using Java 9. But what’s still missing are the required module descriptors which describe a module’s name, its public API, its dependencies to other modules and some other things.

Module descriptors are Java files named module-info.java and live in the root of a given module. Create the descriptor for the Bean Validation API with the following contents:

sources/beanvalidation-api/src/main/java/module-info.java
module javax.validation {(1)
    exports javax.validation;(2)
    exports javax.validation.bootstrap;
    exports javax.validation.constraints;
    exports javax.validation.constraintvalidation;
    exports javax.validation.executable;
    exports javax.validation.groups;
    exports javax.validation.metadata;
    exports javax.validation.spi;

    uses javax.validation.spi.ValidationProvider;(3)
}
1 Module name
2 All the packages the module exports (as this is an API module, all contained packages are exported)
3 The usage of the ValidationProvider service

 

Services have been present in Java for a long time. Originally added as an internal component in the JDK, the service loader mechanism became an official part of the platform as of Java 6.

Since then it has seen wide adoption for building extensible applications from loosely coupled components. With its help, service consumers can solely be implemented against a well-defined service contract, without knowing upfront about a specific service provider and its implementation. Jigsaw embraces the existing service concept and makes services first-class citizens of the modularized world.

Luckily, Bean Validation has been using the service mechanism for locating providers (such as Hibernate Validator) from the get go, so things play out nicely with Jigsaw. As we’ll see in a minute, Hibernate Validator provides an implementation of the ValidationProvider service, allowing the user to bootstrap it without depending on this specific implementation.

But for now let’s compile the Bean Validation module:

export BASE=`pwd`
cd sources/beanvalidation-api
javac -d $BASE/modules/javax.validation $(find src/main/java -name "*.java")
cd $BASE

After compilation, the built module can be found under modules/javax.validation. Note that modules usually will be packaged and redistributed as JAR files, but to keep things simple let’s just work with class directory structures here.

Things get a bit more interesting when it comes to Hibernate Validator. Its module descriptor should look like this:

sources/hibernate-validator/engine/src/main/java/module-info.java
module org.hibernate.validator.engine {(1)
    exports org.hibernate.validator;(2)
    exports org.hibernate.validator.cfg;
    exports org.hibernate.validator.cfg.context;
    exports org.hibernate.validator.cfg.defs;
    exports org.hibernate.validator.constraints;
    exports org.hibernate.validator.constraints.br;
    exports org.hibernate.validator.constraintvalidation;
    exports org.hibernate.validator.constraintvalidators;
    exports org.hibernate.validator.engine;
    exports org.hibernate.validator.group;
    exports org.hibernate.validator.messageinterpolation;
    exports org.hibernate.validator.parameternameprovider;
    exports org.hibernate.validator.path;
    exports org.hibernate.validator.resourceloading;
    exports org.hibernate.validator.spi.cfg;
    exports org.hibernate.validator.spi.group;
    exports org.hibernate.validator.spi.resourceloading;
    exports org.hibernate.validator.spi.time;
    exports org.hibernate.validator.spi.valuehandling;
    exports org.hibernate.validator.valuehandling;

    exports org.hibernate.validator.internal.util.logging to jboss.logging;(3)
    exports org.hibernate.validator.internal.xml to java.xml.bind;

    requires javax.validation;(4)
    requires joda.time;
    requires javax.el.api;
    requires jsoup;
    requires jboss.logging.annotations;
    requires jboss.logging;
    requires classmate;
    requires paranamer;
    requires hibernate.jpa;
    requires java.xml.bind;
    requires java.xml;
    requires java.scripting;
    requires javafx.base;

    provides javax.validation.spi.ValidationProvider with
        org.hibernate.validator.HibernateValidator;(5)

    uses javax.validation.ConstraintValidator;(6)
}
1 The module name
2 All the packages the module exports; Hibernate Validator always had a very well defined public API, with all the code parts not meant for public usage living in an internal package. Naturally, only the non-internal parts are exported. Things will be more complex when modularizing an existing component without such clearly defined public API. Likely you’ll need to move some classes around first, untangling public API and internal implementation parts.
3 Two noteworthy exceptions are o.h.v.internal.util.logging and o.h.v.internal.xml which are exported via a so-called "qualified exports". This means that only the jboss.logging module may access the logging package and only java.xml.bind may access the XML package. This is needed as these modules require reflective access to the logging and XML classes, respectively. Using qualified exports, this exposure of internal classes can be limited to the smallest degree possible.
4 All the modules which this module requires. These are the javax.validation module we just built, all the automatic modules we downloaded before and some modules coming with the JDK itself (java.xml.bind, javafx.base etc).
Some of these dependencies might be considered optional at runtime, e.g. Joda Time would only be needed at runtime when actually validating Joda Time types with @Past or @Future. Unfortunately - and in contrast to OSGi or JBoss Modules - Jigsaw doesn’t support the notion of optional module requirements, meaning that all module requirements must be satisfied at compile time as well as runtime. That’s a pity, as it prevents a common pattern for libraries which expose certain functionality depending on what dependencies/classes are available at runtime or not.
The "right answer" with Jigsaw would be to extract these optional features into their own modules (e.g. hibernate.validator.joda.time, hibernate.validator.jsoup etc.) but this comes at the price of making things more complex for users which then need to deal with all these modules.
5 The module provides an implementation of the ValidationProvider service
6 The module uses the ConstraintValidator service, see below

 
With the module descriptor in place, we can compile the Jigsaw-enabled Hibernate Validator module:

cd sources/hibernate-validator/engine
mkdir -p target/generated-sources/jaxb

xjc -enableIntrospection -p org.hibernate.validator.internal.xml \(1)
    -extension \
    -target 2.1 \
    -d target/generated-sources/jaxb \
    src/main/xsd/validation-configuration-1.1.xsd src/main/xsd/validation-mapping-1.1.xsd \
    -b src/main/xjb/binding-customization.xjb

javac -addmods java.xml.bind,java.annotations.common \(2)
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -processorpath $BASE/tools/jboss-logging-processor-2.0.1.Final.jar:$BASE/tools/jdeparser-2.0.0.Final.jar:$BASE/tools/jsr250-api-1.0.jar:$BASE/automatic-modules/jboss-logging-annotations-2.0.1.Final.jar::$BASE/automatic-modules/jboss-logging-3.3.0.Final.jar \
    -d $BASE/modules/org.hibernate.validator.engine \
    $(find src/main/java -name "*.java") $(find target/generated-sources/jaxb -name "*.java")

cp -r src/main/resources/* $BASE/modules/org.hibernate.validator.engine;(3)

cp -r src/main/xsd/* $BASE/modules/org.hibernate.validator.engine/META-INF;(4)
cd $BASE
1 The xjc utility is used to create some JAXB types from the XML constraint descriptor schemas
2 Compile the source code via javac
3 Copy error message resource bundle into the module directory
4 Copy XML schema files into the module directory

 
Note how the module path used for compilation refers to the modules directory (containing the javax.validation module) and the automatic-modules directory (containing all the dependencies such as Joda Time etc.).

The resulting module is located under modules/org.hibernate.validator.engine.

Giving it a test ride

Having converted Bean Validation API and Hibernate Validator into proper Jigsaw modules, it’s about time to give these modules a test ride. Create a new compilation unit for that:

mkdir -p sources/com.example.acme/src/main/java/com/example/acme

Within that directory structure, create a very simple domain class and a class with a main method for validating it:

sources/com.example.acme/src/main/java/com/example/acme/Car.java
package com.example.acme;

import java.util.List;

import javax.validation.constraints.Min;

public class Car {

    @Min(1)
    public int seatCount;

    public List<String> passengers;

    public Car(int seatCount, List<String> passengers) {
        this.seatCount = seatCount;
        this.passengers = passengers;
    }
}
sources/com.example.acme/src/main/java/com/example/acme/ValidationTest.java
package com.example.acme;

import java.util.Collections;
import java.util.Set;

import javax.validation.ConstraintViolation;
import javax.validation.Validation;
import javax.validation.Validator;

public class ValidationTest {

    public static void main(String... args) {
        Validator validator = Validation.buildDefaultValidatorFactory()
            .getValidator();

        Set<ConstraintViolation<Car>> violations = validator.validate( new Car( 0, Collections.emptyList() ) );

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );
    }
}

This obtains a Validator object via Validation#buildDefaultValidatorFactory() (which internally uses the service mechanism described above) and performs a simple validation of a Car object.

Of course we need a module-info.java, too:

sources/com.example.acme/src/main/java/module-info.java
module com.example.acme {
    exports com.example.acme;

    requires javax.validation;
}

That should look familiar by now: we just export the single package (so Hibernate Validator can access the state of the Car object) and depend on the Bean Validation API module.

Also compilation of this module isn’t much news:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cd $BASE

And with that, we finally can run a first test of Bean Validation under Jigsaw:

java \
    -modulepath modules:automatic-modules \
    -m com.example.acme/com.example.acme.ValidationTest

Similar to javac, there is a new modulepath option for the java command, pointing to one or more directories with Jigsaw modules. The -m switch specifies the main class to run by giving its module name and fully qualified class name.

Mh, that was not really successful:

HV000149: An exception occurred during message interpolation
    ...
Caused by: java.lang.UnsupportedOperationException: ResourceBundle.Control not supported in named modules
    at java.util.ResourceBundle.checkNamedModule(java.base@9-ea/ResourceBundle.java:1551)
    at java.util.ResourceBundle.getBundle(java.base@9-ea/ResourceBundle.java:1533)
    at org.hibernate.validator.resourceloading.PlatformResourceBundleLocator.loadBundle(org.hibernate.validator.engine/PlatformResourceBundleLocator.java:135)

What’s that about? Hibernate Validator is using the Control class in order to merge the contents (error messages) of several resource bundles with the same name found on the classpath. This is not supported in the modularized environment any longer, hence the exception above is raised. Eventually, Hibernate Validator should handle this situation automatically (this is tracked under HV-1073).

For now let’s hack around it and disable the troublesome bundle aggregregation in Hibernate Validator’s AbstractMessageInterpolator. To do so, change true to false in the constructor invocation on line 165:

sources/hibernate-validator/engine/src/main/java/org/hibernate/validator/messageinterpolation/AbstractMessageInterpolator.java
...
new PlatformResourceBundleLocator(
    CONTRIBUTOR_VALIDATION_MESSAGES,
    null,
    false
);
...

Re-compile the Hibernate Validator module. After running the test again, you should now see the following output on the console:

Validation error: must be greater than or equal to 1

Tada, the first successful bean validation in the Jigsaw environment :)

Let me quickly recap what has happened so far:

  • We added a module descriptor to the Bean Validation API, making it a proper Jigsaw module

  • We added a module descriptor to Hibernate Validator; this Bean Validation provider will be discovered by the API module using the service mechanism

  • We created a test module with a main method, which uses the Bean Validation API to perform a simple object validation

(Not) overstepping boundaries

Now let’s be nasty and see whether the module system actually is doing its job as expected. For that add the module requirement to Hibernate Validator to the module descriptor (so its considered for compilation at all) and cast the validator to the internal implementation type in ValidationTest:

sources/com.example.acme/src/main/java/module-info.java
module com.example.acme {
    exports com.example.acme;

    requires javax.validation;
    requires org.hibernate.validator.engine;
}
sources/com.example.acme/src/main/java/com/example/acme/ValidationTest.java
package com.example.acme;

import javax.validation.Validation;

import org.hibernate.validator.internal.engine.ValidatorImpl;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        ValidatorImpl validator = (ValidatorImpl) Validation.buildDefaultValidatorFactory()
            .getValidator();
    }
}

Running javac again, you should now get a compilation error, complaining about the type not being found. So Jigsaw prevents accesses to non-exported types. If you like, try referencing anything from the packages exported by Hibernate Validator, which will work.

That’s a great advantage over the traditional flat classpath, where you might have organized your code base into public and internal parts but then had to hope for users of your library not to step across the line and - accidentally or intentionally - access internal classes.

Custom constraints

With the modules basically working, it’s time to get a bit more advanced and create a custom Bean Validation constraint. This one should make sure that a car does not have more passengers than seats available.

For that we need an annotation type:

sources/com.example.acme/src/main/java/com/example/acme/PassengersDontExceedSeatCount.java
package com.example.acme;

import static java.lang.annotation.ElementType.TYPE;
import static java.lang.annotation.RetentionPolicy.RUNTIME;

import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;

import javax.validation.Constraint;
import javax.validation.Payload;

@Documented
@Constraint(validatedBy = { PassengersDontExceedSeatCountValidator.class })
@Target({ TYPE })
@Retention(RUNTIME)
public @interface PassengersDontExceedSeatCount {

    String message() default "{com.example.acme.PassengersDontExceedSeatCount.message}";
    Class<?>[] groups() default { };
    Class<? extends Payload>[] payload() default { };
}

And also a constraint validator implementation:

sources/com.example.acme/src/main/java/com/example/acme/PassengersDontExceedSeatCountValidator.java
package com.example.acme;

import javax.validation.ConstraintValidator;
import javax.validation.ConstraintValidatorContext;

import com.example.acme.PassengersDontExceedSeatCount;

public class PassengersDontExceedSeatCountValidator implements
        ConstraintValidator<PassengersDontExceedSeatCount, Car> {

    @Override
    public void initialize(PassengersDontExceedSeatCount constraintAnnotation) {}

    @Override
    public boolean isValid(Car car, ConstraintValidatorContext constraintValidatorContext) {
        if ( car == null ) {
            return true;
        }

        return car.passengers == null || car.passengers.size() <= car.seatCount;
    }
}

A resource bundle with the error message for the constraint is needed, too:

sources/com.example.acme/src/main/resources/ValidationMessages.properties
com.example.acme.PassengersDontExceedSeatCount.message=Passenger count must not exceed seat count

Now we can put the new constraint type to the Car class and finally validate it:

sources/com.example.acme/src/main/java/com/example/acme/Car.java
@PassengersDontExceedSeatCount
public class Car {
    ...
}
sources/com.example.acme/src/main/java/com/example/acme/ValidationTest.java
package com.example.acme;

import java.util.Arrays;
import java.util.Set;

import javax.validation.Validation;
import javax.validation.Validator;
import javax.validation.ConstraintViolation;

public class ValidationTest {

    public static void main(String... args) throws Exception{
        Validator validator = Validation.buildDefaultValidatorFactory()
            .getValidator();

        Set<ConstraintViolation<Car>> violations = validator.validate(
            new Car( 2, Arrays.asList( "Anna", "Bob", "Alice" ) )
        );

        System.out.println( "Validation error: " + violations.iterator().next().getMessage() );
    }
}

Compile the example module again; don’t forget to copy the resource bundle to the module directory:

cd sources/com.example.acme
javac \
    -g \
    -modulepath $BASE/modules:$BASE/automatic-modules \
    -d $BASE/modules/com.example.acme $(find src/main/java -name "*.java")
cp -r src/main/resources/* $BASE/modules/com.example.acme
cd $BASE

Run it as before, and you should get a nice error message. But what’s that:

Validation error: {com.example.acme.PassengersDontExceedSeatCount.message}

It seems the error message wasn’t resolved properly, so the raw interpolated message key from the annotation definition has been returned. Now why is this?

Bean Validation error messages are loaded through java.util.ResourceBundle, and due to the strong encapsulation of the modularized environment the Hibernate Validator module cannot "see" the resource bundle provided in the example module.

The updated JavaDocs of ResourceBundle make it clear that only bundles located in the same module as the caller of ResourceBundle#getBundle() can be accessed. In order to access resource bundles from other modules, the service loader mechanism is to be used as per Java 9; A new SPI interface, ResourceBundleProvider, has been added to the JDK for that purpose.

Ultimately, Bean Validation should take advantage of that mechanism, but how can we make things work out for now? As it turns out, Hibernate Validator has its own extension point for customizing the retrieval of resource bundles, ResourceBundleLocator.

This comes in very handy now: we just need to create an implementation of that SPI in the example module:

sources/com.example.acme/src/main/java/com/example/acme/internal/MyResourceBundleLocator.java
package com.example.acme.internal;

import java.util.Locale;
import java.util.ResourceBundle;

import org.hibernate.validator.spi.resourceloading.ResourceBundleLocator;

public class MyResourceBundleLocator implements ResourceBundleLocator {

    @Override
    public ResourceBundle getResourceBundle(Locale locale) {
        return ResourceBundle.getBundle( "ValidationMessages", locale );
    }
}

When bootstrapping the validator factory, configure a message interpolator using that bundle locator like this:

sources/com.example.acme/src/main/java/com/example/acme/ValidationTest.java
import org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator;
import com.example.acme.internal.MyResourceBundleLocator;

...

Validator validator = Validation.byDefaultProvider()
    .configure()
    .messageInterpolator( new ResourceBundleMessageInterpolator( new MyResourceBundleLocator() ) )
    .buildValidatorFactory()
    .getValidator();

As the call to ResourceBundle#getBundle() now originates from the same module that declares the ValidationMessages bundle, the bundle can be found and the error message will be interpolated correctly. Success!

Keeping your privacy

With the custom constraint in place, let’s think about encapsulation a bit more. Wouldn’t it be nice if the constraint validator implementation didn’t live in the exported package but rather somewhere under internal? After all, that class is an implementation detail and should not be referenced directly by users of the @PassengersDontExceedSeatCount constraint.

Another feature of Hibernate Validator is helpful here: service-loader based discovery of constraint validators.

This allows us to remove the reference from the constraint annotation to its validator (just add an empty @Constraint({}) annotation) and relocate the validator implementation to the internal package:

mv sources/com.example.acme/src/main/java/com/example/acme/PassengersDontExceedSeatCountValidator.java \
    sources/com.example.acme/src/main/java/com/example/acme/internal

Also adapt the package declaration in the source file accordingly and add an import for the Car type. We then need to declare the constraint validator as service provider in the module descriptor:

sources/com.example.acme/src/main/java/module-info.java
module com.example.acme {
    ...

    provides javax.validation.ConstraintValidator
        with com.example.acme.internal.PassengersDontExceedSeatCountValidator;
}

Compile and run the example module again. You should get an error like this:

java.lang.IllegalAccessException: class org.hibernate.validator.internal.util.privilegedactions.NewInstance (in module org.hibernate.validator.engine) cannot access class com.example.acme.internal.PassengersDontExceedSeatCountValidator (in module com.example.acme) because module com.example.acme does not export com.example.acme.internal to module org.hibernate.validator.engine

This originates from the fact that Hibernate Validator is using the service loader mechanism only for detecting validator types and then instantiates them for each specific constraint usage. As the internal package has not been exported, this instantiation is bound to fail. You have two options now:

  • Use a qualified export in module-info.java to expose that package to the Hibernate Validator module

  • Use the new -XaddExports option of the java command to dynamically add this export when running the module

Following the latter approach, the java invocation would look like this:

java \
    -modulepath modules:automatic-modules \
    -XaddExports:com.example.acme/com.example.acme.internal=org.hibernate.validator.engine \
    -m com.example.acme/com.example.acme.ValidationTest

While this approach works, it can become a bit tedious when taking other libraries into the picture, that need to perform reflective operations on non-exported types. JPA providers such as Hibernate ORM and dependency injection frameworks are just two examples.

Luckily, the OpenJDK team is aware of that issue and there is an entry for it in the requirements list for the Java Module System: ReflectiveAccessToNonExportedTypes. I sincerely hope that this one gets addressed before Java 9 gets finalized.

XML configuration

As last part of our journey of "Jigsaw-ifying" Bean Validation, let’s take a look at XML-based configuration of constraints. This is a useful alternative, if you cannot put constraint metadata to a model via annotations or e.g. want to override existing annotation-based constraints externally.

The Bean Validation spec defines a validation mapping file for this, which in turn can point to one or more constraint mapping XML files. Create the following files in order to override the @Min constraint of the Car class:

sources/com.example.acme/src/main/resources/META-INF/validation.xml
<?xml version="1.0" encoding="UTF-8"?>

<validation-config
    xmlns="http://jboss.org/xml/ns/javax/validation/configuration"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://jboss.org/xml/ns/javax/validation/configuration">

    <constraint-mapping>META-INF/constraints-car.xml</constraint-mapping>
</validation-config>
sources/com.example.acme/src/main/resources/META-INF/constraints-car.xml
<?xml version="1.0" encoding="UTF-8"?>

<constraint-mappings
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://jboss.org/xml/ns/javax/validation/mapping validation-mapping-1.1.xsd"
    xmlns="http://jboss.org/xml/ns/javax/validation/mapping" version="1.1">

    <bean class="com.example.acme.Car" ignore-annotations="true">
        <field name="seatCount">
            <constraint annotation="javax.validation.constraints.Min">
                <element name="value">2</element>
            </constraint>
        </field>
    </bean>
</constraint-mappings>

Traditionally, Bean Validation will look for META-INF/validation.xml on the classpath and resolve any linked constraint mapping files relatively to that. If you’ve followed this article that far, you won’t be surprised that this is not going to work in Jigsaw. There is no notion of a "flat classpath" any longer, and thus a validation provider cannot see XML files in other modules, akin to the case of error message bundles discussed above.

More specifically, the method ClassLoader#getResourceAsStream() which is used by Hibernate Validator to open mapping files, won’t work for named modules as of JDK 9. That change will be a tough nut to crack for many projects when migrating to Java 9, as it renders strategies for resource loading known from existing modular environment such as OSGi inoperable. E.g. Hibernate Validator allows to pass in a classloader for loading user-provided resources. In OSGi this can be used to pass what’s called the bundle loader and allow Hibernate Validator to access constraint mapping files and other things provided by the user. This pattern cannot be employed with Jigsaw unfortunately, as getResourceAsStream() "does not find resources in named modules".

But Bean Validation has a way out for this issue, too, as it allows to pass in constraint mappings as InputStream opened by the bootstrapping code. Class#getResourceAsStream() continues to work for resources from the same module, so things will work out as expected when bootstrapping the validator factory like this (don’t forget to close the stream afterwards):

sources/com.example.acme/src/main/java/com/example/acme/ValidationTest.java
InputStream constraintMapping = ValidationTest.class.getResourceAsStream( "/META-INF/constraints-car.xml" );

Validator validator = Validation.byDefaultProvider()
    .configure()
    .addMapping( constraintMapping )
    .buildValidatorFactory()
    .getValidator();

That way the constraint mapping is opened by code from the same module and thus can be accessed and passed to the Bean Validation provider.

In the longer term, APIs such as Bean Validation should foresee some kind of SPI contract for conveying all required configuration information as per Jigsaw spec lead Mark Reinhold. The user would then expose an instance of that contract via a service implementation.

Wrapping it up

With that, we conclude our experiment of making Bean Validation ready for the Jigsaw module system. Overall, things work out pretty well, and without too much effort Bean Validation and Hibernate Validator can be made first-class citizens in a fully-modularized world.

Some observations from the experiment:

  • The lack of optional module requirements poses a usability issue in my opinion, as it prevents libraries from exposing additional functionality based on what classes and modules are present at runtime in a given environment. This means that users of the library either need to provide other modules they don’t actually need or - if the library has been cut into several modules, one per optional dependency - need to add several modules now where they could have worked with a single one before.

  • The need to explicitly expose internal packages for reflective access by libraries such as Hibernate Validator, but also JPA providers or DI containers, can become tedious. I hope there will be a way to enable such access in a more global way, e.g. by whitelisting "trustworthy modules" such as the aforementioned libraries.

  • The changed behaviors around loading of resources such as configuration files or resource bundles provided by the user of a library will potentially affect many applications when migrating to Java 9. The established pattern of accepting an external classloader for loading user resources will not work anymore, so libraries need to adapt either by providing dedicated extension points (akin to addMapping(InputStream) in Bean Validation) or by migrating to service based approaches as envisioned by the makers of Jigsaw.

  • Tools such as Maven (including plug-ins as e.g. Surefire for running tests) or Gradle but also IDEs still need to catch up with Jigsaw. Using plain javac and java can be fun for a while, but you wish back more powerful tools rather quickly :)

  • Converting Hibernate Validator into a Jigsaw module is relatively easy, as we luckily were very careful about a proper API/implementation split from the beginning. Modularizing existing libraries or applications without such clear distinction will be a much tougher exercise, as it may require lots of types to be moved around and unwanted (package) dependencies to be broken up. There are some tools that can help with that, but that might be a topic for a future blog post by itself.

One thing is for sure: interesting times lie ahead! While migration might be painful here and there, I think it’s overdue that Java gets its proper module system and I look forward to seeing it as an integrated part of the platform very much.

Got feedback from following the steps described above or from your own experiments with Jigsaw? Let’s all together learn from our different experiences and insights, so please share any thoughts on the topic below.

Many thanks to Sander Mak, Sanne Grinovero and Guillaume Smet for reviewing this post!

Emulating property literals with Java 8 method references

Posted by    |       |    Tagged as Discussions

One of the things library developers often miss in Java are property literals. In this post I’m going to show how to make creative use of Java 8 method references to emulate property literals, with the help of some byte code generation.

Akin to class literals (e.g. Customer.class), property literals would allow to refer to the properties of a bean class in a type-safe manner. This would be useful for designing APIs that run actions on specific bean properties or apply some means of configuration to them. E.g. consider the API for programmatic configuration of index mappings in Hibernate Search:

new SearchMapping().entity( Address.class )
    .indexed()
    .property( "city", ElementType.METHOD )
        .field();

Or the validateValue() method from the Bean Validation API for validating a value against the constraints of a single property:

Set<ConstraintViolation<Address>> violations = validator.validateValue( Address.class, "city", "Purbeck" );

In both cases a String is used to represent the city property of the Address class.

That’s error-prone on several levels:

  • The Address class might not have a property city at all. Or one could forget to update that String reference when renaming the property.

  • In the case of validateValue(), there is no way for making sure that the passed value actually satisfies the type of the city property.

Users of the APIs will only find out about these issues when actually running their application. Wouldn’t it be nice if instead the compiler and the language’s type system prevented such wrong usages from the beginning? If Java had property literals, that’d be exactly what you’d get (invented syntax, this does not compile!):

mapping.entity( Address.class )
    .indexed()
    .property( Address::city, ElementType.METHOD )
        .field();

And:

validator.validateValue( Address.class, Address::city, "Purbeck" );

The issues mentioned above would be avoided: Having a typo in a property literal would cause a compilation error which you’d notice right in your IDE. It’d allow to design the configuration API in Hibernate Search in a way to accept only properties of Address when configuring the Address entity. And in case of Bean Validation’s validateValue(), literals could help to ensure that only values can be passed, that are assignable to the property type in question.

Java 8 method references

While Java 8 has no real property literals (and their introduction isn’t planned for Java 9 either), it provides an interesting way to emulate them to some degree: method references. Having been introduced to improve the developer experience when using Lambda expressions, method references also can be leveraged as poor-man’s property literals.

The idea is to consider a reference to a getter method as a property literal:

validator.validateValue( Address.class, Address::getCity, "Purbeck" );

Obviously, this will only work if there actually is a getter. But if your classes are following JavaBeans conventions - which often is the case - that’s fine.

Now how would the definition of the validateValue() method look like? The key is using the new Function type:

public <T, P> Set<ConstraintViolation<T>> validateValue(Class<T> type, Function<? super T, P> property, P value);

By using two type parameters, we make sure that the bean type, the property and the value passed for the property all correctly match. So API-wise, we got what we need: It’s safe to use, and the IDE will even auto-complete method names after starting to write Address::. But how do we derive the property name from the Function in the implementation of validateValue()?

That’s where the fun begins, as the Function interface just defines a single method, apply(), wich runs the function against a given instance of T. That seems not exactly helpful, does it?

ByteBuddy to the rescue

As it turns out, applying the function actually does the trick! By creating a proxy instance of the T type, we have a target for invoking the method and can obtain its name in the proxy’s method invocation handler.

Java comes with support for dynamic proxies out of the box, but that’s limited to proxying interfaces only. As our API should work with any kind of bean, also actual classes, I’m going to use a neat tool called ByteBuddy instead. ByteBuddy provides an easy-to-use DSL for creating classes on the fly which is exactly what we need.

Let’s begin by defining an interface which just allows to store and obtain the name of a property we obtained from a method reference:

public interface PropertyNameCapturer {

    String getPropertyName();

    void setPropertyName(String propertyName);
}

Now let’s use ByteBudy to programmatically create a proxy class which is assignable to the type of interest (e.g. Address) and PropertyNameCapturer:

public <T> T /* & PropertyNameCapturer */ getPropertyNameCapturer(Class<T> type) {
    DynamicType.Builder<?> builder = new ByteBuddy()                                       (1)
            .subclass( type.isInterface() ? Object.class : type );

    if ( type.isInterface() ) {                                                            (2)
        builder = builder.implement( type );
    }

    Class<?> proxyType = builder
        .implement( PropertyNameCapturer.class )                                           (3)
        .defineField( "propertyName", String.class, Visibility.PRIVATE )
        .method( ElementMatchers.any() )                                                   (4)
            .intercept( MethodDelegation.to( PropertyNameCapturingInterceptor.class ) )
        .method( named( "setPropertyName" ).or( named( "getPropertyName" ) ) )             (5)
            .intercept( FieldAccessor.ofBeanProperty() )
        .make()
        .load(                                                                             (6)
             PropertyNameCapturer.class.getClassLoader(),
             ClassLoadingStrategy.Default.WRAPPER
        )
        .getLoaded();

    try {
        @SuppressWarnings("unchecked")
        Class<T> typed = (Class<T>) proxyType;
        return typed.newInstance();                                                        (7)
    }
    catch (InstantiationException | IllegalAccessException e) {
        throw new HibernateException(
            "Couldn't instantiate proxy for method name retrieval", e
        );
    }
}

The code may appear a bit dense, so let me run you through it. First we obtain a new ByteBuddy instance (1) which is the entry point into the DSL. It is used to create a new dynamic type that either extends the given type (if it is a class) or extends Object and implements the given type if it is an interface (2).

Next, we let the type implement the PropertyNameCapturer interface and add a field for storing the name of the specified property (3). Then we say that invocations to all methods should be intercepted by PropertyNameCapturingInterceptor (we’ll come to that in a moment) (4). Only setPropertyName() and getPropertyName() (as declared in the PropertyNameCapturer interface) should be routed to write and read access of the field created before (5). Finally, the class is built, loaded (6) and instantiated (7).

That’s all that’s needed to create the proxy type; Thanks to ByteBuddy, this is done in a few lines of code. Now let’s take a look at the interceptor we configured before:

public class PropertyNameCapturingInterceptor {

    @RuntimeType
    public static Object intercept(@This PropertyNameCapturer capturer, @Origin Method method) {         (1)
        capturer.setPropertyName( getPropertyName( method ) );                                           (2)

        if ( method.getReturnType() == byte.class ) {                                                    (3)
            return (byte) 0;
        }
        else if ( ... ) { } // ... handle all primitve types
            // ...
        }
        else {
            return null;
        }
    }

    private static String getPropertyName(Method method) {                                               (4)
        final boolean hasGetterSignature = method.getParameterTypes().length == 0
                && method.getReturnType() != null;

        String name = method.getName();
        String propName = null;

        if ( hasGetterSignature ) {
            if ( name.startsWith( "get" ) && hasGetterSignature ) {
                propName = name.substring( 3, 4 ).toLowerCase() + name.substring( 4 );
            }
            else if ( name.startsWith( "is" ) && hasGetterSignature ) {
                propName = name.substring( 2, 3 ).toLowerCase() + name.substring( 3 );
            }
        }
        else {
            throw new HibernateException( "Only property getter methods are expected to be passed" );    (5)
        }

        return propName;
    }
}

intercept() accepts the Method being invoked as well as the target of the invocation (1). The annotations @Origin and @This are used to designate the respective parameters so ByteBuddy can generate the correct invocations of intercept() into the dynamic proxy type.

Note that there is no strong dependency from this interceptor to any types of ByteBuddy, meaning that ByteBuddy is only needed when creating that dynamic proxy type but not later on, when actually using it.

Via getPropertyName() (4) we then obtain the name of the property represented by the passed method object and store it in the PropertyNameCapturer (2). If the given method doesn’t represent a getter method, an exception is raised (5). The return value of the invoked getter is irrelevant, so we just make sure to return a sensible "null value" matching the property type (3).

With that, we got everything in place to get hold of the property represented by a method reference passed to validateValue():

public <T, P> Set<ConstraintViolation<T>> validateValue(Class<T> type, Function<? super T, P> property, P value) {
    T capturer = getPropertyNameCapturer( type );
    property.apply( capturer );
    String propertyName = ( (PropertyLiteralCapturer) capturer ).getPropertyName();

    // perform validation of the property value...
}

When applying the function to the property name capturing proxy, the interceptor will kick in, obtain the property name from the Method object and store it in the capturer instance, from where it can be retrieved finally.

And there you have it, some byte code magic lets us make creative use of Java 8 method references for emulating property literals.

That said, having real property literals as part of the language (dreaming for a moment, maybe Java 10?) would still be very beneficial. It’d allow to deal with private properties and, hopefully, one could refer to property literals from within annotations. Real property literals also would be more concise (no "get" prefix) and it’d generally feel a tad less hackish ;)

During my talk at VoxxedVienna on using Hibernate Search with Elasticsearch earlier this week, there was an interesting question which I couldn’t answer right away:

"When running a full-text query with a projection of fields, is it possible to return the result as a list of POJOs rather than as a list of arrays of Object?"

The answer is: Yes, it is possible, result transformers are the right tool for this.

Let’s assume you want to convert the result of a projection query against the VideoGame entity shown in the talk into the following DTO (data transfer object):

public static class VideoGameDto {

    private String title;
    private String publisherName;
    private Date release;

    public VideoGameDto(String title, String publisherName, Date release) {
        this.title = title;
        this.publisherName = publisherName;
        this.release = release;
    }

    // getters...
}

This is how you could do it via a result transformer:

FullTextEntityManager ftem = ...;

QueryBuilder qb = ftem.getSearchFactory()
    .buildQueryBuilder()
    .forEntity( VideoGame.class )
    .get();

FullTextQuery query = ftem.createFullTextQuery(
    qb.keyword()
        .onField( "tags" )
        .matching( "round-based" )
        .createQuery(),
    VideoGame.class
    )
    .setProjection( "title", "publisher.name", "release" )
    .setResultTransformer( new BasicTransformerAdapter() {
        @Override
        public VideoGameDto transformTuple(Object[] tuple, String[] aliases) {
            return new VideoGameDto( (String) tuple[0], (String) tuple[1], (Date) tuple[2] );
        }
    } );

List<VideoGameDto> results = query.getResultList();

I’ve pushed this example to the demo repo on GitHub.

There are also some ready-made implementations of the ResultTransformer interface which you might find helpful. So be sure to check out its type hierarchy. For this example I found it easiest to extend BasicTransformerAdapter and implement the transformTuple() method by hand.

To the person asking the question: Thanks, and I hope this answer is helpful to you!

Hibernate Validator 5.2.4.Final is out

Posted by    |       |    Tagged as Hibernate Validator Releases

It’s my pleasure to announce the release of Hibernate Validator 5.2.4.Final!

This is a rather small bugfix release which addresses two nasty issues around one of the more advanced features of Bean Validation, redefined default group sequences.

Please refer to the issues themselves (HV-1055, HV-1057) for all the details.

Where do I get it?

Use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.2.4.Final to fetch the release with Maven, Gradle etc. Alternatively, you can find distribution bundles containing all the bits on SourceForge (TAR.GZ, ZIP).

Found a bug? Have a feature request? Then let us know through the following channels:

It’s my pleasure to announce the release of Hibernate Validator 5.2.3.Final!

Wait, didn’t we already do another Hibernate Validator release earlier this month? That’s right, indeed we pushed out the first Alpha of the 5.3 family a couple of days ago. And normally, that’d mean that there would be no further releases of earlier version families.

But in this case we decided to do an exception from the rule as we noticed that Hibernate Validator couldn’t be used with Java 9 (check out issue HV-1048 if you are interested in the details). As we don’t want to keep integrators and users of Hibernate Validator from testing their own software on Java 9, we decided to fix that issue on the current stable release line (in fact we strongly encourage you to test your applications on Java 9 to learn as early as possible about any potential changes you might need to make).

While we were at it, we backported some further bugfixes from 5.3 to 5.2, amongst them one for ensuring compatability with the Google App Engine. As always, you can find the complete list of fixes in the changelog.

Where do I get it?

Use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.2.3.Final to fetch the release with Maven, Gradle etc. Alternatively, you can find distribution bundles containing all the bits on SourceForge (TAR.GZ, ZIP).

Found a bug? Have a feature request? Then let us know through the following channels:

Hibernate Validator 5.3.0.Alpha1 is out

Posted by    |       |    Tagged as Hibernate Validator Releases

It’s my pleasure to announce the first release of Hibernate Validator 5.3!

The overarching idea for the 5.3 timeline is to prototype several features which may potentially be standardized in the Bean Validation 2.0 specification. For instance we’ll work on a solution for the long-standing request for sorting the constraints on single properties.

If you’d like to see any specific features addressed in that prototyping work (and eventually included in BV 2.0), then please get in touch and let us know which are the most important things you are missing from the spec. We’ve compiled a first list of issues we are considering for inclusion in BV 2.0. For sure we cannot address all of them, so it’ll greatly help if you tell us what would be most helpful to you.

Dynamic payloads for constraints

To get things rolling, the Alpha 1 allows to you to enrich custom constraint violations with additional context data. Code examining constraint violations can access and interpret this data in a safer way than by parsing string-based constraint violation messages. Think of it as a dynamic variant of the existing Bean Validation payload feature.

As an example, let’s assume we have a constraint @Matches for making sure a long property matches a given value with some tolerance:

public static class Package {

    @Matches(value=1000, tolerance=100)
    public long weight;
}

If the annotated value is invalid, the resulting constraint violation should have a specific severity, depending on whether the value lies within the given tolerance or not. That severity value could then for instance be used for formatting the error specifically in a UI.

The definition of the @Matches constraint is nothing new, it’s just a regular custom constraint annotation:

@Retention(RUNTIME)
@Constraint(validatedBy = { MatchesValidator.class })
@Documented
public @interface Matches {

    public enum Severity { WARN, ERROR; }

    String message() default "Must match {value} with a tolerance of {tolerance}";
    Class<?>[] groups() default {};
    Class<? extends Payload>[] payload() default {};

    long value();
    long tolerance();
}

The constraint validator is where it’s getting interesting:

public class MatchesValidator implements ConstraintValidator<Matches, Long> {

    private long tolerance;
    private long value;

    @Override
    public void initialize(Matches constraintAnnotation) {
        this.value = constraintAnnotation.value();
        this.tolerance = constraintAnnotation.tolerance();
    }

    @Override
    public boolean isValid(Long value, ConstraintValidatorContext context) {
        if ( value == null ) {
            return true;
        }

        if ( this.value == value.longValue() ) {
            return true;
        }

        HibernateConstraintValidatorContext hibernateContext = context.unwrap(
                HibernateConstraintValidatorContext.class
        );
        hibernateContext.withDynamicPayload(
                Math.abs( this.value - value ) < tolerance ? Severity.WARN : Severity.ERROR
        );

        return false;
    }
}

In isValid() the severity object is set via HibernateConstraintValidatorContext#withDynamicPayload(). Note that the payload object must be serializable in case constraint violations are sent to remote clients, e.g. via RMI.

Validation clients may then access the dynamic payload like so:

HibernateConstraintViolation<?> violation = violations.iterator()
    .next()
    .unwrap( HibernateConstraintViolation.class );

if ( violation.getDynamicPayload( Severity.class ) == Severity.ERROR ) {
    // ...
}

What else is there?

Other features of the Alpha 1 releases are improved OSGi support (many thanks to Benson Margulies for this!), optional relaxation of parameter constraints (kudos to Chris Beckey!) and several bug fixes and improvements, amongst them support for cross-parameter constraints in the annotation processor (cheers to Nicola Ferraro!).

You can find the complete list of all addressed issues in the change log. To get the release with Maven, Gradle etc. use the GAV coordinates org.hibernate:{hibernate-validator|hibernate-validator-cdi|hibernate-validator-annotation-processor}:5.3.0.Alpha1. Alternatively, a distribution bundle containing all the bits is provided on on SourceForge (TAR.GZ, ZIP).

To get in touch, use the following channels:

Hibernate Validator 5.2.2 released

Posted by    |       |    Tagged as Hibernate Validator Releases

I am happy to announce the availability of Hibernate Validator 5.2.2.Final.

This release fixes several bugs, including a nasty regression around private property declarations in inheritance hierarchies and a tricky issue related to classloading in environments such as OSGi.

We also closed a gap in the API for constraint declaration which allows to ignore annotation-based constraints for specific methods, parameters etc.:

HibernateValidatorConfiguration config = Validation.byProvider( HibernateValidator.class ).configure();

ConstraintMapping mapping = config.createConstraintMapping();
mapping.type( OrderService.class )
    .method( "placeOrder", Item.class, int.class )
        .ignoreAnnotations( true )
        .parameter( 0 )
            .ignoreAnnotations( false );
config.addMapping( mapping );

Validator validator = config.buildValidatorFactory().getValidator();

Please refer to the change log for the complete list of all issues. You can get the release with Maven, Gradle etc. using the GAV coordinates org.hibernate:hibernate-validator::5.2.2.Final. Alternatively, a distribution bundle is provided on on SourceForge (TAR.GZ, ZIP).

Get in touch through the following channels:

back to top