Red Hat

In Relation To Discussions

In Relation To Discussions

Hibernate Community Newsletter 15/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

On Baeldung, you can find a very good article about JPA and Hibernate pagination. While JPA 2.2 defines support for Java 1.8 Stream operations. pagination is still the proffered way to controlling the amount of data being fetched.

Have you ever wondered how you can map a JPA many-to-many association with extra column? If you are interested in the best way to much a a relationship, then you should definitely read this article.

If you’re using a relational database, then you should be using a connection pool as well. Check out this article for an performance analysis of the most common Java connection pools. You will also need connection pool monitoring, and the FlexyPool open source framework allows you to do so.

Hibernate offers a dirty checking mechanism which automatically detects changes to managed entities. While the default mechanism is suitable for most use cases, you can even customize it as explained in this article.

If you ever wondered why you got the HHH000179: Narrowing proxy to class this operation breaks == warning message or wondered how to fix it, then you should read this article wrote by Marcin Chwedczuk.

Traditionally, storing EAV (Entity-Attribute-Value) data in a RDBMS has required various tricks to handle multiple value types. Now that most relational databases support JSON column types, you can use a custom Hibernate type to store your EAV data as a JsonNode object. Check out this article for a step-by-step tutorial that will show you how you can accomplish this task.

Joe Nelson wrote a great article about the difference between various SQL isolation levels with examples for various phenomena like read skew, write skew or lost updates.

Thorben Janssen gives you some tips about mapping the Many-To-One and One-To-Many associations. For more details, check out the best way to map a @OneToMany relationship with JPA and Hibernate article as well.

Questions and answers

Hibernate Community Newsletter 14/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

In this article, Arpit Jain writes about the differences between persist and merge in relation to JPA transaction boundaries. For more details about the persist and merge entity state transitions, check out this article as well.

For functional programming aficionados, TehDev wrote a very interesting article about refactoring towards a transaction monad.

If you’re using Payara Server, check out this article about how you can integrate it with Hibernate 5.

Baeldung published an article about the differences between persist, merge, update, as well as saveOrUpdate.

If you’re using Grails, Michael Scharhag shows you how you can make use of Hibernate filters.

JPA 2.2 has been released, but Hibernate has been supporting Java 1.8 Date and Time, Hibernate-specific @Repeatable annotations and, since 5.1, Java 1.8 streams are supported as well.

If you’re using MySQL, Thorben Janssen has written a list of tips to take into consideration when using Hibernate. If you are interested in more details, then check out the following articles as well:

Debezium is an open-source project developed by Red Hat which allows you to capture transaction events from RDBMS like MySQL, PostgreSQL or NoSQL solutions such as MongoDB and push them to Apache Kafka. For more details, check out this tutorial about using Debezium, MySQL and Kafka.

Hibernate Community Newsletter 13/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Presentations

Check out the Virtual JUG presentation about High-Performance Java Persistence and Hibernate. If you are using a relational database, then you should definitely watch this session and learn how to run your enterprise application at warp speed.

Articles

The @Transactional annotation allows for a clear separation between business logic and transaction handling. However, just because you are using a very convenient abstraction, it does not mean you don’t have to understand how it works behind the scenes. Check out the "how does Spring @Transactional really work" article from jHades to know more on this topic.

If you’re using Payara Java EE Application Server and want to make it work with Hibernate 5, then you should definitely check out this tutorial.

I also wrote three articles that cover JPA 2.2 Date and Time types, Hibernate Array types, as well as CDC using Debezium.

Time to upgrade

Hibernate Validator 6.0.0.CR1 is out with Bean Validation 2.0.0.CR1 support.

Hibernate ORM 5.1.8 has been released, so, if you’re using the 5.1 branch, you should definitely give it a try.

Hibernate Community Newsletter 12/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Presentations

Don’t miss the Virtual JUG presentation about High-Performance Java Persistence and Hibernate. If you are using a relational database, then you should definitely attend this session, and, the best thing about it, you can watch it in the comfort of your home.

Articles

The pick of this edition is this article by Arnold Galovics which reiterates the benefits of using projections when fetching data.

JPA inheritance is a very useful addition to the standard. However, sometimes entity inheritance is not very well understood or applied, so, in this series of articles, I tried to offer a different perspective to why we need entity inheritance in the first place, and what is the best way to do it:

Time to upgrade

Hibernate Search has managed to release three final versions:

  • 5.5.7.Final

  • 5.6.2.Final

  • 5.7.1.Final

as well as a 5.8.0.Beta3 release.

Hibernate Community Newsletter 11/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

The pick of this edition is this article by Heap’s Engineering blog which demonstrates the benefits of using batch updates even for reducing database index overhead.

As previously explained, you can speed up integration tests considerably using a RAM disk or tmpfs. Mark Rotteveel‏ tried this approach and looks like it works for Firebird as well.

Hibernate 5.2.10 comes with a very handy connection management optimization for RESOURCE_LOCAL transactions. If you don’t use JTA and you disabled auto-commit at the connection pool level, then it’s worth setting the hibernate.connection.provider_disables_autocommit configuration property as well.

When using Oracle, the fastest way to access a database record is to use the ROWID pseudocoolumn. If using ROWID is suitable for your application, then you can annotate your entities with the @RowId annotation and Hibernate will use the ROWID pseudocoolumn for UPDATE statements.

The best way to manage a database schema is to use incremental update scripts, and a tool like Flyway. Even in this case, you can still benefit from the hbm2ddl tool to validate the entity mappings. Check out how you can deal with schema mismatch exceptions, especially for non-trivial mappings.

You can use Hibernate statistics to log query execution time. However, in reality, many enterprise application are better off using a JDBC DataSource or Driver Proxy which, not only it allows you to log JDBC statements along with their parameters, but you can even detect N+1 query issues automatically during testing.

Presentations

Jakub Kubryński has a very good presentation about JPA common pitfalls and how you should handle them effectively.

Book discount

Until the 1st of June, High-Performance Java Persistence is 33% OFF. Considering the reader testimonials as well as Good Reads and Amazon reviews, it’s quite a deal!

Time to upgrade

  • Hibernate Validator 6.0.0 Beta1 and Beta2 were released.

  • Hibernate ORM 5.1.7 is out, so you should consider updating if you are running the 5.1 version.

Creating Patch Files for WildFly

Posted by    |       |    Tagged as Discussions

The WildFly application server comes with a patching mechanism which makes it very easy to upgrade existing modules of the server or add new ones. E.g. Hibernate Validator provides patch files which let you upgrade WildFly 10.1 to the preview releases of Bean Validation 2.0.

But you also can use the patching mechanism to add your own custom libraries to WildFly, making them available to deployed applications. Even if you only ever deploy a single application to one WildFly instance, this can be very useful, as it results in a smaller size of your deployment unit (WAR etc.) and thus faster build and deployment times.

How are WildFly patches created, though? Patch files are generally just ZIP files which contain the module(s) to be added or updated as well as some additional metadata. So in theory you could create them by hand, but there’s the patch-gen tool which greatly simplifies this task.

In the following we’ll describe step by step how to create a WildFly patch using the patch-gen-maven-plugin. As an example, we’ll produce a patch file that adds the Eclipse Collections library to a WildFly instance.

The Module Descriptors

The first thing we need are the module descriptors for the JBoss Modules system which is the underlying basis of WildFly. Eclipse Collections is split into two JARs, one for its API and one for the implementation. So we’ll create the following two descriptors:

src/main/modules/system/layers/base/org/eclipse/collections/api/main/module.xml
<?xml version="1.0" encoding="UTF-8"?>

<module xmlns="urn:jboss:module:1.3" name="org.eclipse.collections.api">
    <resources>
        <resource-root path="eclipse-collections-api-${eclipse.collections.version}.jar" />
    </resources>
</module>
src/main/modules/system/layers/base/org/eclipse/collections/main/module.xml
<?xml version="1.0" encoding="UTF-8"?>

<module xmlns="urn:jboss:module:1.3" name="org.eclipse.collections">
    <resources>
        <resource-root path="eclipse-collections-${eclipse.collections.version}.jar" />
    </resources>

    <dependencies>
        <module name="org.eclipse.collections.api" />
    </dependencies>
</module>

Each descriptor specifies a resource for the corresponding JAR (the version property placeholders are replaced using Maven resource filtering). The implementation module also declares a dependency to the API module.

The Patch Tool Configuration File

The patch-gen tool needs a small configuration file which describes some patch metadata (e.g. the server version to which the patch applies and the type of the patch - one-off vs. cumulative) as well as the patched module(s):

src/main/patch/patch.xml
<?xml version='1.0' encoding='UTF-8'?>

<patch-config xmlns="urn:jboss:patch-config:1.0">
    <name>wildfly-${wildfly.version}-eclipse-collections-${eclipse.collections.version}</name>
    <description>This patch adds Eclipse Collections ${eclipse.collections.version} to a WildFly ${wildfly.version} installation</description>
    <element patch-id="layer-base-wildfly-${wildfly.version}-eclipse-collections-${eclipse.collections.version}">
        <one-off name="base" />
        <description>This patch adds Eclipse Collections ${eclipse.collections.version} to a WildFly ${wildfly.version} installation</description>
        <specified-content>
            <modules>
                <added name="org.eclipse.collections.api" />
                <added name="org.eclipse.collections" />
            </modules>
        </specified-content>
    </element>
    <specified-content/>
</patch-config>

Preparing the Patch Creation

The patch-gen tool takes two directories of the distribution to be patched as input: one directory with the original, unpatched WildFly structure and another directory which contains the original WildFly structure and the added (or updated) modules. We can use the Maven dependency plug-in for creating the two directories by extracting the WildFly distribution twice:

pom.xml
...
<plugin>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack-wildfly</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>org.wildfly</groupId>
                        <artifactId>wildfly-dist</artifactId>
                        <version>${wildfly.version}</version>
                        <type>tar.gz</type>
                        <overWrite>false</overWrite>
                        <outputDirectory>${project.build.directory}/wildfly-original</outputDirectory>
                    </artifactItem>
                    <artifactItem>
                        <groupId>org.wildfly</groupId>
                        <artifactId>wildfly-dist</artifactId>
                        <version>${wildfly.version}</version>
                        <type>tar.gz</type>
                        <overWrite>false</overWrite>
                        <outputDirectory>${project.build.directory}/wildfly-patched</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>
...

Now we need to add the Eclipse Collections JARs to the second directory. Let’s configure another execution of the Maven dependency plug-in for that:

pom.xml
...
<execution>
    <id>add-eclipse-collections</id>
    <phase>prepare-package</phase>
    <goals>
        <goal>copy</goal>
    </goals>
    <configuration>
        <artifactItems>
            <artifactItem>
                <groupId>org.eclipse.collections</groupId>
                <artifactId>eclipse-collections-api</artifactId>
                <version>${eclipse.collections.version}</version>
                <overWrite>false</overWrite>
                <outputDirectory>${wildflyPatched}/modules/system/layers/base/org/eclipse/collections/api/main</outputDirectory>
            </artifactItem>
            <artifactItem>
                <groupId>org.eclipse.collections</groupId>
                <artifactId>eclipse-collections</artifactId>
                <version>${eclipse.collections.version}</version>
                <overWrite>false</overWrite>
                <outputDirectory>${wildflyPatched}/modules/system/layers/base/org/eclipse/collections/main</outputDirectory>
            </artifactItem>
        </artifactItems>
    </configuration>
</execution>
...

We also need to add the module.xml descriptors so they are located next to the corresponding JARs. The Maven resources plug-in helps with that. It also can be used to replace the placeholders in the patch.xml descriptor. The following two plug-in executions are needed:

pom.xml
...
<plugin>
    <artifactId>maven-resources-plugin</artifactId>
    <executions>
        <execution>
            <id>copy-module-descriptors</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${wildflyPatched}/modules</outputDirectory>
                <resources>
                    <resource>
                        <directory>src/main/modules</directory>
                        <filtering>true</filtering>
                    </resource>
                </resources>
            </configuration>
        </execution>
        <execution>
            <id>filter-patch-descriptor</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${project.build.directory}/</outputDirectory>
                <resources>
                    <resource>
                        <directory>src/main/patch</directory>
                        <filtering>true</filtering>
                    </resource>
                </resources>
            </configuration>
        </execution>
    </executions>
</plugin>
...

Configuring the Patch-Gen Maven Plug-in

After all these preparations it’s time to configure the patch-gen Maven plug-in which will eventually assemble the patch file:

pom.xml
...
<plugin>
    <groupId>org.jboss.as</groupId>
    <artifactId>patch-gen-maven-plugin</artifactId>
    <executions>
        <execution>
            <id>create-patch-file</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>generate-patch</goal>
            </goals>
            <configuration>
                <appliesToDist>${wildflyOriginal}</appliesToDist>
                <updatedDist>${wildflyPatched}</updatedDist>
                <patchConfig>${project.build.directory}/patch.xml</patchConfig>
                <outputFile>${patchFile}</outputFile>
            </configuration>
        </execution>
    </executions>
</plugin>
...

The plug-in requires the following configuration:

  • The path to the unpatched WildFly directory

  • The path to the patched WildFly directory

  • The path to the patch.xml descriptor

  • The output path for the patch file

As a last step we need to make sure that the created patch file is added as an artifact to the Maven build. That way, the created ZIP file can be installed to the local Maven repository and be deployed to repository servers such as Nexus. The build helper Maven plug-in helps with this last task:

pom.xml
...
<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>build-helper-maven-plugin</artifactId>
    <executions>
        <execution>
            <id>attach-patch-artifact</id>
            <phase>package</phase>
            <goals>
                <goal>attach-artifact</goal>
            </goals>
            <configuration>
                <artifacts>
                    <artifact>
                        <file>${patchFile}</file>
                        <type>zip</type>
                        <classifier>wildfly-${wildfly.version}-patch</classifier>
                    </artifact>
                </artifacts>
            </configuration>
        </execution>
    </executions>
</plugin>
...

Running the Build

With all the configuration in place, the patch file can be built by running mvn clean install. The created patch file should have a structure like this:

target/eclipse-collections-8.1.0-wildfly-10.1.0.Final-patch.zip
├── META-INF
├── README.txt
├── layer-base-wildfly-10.1.0.Final-eclipse-collections-8.1.0
│   └── modules
│       └── org
│           └── eclipse
│               └── collections
│                   ├── api
│                   │   └── main
│                   │       ├── eclipse-collections-api-8.1.0.jar
│                   │       └── module.xml
│                   └── main
│                       ├── eclipse-collections-8.1.0.jar
│                       └── module.xml
├── misc
└── patch.xml

As we’d expect it, the patch contains the Eclipse Collections JARs as well as the corresponding module.xml descriptors. The patch.xml descriptor contains metadata for the patching infrastructure, e.g. the WildFly version to which this patch can be applied as well as hash checksums for the added modules.

Applying and Using the Patch

Once the patch is created, we can apply it using the jboss-cli tool that comes with WildFly:

<JBOSS_HOME>/bin/jboss-cli.sh "patch apply --path path/to/eclipse-collections-8.1.0-wildfly-10.1.0.Final-patch.zip"

You should see the following output if the patch has been applied:

{
    "outcome" : "success",
    "result" : {}
}

And with that you can use the Eclipse Collections API from within your deployed applications. Just make sure to expose the two new modules to your application. To do so, add a descriptor named META-INF/jboss-deployment-structure.xml to your deployment unit:

src/main/resources/META-INF/jboss-deployment-structure.xml
<?xml version="1.0" encoding="UTF-8"?>
<jboss-deployment-structure
    xmlns="urn:jboss:deployment-structure:1.2"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

    <deployment>
        <dependencies>
            <module name="org.eclipse.collections.api" />
            <module name="org.eclipse.collections" />
        </dependencies>
    </deployment>
</jboss-deployment-structure>

If you’d like to try it out and create your own WildFly patch, check out this example project on GitHub. It contains the complete pom.xml for creating the Eclipse Collections patch. There is also an integration test module, which takes the patch file, applies it to a WildFly instance and runs a small test (using Arquillian) that calls the Eclipse Collections API on the server.

If you got any feedback on this blog post or would like to share your experiences with the WildFly patching infrastructure, let us know in the comments below.

Hibernate Community Newsletter 10/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

Mapping JPA relationships is a trivial thing to do. However, not all associations are equal in terms of performance. Check out this series of articles about the best way to map the:

If you’re using TomEE 7, you can easily switch to using Hibernate ORM as the JPA provider. Check out this article which shows you how you can do that, and how you can also speed up application server startup time.

Docker is extremely useful for running database containers that you need when doing integration testing. Check out this article about running IBM DB2 Express-C as a Docker container, and how to set up a JDBC connection to DB2.

Although collections like List and Set are more common when using JPA and Hibernate, you can easily use Maps as explained in this article.

Time to upgrade

Hibernate ORM 5.1.6 has been released, as well as Hibernate Search 5.8.0 Beta 2.

Hibernate Community Newsletter 9/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

Hibernate uses Proxy objects for lazy associations. If you want to understand how Porxy objects work and how you can unproxy them to the actual entities they represent, then you should definitely read this article.

Romain Manni-Bucau wrote a very interesting article which points out how you should be merging incoming and existing association collections when using Hibernate.

Getting access to the underlying database metadata is very easy with Hibernate 5. You just have to know how to make use of the Integrator SPI mechanism, as explained in this article.

Knowing how to implement equals and hashCode is of paramount importance when using JPA and Hibernate. Read this article by Steven Schwenke to find out business keys are very suitable for this task.

Bean Validation is a very convenient mechanism for validating entity state. Check out this article to find out how you can make sure that an integer value is within a bounded range.

Hibernate Community Newsletter 8/2017

Posted by    |       |    Tagged as Discussions Hibernate ORM

Welcome to the Hibernate community newsletter in which we share blog posts, forum, and StackOverflow questions that are especially relevant to our users.

Articles

The Hibernate ResultTransformer is extremely useful to customize the way you fetch data from the database. Check out this article to learn more about this topic.

JPA and Hibernate use the first-level cache as a staging buffer for every read/write operation, and understanding its inner workings is very important if you want to use JPA effectively. For more details, you should definitely read this article.

Marco Behler offers 6 very practical video tutorials for Hibernate beginners.

Dealing with difficult software problems is easier than you might think. Check out this article for more details on how you can solve any software issue with the help of our wonderful software community.

If you wonder why you should choose Hibernate over plain JDBC, this article gives you 15 reasons why Hibernate is worth using.

This very short article offers a quick introduction to mapping a bidirectional one-to-many association. If you want to know what is the best way to map a one-to-many database relationship, then you should definitely read this article as well.

Database concurrency control is a very complex topic. PostgreSQL advisory locks are a very useful concurrency control API which you can use to implement multi-node coordination mechanisms. Check out this article for more details on this topic.

Time to upgrade

Hibernate ORM 5.2.10 has been released, as well as Hibernate Search 5.8.0.Beta1 which is now compatible with ElasticSearch 5.

Accessing private state of Java 9 modules

Posted by    |       |    Tagged as Discussions

Data-centric libraries often need to access private state of classes provided by the library user.

An example is Hibernate ORM. When the @Id annotation is given on a field of an entity, Hibernate will by default directly access fields - as opposed to calling property getters and setters - to read and write the entity’s state.

Usually, such fields are private. Accessing them from outside code has never been a problem, though. The Java reflection API allows to make private members accessible and access them subsequently from other classes. With the advent of the module system in Java 9, rules for this will change a bit, though.

In the following we’ll explore the options authors of a library provided as a Java 9 module have to access private state of classes defined in other modules.

An example

As an example, let’s consider a simple method which takes an object - e.g. an instance of an entity type defined by the user - and a field name and returns the value of the object’s field of that name. Using reflection, this method could be implemented like this (for the sake of simplicity, we are ignoring the fact that a security manager could be present):

package com.example.library;

public class FieldValueAccessor {

    public Object getFieldValue(Object object, String fieldName) {
        try {
            Class<?> clazz = object.getClass();
            Field field = clazz.getDeclaredField( fieldName );
            field.setAccessible( true );
            return field.get( object );
        }
        catch (NoSuchFieldException | SecurityException | IllegalArgumentException | IllegalAccessException e) {
            throw new RuntimeException( e );
        }
    }
}

By calling Field#setAccessible() we can obtain the field’s value in the following, even if it had been declared private. The module descriptor for the library module is trivial, it just exports the package of the accessor class:

module com.example.library {
    exports com.example.library;
}

In a second module, representing our application, let’s define a simple "entity":

package com.example.entities;

public class MyEntity {

    private String name;

    public MyEntity(String name) {
        this.name = name;
    }

    // ...
}

And also a simple main method which makes use of the accessor to read a field from the entity:

package com.example.entities;

public class Main {
    public static void main(String... args) {
        FieldValueAccessor accessor = new FieldValueAccessor();
        Object fieldValue = accessor.getFieldValue( new MyEntity( "hey there" ), "name" );
        assert "hey there".equals( fieldValue );
    }
}

As this module uses the library module, we need to declare that dependency in the entity module’s descriptor:

module com.example.myapp {
    requires com.example.library;
}

With the example classes in place, let’s run the code and see what happens. It would have been fine on Java 8, but as of Java 9 we’ll see this exception instead:

java.lang.reflect.InaccessibleObjectException:
Unable to make field private final java.lang.String com.example.entities.MyEntity.name accessible:
module com.example.myapp does not "opens com.example.entities" to module com.example.library

The call to setAccessible() fails, as by default code in one module isn’t allowed to perform so-called "deep reflection" on code in another (named) module.

Open this module!

Now what can be done to overcome this issue? The error message is giving us the right hint already: the package with the type to reflect on must be opened to the module containing the code invoking setAccessible().

If a package has been opened to another module, that module can access the package’s types reflectively at runtime. Note that opening a package will not make it accessible to other modules at compile time; this would require the package to be exported instead (as in the case of the library module above).

There are several options for opening a package. The first is to make the module an open module:

open module com.example.myapp {
    requires com.example.library;
}

This opens up all packages in this module for reflection by all other modules (i.e. this would be the behavior as known from other module systems such as OSGi). In case you’d like some more fine-grained control, you can opt to open specific packages only:

module com.example.myapp {
    opens com.example.entities;
    requires com.example.library;
}

This will allow for deep reflection on the entities package, but not on other packages within the application module. Finally, there is the possibility to limit an opens clause to one or more specific target modules:

module com.example.myapp {
    opens com.example.entities to com.example.library;
    requires com.example.library;
}

That way the library module is allowed to perform deep reflection, but not any other module.

No matter which of these options we use, the library module can now make private fields of types in the entities package of the entities module accessible and subsequently read or write their value.

Opening up packages in one way or another lets library code written prior to Java 9 continue to function as before. It requires some implicit knowledge, though. I.e. application developers need to know which libraries need reflective access to which types so they can open up the right packages. This can become tough to manage in more complex applications with multiple libraries performing reflection.

Luckily, there’s a more explicit approach in the form of variable handles.

Can you handle the var?

Variable handles - defined by JEP 193 - are a very powerful addition to the Java 9 API, providing "read and write access to [variables] under a variety of access modes". Describing them in detail would go far beyond the scope of this post (refer to the JEP and this article if you would like to learn more). For our purposes let’s focus on their capability for accessing fields, representing an alternative to the traditional reflection-based approach.

So how could our FieldValueAccessor class be implemented using variable handles?

Var handles are obtained via the MethodHandles.Lookup class. If such lookup has "private access" to the entities module, it will let us access private fields of that module’s types. To get hold of such lookup, we let the client code pass in a lookup when bootstrapping the library code:

Lookup lookup = MethodHandles.lookup();
FieldValueAccessor accessor = new FieldValueAccessor( lookup );

In FieldValueAccessor#getFieldValue() we can now use the method MethodHandles#privateLookupIn() which will return a new lookup granting private access to the given entity instance. From that lookup we can eventually obtain a VarHandle which allows us to get the object’s field value:

public class FieldValueAccessor {

    private final Lookup lookup;

    public FieldValueAccessor(Lookup lookup) {
        this.lookup = lookup;
    }

    public Object getFieldValue(Object object, String fieldName) {
        try {
            Class<?> clazz = object.getClass();
            Field field = clazz.getDeclaredField( fieldName );

            MethodHandles.Lookup privateLookup = MethodHandles.privateLookupIn( clazz, lookup );
            VarHandle handle = privateLookup.unreflectVarHandle( field );

            return handle.get( object );
        }
        catch (NoSuchFieldException | IllegalAccessException e) {
            throw new RuntimeException( e );
        }
    }
}

Note that this only works if the original lookup has been created by code in the entities module.

This is because MethodHandles#lookup() is a caller sensitive method, i.e. the returned value will depend on the direct caller of that method. privateLookupIn() checks whether the given lookup is allowed to perform deep reflection on the given class. Thus a lookup obtained in the entities module will do the trick, whereas a lookup retrieved in the library module wouldn’t be of any use.

Which route to take?

Both discussed approaches let libraries access private state from Java 9 modules.

The var handle approach makes the requirements of the library module more explicit, which I like. Expecting a lookup instance during bootstrap should be less error-prone than the rather implicit requirement for opening up packages or modules.

Mails by the OpenJDK team also seem to suggest that - together with their siblings, method handles - var handles are the way to go in the long term. Of course it requires the application module to be cooperative and pass the required lookup. It remains to be seen how this could look like in container / app server scenarios, where libraries typically aren’t bootstrapped by the application code but by the server runtime. Injecting some helper code for obtaining the lookup object upon deployment may be one possible solution.

As var handles only are introduced in Java 9 you might want to refrain from using them if your library is supposed to run with older Java versions, too (actually, you can do both by building multi-release JARs). A very similar approach can be implemented in earlier Java versions using method handles (see MethodHandles.Lookup#findGetter()). Unfortunately, though, there’s no official way to obtain method handles with private access prior to Java 9 and the introduction of privateLookupIn​(). Ironically, the only way to get such handles is employing some reflection.

The final thing to note is that there may be a performance advantage to using var and method handles, as access checking is done only once when getting them (as opposed to every invocation). Some proper benchmarking would be in order, though, to see what’s the difference in a given use case.

As always, your feedback is very welcome. Which approach do you prefer? Or perhaps you’ve found yet other solution we’ve missed so far? Let us know in the comments below.

back to top