Red Hat

In Relation To Bean Validation

In Relation To Bean Validation

How to start learning Java EE 6

Posted by    |       |    Tagged as Bean Validation CDI Java EE JSF Weld

A developer new to Java EE posted in the Weld forum asking for advice on getting started learning EE 6. I've decided to promote part of my response in the forum to the blog.

So, what should I start learning. Java EE 5 and Seam? Java EE 6 and Weld? Is there any learning material about 6 and Weld right now?

Unless you plan on putting your system into production inside the next 2-3 months, you should start learning CDI and Weld. CDI defines the basic infrastructure you'll use to get the various kinds of Java EE components working together (and much more).

You also need to decide what web framework to use. This is the hard bit for most people.

  • You could use plain servlets and JSP, which in EE 6 both include out-of-the-box CDI integration. However, most people prefer to use something with more features. But honestly, if you're new to enterprise Java, it's not a bad place to start. The latest servlet spec lets you use annotations, which is a big step forward in usability.
  • JSF2 is included as part of the platform, and is integrated with CDI out of the box. Many people find JSF difficult to learn. We believe that for some kinds of applications, it pays off in the end. Unfortunately, most of the books and articles about JSF make it appear much more complex than it actually is. In particular, the specification is horribly written and not worth reading (so I won't link to it). Take a look at the Weld examples to get a taste of what JSF2 is really like.
  • An option we recommend is Wicket. Wicket is an alternative to JSF that is easier to learn. Weld comes with Wicket integration.
  • There are many other possibilities including about a thousand frameworks which are basically alternatives to the servlet API. Struts2 and Stripes seem like popular options. I don't find any of these kinds of frameworks very interesting or sexy, but they're usually easy to learn. It should be very relatively easy to integrate any of them with Weld.
  • A final option worth mentioning is GWT. GWT is an entirely different beast, with its own Java to JavaScript compiler. It's great for certain kind of application. We should have GWT integration for Weld available very soon.

My personal view is that you should start out by looking at either JSF and/or Wicket, unless you're truly a beginner at this, in which case start with writing some plain servlets. If these options don't suit your needs, cast a wider net.

You'll also need a persistence solution. For most people writing EE 6 applications that means JPA2. There are at least three excellent implementations of this specification to choose from, including ours. You'll probably want to use JPA2 together with Bean Validation.

Once you've got a handle on CDI, JPA2, and whatever web framework you decide to use, take a look at EJB 3.1, and the more advanced functionality it offers for integrating with data stores and messaging, and managing asynchronicity. Unfortunately, the EJB spec itself is quite hard to read, so you should probably try to find a good tutorial. Be careful, there is still a lot of information about EJB 2 out there on the web. EJB 3.1 is a quite different beast.

Hibernate Validator 4 unleashed

Posted by    |       |    Tagged as Bean Validation Hibernate Validator

After many months of polishing, we are happy to release Hibernate Validator 4. This is a major milestone for Hibernate Validator with tons of new features and a spec compliance.

For the newcomers

Hibernate Validator let's you declare constraints on your domain model using annotations like @NotNull or @Size and returns the list of constraint failures found in an object graph. Instead of duplicating constraint declarations in various application layers, constraints are centralized on your domain model and shared by all layers and frameworks: declared once, validate anywhere if you will.

What's new and cool?

Hibernate Validator is a complete rewrite and has many many many new features. Let me describe a handful of them:

  • constraint composition: a custom constraint can be composed of smaller constraints, avoiding code duplication, improving readability and increasing portability especially when combined with the set of built-in constraints.
  • groups: groups allow you to define a subset of the constraints you want to see validated at a given time. Useful in many situations like partially filled data, check the state of an object (can this user buy in one click?), order constraint validations. Oh and groups are not mere strings, they are a type-safe construct build on top of Java interfaces.
  • type-safe constraint declaration: Hibernate Validator 4 ensures that constraints set on a property are compatible with the property's type. This can even be theoretically checked at compile time.
  • more powerful custom constraints: as easy as before, more powerful than before. You can now customize the constraint violation messages reported by custom constraints and potentially return several violations if needed. A violation can point to a sub-property (useful for cross-property validations).
  • native integration with JPA 2 and JSF 2: Hibernate Validator 4 natively integrates with Java Persistence 2 and Java Server Faces 2 due to it's Bean Validation compliance. This integration is in the work in JBoss AS 5.2. People using Seam and Hibernate Core are already familiar with these style of integration since Hibernate Validator 3.
  • fluent type-safe bootstrap API: you can refine many aspects of Hibernate Validator like the message interpolation logic using the new bootstrap API.
  • metadata API: frameworks in need to query the constraints on a domain model can use the metadata API. This is for example used by Hibernate Core to propagate constraints to the database schema.
  • XML: XML configuration can be used in combination with or instead of annotations allowing for example redefinition of some constraints depending on the deployment environment.
  • a much improved test suite: we have completely rewritten the test suite which is now the base of the specification TCK.
  • compliance with JSR-303 Bean Validation: last but not least, Hibernate Validator 4 is the reference implementation, we couldn't make it more compliant ;) Practically, for you, it means you code against the specification API making your code more portable and your constraints will be visible by the whole Java ecosystem provided that they integrate Bean Validation (like Java EE 6 did for example).

Of course there are many other new features, check out the documentation here. You can download Hibernate Validator 4 from there and reach us in our forum.

The second good news is that the Bean Validation specification is now finished. I will hand it over to the JCP for final approval ballot today. More on that in a few days.

Many thanks to Hardy, the person behind Hibernate Validator 4's implementation for coping with last month/day/minute/second enhancements to the specification. A special thanks to Gunnar Morling and Alaa Nassef for their contribution despite a few administrative hiccups :)

Here is the latest draft before sending Bean Validation (JSR 303) to final stage (in pdf[1]). For the few who don't know yet ( ;) ), Bean Validation standardizes constraint declaration, definition, validation and metadata for the Java platform. Said otherwise, add an annotation to a property and hop it's validated.

class User {
   @NotEmpty @Size(max=50)
   String getName() { ... }

   String getEmail() { ... }

Please give us last minute feedbacks in our forums

You can already use the spec as Hibernate Validator 4 implements it. Overall 38 coarse grained bugs and tasks got fixed or implemented respectively in this beta. Hibernate Validator artifacts can be downloaded on Sourceforge or in the JBoss Maven repository.

Back to the spec, let's discuss some of the enhancements.

A type-safe Path representation

A typesafe way to express navigation paths to the failing property has been added. Before, paths were expressed as strings like "addresses[0].street1" and libraries were forced to parse this string by hand. The Path object now exposes all individual nodes via the Iterable interface.

Here is the routine to build the old String based form.

StringBuilder stringPath = new StringBuilder();
Path path = constraintViolation.getPropertyPath();
boolean isRoot = true;
for(Node node : path) {
    // node with name
    if ( node.getName() != null ) {
        // likely a collection, add []
        if ( node.isInIterable() ) {
            // list or array
            if ( node.getIndex() != null ) {
                stringPath.append( node.getIndex() );
            // a Map
            else if ( node.getKey() ) {
                stringPath.append( node.getKey() );

        //dot between properties
        if (isRoot) {
            isRoot = false;
        else {
        stringPath.append( node.getName() );

Most usages are much much more simple and only involve using the node name.

The ConstraintValidatorContext used to customize error messages inside validators has a nice fluent API to add subnodes.

context.buildErrorWithMessageTemplate( "this detail is wrong" )
            .addSubNode( "addresses" )
            .addSubNode( "country" )
                .inIterable().atKey( "home" )
            .addSubNode( "name" )

Bootstrap API for provider specific usages

The bootstrap API now takes a Bean Validation provider class rather than the Configuration class to select a specific provider:


Opening doors for the future

While we unfortunately could not include method and parameter validations in this release, we have open the doors for provider specific extensions and potential standardization for the next revision of Bean Validation.

  • Built-in Constraint annotations can be hosted on parameters and constructors
  • unwrap allows access to provider specific extensions

Here is a possible implementation for Hibernate Validator (to be implemented ;) )

class AtYourService {
   doMeAFavor(@Valid Favor favor, @NotEmpty owner) { ... }

HibernateValidator hVal = validator.unwrap(HibernateValidator.class);
Set<ConstraintViolation> failures = hVal.validateMethod(doMeAFavor);

We have also added the notion of constraint payload. While ignored by Bean Validation and most Bean Validation providers, payloads can be used by validation clients to associate metadata to particular constraints. The use case driving this inclusion was to define a severity level to error messages.

class User {
   @Size(max=50, payload=Severity.ERROR)
   String getName() { ... }

   String getEmail() { ... }

This information can then be read by the presentation framework to display error messages differently. If you define constraints, don't forget the new mandatory payload parameter!


We have also done various enhancements:

  • rethink java (sub)packages and move interfaces around
  • add support for unbound wildcards for ConstraintValidators
  • add support to return the list of ConstraintDescriptors matching a given set of groups
  • enhanced the TraversableResolver contract
  • rename message template keys to match f.q.c.NameOfTheConstraint.message (ie javax.validation.NotNull.message)

Many thanks to all the feedback received whether it be from within and outside the JCP. Feel free to drop more last minute feedbacks in our forums, and try Hibernate Validator 4!

PS: this draft was supposed to go through the regular JCP process but due to legal dirty work at play, this is not currently possible (I might expand on the subject in a different post depending on how frustrated I end up being). Our legal team is at work but in the mean time, I wanted to give you the premium (download the spec here[1]).

Aaron and I will be talking about Hibernate Search and how it can complement your database when you need to scale big, like in... ahem a cloud. It's on Wednesday, June 3 at 9:45 am in Hall 134. I know it's early, someone in the program committee did not like us so much ;)

I will also do an author signing session of Hibernate Search in Action the same day Wed, June 3 at the JavaOne bookstore.

I will also discuss Bean Validation (JSR-303), what it does and how it integrates in Java EE 6 (which I will demo on stage) and any other architecture. This will be Thursday, June 4 at 13:30 in Hall 134. The latest version of the spec is always available here at least till we make it final. Hibernate Validator 4, the reference implementation is well underway, give it a try.

Bean Validation's public draft has now been approved by the EC (all members voting yes except Nortel and SpringSource who did not vote).

But we have not stopped there!

Reference implementation

We are pleased to release the first alpha of the Bean Validation Reference Implementation: Hibernate Validator 4 Alpha (download).

This is a big milestone and the core features are already present and the implementation is already used in the JSF 2.0 RI prototype. This release is not feature complete though, here are the main missing parts:

  • Group inheritance using interfaces or implicit grouping
  • Group sequence is implemented but not fully tested
  • Overriding the Default group of a class is not implemented yet
  • TraversableResolver does not integrates with JPA yet
  • XML configuration (META-INF/validation.xml)
  • XML mapping
  • some metadata APIs are not fully compliant
  • ConstraintValidatorContext is not operational

The distribution includes source code, jars and a getting started guide.

Everything should work as described in the spec otherwise. For help, come to our forum. Please file bugs to JIRA. For more informations on Hibernate Validator 4, see here. Many thanks to Hardy for leading the effort and to Alaa and Gunnar for stepping up.


This implementation is based on the specification 1.0 Beta3[1] (post public draft). This version improves:

  • the bootstrap API names are much more intuitive
  • type-safe validator implementations: a constraint is attached to several validators, the right one being chosen based on the targeted type
  • XML configuration via META-INF/valiation.xml
  • XML constraint mapping (allowing annotation overriding)

Give us feedback and tell us how it goes.

The public draft of Bean Validation is finally out and available here on the JCP website. We have made a lot of improvements and took a lot of feedback from you since the last draft:

  • interface-based groups
  • constraint composition
  • built-in constraints
  • multiple reports per constraint violation
  • bootstrap API
  • method level validation
  • integration with JPA, JSF and Java EE

Please go check the spec and give us your feedback in our forum. We plan to get a final spec early february, so don't delay :)


The fundamental notion of group is basically unchanged but it has morphed into a much more type-safe and cleaner beast. Instead of declaring groups as strings, groups are now interfaces. This has several advantages including:

  • easier to find a group usage in IDEs
  • errors due to typos will be visible at compilation time rather than runtime
  • groups can inherit from each other using the natural interface inheritance model. This allows group composition.
  • partial validation based on interfaces hosting constraint definitions
 * Customer can buy bypassing the harassing checking process 
 * @author Emmanuel Bernard 
public interface BuyInOneClick extends Default, Billable {} 

 * User representation 
public class User { 
    @NotNull private String firstname; 
    @NotNull(groups = Default.class) private String lastname; 
    @NotNull(groups = {Billable.class}) private CreditCard defaultCreditCard; 

Set<ConstraintViolation> violations = validator.validate(user, BuyInOneClick.class);

The spec is very readable and has a lot of real-life examples, check section 3.4.


Constraints can now declaratively be composed of more primitive constraints. This both helps to reduce code duplication and to allow primitive constraints to be exposed in the metadata and used by tools like JavaScript generators.

@Size(min=5, max=5)
public @interface FrenchZipCode {
    String message() default "Wrong zipcode";
    Class<?>[] groups() default {};

I described the idea in more details in this previous blog entry.

Speaking of constraint definition, the specification now comes with a set of built-in constraints: @Null, @NotNull, @AssertTrue, @AssertFalse, @Min, @Max, @Size, @Digits, @Past, @Future. These are the fundamental bricks that can be understood by all tools accessing metadata.

Finally, constraint implementations can now raise more than one violation report per constraint. This is particularly useful when a bean-level constraint is targeting two specific properties.

public class Address {
  private String zipCode;
  private String city;

@CheckAddressConsistency probably wants to raise two violation reports, one targeting zipCode and one targeting city. This is also useful to define fine grained error messages depending on the failure. See section 2.4 for more info.


The Validator interface can now validate any type of object and is no longer tied to a given JavaBean type.

Validator validator = validatorFactory.getValidator();
Set<ConstraintViolation<Address>> addressViolations = validator.validate(address);
Set<ConstraintViolation<User>> userViolations = validator.validate(user);

The specification has a bootstrap API letting you customize various validation components and optionally letting you choose the specific Bean Validation provider you seek. An older version of the bootstrap API is described in this blog entry, the latest version is of course available in the spec section 4.4.

In appendix, you will find a proposal for method level validations: parameter and returned values can be decorated with constraints declarations. This feature has been widely requested, let us know what you think.


The JPA, JSF and Java EE integration proposals are also available in appendix.

Thanks to everyone for your past feedback and your future one :)

Until now, it was not possible or easy to reuse constraints to make more complex constraints.

The new specification draft introduces the notion of constraint composition. Composition is useful for three main things:

  • reuse more primitive constraints to build new constraints and avoid duplication
  • define fine grained error reports for a given constraint
  • expose how a constraint is composed and describe its primitive blocks

The last point is particularly interesting. Constraints implementations are black boxes answering yes or no to the question: Is this value valid or not?. When constraints need to be applied outside the Java world or applied on a different metadata model, black boxes do no good. There is no way to know that @OrderNumber actually apply some restriction on the length of the number as well as apply some CRC verification.

Composition helps to solve the problem by giving access to the primitive constituencies of a constraint. Let's first see how to define a composed constraint.

Defining a composed constraint

To define the list of constraints composing a main constraint, simply annotate the main constraint annotation with the composing constraint annotations.

@Size(min=5, max=5)
public @interface FrenchZipCode {
    String message() default "Wrong zipcode";
    String[] groups() default {};

When @FrenchZipCode is placed on a property, its value is validated against @Numerical, @Size(min=5, max=5) and the constraint implementation FrenchZipcodeValidator: all the composing constraints are validated as well as the logic of the main constraint. Note that a composing constraint can itself be composed of constraints.

Each failing constraint will generate an individual error report which is useful when you want do display fine grained reports to your user. But this might be quite confusing and a single error report is more appropriate in some situations. You can force Bean Validation to raise a single error report (the composed constraint error report) if any of its composing constraint fails by using the @ReportAsSingleInvalidConstraint annotation.

@Size(min=5, max=5)
public @interface FrenchZipCode {
    String message() default "Wrong zipcode";
    String[] groups() default {};

In the past two examples, none of the composing annotation parameters can be adjusted at declaration time. This is fine if the zip code is always of size 5. But what happens if the size can be adjusted depending on the property? The spec offers a way is a way to override a parameter from the composing annotation based on a parameter from the composied annotation using the @OverridesParameter annotation.

@Size //arbitrary parameter values
public @interface FrenchZipCode {
    String message() default "Wrong zipcode";
    String[] groups() default {};
    @OverridesParameters( {
        @OverridesParameter(constraint=Size.class, parameter="min")
        @OverridesParameter(constraint=Size.class, parameter="max") } )
    int size() default 5;

    @OverridesParameter(constraint=Size.class, parameter="message")
    String sizeMessage() default "{error.zipcode.size}";

    @OverridesParameter(constraint=Numerical.class, parameter="message")
    String numericalMessage() default "{error.zipcode.numerical}";

The following declaration

@FrenchZipcode(size=9, sizeMessage="Zipcode should be of size {value}")

is equivalent to the following definition / declaration combination

@Size(min=9, max=9, message="Zipcode should be of size {value}")
public @interface FrenchZipCode {
    String message() default "Wrong zipcode";
    String[] groups() default {};

Let's now see how a tool would use this extra information.

Exploring composed constraints with the metadata API

Using the metadata API, you can explore the list of constraints on a given object or property. Each constraint is described by a ConstraintDescriptor. It lists all the composing constraints and provides a ConstraintDescriptor object for each. The ConstraintDescriptor honors overridden parameters (ie using @OverridesParameter): the annotation and the parameter values returned contain the overridden value.

ElementDescriptor ed = addressValidator.getConstraintsForProperty("zipcode");
for ( processConstraintDescriptor cd : ed.getConstraintDescriptors() ) {
	processConstraintDescriptor(cd); //check all constraints on zip code

public void processConstraintDescriptor(processConstraintDescriptor cd) {
	//Size.class is understood by the tool
	if ( cd.getAnnotation().getAnnotationType().equals( Size.class ) ) {
		Size m = (Size) cd.getAnnotation();
		column.setLength( m.max() );  //read and use the metadata
	for (ConstraintDescriptor composingCd : cd.getComposingConstraints() ) {
		processConstraintDescriptor(cd); //check composing constraints recursively

When using the following declaration

@FrenchZipCode(size=10) public String zipCode;

the tool will set the zipCode column length to 10.

While the tool does not know what @FrenchZipCode and @Numerical means, it knows how to make use of @Max. Javascript generation libraries or persistence tools typically understand a core subset of constraints. If a complex constraint is composed of one or several of these core subset constraints, it can be partially understood and processed by Java Persistence for example.

This is one of the reasons why it is strongly recommended to build complex constraints on top of more primitive ones. The Bean Validation specification will come with a core of constraints tools will be able to rely upon.

Let us know what you think in our forum.

We have been recently working a lot on the Bean Validation spec (JSR 303) and have two good news for you:

  • the bootstrap API proposal is out
  • the reference implementation is available too (end of this blog)

Please have a look at both and give us feedbacks. It is still time to make it better ;)

Bootstrap API

The primary goal of the bootstrap API is trivial. Provide access to a ValidatorFactory capable of building Validator instances. These validator instances are then used to validate the domain model.

Some additional goals have been pursued as well:

  • simple to use
  • type safe
  • extensible and flexible to the container environment (Java EE, Java SE, dependency injection framework, OSGi-esque containers and so on)
  • support multiple implementations


The best way to learn an API is to see it in action.

Everyday bootstrap

The first example shows the most simple way and also the most common.

ValidatorFactory factory = Validation.getValidatorBuilder().build();

//cache the factory somewhere
Validator<Address> addressValidator = factory.getValidator(Address.class);

This creates a thread-safe ValidatorFactory object that should be cached. In this example, the default Bean Validation provider is used unless an explicit provider implementation is defined in the Bean Validation XML configuration file.

Container driven bootstrap

The second example refines some of the configuration elements.

ValidatorFactory factory = Validation
          .messageResolver( new WebBeansMessageResolver() )
          .constraintFactory( new WebBeansComponentConstraintFactory() )

//cache the factory somewhere
Validator<Address> addressValidator = factory.getValidator(Address.class);

This example shows a typical bootstrap in a container. The container has the ability to refined the message resolution strategy to adhere it standards. In the case of Web Bean, one can imagine the message resolver to resolve contextual components and EL expressions.

A container is also responsible for the lifecycle of its components. A custom contraint implementation factory can be provided. It will be responsible for instantiating constraint implementations. In the case of Web Beans, one can imagine a constraint factory properly injecting components inside the constraint implementation.

Some containers change the standard Java rules when it comes to classloader and resource discovery. It can make Bean Validation provider discovery challenging in such environments. To work around this problem, the specification lets a container override the provider resolution strategy. This strategy can be injected at bootstrap time.

//OSGi environment is picky when it comes to class loaders.
ValidatorFactory factory = Validation
          .providerResolver( new OSGiServiceDiscoverer() )

//cache the factory somewhere
Validator<Address> addressValidator = factory.getValidator(Address.class);

OSGIServiceDiscoverer has the knowledge of the OSGi isolation rules and can resolve the list of available providers accordingly.

Specific provider bootstrap

The third example shows how to select a specific provider programmatically. Each provider is uniquely identified by a sub interface of ValidatorBuilder. Hibernate Validator for example provides org.hibernate.validator.HibernateValidatorBuilder.

//get Hibernate Validator provider
ValidatorFactory factory = Validation
       .builderType( HibernateValidatorBuilder.class )

//cache the factory somewhere
Validator<Address> addressValidator = factory.getValidator(Address.class);

Using a specific sub interface of ValidatorBuilder as unique identifier has an other advantage. This sub interface can host provider specific configuration and still be called in a type safe way.

HibernateValidatorBuilder hibernateBuilder = Validation
       .builderType( HibernateValidatorBuilder.class )

ValidatorFactory factory = hibernateBuilder
          .messageResolver( new ContainerMessageResolver() ) //default configuration option

Or written in a more compact way

ValidatorFactory factory = Validation
       .builderType( HibernateValidatorBuilder.class )
               .messageResolver( new ContainerMessageResolver() )

Where HibernateValidatorBuilder looks like:

public interface HibernateValidatorBuilder extends ValidatorBuilder<HibernateValidatorBuilder> {
    HibernateValidatorBuilder disableDDLAlteration();
    HibernateValidatorBuilder enableconstraintHotRedeploy();

Note that nowhere we needed to down cast objects!

Main APIs

The main artifacts involved in the bootstrap process are:

Validation: API entry point. Lets you optionally define the Bean Validation provider targeted as well as a provider resolution strategy. Validation generates ValidatorBuilder objects and can bootstrap any provider implementation.

ValidationProvider: contract between the bootstrap procedure and a Bean Validation provider implementation.

ValidationProviderResolver: returns a list of all Bean Validation providers available in the execution context (generally the classpath).

ValidatorBuilder: collects the configuration details that will be used to build ValidatorFactory. A specific sub interface of ValidatorBuilder must be provided by Bean Validation providers as a unique identifier. This sub interface typically hosts provider specific configurations.

ValidatorFactory: result of the bootstrap process. Build Validator instances from a given Bean Validation provider. This object is thread-safe.


The bits of specification describing the bootstrap process is available here[1] and the APIs are available as part of the reference implementation. Please provide feedback either to our forum or in the JIRA issue tracker project component spec-general.

Reference implementation

The reference implementation is available in SVN. Simply do

svn checkout ri
to check it out so to speak. The spec is comprised of the validation API and the actual reference implementation. Both projects are Maven projects. They are distributed under the Apache Software License 2.0.

Many thanks to Hardy for taking over my bits and pieces of prototype, todos and complaints and make it an actual project. Hardy will probably write a blog entry sometimes in the future about the RI architecture.

You can provide feedback to our forum. If you have suggestions or bug reports, a JIRA issue tracker project has also been created.

It is still a work in progress but is good enough to start playing with it. Most of the ideas behind the spec are already present, so have a look!

This is the third installment of a series of blog entries on the Bean Validation specification (JSR 303). If you have not done it yet, go read the first and second part.

The Bean Validation specification let's you define constraints on your domain model. In a couple of situations, validating a subset of constraints is needed.

In a wizard-style screen, the information is partially distilled over time from screen to screen. Validating data for each screen is useful but as a whole, the data is not ready for a complete constraint validation.

Some constraints need to run after others. There are several use cases for ordering constraints:

  • higher level constraints may expect data pre validated by basic constraints before being triggered themselves
  • some constraints are expensive in CPU, memory, time or even money. Running the previous constraints first and fail fast is required
  • some use cases need to validate additional or specific constraints not necessary for others use cases

To solve this class of problems, the Bean Validation specification uses the notion of group. Each constraint can define a set of groups it belongs to. When it comes to validate an object graph, one or several groups can be requested.

The principles

Let's first describe the theory behind groups

Declare a constraint as belonging to a group

Each constraint has a groups element whose default value is an empty array. When this element is not explicitly declared on an constraint annotation, the default group is assumed

public class Address {
        //not null belongs to the "default" group
        @NotNull String street1;

Any constraint declaration can define one or several groups it belongs to. Note that default is the special default group.

public class Address {
        @NotNull(groups={"basic", "default"}) private String street1;

In this example, @NotNull is checked any time either the basic or default group is requested.

Defining constraint ordering

Ordering constraints for the sake of ordering the execution does not make a lot of sense since the Bean Validation implementation ends up executing all the constraints and return all the failures (for a given set of groups). However, it is useful to have the ability to avoid the execution of a constraint if an other constraint fails.

The Bean Validation specification uses the notion of a group sequence. A group sequence declares a list of groups. Each group must be validated sequentially following the list order. Validating a group means validating all constraints of an object graph belonging to the group. If one or more constraints fail in a given group, the following groups in the group sequence are not processed.

A group sequence is declared with @GroupSequence and is applied on the class it is set on.

@GroupSequence(name="default", sequence={"basic", "complex"})
public class Address {
        @NotNull(groups="basic") private String street1;

In an object graph (@Valid), group sequences from the parent object are inherited by the children objects. @GroupSequences is used to declare more than one sequence per class. A group sequence can contain the name of another group sequence.

Validating a group

By default, the validate method validates the default group. You can decide to validate one or several groups:

validator.validate(address, "basic");
validator.validate(address, "firstscreen", "secondscreen");

The groups passed to the validate method are not processed in a particular order (if you need ordering, use a group sequence).

Practical uses

Enough theory, let's see how groups and group sequences solve our use cases.

Use case specific validation

Defining constraints based on specific use cases is sometimes useful. A typical example comes from reusing the same object model for different purposes, in different contexts.

In our example, an account can be in three states, default, canbuy and oneclick. A canbuy account has enough information to buy items, a oneclick account can buy in... ahem one click.

public class Account {

        @NotEmpty @Length(max=50) private String firstname;
        @Length(max=50) private String middleName;
        @NotEmpty @Length(max=50) private String lastname;
        @NotNull(groups="canbuy") @Phone private String cellPhone;

        @Valid @NotEmpty(groups="canbuy")
        private Set<Address> addresses;

        @Valid @NotNull(groups="canbuy")
        private Address defaultAddress;
        @Valid @NotEmpty(groups="canbuy")
        private Set<PaymentMethod> paymentMethod;

        @Valid @NotNull(groups="oneclick")
        private Address defaultPaymentMethod;


Let's dissect the previous example. A default account must have a first name and last name. If it has addresses, payment methods and phone number, each of these properties must be valid. However, these properties do not have to be filled up to be valid.

An account can buy if it has a default address and at least one payment method. An account can buy in one click if it has a default payment method.

We could imagine the following logic

Set<InvalidConstraint> accountIssues = validator.validate(account);
if ( accountIssues.size() > 0 ) {
        //push that error list to the user
else {
        if ( validator.validate(account, "canbuy").size() == 0 ) {
                if ( validator.validate(account, "oneclick").size() == 0 ) {

The data consistency is ensured by the default group. And some additional state checking is provided by additional groups. Note that in an integrated world, your application framework or your web framework would let your declaratively express which group needs to be validated in a given page context, page, conversation etc. The application avoids the extra work of manually executing the validation.

Partial data validation: the wizard screen model

In a wizard-style user interface, data is provided partially and gets more and more complete when the user move from one wizard screen to the other.

Let's take the previous example and let's imagine a wizard building a one click account from the ground up:

  • the first screen acquires the firstname, lastname and phone
  • the second screen adds one or several addresses
  • the third screen adds one or several payment methods
  • the forth screen makes sure the user selects a default address and payment method and that everything is set up for a one click process
  • the fifth screen is a summary and make sure that everything is in place
public class Account {

        @NotEmpty(groups={"firstscreen", "default"}) 
        @Length(max=50, groups={"firstscreen", "default"}) 
        private String firstname;
        @NotEmpty(groups={"firstscreen", "default"})
        @Length(max=50, groups={"firstscreen", "default"}) 
        private String lastname;
        @NotNull(groups={"firstscreen", "canbuy"}) 
        @Phone(groups={"firstscreen", "default"})
        private String cellPhone;

        @Valid @NotEmpty(groups={"secondscreen", "canbuy"})
        private Set<Address> addresses;

        @Valid @NotNull(groups={"forthscreen", "canbuy"})
        private Address defaultAddress;
        @Valid @NotEmpty(groups={"thirdscreen", "canbuy"})
        private Set<PaymentMethod> paymentMethod;

        @Valid @NotNull(groups={"forthscreen", "oneclick"})
        private Address defaultPaymentMethod;


For each screen but the last, a specific group has been set up. The last screen needs to ensure an account is valid for a oneclick operation. It then needs to validate default, canbuy and oneclick groups.

While the group validation can be called programmatically, we expect application frameworks and web frameworks to let you define the targeted groups declaratively.

In JSF, it could look like:

<s:validateAll groups="firstscreen">
    <h:inputText id="firstname" value="#{account.firstname}" />
    <h:message for="firstname" styleClass="error" />
    <h:inputText id="lastname" value="#{account.lastname}" />
    <h:message for="lastname" styleClass="error" />
    <h:inputText id="cellPhone" value="#{account.cellPhone}" />
    <h:message for="cellPhone" styleClass="error" />

This is just a proposal / vision. This integration is not part of the JSR 303 specification.

Constraint ordering

An other use case is the ability to stop constraint validations if a group of constraint fails. It is quite useful in (at least) two situations:

  • some constraint validations (especially class level constraints) expect to work on pre-validated constraints (for example pre validated with nullability, length and global pattern matchings): the basic level constraints must be valid before calling a higher level constraint checking.
  • some constraints are expensive to run (long processing, access an external resource etc)

Validating an address can be fairly simple (some basic constraints) especially when targeting one country. It could also be much more complex and involve address coherence checking (the zipcode must match the city, the street name much match the zipcode and so on). The address coherence checking cannot really be applied until we are sure the basic constraints are valid as it expects some pre normalized data. On top of that, our application has to call an external service to check the coherence. And it turns out calling this service has two drawbacks:

  • it is a bit slow (at least slower than the other constraint validations)
  • it costs a fee per execution

We want to make sure the address coherence constraint is not called unless the other constraints are valid:

@GroupSequence(name="default", sequence={"basic", "complex"})
public class Address {
    @NotNull(groups="basic") @Max(50, groups="basic")) private String street1;
    @Max(50, groups="basic")) private String street2;
    @Max(10, groups="basic")) @NotNull(groups="basic") private String zipCode;
    @Max(20, groups="basic")) @NotNull(groups="basic") String city;
    @NotNull(groups="basic") private Country country;

The group sequence will first validate the constraints marked as basic, then validate the constraints marked as complex unless one (or more) constraint from the basic group failed.


In most cases, the default group will suffice. But more complex use cases need additional flexibility in the constraint definition and validation. While declaring groups is the application developers responsibility, validating a given set of groups will most likely be driven by the client framework (application frameworks, web frameworks and to a certain extends persistence frameworks). Either by using method annotations or template declarations, frameworks like WebBeans or JSF will capture the requested constraint validation groups and execute the validation calls for the application.

This part of the specification is probably the most controversial. The current proposal tries to achieve several goals:

  • embrace the declarative model
  • stay as simple and understandable as possible
  • provide enough flexibility to match most use cases

The expert group is seeking feedback in several forms on the groups approach:

  • use cases: describe your use cases, see how (or if) they can be addressed by the current proposal
  • enhancements: propose enhancements on the groups concept to make it more flexible or more simple
  • alternatives: you think you have the perfect solution? go describe it, let's see how it fits the model

We encourage you to read the specification and provide feedbacks and comments on this forum.

This is the second part of a series of blogs about Bean Validation. For a general introduction, read this entry first. This part focuses on constraint definition.

While Bean Validation will come with a set of predefined basic constraints (like @NotNull, @Length and so on), a key feature of the specification is its extensibility. Application developers can and are strongly encouraged to write their own custom constraints matching a particular business requirement.

Writing a custom constraint

Because writing custom constraints is a core part of the specification goal, great care have been taken to make it as simple as possible. Let's walk through the process of creating a custom constraint.

As we have seen in the previous blog entry, a constraint is composed of

  • an annotation
  • an implementation

Constraint annotation

Every constraint is associated to an annotation. You can see it as a type safe alias and a descriptor. Constraint annotations can also hold one or several parameters that will help customize the behavior at declaration time

public class Order {
    @NotNull @OrderNumber private String number;
    @Range(min=0) private BigDecimal totalPrice;

Let's have a look at the @OrderNumber annotation definition

@Target({METHOD, FIELD})
public @interface OrderNumber {
    String message() default "{error.orderNumber}"; 
    String[] groups() default {}; 

Constraint annotations are just regular annotations with a few extra things:

  • they must use the runtime retention policy: the bean validation provider will inspect your objects at runtime
  • they must be annotated with @ConstraintValidator
  • they must have a message attribute
  • they must have a groups attribute

The @ConstraintValidator indicates to the bean validation provider that an annotation is a constraint annotation. It also points to the constraint validation implementation routine (we will describe that in a minute).

The message attribute (which generally is defaulted to a key) provides the ability for a constraint declaration to override the default message returned in the constraint error list. We will cover this particular topic in a later post.

groups lets a constraint declaration define the subset of constraints it participates to. Groups enable partial validation and ordered validation. We will cover this particular topic in a later post.

In addition to these mandatory attributes, an annotation can define any additional element to parameterize the constraint logic. The set of parameters is passed to the constraint implementation. For example, a @Range annotation needs min and max attributes.

@Target({METHOD, FIELD})
public @interface Range {
        long max() default Long.MAX_VALUE;

        long min() default Long.MIN_VALUE;

        String message() default "{error.range}";
        String[] groups() default {}; 

Now that we have a way to express a constraint and its parameters, we need to provide the logic to validate the constraint.

Constraint implementation

The constraint implementation is associated to its annotation through the use of @ConstraintValidator. In the first early draft, @ValidatorClass is sometimes used in lieu of @ConstraintValidator: this is a mistake driven by a last minute change, sorry. The implementation must implement a very simple interface Constraint<A extends Annotation> where A is the targeted constraint annotation

public class OrderNumberValidator implements Constraint<OrderNumber> {
        public void initialize(OrderNumber constraintAnnotation) {
                //no initialization needed

         * Order number are of the form Nnnn-nnn-nnn when n is a digit
         * The sum of each nnn numbers must be a multiple of 3
        public boolean isValid(Object object) {
                if ( object == null) return true;
                if ( ! (object instanceof String) )
                        throw new IllegalArgumentException("@OrderNumber only applies to String");
                String orderNumber = (String) object;
                if ( orderNumber.length() != 12 ) return false;
                if ( orderNumber.charAt( 0 ) != 'N'
                                || orderNumber.charAt( 4 ) != '-'
                                || orderNumber.charAt( 8 ) != '-'
                                ) return false;
                try {
                        long result = Integer.parseInt( orderNumber.substring( 1, 4 ) )
                                        + Integer.parseInt( orderNumber.substring( 5, 8 ) )
                                        + Integer.parseInt( orderNumber.substring( 9, 12 ) );
                        return result % 3 == 0;
                catch (NumberFormatException nfe) {
                        return false;

The initialize method receives the constraint annotation as a parameter. This method typically does:

  • prepare parameters for the isValid method
  • acquire external resources if needed

As you can see the interface entirely focuses on validation and leaves other concerns such as error rendering to the bean validation provider.

isValid is responsible for validating a value. A few interesting things can be noted:

  • isValid must support concurrent calls
  • exceptions should be raised when the object type received does not match the validation implementation expectations
  • the null value is not considered invalid: the specification recommends to split the core constraint validation from the not-null constraint validation and use @NotNull if the property must not be null

This easily customizable approach gives application programmers the freedom they need to express constraints and validate them.

Applying the same constraint type multiple times

Especially when using groups, you will sometimes need to apply the same type of constraint multiple times on the same element. The Bean Validation specification takes into account annotations containing an array of constraint annotations:

@Target({METHOD, FIELD})
public @interface Patterns {
        Pattern[] value();

@Target({METHOD, FIELD})
public @interface Pattern {
        /** regular expression */
        String regex();

        /** regular expression processing flags */
        int flags() default 0;

        String message() default "{validator.pattern}";
        String[] groups() default {};

In this example, you can apply multiple patterns on the same property

public class Engine {
        @Patterns( {
            @Pattern(regex = "^[A-Z0-9-]+$", message = "must contain alphabetical characters only"),
            @Pattern(regex = "^....-....-....$", message="must match ....-....-....")
                        } )
        private String serialNumber;

Building constraints

By default, Bean Validation providers instantiate constraint validation implementations using a no-arg constructor. However, the specification offers an extension point to delegate the instantiation process to a dependency management library such as Web Beans, Guice, Spring, JBoss Seam or even the JBoss Microcontainer.

Depending on the capacity of the dependency management tool, we expect validation implementations to be able to receive injected resources if needed: this mechanism will be entirely dependent on the dependency management tool.

Class-level constraints

Some of you have expressed concerns about the ability to apply a constraint spanning multiple properties, or to express constraint which depend on several properties. The classical example is address validation. Addresses have intricate rules:

  • a street name is somewhat standard and must certainly have a length limit
  • the zip code structure entirely depends on the country
  • the city can often be correlated to a zipcode and some error checking can be done (provided that a validation service is accessible)
  • because of these interdependencies a simple property level constraint does to fit the bill

The solution offered by the Bean Validation specification is two-fold:

  • it offers the ability to force a set of constraints to be applied before an other set of constraints through the use of groups and group sequences. This subject will be covered in the next blog entry
  • it allows to define class level constraints

Class level constraints are regular constraints (annotation / implementation duo) which apply on a class rather than a property. Said differently, class-level constraints receive the object instance (rather than the property value) in isValid.

public class Address {
    @NotNull @Max(50) private String street1;
    @Max(50) private String street2;
    @Max(10) @NotNull private String zipCode;
    @Max(20) @NotNull String city;
    @NotNull private Country country;
public @interface Address {
    String message() default "{error.address}";
    String[] groups() default {};
public class MultiCountryAddressValidator implements Constraint<Address> {
        public void initialize(Address constraintAnnotation) {
                //initialize the zipcode/city/country correlation service

         * Validate zipcode and city depending on the country
        public boolean isValid(Object object) {
                if ( ! (object instanceof Address) )
                        throw new IllegalArgumentException("@Address only applies to Address");
                Address address = (Address) object;
                Country country = address.getCountry();
                if ( country.getISO2() == "FR" ) {
                    //check address.getZipCode() structure for France (5 numbers)
                    //check zipcode and city correlation (calling an external service?)
                    return isValid;
                else if ( country.getISO2() == "GR" ) {
                    //check address.getZipCode() structure for Greece
                    //no zipcode / city correlation available at the moment
                    return isValid;

The advanced address validation rules have been left out of the address object and implemented by MultiCountryAddressValidator. By accessing the object instance, class level constraints have a lot of flexibility and can validate multiple correlated properties. Note that ordering is left out of the equation here, we will come back to it in the next post.

The expert group has discussed various multiple properties support approaches: we think the class level constraint approach provides both enough simplicity and flexibility compared to other property level approaches involving dependencies. Your feedback is welcome.


Custom constraints are at the heart of JSR 303 Bean Validation flexibility. It should not be considered awkward to write a custom constraint:

  • the validation routine captures the exact validation semantic you expect
  • a carefully chosen annotation name will make constraints extremely readable in the code

Please let us know what you think here. You can download the full specification draft there. The next blog entry will cover groups, constraints subsets and validation ordering.

back to top