Monday, July 09, 2018

Mario Fusco is the new Drools Project Lead

It is my honor to announce that Mario Fusco will be taking over as the new Drools Project Lead.

Mario is a Principal Software Engineer at Red Hat, working on the development of the Drools core. He has vast experience as a Java developer, and among other accomplishments, was nominated a Java Champion in 2016. Mario previously created and led the open source Lambdaj project and has been involved in (and often leads) many enterprise-level projects in several industries ranging from media companies to the financial sector.

Mario is a frequent speaker at conferences, like the Red Hat Summit, Devoxx, Voxxed, JavaOne, LambaWorld and others. Mario authored several articles for InfoQ and DZone, and co-authored the “Java 8 in Action” book published by Manning. His tweeter following is another hint at his popularity, and if you would like to keep up with his latest insights, I suggest you hit that button.

Mario joined Red Hat in 2011 to work on the core engine of Drools and has since made invaluable contributions, including the development and improvement of the latest core algorithm. Among his interests are also functional programming and domain specific languages.

If you ever have the opportunity to interact with him in person, you will experience firsthand how nice of a person he is and how pleasant it is to have a talk with him. You can even offer him a beer, he will like that, but whatever you do, make sure you follow proper Italian etiquette (or is that Mario’s etiquette?): no pineapple on your pizza and no cappuccinos after meals.

Please join me in congratulating Mario on his new role as the Drools Project lead.

Edson Tirelli


Wednesday, April 18, 2018

The DMN Cookbook has been published

The Decision Model and Notation (DMN) Standard offers something no previous attempt at standardization of decision modelling did: a simple, graphical effective language for the documentation and modelling of business decisions. It defines both the syntax and the semantics of the model, allowing IT and Business teams to "speak the same language". It also ensures interoperability between vendor tools that support the standard, and protect customer's investment and IP.

It was an honour to work with accomplished author Bruce Silver to write the "DMN Cookbook", a book that explains the features of the standard by examples, showing solutions for real modelling problems. It discusses what DMN offers that is different from traditional rules authoring languages, as well as how to leverage its features to create robust solutions.

Topics covered include:

  • What is DMN?
  • How DMN differs from traditional rule languages
  • DMN Basics
    • DRG elements and DRDs
    • Decision tables and other boxed expressions
    • FEEL
  • Decision services
  • Practical examples
    • Uniform Residential Loan Application: validation, handling null values, handling XML input
    • GSE Mortgage Eligibility: variations using a central registry
    • Canadian Sales Tax: variations without a central registry (dynamic and static composition)
    • Timing the Stock Market: modeling a state chart with DMN
    • Land Registry: DMN-enhanced Smart Contract
    • Decision Service Deployment: automated and manual
    • Decision Service Orchestration: BPMN or Microsoft Flow

More information on the book website.

Available on Amazon.


bpmNext 2018 day 1 videos are already online!

The organization of bpmNEXT 2018 is outdoing themselves! The videos from the first day of the conference are already available.

In particular, the presentations from Denis Gagné, Bruce Silver and Edson Tirelli are directly related to Drools with content related to DMN. I also recommend the presentation from Vanessa Bridge, as it is related to BPM and the research we've been doing on blockchain.

Smarter Contracts with DMN: Edson Tirelli, Red Hat

Timing the Stock Market with DMN: Bruce Silver,

Decision as a Service (DaaS): The DMN Platform Revolution: Denis Gagné, Trisotech

Secure, Private, Decentralized Business Processes for Blockchains: Vanessa Bridge, ConsenSys

The Future of Process in Digital Business: Jim Sinur, Aragon Research

A New Architecture for Automation: Neil Ward-Dutton, MWD Advisors

Turn IoT Technology into Operational Capability: Pieter van Schalkwyk, XMPro

Business Milestones as Configuration: Joby O'Brien and Scott Menter, BPLogix

Designing the Data-Driven Company: Elmar Nathe, MID GmbH

Using Customer Journeys to Connect Theory with Reality: Till Reiter and Enrico Teterra, Signavio

Discovering the Organizational DNA: Jude Chagas Pereira, IYCON; Frank Kowalkowski, KCI



Monday, February 26, 2018

The Drools Executable Model is alive


The purpose of the executable model is to provide a pure Java-based representation of a rule set, together with a convenient Java DSL to programmatically create such model. The model is low level and designed for the user to provide all the information it needs, such as the lambda’s for the index evaluation. This keeps it fast and avoids building in too many assumptions at this level. It is expected higher level representations can layer on in the future, that may be more end-user focused. This work also highly compliments the unit work, which provides a java-oriented way to provide data and control orchestration.


This model is generic enough to be independent from Drools but can be compiled into a plain Drools knowledge base. For this reason the implementation of the executable model has been split in 2 subprojects:
  1. drools-canonical-model is the canonical representation of a rule set model which is totally independent from Drools
  2. drools-model-compiler compiles the canonical model into Drools internal data structures making it executable by the engine
The introduction of the executable model brings a set of benefits in different areas:
  • Compile time: in Drools 6 a kjar contained the list of drl files and other Drools artifacts defining the rule base together with some pre generated classes implementing the constraints and the consequences. Those drl files needed to be parsed and compiled from scratch, when the kjar is downloaded from the Maven repository and installed in a KieContainer, making this process quite slow especially for large rules sets. Conversely it is now possible to package inside the kjar the Java classes implementing the executable model of the project rule base and recreate the KieContainer and its KieBases out of it in a much faster way. The kie-maven-plugin automatically generates the executable model sources from the drl files during the compilation process.
  • Runtime: in the executable model all constraints are defined as Java lambda expressions. The same lambdas are also used for constraints evaluation and this allows to get rid of both mvel for interpreted evaluation and the jitting process transforming the mvel-based constraints in bytecode, resulting in a slow warming up process.
  • Future research: the executable model will allow to experiment new features of the rule engine without the need of encoding them in the drl format and modify the drl parser to support them. 

Executable Model DSLs

One goal while designing the first iteration of the DSL for the executable model was to get rid of the notion of pattern and to consider a rule as a flow of expressions (constraints) and actions (consequences). For this reason we called it Flow DSL. Some examples of this DSL are available here.
However after having implemented the Flow DSL it became clear that the decision of avoiding the explicit use of patterns obliged us to implement some extra-logic that had both a complexity and a performance cost, since in order to properly recreate the data structures expected by the Drools compiler it is necessary to put together the patterns out of those apparently unrelated expressions.
For this reason it has been decided to reintroduce the patterns in a second DSL that we called Pattern DSL. This allowed to bypass that algorithm grouping expressions that has to fill an artificial semantic gap and that is also time consuming at runtime.
We believe that both DSLs are valid for different use cases and then we decided to keep and support both. In particular the Pattern DSL is safer and faster (even if more verbose) so this will be the DSL that will be automatically generated when creating a kjar through the kie-maven-plugin. Conversely the Flow DSL is more succinct and closer to the way how an user may want to programmatically define a rule in Java and we planned to make it even less verbose by generating in an automatic way through a post processor the parts of the model defining the indexing and property reactivity. In other terms we expect that the Pattern DSL will be written by machines and the Flow DSL eventually by human.

Programmatic Build

As evidenced by the test cases linked in the former section it is possible to programmatically define in Java one or more rules and then add them to a Model with a fluent API

Model model = new ModelImpl().addRule( rule );

Once you have this model, which as explained is totally independent from Drools algorithms and data structures, it’s possible to create a KieBase out of it as it follows

KieBase kieBase = KieBaseBuilder.createKieBaseFromModel( model );

Alternatively, it is also possible to create an executable model based kieproject by starting from plain drl files, adding them to a KieFileSystem as usual

KieServices ks = KieServices.Factory.get();
KieFileSystem kfs = ks.newKieFileSystem()
                      .write( "src/main/resources/r1.drl", createDrl( "R1" ) );
KieBuilder kieBuilder = ks.newKieBuilder( kfs );

and then building the project using a new overload of the buildAll() method that accepts a class specifying which kind of project you want to build

kieBuilder.buildAll( ExecutableModelProject.class );

Doing so the KieBuilder will generate the executable model (based on the Pattern DSL) and then the resulting KieSession

KieSession ksession = ks.newKieContainer(ks.getRepository()

will work with lambda expression based constraint as described in the first section of this document. In the same way it is also possible to generate the executable model from the Flow DSL by passing a different project class to the KieBuilder

kieBuilder.buildAll( ExecutableModelFlowProject.class );

but, for what explained when discussing the 2 different DSLs, it is better to use the pattern-based one for this purpose.

Kie Maven Plugin

In order to generate a kjar embedding the executable model using the kie-maven-plugin it is necessary to add the dependencies related to the two formerly mentioned projects implementing the model and its compiler in the pom.xml file:


also add the plugin to the plugin section

An example of a pom.xml file already prepared to generate the executable model is available here. By default the kie-maven-plugin still generates a drl based kjar, so it is necessary to run the plugin with the following argument:

Where <VALUE> can be one of three values:


Both YES and WITHDRL will generate and add to the kjar use the Java classes implementing the executable model corresponding to the drl files in the original project with difference that the first will exclude the drl files from the generated kjar, while the second will also add them. However in this second case the drl files will play only a documentation role since the KieBase will be built from the executable model regardless.

Future developments

As anticipated one of the next goal is to make the DSLs, especially the flow one, more user friendly, in particular generating with a post-processor all the parts that could be automatically inferred, like the ones related to indexes and property reactivity.
Orthogonally from the executable model we improved the modularity and orchestration of rules especially through the work done on rule units, This focus around pojo-ification compliments this direction of research around pure java DSLs and we already have a few simple examples of how executable model and rule units can be mixed to this purpose.


Wednesday, February 07, 2018

Running multi Workbench modules on the latest IntelliJ Idea with live reloading (client side)

NOTE: The instructions below apply only to the old version of the gwt-maven-plugin

At some point in the past, IntelliJ released an update that made it impossible to run the Workbench using the GWT plugin. After exchanging ideas with people on the team and summing up solutions, some workarounds have emerged. This guide provides information to running any Errai-based applications in the latest version of IntelliJ along with other modules to take advantage of IntelliJ's (unfortunately limited) live reloading capabilities to speed the development workflow.

Table of contents

1. Running Errai-based apps in the latest IntelliJ
2. Importing other modules and use live reload for client side code
3. Advanced configurations
3.1. Configuring your project's pom.xml to download and unpack Wildfly for you
3.2. Alternative workaround for non-patched Wildfly distros

1. Running Errai-based apps in the latest IntelliJ

As Max Barkley described on #logicabyss a while ago, IntelliJ has decided to hardcode gwt-dev classes to the classpath when launching Super Dev Mode in the GWT plugin. Since we're using the EmbeddedWildflyLauncher to deploy the Workbench apps, these dependencies are now deployed inside our Wilfdfly instance. Nothing too wrong with that except the fact that gwt-dev jar depends on apache-jsp, which has a ServletContainerInitializer marker file that causes the deploy to fail.

To solve that issue, the code that looks to the ServletContainerIntitilizer file and causes the deploy to fail was removed in custom patched versions of Wildfly that are available in Maven Central under the org.jboss.errai group id.

The following steps provide a quick guide to running any Errai-based application on the latest version of IntelliJ.

1. Download a patched version of Wildfly and unpack it into any directory you like
- For Wildfly 11.0.0.Final go here

2. Import the module you want to work on (I tested with drools-wb)
  - Open IntelliJ, go to File -> Open.. and select the pom.xml file, hit Open then choose Open as Project

3. Configure the GWT plugin execution like you normally would on previous versions of IntelliJ

- VM Options:

- Dev Mode parameters:
  -server org.jboss.errai.cdi.server.gwt.EmbeddedWildFlyLauncher

4. Hit the Play button and wait for the application to be deployed

2. Importing other modules and using live reload for client side code

After being able to run a single webapp inside the latest version of IntelliJ, it might be very useful to have some of its dependencies be imported as well, so that after changing client-code on that dependency, you don't have to wait (way) too long for GWT to compile and bundle your application's JavaScript code again.

Simply go to File > New > Module from existing sources.. and choose the pom.xml of the module you want to import.
If you have kie-wb-common or appformer imported alongside with another project, you'll most certainly have to apply a patch in the beans.xml file of your webapp.

For drools-wb you can download the patch here. For other projects such as jbpm-wb, optaplanner-wb or kie-wb-distributions, you'll have to essentially do the same thing, but changing the directories inside the .diff file.

If your webapp is up, hit the Stop button and then hit Play again. Now you should be able to re-compile any code changed inside IntelliJ much faster.

3.1. Configuring your project's pom.xml to download and unpack Wildfly for you

If you are used to a less manual workflow, you can use the maven-dependency-plugin to download and unpack a Wildfly instance of your choice to any directory you like.

After you've added the snipped below to your pom.xml file, remember to add a "Run Maven Goal" before the Build of your application in the "Before launch" section of your GWT Configuration. Here I'm using the process-resources phase, but other phases are OK too.

              <!-- Using a patched version of Wildfly -->
              <!-- Unpacking it into /target/wildfly-11.0.0.Final -->

3.2. Alternative workaround for non-patched Wildfly distros

If you want to try a different version of Widlfly or if you simply don't want to depend on any patched versions, you can still use official distros and exclude the ServletContainerInitializer file from the apache-jsp jar on your M2_REPO folder.

If you're working on a Unix system, the following commands should do the job.

1. cd ~/.m2/repository/

2. zip -d org/eclipse/jetty/apache-jsp/{version}/apache-jsp-{version}.jar META-INF/services/javax.servlet.ServletContainerInitializer

By excluding it manually from the apache-jsp jar, Maven won't try to download it again after you remove the file. That makes this workaround permanent as long as you don't erase your ~/.m2/ folder. Keep in mind that if you ever need the apache-jsp jar to have this file back, the best option is to delete the apache-jsp dependency directory and let Maven download it again.

New instructions for the new version of the gwt-maven-plugin are to come, stay tunned!