Saturday, December 29, 2012

Rolo The Robot says: Happy New Year!


Hi Everyone! This is probably my last post of the year and in this occasion I want to share with you all a side project that I've being working on my spare time with my father. In less than 10 days we manage to build a small robot which runs all its logic inside the Drools Rule Engine. This first step main goal was to prove that it is possible to build a low cost robot which will be entirely controlled by the rule engine and the CEP features provided by Drools Fusion. We didn't include any process execution in this first stage, but it is definitely on the roadmap. This post briefly explains what we have now working and what's planned for the future, because this is just the beginning. The interesting side of the project is to make the robot completely autonomous, which means that it will run entirely on its own and without the need to be connected to a computer or to a power outlet in the wall.
rolo brand


Introduction


We started building this first prototype with the following goals in mind:
  • Demonstrate that the Drools & jBPM Platform can help us to build a reliable and declarative environment to code the robot internal knowledge. 
  • Demonstrate that a robot can be constructed on top of the Rule & Process Engine in a reduced and portable platform. Some important points from our perspective are:
    • It needs to run without an external computer
    • It needs to be autonomous and run on batteries to have freedom of mobility
    • It can be monitored and contacted via wireless
    • It needs to react in near real time and process information without requiring long periods of time. We are doing the tests with 100 millisecond (because that's more than enough in this stage) lapses now but the performance can be improved to support lower latency.
  • Test different hardware options to decide which are the best components to use to build different types of robots
  • Push the limits and mix the Rule/Process Engine arena with the Hardware/Electronics/Robotics arena.
  • Incrementally build a framework to speed up the initial steps
  • and of course, make it open source to improve  collaboration and to join forces with other people which are interested in the same topics.


Hardware


Rolo!
Rolo!
My father (Jose Salatino), "the electronic geek" help me with all the hardware side of the robot. I've started looking at the Lego NXT and WEDO platforms to see if we can reuse some of the cool things that they have designed, but the NXT runs with J2ME and it's old at this point. I'm looking forward to see if they release something new soon. The Lego WEDO and Power Functions looks promising, but they have several limitations such as: reduced number of devices that you can handle via the USB port and there are some expensive pieces that you need to get if you want to make it work. When my father started playing with Arduino we find a lot of advantages which help us to get everything working in almost no time. In the other hand the Lego Motor & Sensors provide us a great and scalable environment to create advanced prototypes. For that reason, we choose to use Arduino as a central Hub to control a set of Sensor, Motors and Actuators, no matter if they are Lego or not.
The following figure shows the wiring between the different components that we are using, most of the components can be changed without affecting the software architecture:
Hardware Wiring
Hardware Wiring

This list summarize the list of components that we are using, please note that there are a lot of things to improve so this infrastructure is in no way set in stone:
  • 1 x Raspberry Pi Model B
  • 1 x Arduino Uno
  • 2 x Lego NXT Servo Motor
  • 1 x SR04 Ultra Sonic Sensor (Distance Sensor)
  • 1 x SG90 180 Servo Motor
  • 2 Battery Packs (10 AA batteries) -> we are working on this, don't worry ;)
  • 1 x USB Wireless Dongle
  • 1 x LDR Sensor (Light Sensor)


Hardware Roadmap


From the Hardware perspective there are a lot of things to do. We will start researching about the I2C  protocol to replace all the serial communications. We know that I2C is the way to go, but we didn't had time yet to do all the necessary tests. We currently have a hardware/physical limitation about the number of devices that we can set up. We want to push the platform limits so we will be looking forward to add more motors and more sensors to increase the robot complexity and see how far we can go.


Software


From the software perspective we have a bunch of things to solve, but this section gives a quick overview about what has been done until now. We need to understand that the Raspberry Pi is not a PC, it's an ARM machine, which is a completely different infrastructure. For Java that's not supposed to be a problem, but it is. When you want to access the serial port or use the USB port to transmit data, you will start facing common issues about native libraries which are not compiled for the ARM platform. Once we manage to solve those issues we need to find a way to interact with the Arduino Board which is programmed in C/C++. Luckily for us there is  software called Firmata which externalise via the Serial port the whole board. Using this software we can read and write digital/analog information from the board pins.  This helps us a lot, because we will write a standard software inside the Arduino which will allow us to write/read all the information that we need into the board to control the motors and read the sensors.  Unfortunately, as every standard we hit a non covered sensor (SR04 - UltraSonic Sensor), and for that reason we provided a slightly modified version of the Firmata Sketch, which can be found inside the project source repository. From the Java Perspective, there is a library called processing (Processing is an open source programming language and environment for people who want to create images, animations, and interactions) which has a number of sub libraries, one of them for interacting with Firmata. I borrowed two classes from Processing in order to customize to my particular needs. From the beginning I wanted to use Processing because believe that it has a lot of potential to be mixed with the Process and Rules Engine, but this initial stage is not taking advantage of it.
The following figure shows from a high level perspective the different software components that runs in order to bring Rolo to life:
Software Components
As you can see, the Rolo Server expose and recieve information via JMS which allows us to build a Monitor to see the information and send more imperative commands or information about the world to the robot. Rolo Server is basically a Drools/jBPM Knowledge Session now, but a more robust schema with multiple sessions for different purpose will be adopted in future stages.
The Rules right now have access to all the Motors and Sensors information allowing us to write rules using those values. All the sensors input data are considered as events and for this reason we can use all the Drools Fusion temporal operators.
The following two rules are simple examples about what is being done inside the robot right now:

rule "Something too close - Robot Go Back"
   when
        $r: RoloTheRobot()
        $m: Motor(  )
        UltraSonicSensor( $sensor: name )
        $n: Number( doubleValue < 30) from accumulate (
                    DistanceReport( sensorName == $sensor, $d: distance )
                                    over window:time( 300ms )
                                    from entry-point "distance-sensor", average($d))

   then
       notifications.write("Process-SOMETHING_TOO_CLOSE:"+$n);
       $m.start(120, DIRECTION.BACKWARD);
       Match item = ( Match ) kcontext.getMatch();
              final Motor motor = $m;
              final HornetQSessionWriter notif = notifications;
              ((AgendaItem)item).setActivationUnMatchListener( new ActivationUnMatchListener() {

                    public void unMatch(Session session,
                                        Match match) {
                        System.out.println(" Stop Motor");

                        motor.stop();
                        try{
                            notif.write("Stopping Motor because avg over: 30");
                        } catch(Exception e){
                            System.out.println("ERROR sending notification!!!");
                        }

                   }
                } );
end
This rule checks the average of the distance received from a Distance Sensor (in this case the UltraSonic Sensor) in the case that the distance is less than 30cm in the last 300ms all the motors will be started at a fixed speed to move away from that object. This allows us to be sure that there is something in front of the robot instead of reacting in the first measure that matches the condition. Different functions can be used to correct wrong reads from the sensors and to improve the overall performance. Notice that after starting the motor we are registering an ActivationUnMatchListener, this will cause that as soon as the Rule doesn't match anymore the motor will be stopped. You will see in the video, that the robot will go backward until the average received from the  Distance Sensor in the last 300ms is over 30 cm.
There is another rule which use the Light Sensor to know how to go out from dark places.

Software Roadmap

After a well deserved holidays, I will be working on improving the code base, to allow to run all the software without the need to have an Arduino Board or any specific hardware. The main idea is to have an environment where we can simulate virtual motors and sensors. This will allows us to improve the development of the software without being tied to the hardware improvements. This will also allow you to collaborate with the project, if I get enough collaborations I can do weekly videos about how the robot behaves using your contributions :)
So, take the following list as a brain dump of the things that I need to do on the project:
  1. Improve the infrastructural code: JMS messages encoding, Monitor App, Simulation App
  2. Create more rules and processes to enable Rolo to do different things such as: recognize the environment/room where its running, interact with different objects,
  3. Mock a coordinate system and a model store different objects recognized from the environment.
  4. Use processing to draw in real time what is being sensed by Rolo in a 3D environment
  5. Enable Rolo to ask questions using the Human Task Services provided by jBPM
  6. Define the requirements for actuators and how to use them
  7. Video Streaming and image analysis

Video

Finally, let me introduce you: Rolo The Robot!



Notice in the last 20 seconds of the video you can see the Rolo Client/Monitor application which shows us all the notifications that are being sent from the robot. You can see a small control panel which allows us to send some commands and also see the values that are being captured from the sensors.
Rolo says: Happy New Year to you all!
Stay tuned!

Share/Bookmark

Monday, December 17, 2012

jBPM Console NG (Update): Rules + Processes + Events


Hi everyone! I'm back with another update about the jBPM Console NG. Yesterday we did a quick demo about the console current features in the JBUG London meetup. Today I've decided to explain the demo in more depth and also explain the last slides from the presentation which describe some scenarios where events and rules influence the execution of our business processes.

Introduction


The main idea of the demo is to show how rules, processes and events can be used to monitor our business processes and influence their execution. In order to understand the runtime behavior we need to obviously understand how Rules and Events works, but I will start explaining the business use case first in order to explain what we are trying to achieve.
The Business Process that we want to execute looks like the following image:
Release Process
This is just a normal process, it includes Human Interactions and System Interactions. We will handle the Human Interactions with the Human Task Services and the System to System interactions will be handled with different WorkItemHandlers implementations.
The process is about releasing artifacts. In order to make a release the files from an specific artifact needs to be staged. We have three directories where we will move the files to be released and they will be processed accordingly. Basically, we will pick a set of files from a repository that has the following directory structure:
Directory Structure
Directory Structure
The sequence will be: Origin (where the original files will be placed for the release process) -> Stage (reviewed by a Person) -> Test (automatically tested) -> Production.
Notice that if the automatic tests fails a special path will be followed and a Person will be in charge of fixing the issues and move the files back to the Staging area.

Keeping our process as simple as possible

We don't want to complicate our business process, we want to keep the process definition as clear and simple as possible. We don't want to add tons of activities to check different situations that doesn't describe the normal flow of actions. But at the same time we want to enforce some extra requirements and deal with exceptional business situations. For recognizing situations where we want to enforce different business policies or recognize business exceptions we can start using rules. If we want to recognize situations that involves time intervals we can include Fusion into the picture.
As I've explained in previous posts, there are several rules to analyze our processes executions using Rules, but from a very high-level perspective we can do the following:
  • Analyze a single process and the process contextual information to execute some actions or influence the process state
  • Analyze a group of processes running in the same context as a logical group and execute an action that can be related with one particular instance, create one or a group of new new instances, terminate/abort one or a group of running instances, create one or a group of human tasks, or execute one or a set of actions.
To demonstrate the different things that you can do we have chosen three different things that we can do without adding more complexity to our process definition:
  1. If an instance of the process go 2 or more times through the Fix Issues branch, we want to get a warning or notify someone about this situation to take an action. Imagine the pain of doing this kind of checks inside the business process, probably adding a new process variable to check the amount of executions of each path, a real nightmare that complicates the process definition.
    Paths and Activities Evaluations
  2. If an instance is doing a release with a set of files or pointing to an specific repository, we must not allow two process instances working with the same resources. If you think about this restriction that involves multiple process instances then it is clear that the logic of checking those restrictions cannot be placed inside a process definition, because it's not a restriction that will be applied per instance. If you think about this kind of situations, you will see that there a lot of similar cases where you can apply more intelligent restrictions to a set of process instances. The main problem is that if we have a "Normal/Old" process engine your application will need to handle those kind of things, or once again you will need to start doing some hacks in order to make that work. Most of the time using traditional BPMSs you don't even think about how to handle these scenarios, because the tooling doesn't even support them.
    Multi Process Instance Evaluations
    Multi Process Instance Evaluations
  3. In some situations we want to solve cross cutting concerns that are solved in multiple processes in the same way. Sometimes we have tasks that are done in several business processes, but we don't want to include the task as part of the process definition because it's a generic task that its not related with the business goal of that business process, but it's related with the work that needs to be done to keep things running. In such cases, we can create an Ad-Hoc task to deal a particular situation. In this case the example shows a task that is being created to improve the performance of an automated task if the execution takes longer than we have expected. We can define the SLAs using rules and dynamically create a human task if it's needed.
    Ad-Hoc Task
    Ad-Hoc Task

jBPM Console NG - technical side

Let's analyze from the technical perspective how the infrastructure should provide us a way to handle situations likes the ones described before. Before going into the rules that are identifying and reacting on different situations, we need to understand how to generate the data that the rule engine will use.
First of all we need to notify the Rule Engine about the Process Instances, so it handle them as facts. For this reason we attach the following process event listener to our sessions:
This process event listener is in charge of inserting, updating and retracting the Process Instance from the Knowledge Session where the process is running. It also keep up to date the process variables that are modified inside the process. This listener also generate and insert Drools Fusion events that can be used for temporal reasoning.
The expected results when we attach this ProcessEventListener to our sessions is:
  • Every time that we create a process instance , the Process Instance object will be available to the rule engine to create rules about it.
  • When a process instance is completed it is automatically retracted from the rule engine context
  • When a process variable is modified/updated the Process Instance fact is updated as well
  • Every time that an activity  is executed "Drools Fusion" events are created and inserted into the session before and after the task is executed. We as users have the responsibility to define these types as events, so the engine can tag them with the correspondant timestamp (Look at the rules file).
Inside the session that have attached this listener we will be able to:
  • Write rules about Process Instances and their internal status, including process variables
  • Write rules that identify situations where we want to measure time between different activities of the same process or a group of processes
  • Influence the business processes execution based on different scenarios
  • If we insert into the session more business context, we will be able to mix all the information that is being generated by the processes execution with our business context to recognize more advanced scenarios
  • Mix all of the above

Resources and Git Backend

One more important thing if you want to try this alpha version, is to understand that we are now picking up all the resources that are being used by the runtime from a Git Repository. This means that our backend repository in this case is github.com. We store all our assets in this repository, and we build up different sessions using the resources located in the remote repository. This gives us a lot of advantages, but the integration is not finished yet. In the future you will be able to point to different repositories and fetch resources on demand to build new runtimes. For now you need to understand that Forms, Processes, Rules and all the configuration resources are being picked up from a remote repository, abstracting our application from where the resources are stored.

Rules, Processes & Events

Once we have all the data inside our session we can start writing our rules.
The complete rules file used for this demo can be found here:
Here are some things that we need to understand from this drl file:
  • Event Declarations: We need to inform the rule engine which facts will be treated as Events. Notice the first lines after the imports:
    declare ProcessStartedEvent
         @role(event)
    end
    In this case we are defining that all the insertion of ProcessStartedEvent needs to be handled as events, which are a special type of facts.
  • We can make services available for the rules to use. For this example I'm injecting the services as globals:
    global RulesNotificationService rulesNotificationService;
    global TaskServiceEntryPoint taskService;The TaskServiceEntryPoint will allow us to create and manage tasks from rules. The RulesNotificationService is exposing to the outside world the rules execution. It's a quick way to notify the users about certain situations. You can think about it as a simple log service about what is happening inside our sessions.
  • Then you can write rules about Processes and the Events generated by the processes:
    rule "Fix Issues Task pending for more than 30 seconds"
      when
         $w1: WorkflowProcessInstanceImpl($id: id)
         $onEntry: ProcessNodeTriggeredEvent(
    processInstance.id == $id,
    $nid: nodeInstance.id,
    nodeInstance.nodeName == "Fix Issues") from entry-point "process-events"
        $onExit: ProcessNodeLeftEvent(
    this after[30s] $onEntry,
    processInstance.id == $id,
    nodeInstance.id == $nid,
    nodeInstance.nodeName == "Fix Issues") from entry-point "process-events"
    then
    ....So this rule is matching situations where a particular node inside (Fix Issues) of our business processes is taking more than 30 seconds to be executed. Notice that the process instance events are being inserted in a special entry-point called "process-events". I suggest you to take a look at the other rules that are being analyzed inside the demo, so you can get in idea about what kind of things can be done in this environment.

DEMO


jBPM Console NG update 14/12/2012 from salaboy on Vimeo.

Full Presentation at JBUG London




Stay tuned for more updates about the console and the book!

Share/Bookmark

Sunday, December 16, 2012

Barcelona JUG - jBPM5 Developer Guide Presentation (19/12/12)


Hi everyone, I'm going to give a presentation in Barcelona about the jBPM5 Developer Guide book. There is no defined venue yet, but it will be next Wednesday (19th) at 7pm somewhere in the city. I will keep you posted! If you are interested to attend please drop me a comment so we can make the necessary adjustments. This will be a Barcelona JUG meetup, so feel free to invite as many friends as you want and please help us to spread the word.


Here are some links from the Barcelona JUG Group that you can follow to see updates about this and future meetups:
Google groups - http://bit.ly/BarcelonaJUG 

Update

Hi everyone, the meetup for tomorrow is confirmed and we now have a place confirmed. The meetup will take place in the Facultad de Informatica de Barcelona:
Edifici B6 del Campus Nord C/Jordi Girona Salgado,1-3 08034 BARCELONA Espanya
the talk will start at 7 pm, so see you there with all your friends!

Spanish: 

Hola a todos! Estare en Barcelona presentando el libro jBPM5 Developer Guide. Todavia no hay sitio definido, pero en el transcurso de mañana estaremos publicando el lugar. Estamos seguros de que sera el Miercoles 19 de Diciembre en algun lado de la ciudad a las 7pm. El evento esta organizado por Barcelona JUG por eso sientanse libres de invitar a cuantos amigos pueda y ayudenos a difundir la palabra.


Actualizacion:
El evento y el lugar esta confirmado!
Nos vemos mañana miércoles 7pm a las 7pm!! En el Edicio A6, Aula 102 de la Facultad de Informática de Barcelona!
Edifici B6 del Campus Nord C/Jordi Girona Salgado,1-3 08034 BARCELONA Espanya
Lleven a sus amigos!



Share/Bookmark

Saturday, December 15, 2012

6.0 Alpha - Annotation Driven development with Multi Version Loading

Drools & jBPM 6.0 alpha should be out end of next week.  6.0 introduces convention based projects that remove the need for boiler plate code - literally just drop in the drl or bpmn2 and get going. Further we now allow rules and processes to be published as maven artifacts, in maven repositories. These artifacts can either be resolve via the classpath or downloaded dynamically on the fly. We even support out of the box side by side version loading, via the maven ReleaseId conventions.

As a little taster here is a new screenshot showing the annotation driven development. The lines below are all that's needed to dynamically load a module from a local or remote maven repository and start working with it. KieSession is the new, shorter name, for StatefulKnowlegeSession. Kie is an acronym for "Knowledge Is Everything", but I'll talk about Kie in another blog, expect to start hearing a lot about it soon :)



And here is a complete example screen shot. Create the drl, define the kmodule and start using them.

(click image to enlarge)



Share/Bookmark

Friday, December 14, 2012

Score flexibility in Planner, shown with vehicle routing


Do we want to minimize distance or minimize time? Should trucks return to their depot after delivering their items?
It depends on what's best for your business. Luckily, changing the score function in Planner is easy, as shown in this demo.


Share/Bookmark

Tuesday, November 27, 2012

jBPM Designer 2.4.0.Final released!


We are very happy to announce a new release 2.4.0.Final of jBPM Designer, the Web-based Business Process Editor for jBPM 5.
Here is an overview of new features and most notable bug fixes in this release:

New Features
Notable Bug Fixes
You can download jBPM Designer version 2.4.0.Final from Sourceforge. If you are upgrading from an older Designer version, make sure to clear your browser cache before start using the new one.
You can clone jBPM Designer or just browser its source at GitHub.
 
Roadmap
For the next release we will strongly focus on
  • Add enhancements to Process Simulation capabilities
  • “Smart Properties” – more usable ways for users to enter in execution properties to their models
  • Alternative asset storage options
  • Overall usability enhancements
jBPM Designer is open-source and of course free! If you would like to be part of Designer development and discussions or just want to ask questions feel free to talk to us on the User Form, the Mailing List, or IRC.

You can also follow the latest news about the jBPM Designer on it’s Blog.

Enjoy :)


Share/Bookmark

jBPM Console NG - Alpha Dev Access


Hi everyone! I'm writing this post to introduce the jBPM Console NG project which will provide a new integrated workbench for handling process related activities. We are now in a very initial stage of development and we are looking for contributors. We know that there are a lot of companies out there implementing their own solutions and at this point  we encourage you all to give us feedback about the direction that we are picking for the BPM tooling. As usual, this tooling will be integrated with all the Drools and Guvnor Tooling to provide an integrated Knowledge Development Environment.
jBPM Console NG

Introduction

As always, we are developing the jBPM Console NG in a public github repository: https://github.com/droolsjbpm/jbpm-console-ng
You can clone this repository, build the source code and deploy the jBPM Console NG in your own container following the next steps:
2) cd jbpm-console-ng
3) mvn clean install
4) cd jbpm-console-ng-showcase
5) mvn gwt:run -> This will display the GWT Development Mode console which will give you an URL to access via your Browser (for development purposes you need to use Firefox which provides a Development GWT Plugin that allows us to Debug the application)

Technology

The application is being developed using Uberfire which is based on GWT (Google Web Toolkit), Errai and CDI. This mix of technologies gives us the ultimate environment to build  flexible applications using a rock solid component model. I will be posting some examples showing how to get started to create new panels and add customizations to the existing code base shortly, but feel free to clone/fork the repository to take a look at the current status.

Goals

The main goal behind the application is to provide an integrated environment to discover, design, deploy, execute, monitor and improve our business processes.  In order to provide all this functionality we have started the development integrating our existing components inside the Uberfire infrastructure.
There is an ongoing effort to integrate the jBPM Process Designer inside this platform, but I've started working on the Process Runtime Panels and in the Task Lists with the help of Maciej.
The following screenshots shows the current status of the application:

Home Screen

The home screen shows us important information about the things that the user is enabled to do. The jBPM Lifecycle chart allows the use to select in which phase he/she wants to work. Right now I'm focused on improving the "Work" stage as it's being shown in the following screenshots.
Home Screen
The home screen also contains a suggestion box that allows you to quickly type different "Commands" to access the different sections of the application. In order to return to the Home Screen, we can use the shortcut CTRL+H.

Tasks List

The Tasks List screen will allows us to interact with the tasks assigned to us or to the groups where we are included. As you can notice in the previous screen, my user (salaboy) was included inside the [Writer] group. This means that all the Tasks associated to the Writer group will appear in my personal task list. Notice that for each row inside the list we will have a set of actions to interact with each task. The following screenshot shows the Start button inside the Actions column, we can also edit/view the Task Details and we can also access to work on that particular task via it associated Task Form.
Tasks List

Quick Tasks Creation

Clicking in the Create New Task button, we will be able to create a new Task for us or for other person inside the organization. The task will be created assigned to us, but we can forward the task later. Notice that we can also create a Quick Task, this means that the task will be automatically started and can be used as a simple TODO task. No matter where we are in the application we can use the shortcut CTRL+T to create a new task.
Quick Task Creation

Task Details

The Task Details popup allows us to see the most important information about the a particular task. If we want to access to a more detailed view about that particular task we can click the Full button which will open more panels related with that task. Notice that This task is not associated with any business processes, but for those variables which are associated with a business process instance, we can access to see the process instance details using the "Process Instance Details" button on the bottom.
Task Details

Forms

The current version allows us to interact with tasks via Task Forms which are dynamically generated based on the task content and the expected outputs. This task is already in progress, and for that reason you can see the Complete button on the bottom of the form. If the task is in a different state, different buttons will be displayed. As you can see the Save button will allow the user to store intermediate steps of the information that is being filled up inside the form. The Full button can be used to see the form with more contextual data, like for example Task Attachments or Task Comments.
Task Form (Based on a Template)

Process Management

The Process Management panels will allows us to see all the available Process Definitions and it will allows us to create new Process Instances. As you can see in the following screenshot, you will be able to inspect the Process Definition Details to see the process diagram and relevant information about each process. 
Process Management

Process Instance Details

Inside the Process Instance Details you will be able to see the current status of the Process Variables, the activities that are being executed and also the Log for that particular instance.
Process Instance Details

Signaling Events

From the Process Instance List you will be able to signal events. The Events List will be retrieved based on the process definition and the Signal Ref suggestion box list you all the events related with the selected Process Instance.
Signal Events

Roadmap

During the following months we will be working on polishing the current panels and services behind the application to provide an error free environment that allows you to execute your business processes and interact with the Human Task Services. During this initial phase of development we are looking forward to improve the user experience, and for this reason we encourage you to try the latest source code and let us know what you think. This initial version can also be deployed in the cloud, like for example OpenShift. We believe that this will help a lot new users that want to try out an existing installation.
There a lot of things that needs to be done, so take a look at the following section because if you want to get involved with the development of an open source tooling this is a very good opportunity to learn and to join the project.

Contributions

The following list is a set of things that can be done by Java Developers and doesn't require any advanced knowledge about the technology that we are using. You will see that the technology stack that we are using is extremely simple and agile:
  • Task Comments Panels
  • Task Attachments Panels
  • Shortcuts Mappings Panels
  • Notifications Panels
  • Avatar and Meta information about users and groups Administration Panels
  • Domain Specific Suggestions Phrases Administration Panels
  • I18N Translations (We already have Spanish (es_AR) and English, so if you are a native speaker of a different language feel free to drop us a line)
  • Extending the GWT DataGrid component to support prioritized lists (Knowledge about GWT is required)
  • Create a Custom GWT Calendar Component to display the pending tasks in a Calendar. (Knowledge about GWT required)
  • Experiment with m-gwt (http://www.m-gwt.com/) (Knowledge about GWT and motivation to learn m-gwt is required)
  • Any idea that you may have and want to propose
Drop us a line with your ideas/requirements, we are very open to guide all the interested in doing contributions to learn what they need in order to get started. I will be posting some videos about how to create a simple panel and about how the internal services are working on the next few days, but feel free to ask questions if you are interested in this development. You already know where to find me :)

Share/Bookmark

Monday, November 26, 2012

jBPM5 Developer Guide Official Presentation @London


Hi all, with the book  almost being printed out, we have organized an oficial presentation with the help of the JBug London.  We (Esteban Aliverti and I - "Salaboy") will be giving a talk about the book the 12th of December. Seats are limited, so make sure to reserve yours today:
We will be giving away a free copy of the book to the attendee who makes the more interesting questions during the talk. Feel free to drop us a line with comments or expectations about the talk. You can start looking at the examples provided with the book here:
We will be posting a detailed agenda about the presentation shortly.
Stay Tuned!
PS: you can follow us on twitter: @salaboy and @ilesteban

Share/Bookmark

Friday, November 16, 2012

JBUGs Sydney & Melbourne 19th & 20th November 2012

I'm visiting Australia next week, and I'll be giving two JBUG presentations:
  • in Sydney on Monday November 19th
  • in Melbourne on Tuesday November 20th
I'll give a quick jBPM overview and focus on some of the new features that we're developing and some of the changes you'll see in the near future.
Important: Followed by pizza and drinks after each session ! :)
So if you're in the vicinity, feel free to join us.  Registration is necessary.
Hope to see you there !

Share/Bookmark

Tuesday, November 13, 2012

Drools 5.5.0.Final released

We're happy to announce the release of Drools (Expert, Fusion, Planner, Guvnor) 5.5.0.Final.

Documentation, Release Notes and Downloads are detailed below:
  • Download the zips from the bottom of the Drools download page
  • To try out the examples, just unzip one and run the runExamples.sh or runExamples.bat script.
Try it out and give us some feed-back (user list, issue tracker).

[* Note: At the time of posting Maven Central has not been synchronized]
Share/Bookmark

Friday, November 09, 2012

Hackergarten day: hacking Drools Planner (Antwerp, Tuesday)

If you're near Antwerp this Tuesday (13th Nov 2012), join us at the Hackergarten day (free entrance) at Devoxx between 9:30am - 6:30pm. We 'll be hacking Drools Planner and many other open source projects (hibernate, arquillian, ...).

Bring your laptop and sit shoulder-to-shoulder with other open source developers. We'll guide you in creating pull requests and proof of concepts.

If you're interested in hacking Planner, but don't know what to do yet, here's a list of suggestions:
  • Create a new build-in ScoreDefinition (1-2 hours)
  • Improve the benchmarker report (1 hour per improvement)
  • Create an example for Portfolio optimization (full day)
Unlike most of Devoxx, the entrance is free. Hackergarten will take place in the two extra rooms (Area 1 & 2, each approx. 50m2) on the exhibition floor of Devoxx (not in the BOF rooms). So if you want to get your hands dirty, join us (except of course between 5:25 and 5:55 when you should come to my Devoxx presentation on Maven Dependency Puzzlers :).

Preparation

There's a good network, but the Devoxx crowd never fails to slow it down. So avoid waiting for your downloads to finish and set up your environment at home:

Set up Git and clone drools-planner from GitHub (or alternatively, download the zipball):
$ git clone https://github.com/droolsjbpm/drools-planner.git
...

Then do a Maven 3 build:
$ cd drools-planner
$ mvn -DskipTests clean install
...

Try running the examples directly from the command line:
$ cd drools-planner-examples
$ mvn exec:exec
...
Next, open the drools-planner/pom.xml file with your favorite IDE (IntelliJ, Eclipse, NetBeans).


Share/Bookmark

Thursday, November 01, 2012

Announcing UberFire

Today we're pleased to announce the first public release of UberFire, a web based workbench framework inspired by Eclipse Rich Client Platform.

What is it?

UberFire is a new independent project developed and maintained by Drool & jBPM team. This is a very strategic project for us, once it's the base technology for our next generation of web tooling.

One key aspect for UberFire is the compile time composition of plugins. Every plugin can be a maven module, when building a distribution, you simple add those maven modules as dependencies. Those plugins then become available a panels to be placed in perspective, via drag and drop, with docking.

The clean and powerful design was made possible by GWTErrai and CDI.

In 0.1.0.Alpha1 version we have the following features:

Workbench API
  • Perspectives
  • Panels
  • Drag and Drop and Docking
  • Model-View-Presenter framework

Core Widgets
  • File Explorer
  • Text Editor
  • Markdown Preview & Live Editor

Virtual File System API (back port of NIO2)
  • File System default backend
  • GIT backend

Security API/Framework
  • Authentication
  • Authorization

Other important aspect of UberFire APIs is the fact that you can use them for client or server side development. Please remember this is our first alpha release, expect bugs, unfinished features and we'll work on improving the look and feel over the coming months.

Video

1. Quick tour
2. Rich Client App

Screenshots

Here are some screenshots of our Showcase demo application

1. Login



2. Home perspective with same panels including You Tube videos.



3. Selecting a new perspective.


4. Dashboard composed by some panels, including mounted Google gadgets.



5. Notice the "File explorer" any GIT repo can be created or cloned, with seamless server side storage. Also notice the context sensitive toolbar and menu bar entries, because the "File Explorer" panel has focus.


6. Markdown editor with live preview.



7. Panel drag and drop - The compass helps give visual indication for the drop zone. Panels can either be dropped onto the current panel, and added as a tab, or they can dragged to a new panel area, below it is added to the bottom.


Important Note:
No effort has yet been spent on its Look & Feel. 


What to expect in the next releases

We already have other new features scheduled for next releases (some already under development, others still just a PoC):

  • Metadata engine (index/search)
  • Embeddable infrastructure
  • Panels exposed as IDE Plugins (Eclipse is the first target)

Eating our own dog food

There's nothing like believing in your own ideas than using them for your own work. Drools Guvnor is currently being ported to the UberFire framework. Anybody can experience the extent of our work so far by downloading the latest SNAPSHOT.

Furthermore the jBPM Console is also being ported to UberFire. 


Community Call

UberFire is also a great opportunity if you're looking for make an Open Source contribution, as this is a very new project we have lot's of things to do. Click here to get some ideas for contributions. 


Getting started now

With all that said, we invite you to visit our landing page at http://droolsjbpm.github.com/uberfire

Get the latest artifacts from JBoss Nexus, or download our binary distribution direct from here.

Try it out and give us some feed-back on our user list.

Share/Bookmark

Wednesday, October 24, 2012

Drools 5.5.0.CR1 released

We're happy to announce the release of Drools (Expert, Fusion, Planner, Guvnor) 5.5.0.CR1.

Documentation, Release Notes and Downloads are detailed below:
  • Download the zips from the bottom of the Drools download page
  • To try out the examples, just unzip one and run the runExamples.sh or runExamples.bat script.
Try it out and give us some feed-back (user list, issue tracker).

[* Note: At the time of posting Maven Central has not been synchronized]
Share/Bookmark

Saturday, October 20, 2012

Scorecards and PMML4.1 support for Drools 5.5

Thanks to our super star community contributor, Vinod Kiran, score cards are coming to Drools 5.5. Initially the PMML4.1 standard is embedded for the Scorecards module. We have a full standalone PMML implementation coming for 6.0, being worked on by Dr Davide Sottara. I hope that Vinod will write a full tutorial in this blog soon, explaining the feature in more detail.

If you don't know what a Scorecard is, here is a tutorial I found on google:
http://www.primatek.ca/blog/2010/08/16/a-quick-introduction-to-scorecards/

Below is a text and image excerpt  from the New and Noteworthy docs for the up coming Drools 5.5 release:

A scorecard is a graphical representation of a formula used to calculate an overall score. A scorecard can be used to predict the likelihood or probability of a certain outcome. Drools now supports additive scorecards. An additive scorecard calculates an overall score by adding all partial scores assigned to individual rule conditions.


Additionally, Drools Scorecards will allows for reason codes to be set, which help in identifying the specific rules (buckets) that have contributed to the overall score. Drools Scorecards will be based on the PMML 4.1 Standard.  
   


Share/Bookmark