Planet BPM

November 27, 2014

Sandy Kemsley: Activiti BPM Suite – Sweet!

There are definitely changes afoot in the open source BPM market, with both Alfresco’s Activiti and camunda releasing out-of-the-box end-user interfaces and model-driven development tools to augment...

[Content summary only, click through for full article and links]

by sandy at November 27, 2014 04:42 PM

November 21, 2014

Thomas Allweyer: BPMN-Zertifizierungskurs mit ARIS

BPMN-Prozess in ARISDer von mir mitentwickelte BPMN-Zertifizierungskurs, der seit geraumer Zeit sehr erfolgreich in der Schweiz läuft, kommt Anfang nächsten Jahres auch nach Deutschland. Gemeinsam mit der Firma aproo, die unter anderem zahlreiche Dienstleistungen im ARIS-Umfeld anbietet, wurde der Kurs an die Verwendung mit den ARIS-Modellierungswerkzeugen der Software AG angepasst. Lange Zeit dominierte bei ARIS-Nutzern die Ereignisgesteuerte Prozesskette (EPK) als Notation zur Modellierung von Geschäftsprozessen, doch gewinnt die BPMN als alternative oder zusätzliche Darstellungsmethodik auch hier an Bedeutung. ARIS bietet neben der Prozessmodellierung zahlreiche Methoden zur integrierten Modellierung unterschiedlicher Sichten an, wie z. B. Daten, Organisation oder IT-Landschaften. In dem Kurs wird daher auch erläutert, wie sich BPMN-Modelle mit anderen Modelltypen verbinden lassen.

Schwerpunkt der Schulung ist aber die fundierte Einführung in die BPMN selbst. Anhand vieler Beispiele und praktischer Übungen lernen die Teilnehmer die Notation zu verstehen und selbst anzuwenden. Der zweitägige Kurs richtet sich daher nicht nur an ARIS-Nutzer, sondern an jeden, der die Prozessmodellierung mit BPMN erlernen möchte. Optional kann im Anschluss an den Kurs ein Zertifikat erworben werden. Der erste Kurs findet am 27. und 28. Januar in Frankfurt statt. Weitere Termine in anderen Regionen sind in Vorbereitung.

Weitere Informationen und Anmeldung

by Thomas Allweyer at November 21, 2014 10:35 AM

Drools & JBPM: Red Hat JBoss BRMS and BPMS Rich Client Framework demonstrating Polyglot Integration with GWT/Errai/UberFire and AngularJS

Last week I published a blog highlighting a presentation I gave showing our rich client platform that has resulted from the work we have done within the BRMS and BPMS platforms, the productised versions of the Drools and jBPM projects. The presentation is all screenshots and videos, you can find the blog and the link to the slideshare here:
"Red Hat JBoss BRMS and BPMS Workbench and Rich Client Technology"

The presentation highlighted the wider scope of our UI efforts; demonstrating what we've done within the BRMS and BPMS platform and the flexibility and adaptability provided by our UI technology. It provides a great testimony for the power of GWTErrai and UberFire, the three technologies driving all of this. We can't wait for the GWT 2.7 upgrade :)

As mentioned in the last blog the UberFire website is just a placeholder and there is no release yet. The plan is first to publish our 0.5 release, but that is more for our BRMS and BPMS platforms. We will then move it to GWT 2.7 and work towards a UF 1.0, which will be suitable for wider consumption.  With 1.0 we will add examples and documentation and work on making things more easily understood and consumable for end users. Of course there is nothing to stop the adventurous trying 0.5, the code is robust and already productized within BRMS and BPMS - we are always on irc to help, Freenode #uberfire.

That presentation itself built on the early video's showing our new Apps framework:
The Drools and jBPM KIE Apps Framework

The above video already demonstrates our polyglot capabilities, building AngularJS components and using them within the UF environments. It also shows of our spiffy new JSFiddle inspired RAD environment.

I'd now like to share with you the work we've done on the other side of polyglot development - this time using GWT and UF from within AngularJS. It was important we allow for an AngularJS first approach, that worked with the tool chain that AngularJS people are familiar with. By AngularJS first, I mean that AngularJS is the outer most container. Where as in the above video UF is already running and is the outer container in which individual AngularJS components can be used.

Before I detail the work we've done it's first best to cover the concepts of Screens and Perspectives, our two main components that provide our polyglot interoprability - there are others, but this is enough to understand the videos and examples that come next. A Screen is our simplest component, it is a DIV plus optional life cycle callbacks. A Perspective is also a DIV, but it contains a composition of 1..n Screen with different possible layout managers and persistence of the layout.

Screen
  • CDI discovery, or programmatically registered.
  • DIV on a page.
  • Life cycle callbacks.
    • OnStart, OnClose, OnFocus, OnLostFocus, OnMayClose, OnReveal.
  • Decoupling via Errai Bus.
    • Components do not invoke each other, all communication handled by a bus.
  • Editors extend screens, are associated with resource types and provide the additional life cycles
    • onSave, isDirty.
Perspective
  • CDI discovery, or programmatically registered.
  • Composition of 1..n Screens, but is itself a DIV.
  • Supports pluggable window management of Screens.
    • North, East, South West (NESW).
      • Drag and Drop docking capabilities.
    • Bootstrap Grid Views.
      • Separate design time and runtime.
    • Templates (ErraiUI or AngularJS).
      • Absolute control of Perspective content and layout.
  • Supports persistence of Perspective layout, should the user re-design it.
    • Only applicable to NESW and Bootstrap Grid views.

A picture is worth a thousands words, so here is a screenshot of the Perspective Builder in action. Here it uses the Bootstrap Grid View layout manager. Within each grid cell is a Screen. The Perspective is saved and then available from within the application. If the NESW layout manager was used there is no separate design time, and all dragging is done in-place and persistence happens in the background after each change. Although it's not shown in the screenshot below we also support  both list (drop list) and tab stacks for Screens.



Now back to what an AngularJS first approach means. 6 different points were identified as necessary to demonstrate that this is possible.
  1. UF Screens and Perspectives should be available seamlessly as AngularJS Directives.
  2. Bower packaging for a pre-compiled UFJS. UFJS is the pre-compile client only version of UF.
  3. UFJS can work standalone, file:// for example. UFJS can optionally work with an UF war backend, allowing persistence of perspectives and other optional places that UFJS might need to save state as well as access to our full range of provided services, like identity management.
  4. Support live refresh during development.
  5. Nested Controllers.
  6. Persistence and routing.
  7. Work with tools such as Yeoman, Grunt and Karma.
Eder has produced a number of examples, that you can run yourself. These demonstrate all of the points have been solved. You can find the code here, along with the README to get you started. We did not provide video's for point 7, as I believe the video's for points 1 to 6 show that this would not be a problem.

Eder has also created several short videos running the examples, for each of the use cases, and put them into a YouTube playlist. He has added text and callouts to make it clear what's going on:
AngularJS + UF PlayList
  1. Overview explaining what each video demonstrates (33s).
  2. AngularJS App + UFJS, client only, distribution using Bower. (2m30s).
    • Install and play with UFJS through Bower
    • Create a Native AngularJS App
    • Integrate this app with UFJS
      • Show UF Screen Directives
      • Show UF Perspective Directives
  3. AngularJS App + UFJS client and UF Server.
    • 1 of 2 (3m58s).
      • Download UF War
      • Install and run on EAP
      • Download and run our Angular demo on Apache
      • Show AngularJS Routes + UF Integration
    • 2 of 2 (4m06s).
      • Use UF to create Dynamic Screens and Perspectives
      • Encapsulate an AngularJS template in a UF Screen
      • Show an AngularJS App (inside a UF screen) nested in a parent controller.
        • Demonstrated multiple levels of controller nesting.
  4. KIE UF Workbench RAD environment with AngularJS component.
  5. Uberfire Editor working seamlessly as an Eclipse editor.
For completeness the original video's showing the JSFiddle inspired RAD environment, which demonstrates an UF first polyglot environment, have been added to the playlist. See point 4. above

Finally just to show of, and because we can, we added a bonus video demonstrating a UF editor component running seamlessly in Eclipse. This demonstrates the power of our component model - which has been designed to allow our components to work standalone in any environment. We use Errai to intercept all the RPC calls and bridge them to Eclipse. Because the virtual file system our editors use, like other services, is decoupled and abstracted we can adapt it to the Eclipse File io. For the end user the result is a seamless editor, that appears native. This allows the development of components that can work on the web and in Eclipse, or even IntelliJ. We'll work on making this example public at a later date.

Here are some screenshots taken from the video's

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)


(click image to enlarge)


(click image to enlarge)


(click image to enlarge)


(click image to enlarge)



Finally to all those that said it couldn't be done!!!!


by Mark Proctor (noreply@blogger.com) at November 21, 2014 01:30 AM

November 18, 2014

Thomas Allweyer: Fraunhofer IESE stellt am 10.12. die zweite BPMS-Studie vor

Das Fraunhofer Institut für Experimentelles Software Engineering (IESE) hatte im vergangenen Jahr zum ersten Mal eine detaillierte Untersuchung von Business Process Management-Systemen (BPMS), d. h. Systemen zur Prozessausführung, durchgeführt. Aufgrund der großen Resonanz dieser Studie beteiligten sich an der zweiten Auflage rund zwanzig Anbieter. Dabei wurde ihnen ordentlich auf den Zahn gefühlt: Jedes Tool wurde in einem ganztägigen Workshop analysiert. Dabei musste unter anderem ein von den Studien-Machern vorgegebenes Szenario vorgeführt und während des Workshops geändert und erweitert werden.

Ich hatte die Gelegenheit, als Gast an einigen dieser Workshops teilzunehmen. Dabei konnte man wirklich einen umfassenden Einblick in das jeweilige Tool gewinnen. Gelegentlich kamen die Vertreter der Toolhersteller auch ganz schön ins Schwitzen, wenn sie die eine oder andere diffizile Anforderung umsetzen sollten. Derartige Workshops sind sehr aufwändig. Sie erlauben jedoch eine wesentlich tiefer gehende Analyse als viele anderen Studien, bei denen lediglich mit Fragebögen gearbeitet wird, die von den Herstellern ausgefüllt werden.

Die Studie wird am 10. Dezember in Kaiserslautern im Rahmen einer Veranstaltung präsentiert, bei denen viele der beteiligten Anbieter mit ihren Systemen vor Ort sind. Die Teilnahme ist kostenfrei. Weitere Informationen und Anmeldung unter www.iese.fraunhofer.de/bpm2014

by Thomas Allweyer at November 18, 2014 08:06 AM

November 05, 2014

Sandy Kemsley: ActiveMatrix BPM at Citibank Brazil

Roberto Mercadante, SVP of operations and technology at Citibank Brazil, presented a session on their journey with AMX BPM. I also had a chance to talk to him yesterday about their projects, so have...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 11:31 PM

Sandy Kemsley: Event Analytics in Oil and Gas at TIBCONOW

Michael O’Connell, TIBCO’s chief data scientist, and Hayden Schultz, a TIBCO architect, discussed and demonstrated an event-handling example using remote sensor data with Spotfire and...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 10:15 PM

Sandy Kemsley: AMX BPM and Analytics at TIBCONOW

Nicolas Marzin, from the TIBCO BPM field group, presented a breakout session on the benefits of combining BPM and analytics — I’m not sure that anyone really needs to be convinced of the...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 07:44 PM

Sandy Kemsley: TIBCONOW ActiveMatrix BPM Roadmap

On Monday, we heard an update on the current state of AMX BPM from Roger King; today, he gave us more on the new release and future plans in his “BPM for Tomorrow” breakout session. He...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 06:50 PM

Sandy Kemsley: Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 05:44 PM

Keith Swenson: Five Ways ‘Planning Before Doing’ can be Bad

Before you do something, plan it.  Figure out what you are going to do, and then do it.  If you failed to succeed, then you didn’t plan well enough.  Next time, do better planning.  How many times have you heard these saying from traditional scientific management?  They are so ingrained in our working behaviors, they seem beyond questioning.  But there are some times when planning is a bad idea — and this point talks about 5 such situations.

plannerPlanning takes many forms, and when I talk about a plan, I mean any kind of way that you make a definition of what you or others are going to do in the future.  The plan has to be persistent, usually written, but you can also share a memorized plan if it is not too complicated.  Some plans are a detailed list of instructions.  Others might be a flow chart describing various options, and what one must do if certain possibilities materialize.  A plan is not simply a command to do something, but it is a description of what to do that is worked out in advance.  That is the key:  a plan is created at one time, but the actual action is done later.

Planning helps to coordinate the actions of people, and it is particularly important when those people can not be constantly communicating.  If you want to meet someone at a coffee shop, you have to make a plan, otherwise you are unlikely to meet.  You want to build a skyscraper?  You need a very large elaborate plan for that or else the resulting building might be poorly built.  But there are some situations where a plan is a bad idea.

  • Volatility can make a plan worthless – All of the example where a plan helps is when the patterns of behavior can be accurately predicted and are repeatable enough.  Imagine tomorrows stock market prices.  To make a detailed plan today, about what you are going to trade, and when you are going to trade it tomorrow, is pointless.  It is easy to see how stock prices are chaotic enough on a day by day, but the same principle applies to many other things if the time scale is increased.  Planning a lunch reservation for 6 months from now would have to be a very special occasion and still would be highly tentative at best.  The famous Franco-Prussian War general Helmut von Moltke said “no plan survives contact with the enemy” which reflects this sentiment perfectly given the volatility of a war situation.
  • A plan can be more trouble than the benefit – Imagine ten carloads of people attending an ordinary evening event.  Nobody would bother making a plan for the exact places that each car will park, because there is no substantial benefit in making this plan.  Planning is never free; it always takes at least some time and effort.  If the benefit of the plan is not greater than the cost of planning, then the planning itself is a waste, and should be avoided.
  • Following the plan may be a distraction – There are some situations where creating a plan is a helpful exercise, but taking the plan literally is harmful.  Creating a plan can force a team to think though all the possible things that might happen, and to think about how one might respond.  As an exercise in preparedness this can be a benefit, but it is important for the team to remember that the specific plan may contain details that are not justified and do not properly account for how the reality has unfolded.  The team needs to know to ignore those details in the plan.  This is the idea behind General Eisenhower’s statement that “plans are useless, but planning is indispensable.
  • A plan can give false confidence – Just because someone threw together a plan does not mean that it is a good plan.  Without a plan, people will be searching for possibilities.  Those same people may stop looking for solutions if they have been convinced that a good planning job has already been done.
  • A plan made too early can lead you down the wrong path – A plan that is made too early might be made on misunderstood conditions.  This naïve plan is actually harmful because it locks the team to a particular direction, at a time when the reality of the value of direction is unclear.  Later, when the actual situation has cleared, it may be impossible to change to the better direction.  This is the result of some serious research by Danish economic geographer Bent Flyvbjerg.  He studied hundreds of public works projects, and found that those projects that waited longer before making a plan, tended to do better.  He says: “Often there is ‘lock in’ or ‘capture’ of a certain project concept at an early stage, leaving analysis of alternatives weak or absent [from the plan].

The conclusion from all this is that plans can be costly, so make sure you are getting an appropriate amount of benefit.  In some cases, you are actually better off not making any plan, and simply figuring it out as you go.  In cases where a plan does make sense, there can be a right time to make a plan.  A plan made too late might not have enough effect on the work already in progress, but alternately a plan made too early can be flawed so as to prevent you from finding the right route.  A Late-Structured Process is a strong aspect of case management where the case manager is allowed to plan after the work is started.

So don’t think that a full and complete plan is the obvious and indisputable requirement for success in all projects.   Doing more planning does not necessarily make the project better.  This is not to say that one should never plan.  A majority of cases need a plan, but a simple plan may be more effective than an elaborate one.  The amount of planning should be appropriate to the need.   There is a right time.  Delaying planning to a time where you know more about the specifics may be a good idea, may give you a better plan, which may allow you to outperform those who plan too much & too early.


by kswenson at November 05, 2014 10:53 AM

Drools & JBPM: Red Hat JBoss BRMS and BPMS Workbench and Rich Client Technology

Last week I did a blog highlighting the recent R&D we are doing to make our web platform easily extensible and allow the development of a self service apps platform. The blog had two video links showing progress.

Today I gave a presentation that highlighted the wider scope of our UI efforts; demonstrating what we've done within the BRMS and BPMS platform and the flexibility and adaptability provided by our UI technology. It provides a great testimony for the power of GWTErrai and UberFire, the three technologies driving all of this. We can't wait for the GWT 2.7 upgrade :)



As mentioned in the last blog the UberFire website is just a placeholder and there is no release yet, we'll be working on that over xmas, along with documention to make things more easily understandable and consumerable for end users.

The presentation is now live up on Slideshare and I've taken time to embed all the video's within it.
http://www.slideshare.net/MarkProctor/red-hat-jboss-brms-and-bpms-workbench-as-a-platform

The presentation is mostly self explanatory screenshots with headers with videos scattered throughout - many of which are previous video's you might have seen before. It provides a very good over, using scatter gun approach, of what we have done, what we are doing and where we are going. Two of the videos around the extensible workbench and the technical web ide have audio commentary.

It starts by showing the existing BRMS and BPMS stuff before moving onto the new extensibility efforts. Finally it covers a new demo we've done with the workbench reskinned for a more technical audience. It also shows ACE integration for java and xml editing, as well as real time provisioning and running of Spring Pet Clinic application within the workbench.

by Mark Proctor (noreply@blogger.com) at November 05, 2014 03:14 AM

November 04, 2014

Sandy Kemsley: TIBCONOW 2014 Day 2 Keynote: Product Direction

Yesterday’s keynote was less about TIBCO products and customers, and more about discussions with industry thought leaders about disruptive innovation. This morning’s keynote continued...

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 06:50 PM

Sandy Kemsley: Spotfire Content Analytics At TIBCONOW

(This session was from late yesterday afternoon, but I didn’t remember to post until this morning. Oops.) Update: the speakers were Thomas Blomberg from TIBCO and Rik Tamm-Daniels from Attivio....

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 04:59 PM

Thomas Allweyer: Was leisten BPM-Tools für die Überwachung von Geschäftsprozessen?

Cover_Ueberwachung-von-Geschaeftsprozessen_StudieBei der vorliegenden Studie handelt es sich um eine von drei Schwerpunktstudien zu BPM-Tools, die das Stuttgarter Fraunhofer-Institut IAO in diesem Jahr durchgeführt hat. Von den 28 Herstellern, die im vorausgegangenen Marktüberblick vertreten waren, haben sich nur ganze fünf zum Thema “Überwachung von Geschäftsprozessen” beteiligt. Das ist recht erstaunlich, bergen doch die Überwachung und Auswertung des tatsächlichen Prozessgeschehens ein hohes Potenzial. So können beispielsweise Probleme frühzeitig erkannt, Auslastungen optimiert und Verbesserungsmöglichkeiten aufgedeckt werden. Insbesondere für alle BPMS-Hersteller, die über eine Process Engine verfügen, sind die durch die Prozessautomatisierung eröffneten Auswertungsmöglichkeiten ein wichtiges Verkaufsargument.

Die Autoren gliedern die Monitoring-Funktionalitäten in die drei Aufgabenfelder Datenerfassung, Auswertung und Darstellung. Das Thema betrifft den gesamten Prozessmanagement-Lebenszyklus: So können Process Mining-Algorithmen zur Prozessidentifikation verwendet werden. Bei der Prozessmodellierung sind Kennzahlen und ihre Sollwerte zu bestimmen sowie die Kennzahlenerhebung zu planen. In der Prozessausführung werden die Daten der einzelnen Prozessinstanzen ausgewertet, die dann zu Kennzahlen aggregiert und im Rahmen der Prozessüberwachung in Dashboards präsentiert und analysiert werden.

Zwei der vertretenen Werkzeuge verfügen über eine Process Engine. Sie bieten Funktionen zur Überwachung der Prozessinstanzen in Echtzeit. Von besonderer Bedeutung sind zeitnahe Auswertungen insbesondere für kundenbezogene Prozesse, Ticket-Prozesse im Support-Bereich und für Compliance-relevante Kennzahlen. Auch die Systemverfügbarkeit für Kernprozesse sollte in Echtzeit überwacht werden.

Zwei weitere Tools haben ihren Schwerpunkt in der Prozessmodellierung. Sie verfolgen den Ansatz, Daten aus verschiedenen Drittsystemen zu extrahieren, sie in Bezug zu den Geschäftsprozessmodellen zu setzen und somit prozessbezogene Analysen zu ermöglichen. Schließlich ist noch ein Process Mining-Tool vertreten, bei dem die Prozesse nicht im Vorfeld definiert werden. Stattdessen rekonstruiert das System anhand von Daten aus verschiedenen Anwendungssystemen, wie die Prozessinstanzen tatsächlich abgelaufen sind. Hierbei können ebenfalls verschiedenen Kennzahlen ermittelt und über verschiedene Auswertungen verglichen werden.

Es kommen also ganz unterschiedliche Konzepte zum Einsatz. Dennoch gibt es auch Gemeinsamkeiten. So bieten alle Tools Dashboard-Darstellungen im Browser an, die rollenbasiert und individuell personalisiert werden können. Inhalt und Aufbau können jeweils per grafischer Modellierung oder Konfiguration angepasst werden. Ebenso findet sich überall die Möglichkeit neben vordefinierten Kennzahlen auch eigene Kennzahlen und Auswertungen zu definieren, Kennzahlen zu aggregieren und zu filtern, und verschiedene Statistiken zu erstellen. Zur Präsentation werden neben den üblichen Balkendiagrammen etc. zum Teil auch Prozessmodelle verwendet. Ausgewählte Kennzahlen werden direkt in die betreffenden Stellen des Diagramms eingeblendet und z. B. farblich hervorgehoben.

Auch wenn die Studie mit nur fünf Anbietern den Markt längst nicht abdeckt, werden doch wesentliche Facetten und Möglichkeiten der Prozessüberwachung deutlich.


Falko Kötter, Monika Kochanowski, Thomas Renner:
Business Process Management Tools 2014 – Überwachung von Geschäftsprozessen.
Fraunhofer Verlag 2014.
Weitere Infos und Bestellung beim IAO

by Thomas Allweyer at November 04, 2014 09:10 AM

Sandy Kemsley: BPM For Today At TIBCONOW

Roger King, who heads up TIBCO’s BPM product strategy, gave us an update on ActiveMatrix BPM, and some of the iProcess to AMX BPM tooling (there is a separate session on this tomorrow that I...

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 12:26 AM

November 03, 2014

Sandy Kemsley: BPM COE at TIBCONOW 2014

Raisa Mahomed of TIBCO presented a breakout session on best practices for building a BPM center of excellence. She started with a description of different types of COEs based on Forrester’s...

[Content summary only, click through for full article and links]

by sandy at November 03, 2014 10:17 PM

Sandy Kemsley: TIBCONOW 2014 Opening Keynote: @Gladwell and More. Much More.

San Francisco! Finally, a large vendor figured out that they really can do a 2,500-person conference here rather than Las Vegas, it just means that attendees are spread out in a number of local...

[Content summary only, click through for full article and links]

by sandy at November 03, 2014 08:12 PM

November 01, 2014

Tom Debevoise: By the book: How DMN is connected to BPMN

Throughout our new book 'the Microguide to Process and Decision Modeling in DMN and BPMN' we treat DMN as an integral notation for process modeling in BPMN. Even though decision model notation is a separate domain within the OMG, the DMN spec provides an explicit way to connect to processes in BPMN. DMN provides a schema model in XML format that includes two connection points. First, there is an explicit list that denotes the processes and tasks that use the decisions. Next, DMN provides an input and output data type that implicitly corresponds to the rule activity that invokes the knowledge bases of the decision.

In table 7 of the proposed Decision Model and Notation (DMN) Specification, the class model for the decision defines the BPMN processes and tasks that require the decision to be made (usingProcesses and usingTasks).

Appendix B of the DMN specification says:

“The interface to the decision service will consist of:

  • Input: a list of contexts, providing instances of all the Input Data required by the encapsulated decisions
  • Output: a context, providing (at least) the results of evaluating all the decisions in the minimal output set, using the provided instance data.

When the service is called, providing the input, it returns the output.

In its simplest form a decision service would always evaluate all decisions in the encapsulation set and return all the results.”

Here we are assuming that the decision is created by business rules from input processes and accessed through a decision service. Other decision can be manual or can require user input. The business rule task shape can denote the place within the process model that calls up a DMN model with the needed input and obtains the decision output.

As seen in the figure below, most DMN decision modelers utilize the rule shape to denote a connection to DMN. The inputs of a rule task are processed by the logic defined in the DMN model and then output for use in downstream gateways, participants, events and activities.

The customer is the input for the decision and the output is the customer discount. The process in the diagram can be made explicit according to the execution semantics. The figure shows the usage of the message shape. The association lines (dotted) are used to create the relationship between the message and the data type that is used in the process schema.  When decision for a customer discount is requested a customer message is sent to the decision service. This is an initiating message, so the envelope is white. After the decision is completed, the BRMS returns the customer discount. The message is shown as a non-initiating message with light shading.

To summarize: the DMN spec explicitly defines how a BPMN process is connected to the decision through the usingProcesses and usingTasks metadata for the decision shape. The input and output are attributes of the  decision and created by expressions and decision table. (more on that latter).

by Tom Debevoise at November 01, 2014 06:09 PM

October 30, 2014

Drools & JBPM: The Drools and jBPM KIE Apps Framework

With the Drools and jBPM (KIE) 6 series came a new workbench, with the promise of eventual end user extensibility. I finally have some teaser videos to show this working and what's in store. Make sure you select 1080p and go full screen to see them at their best.
(click to enlarge)

(click to enlarge)

What you seen in these videos is the same workbench available on the Drools video's page. Once this stuff is released you'll be able to extend an existing Drools or JBPM (KIE) installation or make a new one from scratch that doesn't have Drools or jBPM in it - i.e. the workbench and it's extension stuff is available standalone, and you get to chose which plugins you do or don't want.

Here is demo showing the new Bootstrap dynamic grid view builder used to build a perspective, which now doubles as an app. It uses the new RAD, JSFiddle inspired, environment to author a simple AngularJS plugin extension. This all writes to a GIT backend, so you could author these with Intellij or Eclipse and just push it back into the GIT repo. It then demonstrates the creation of a dynamic menu and registers our app there. It then also demonstrates the new app directory. Apps are given labels and can then be discovered in the apps directory - instead, or as well as, top menu entries. Over 2015 we'll be building a case management system which will compliment this perfect as the domain front end - all creating a fantastic Self Service Software platform.
http://youtu.be/KoJ5A5g7y4E

Here is a slightly early video showing our app builder working with DashBuilder,
http://youtu.be/Yhg31m4kRsM

Other components such as our Human Tasks and Forms will be available too. We Also have some cool infrastructure coming event publication and capture and timeline reporting, so you visualise social activity within your organization - you'll be able to place time timeline components you see in this blog, on your app pages:
http://blog.athico.com/2014/09/activity-insight-coming-in-drools-jbpm.html

All this is driven by our new project UberFire, which provides the workbench infrastructure for all of this. The project is not yet announced or released, but will do so soon - the website is currently just a placeholder, we'll blog as soon as there is something to see  :)

by Mark Proctor (noreply@blogger.com) at October 30, 2014 07:35 PM

October 29, 2014

October 28, 2014

Sandy Kemsley: SAP’s Bigger Picture: The AppDev Play

Although I attended some sessions related to BPM and operational process intelligence, last week’s trip to SAP TechEd && d-code 2014 gave me a bit more breathing room to look at the bigger...

[Content summary only, click through for full article and links]

by sandy at October 28, 2014 12:55 PM

October 27, 2014

BPinPM.net: Want to be one step ahead in BPM? Get one of our very last conference tickets for free!

stiftung_naechste_gesellschaft_3We are proud to announce that we have made it to engage Dr. Bernhard Krusche from Stiftung Nächste Gesellschaft (Foundation Next Society) as key note speaker for this year’s BPinPM.net Conference! :-)

Bernhard Krusche is an anthropologist, author, and explorer of the Next Society and he will help us to keep the pace and continue to be one step ahead!

Last year, we have started to tackle the Digital Age and we have performed workshops to dive into Digital Age BPM. – Results will be presented at the conference.

But this year’s key note is designed to go even further into this direction by challenging BPM with insights from the Next Society: Connect, Co-Create, Collaborate!

For more information about the Next Society, have a look at: http://x-society.net

If you have not registered yet, please don’t hesitate to sign up for the conference. There are only 5 very last tickets available!

And: You won’t believe it, but we are going to give away one of these very last tickets for free! Even if you are already holding a ticket, join the lottery now and spread the word. We will redeem your ticket, if you win.

Enter the lottery now… …and – to be on the safe side – buy your conference ticket right away!

We are looking forward to welcome you in Seeheim!

Best regards,
Mirko

by Mirko Kloppenburg at October 27, 2014 08:31 PM

Sandy Kemsley: What’s New With SAP Operational Process Intelligence

Just finishing up some notes from my trip to SAP TechEd && d-code last week with the latest on their Operational Process Intelligence product, which can pull events and data from multiple...

[Content summary only, click through for full article and links]

by sandy at October 27, 2014 01:14 PM

October 24, 2014

Thomas Allweyer: Prozessmanagement-Ratgeber mit praktischen Arbeitshilfen

Cover Fuermann ProzessmanagementFüermann betrachtet Prozessmanagement als langfristiges Organisationsprinzip, das nicht in Form eines einzelnen Projektes eingeführt werden kann, sondern ein umfangreiches Programm erfordert. Er teilt dieses Programm in vier Phasen auf, nach denen auch das Buch gegliedert ist. Zunächst aber werden die Grundlagen der Prozessorganisation behandelt. Von den verschiedenen möglichen Organisationsformen wird die asymmetrische Matrixorganisation hervorgehoben, die das Spannungsfeld zwischen Prozess und Funktion durch eine Matrix löst, aber den Prozessen den Vorrang gibt. Nicht direkt an den kundenorientierten Prozessen beteiligte Organisationseinheiten werden als interne Dienstleister aufgestellt, oder sie entsenden Fachleute in die Kernprozesse.

In der Phase “Infrastruktur” werden die verschiedenen Rollen im Prozessmanagement besetzt und das Programm geplant. Außerdem geht es darum, die Prozesse zu identifizieren und in einer Prozesslandkarte darzustellen. Es folgt die Phase “Beschreibung”, in der die Prozessdetails festgelegt und in Form von Ablaufdiagrammen dokumentiert und verbindlich festgelegt werden. In der Phase “Lenken” werden Indikatoren zur Messung von Prozessen aufgestellt, Schnittstellenvereinbarungen getroffen, Prozessaudits durchgeführt und Korrekturmaßnahmen eingeleitet. Die Methode “Hoshin Kanri” bietet einen Rahmen zur Ziel- und Maßnahmenplanung. Schließlich folgt die Phase “Verbessern”. Das Prinzip der ständigen Verbesserung sorgt für eine kontinuierliche Weiterentwicklung der Prozesse. Als Ansätze für größere Veränderungen werden Six Sigma und Process Re-Engineering beschrieben.

Für jede Phase wird eine Reihe von Arbeitsmitteln zur Verfügung gestellt, die auch von der Website des Buchs heruntergeladen werden können, meist in Form von Excel-Dateien. Darunter finden sind beispielsweise Vorlagen für Programmpläne, für SIPOC-Darstellungen (Supplier – Input – Process – Output – Customer) oder für übersichtliche Prozessberichte im A3-Format. Die Anwendung jedes dieser Hilfsmittel wird im Buch ausführlich erläutert.

Der Fokus des Buchs liegt rein auf organisatorischen Fragestellungen, daher wurden sämtliche IT-Aspekte außen vor gelassen. Ob dies angesichts der immer stärkeren IT-Durchdringung aller Prozesse heute noch sinnvoll ist, lässt sich zumindest diskutieren. Auch einige der vorgestellten Arbeitsmittel im Prozessmanagement sollte man in ernsthaften Prozessmanagement-Initiativen nicht unbedingt ohne entsprechende Software-Unterstützung durchführen. Beispielsweise wird eine rein Excel-basierte Prozessmodellierung recht schnell nicht mehr handhabbar sein. Dass für die grafische Darstellung Programmablaufpläne anstelle verbreiteter Notationen wie BPMN verwendet werden, entspricht nicht ganz dem State of the Art.

Zwar enthält das Buch keine kompletten Neuheiten, doch werden die einzelnen Themen gut verständlich und nachvollziehbar dargestellt. Viele der bereitgestellten Arbeitshilfen können sehr nützlich für die praktische Arbeit von Prozessberatern und -managern sein.


Timo Füermann:
Prozessmanagement – Kompaktes Wissen, Konkrete Umsetzung, Praktische Arbeitshilfen.
Hanser 2014
Das Buch bei amazon.

by Thomas Allweyer at October 24, 2014 09:27 AM

Drools & JBPM: Red Hat Job Opening - Software Sustaining Engineer

We are looking to hire someone to help improve the quality of BRMS and BPMS platforms. These are the productised versions of the Drools and jBPM open source projects.

The role will involve improving our test coverage, diagnosis problems, creating reproducers for problems as well as helping fix them. You’ll also be responsible for helping to setup and maintain our continue integration environment to help streamline the various aspects involved in getting timely high quality releases out.

So if you love Drools and jBPM, and want to help make them even better and even more robust - then this is the job for you :)

The role is remote, so you can be based almost anywhere.

URL to apply now http://jobs.redhat.com/jobs/descriptions/software-engineer-brno-jihomoravsky-kraj-czech-republic-job-1-4759718

Mark

by Mark Proctor (noreply@blogger.com) at October 24, 2014 05:10 AM

October 21, 2014

Sandy Kemsley: SAP TechEd Keynote with @_bgoerke

I spent yesterday getting to Las Vegas for SAP TechEd && d-code and missed last night’s keynote with Steve Lucas, but up this morning to watch Björn Goerke — head of SAP Product...

[Content summary only, click through for full article and links]

by sandy at October 21, 2014 05:44 PM

October 20, 2014

Thomas Allweyer: Welche Projektmanagement-Praktiken sorgen wirklich für Erfolg?

Zwar existieren zahlreiche Ansätze und Methoden im Projektmanagement, doch gibt es wenig systematische Untersuchungen darüber, welche Faktoren tatsächlich für den Projekterfolg ausschlaggebend sind. Daher führt das BPM-Labor der Hochschule Koblenz in Zusammenarbeit mit der Deutschen Gesellschaft für Projektmanagement die Studie “Erfolgsfaktoren im Projektmanagement” durch. Zur Teilnahme an der online durchgeführten Befragung sind alle Personen mit praktischer Projekterfahrungen aufgerufen. Anhand strukturierter Fragen werden Projektrahmen, -typ und genutzte Praktiken von jeweils einem erfolgreichen und einem weniger erfolgreichen Projekt erhoben. Die Teilnehmer erhalten neben dem Studienbericht Sonderauswertungen und Schlüsselaussagen für die eigene Branche. Zudem kann man die Teilnahme an einem Workshop zum agilen Projektmanagement gewinnen.

Eine Beteiligung an der Umfrage ist bis zum 26.11. unter www.erfolgsfaktoren-projektmanagement.de möglich.

by Thomas Allweyer at October 20, 2014 08:30 AM

October 16, 2014

Tom Debevoise: New Book: The Microguide to Process and Decision Modeling in BPMN/DMN

 

The Microguide to Process and Decision Modeling in BPMN/DMN is now available on Amazon.  A little bit about the book: the landscape of process modeling has evolved as have the best practices. The smartest companies are using decision modeling in combination with process modeling. The principle reason is that decisions and processes are discovered and managed in separate, yet, interrelated ways.

Decision Model and Notation (DMN) is an evolution of Business Process Model and Notation (BPMN) 2.0 into an even more powerful and capable tool set and the Microguide book covers both specifications. It also focuses on the best practices in decision and process modeling. A number of these best practices have emerged, creating robust, agile, and traceable solutions.  Decision management and decision modeling are critical, allowing for simpler, smarter, and more agile processes. 

A simple decision and gateway control of an execution path to respond to a purchasing decision.

As the figure above shows, the proper use of decision modeling uncovers critical issues that the process must address to comply with the decision. Decision-driven processes act on the directives of decision logic: decision outputs affect the sequence of things that happen, the paths taken, and who should perform the work. Processes provide critical input into decisions, including data for validation and identification of events or process-relevant conditions. The combination of process and decision modeling is a powerful one.

In most business processes, an operational decision is the controlling factor driving processes. This is powerful, as many governments and enterprises focus on minimizing the event response lag because there is often a financial benefit to faster responses. Straight-through processing and automated decision making, not just automated processes, is also emphasizing the importance of decisions in processes. Developing a decision model in DMN provides a detailed, standardized approach that precisely directs the process and creates a new level of traceability.

Decision modeling can therefore be considered an organizing principle for designing many business processes. Most process modeling in BPMN is accomplished by matching a use case, written or otherwise, with workflow patterns. Process modeling is critical to the creation of a robust and sustainable solution. Without decision modeling, however, such an approach can result in decision logic becoming a sequence of gateways and conditions such that the decision remains hidden and scattered among the process steps.

Without decision modeling, critical decisions, such as how to source a requisition when financial or counter-party risk is unacceptable, or what to offer a customer, are lost to the details of the process. When the time comes to change or improve a decision, a process model in BPMN alone might not meet the need. Providing a notation for modeling decisions separately from processes is the objective of DMN.

by Tom Debevoise at October 16, 2014 09:19 PM

October 15, 2014

Sandy Kemsley: AIIM Information Chaos Rescue Mission – Toronto Edition

AIIM is holding a series of ECM-related seminars across North America, and since today’s is practically in my back yard, I decided to check it out. It’s a free seminar so heavily...

[Content summary only, click through for full article and links]

by sandy at October 15, 2014 05:34 PM

Bruce Silver: BPMN Explained – Part 2

Yesterday I tried to explain BPMN to those who don’t know what it is.  OK, they are probably saying, if BPMN is so great, why do I hear these complaints about it?  Yes, that’s a good question.

First, you need to understand exactly who is complaining.  If it’s a legacy tool vendor wedded to their proprietary (“much better!”) notation, well that speaks for itself.  Ditto if it’s a gray-haired process improvement consultant whose idea of a modern tool is a whiteboard that prints.  Which is most of them.  But even if you cross those guys off the list, there are normal end users who complain about it.

One complaint is there are too many shapes and symbols.  Actually, there are only three primary shapes, called flow nodes: activity, the rounded rectangle, denoting an action step in the process; gateway, the diamond, denoting conditional branching and merging in the flow; and event, the circle, denoting either the start or end of a process or subprocess, or possibly the process’s reaction to a signal that something happened.  Just three, much fewer than a legacy flowcharting notation.  In BPMN, the solid arrow, called sequence flow, must connect at both head and tail to one of these three shape types.

The problem is that the detailed behavior of the flow nodes is actually determined by their icons, markers, and border styles.  There are way too many of those, I will readily admit.  Only a small fraction of them are widely used and important to know; the rest you can simply ignore.  When I started my BPMN training many years ago, I identified a basic working set of shapes and symbols called the Level 1 palette, mostly carried over from traditional flowcharting.  The purpose was to eliminate the need to learn useless BPMN vocabulary that would never be used.  When BPMN 2.0 came out 4 years ago, they did a similar thing, officially this time, but for a different purpose.  The so-called Descriptive process modeling conformance class is essentially the Level 1 working set.  Its purpose, from OMG’s standpoint, was to limit the set of shapes and symbols a tool vendor must support in order to claim BPMN support.  So… if you are new to BPMN, just stick to the Level 1/Descriptive working set.  It will handle most everything you are trying to show, and good BPMN tools in fact let you restrict the palette to just those elements.

I sometimes hear the opposite complaint, that BPMN does not have a standard way to visualize important information, like systems, organizational units, task completion times, or resource costs, available in their current process modeling tool.  Actually, many BPMN tools do have ways to include these things, but each in their own tool-specific way.  BPMN just describes the process logic, that is, how the process starts and ends and the order of the steps.  It doesn’t describe the internal details of a task, like its data or user interface, or decision logic, or systems involved, or important simulation parameters.  Its scope is quite limited.  There are some emerging standards for those other things that will eventually link up with BPMN, but they are not yet widely adopted.  Anyway, it’s important to distinguish the information a BPMN tool can support from information that is part of BPMN itself.

Finally, some people don’t like the fact that BPMN has rules.  A tool validating models against those rules might determine, for instance, that the way you’ve been modeling something for years is invalid in BPMN.  You can ignore that, of course, but remember the goal of BPMN is clear communication of the process logic.  A diagram that violates the rules of the specification probably does not do that very well.  Like any new language, BPMN asks that you take a little time to learn it.  It’s actually not that hard.

The post BPMN Explained – Part 2 appeared first on Business Process Watch.

by bruce at October 15, 2014 05:22 PM

October 14, 2014

Bruce Silver: BPMN Explained

On Twitter someone posted to me: “Have you ever seen a short overview of BPMN that makes sense to people who have never heard of it?”  Hmmm… Probably not.  So here is my attempt.

Business Process Modeling Notation, or BPMN, is a process diagramming language.  It describes, in a picture, the steps in a business process from start to end, an essential starting point whether you are simply documenting the process, analyzing it for possible improvement, or defining business requirements for an IT solution to a process problem. Dozens of process diagramming languages have existed since the 1980s at least, so what’s so special about BPMN?

First, BPMN is an open industry standard, under the auspices of the Object Management Group.  It is not owned by a particular tool or consulting company.  A wide variety of tools support it, and the meaning of the business process diagram is independent of the tool used to create it. With BPMN you don’t need to standardize on a single tool for everyone in the organization, since they all share a common process modeling language.

Second, unlike flowcharts created in a tool like Visio or Powerpoint, the meaning of each BPMN shape and symbol is quite precise – it’s defined in a specification – and in principle independent of the personal interpretation of the person who drew it.  I say “in principle” because it is possible to violate the rules of the BPMN specification, just like it is possible to write an English sentence that violates accepted rules of grammar or spelling.  Nothing drastic happens in that case, but the diagram’s effectiveness at communication is decreased.

Third, BPMN is a language shared by business and IT, the first process modeling language able to make that claim.  When BPMN was first developed about 10 years ago, the only available process modeling standards at that time – UML activity diagrams and IDEF, among others – were rejected as “IT standards” that would not be accepted by business users.  To business users, a process diagram looked like a swimlane flowchart, widely used by BPM practitioners but lacking precise definition in a specification.  BPMN adopted the basic look and feel of a swimlane flowchart, and added to it the precision and expressiveness required by IT.  In fact, that precision and expressiveness is sufficient to drive a process automation engine in a BPM Suite (BPMS).  The fact that the visual language used by the business to describe a proposed To-Be process is the same as the language used by developers to build that process in a BPMS has opened up a new era of business-empowered process solutions in which business and IT collaborate closely throughout a faster and more agile process improvement cycle.

Even if you have no intention to create an automated process solution in a BPMS, BPMN diagrams can reveal information critical to process documentation and analysis that is missing in traditional swimlane flowcharts: exactly how the process starts and ends, what each instance of the process represents, how various exceptions are handled, and the interactions between the process and the customer, external service providers, and other processes.  The rules of the BPMN specification do not require these elements, but use of best-practice modeling conventions in conjunction with a structured methodology can ensure they are included.  My book BPMN Method and Style and my BPMessentials training of the same name are based on such an approach.

So yes, there is a cost to adopting BPMN, whether you are moving from casual tooling like Powerpoint or Visio flowcharts or from a powerful but proprietary language like ARIS EPC.  There is a new diagram vocabulary to learn, diagramming rules, as well as the aforementioned conventions and methodology such as Method and Style.  But the benefits of speaking a common process language are tremendous.  The investment in process discovery and analysis is far more than the cost of a tool or the time required to draw the diagrams.  It involves hundreds of man-hours of meetings, information gathering from stakeholders, workshops, and presentations to management.  The process diagram is a distillation of all that time and effort.  If it cannot be shared across the whole project team – business and IT – or to other project teams across the enterprise, now or in the future, you are throwing away much of that investment.  BPMN provides a way to share it, without requiring everyone to standardize on a single tool.

The post BPMN Explained appeared first on Business Process Watch.

by bruce at October 14, 2014 05:24 PM

Thomas Allweyer: Social BPM-Fähigkeiten von Prozessmanagement-Tools

Cover_Social_BPM_StudieDer Begriff “Social BPM” ist nicht ganz leicht zu fassen. Angesichts der Verbreitung von sozialen Netzwerken und “Social Software” im Unternehmen liegt es nahe, dass diese auch sehr nützlich für das Geschäftsprozessmanagement sein können – zumal es bei der Definition und Ausführung von Prozessen praktisch immer erforderlich ist, dass mehrere Beteiligte erfolgreich zusammenarbeiten. Doch welche konkreten Einsatzmöglichkeiten gibt es für Newsfeeds, Kontakte, Kommentarfunktionen, Wikis etc. im Prozessmanagement und welchen Nutzen bringen Sie?

Die Autoren der vorliegenden Studie haben zunächst die bestehenden Potenziale in den unterschiedlichen Phasen des Prozessmanagement-Kreislaufs herausgearbeitet. So bietet Social BPM in der Phase der Prozessidentifikation und -modellierung den Vorteil, dass sich viele Beteiligte aktiv einbringen können. Z. B. können mehrere Personen gemeinsam an einem Modell arbeiten, sich über Änderungen informieren lassen und Kommentare abgeben. In der Prozessimplementierung und -ausführung helfen beispielsweise zielgerichtete Informationen auf rollenbasierten Prozessportalen. Workflow-Aufgaben können in unternehmensinterne soziale Netzwerke eingespeist und Workflows durch Social Media-Ereignisse gestartet werden. Bei der Prozessüberwachung und der kontinuierlichen Verbesserung können auftretende Probleme schnell allen Betroffenen mitgeteilt werden. Virtuelle Communities können dazu dienen, Prozesse weiterzuentwickeln.

Bei den insgesamt zehn Anbietern von BPM-Software, die an der Schwerpunktstudie teilgenommen haben, handelt es sich überwiegend um Hersteller von Modellierungswerkzeugen. Zwar geben fast alle an, die Ausführung zumindest von Freigabeworkflows u. ä. zu unterstützen, doch liegt der Schwerpunkt bei den meisten ganz deutlich im Bereich Modellierung und Analyse. Entsprechend beziehen sich die angebotenen Social BPM-Funktionen hauptsächlich auf die Phase der Prozessidentifikation und -modellierung. Zwar hat die Mehrheit der Anbieter erst seit etwa 2010 damit begonnen, dedizierte Social Software-Funktionalitäten in ihre Produkte einzubauen, doch bieten viele schon seit Langem Möglichkeiten zur Zusammenarbeit, etwa zentrale Repositories zur verteilten Modellierung. Oftmals wird in diesem Zusammenhang auch die Bezeichnung “Collaborative BPM” verwendet.

Praktisch alle betrachteten Produkte verfügen über Prozessportale, Kommentar- und Bewertungsfunktionen, Newsfeeds und Abonnements sowie eine Aufgabenverwaltung für die Tätigkeiten im Modell-Lebenszyklus. Vereinzelt wird eine vereinfachte Modellierung angeboten, z. B. mittels tabellarischer Darstellungen. Sie ermöglicht es auch Mitarbeitern, die keine Modellierungs-Schulungen erhalten haben, ihre eigenen Prozesse zu erfassen. Eher selten ist die Einbindung von Wikis und Blogs, wobei mehrere Hersteller die Integration von Wikis in ihre Modellierungsplattformen für die Zukunft angekündigt haben.

Neben diesen vorgegebenen Kategorien konnten die Hersteller weitere vorhandene soziale Funktionen nennen. Die Antworten reichen von Abstimmungsfunktionen über Wissens- und Ideen-Management bis hin zur Karriereplanung und zum Case Management. Diese weite Spanne macht deutlich wie vielfältig das Themengebiet Social BPM ist.

Zumeist sind die sozialen Funktionen komplett in das Prozessmanagementwerkzeug integriert. Mehrere Anbieter bieten stattdessen oder zusätzlich eine Integration mit anderen Plattformen an, allen voran mit Microsoft Sharepoint.

Was beim Durchlesen der Studie etwas erstaunt: Die Nutzung von sozialen Funktionen bei der Prozessausführung kommt fast gar nicht vor. Das liegt sicherlich in einem hohen Maße am Teilnehmerfeld, das kaum BPMS-Hersteller mit dem Schwerpunkt Ausführung enthält. Andererseits finden sich in der vorangegangenen Überblicksstudie unter den 28 Herstellern eine Reihe von BPMS-Anbietern. Man kann spekulieren, woran es liegt, dass davon kaum einer an der Social BPM-Studie teilgenommen hat. Entweder haben sie in diesem Bereich wenig anzubieten, oder das Thema Social BPM wird fast ausschließlich unter dem Aspekt der kollaborativen Modellierung gesehen. Dabei liegen in der Prozessausführung deutlich höhere Potenziale als im Bereich Prozessmodellierung, da wesentlich mehr Mitarbeiter Prozesse durchführen als Prozesse modellieren.

Möglicherweise wird aber immer noch eine starke Trennung zwischen stark strukturierten Prozessen und kollaborativen Aufgaben vorgenommen. Beim BPMS-Einsatz für stark strukturierte Prozesse werden soziale Funktionen außen vorgelassen, und für kollaborative Aufgaben werden eventuell unternehmensinterne soziale Netzwerke eingesetzt, aber unabhängig von BPMS. Da es jedoch sowohl bei stark strukturierten Prozessen oftmals die Notwendigkeit zur Zusammenarbeit gibt, als auch bei kollaborativen Tätigkeiten gewisse Steuerungsaufgaben automatisiert werden könnten, wäre eine stärkere Integration mit Sicherheit sinnvoll. Nützliche Ansätze hierzu liefern auch die viel diskutierten Konzepte zum Adaptive Case Management – doch hinkt auch in diesem Bereich die praktische Umsetzung noch hinter der Diskussion hinterher.


Jens Drawehn, Oliver Höß:
Business Process Management Tools 2014 – Social BPM.
Fraunhofer Verlag 2014.
Weitere Infos und Bestellung beim IAO

by Thomas Allweyer at October 14, 2014 09:50 AM

Drools & JBPM: Decision Camp - 2014 - What you are missing out on

Here is the Decision Camp 2014 agenda, so you can see what you are missing out on, if you aren't there :)

Tuesday

9:30 - 10:00 am
Registration
11 am - 12 pm
CTO Panel
Mark Proctor, Red Hat
Dr. Jacob Feldman, OpenRules
Carlos Serrano-Morales, Sparkling Logic
Moderated by James Taylor
12 - 1 pm
Lunch

General Sessions

We will host Best Practices sessions all day, presented by fellow practitioners or technology providers
General sessions will have break out tracks for rules writers and software architects
1 - 2 pm
An Intelligence Led Approach to Decision Management in Tax AdministrationDr. Marcia Gottgtroy, Inland Revenue New Zealand
Decision Tables as a Programming tool
Howard Rogers, RapidGen Software
Are Business Rules Obsolete?
Kenny Shi, UBER Technologies
2 - 3 pm
Customer Support Process AutomationErwin De Ley, iSencia Belgium
Building Domain-Specific Decision ModelsDr. Jacob Feldman, OpenRules
4 - 5 pm
Explanation-based E-Learning for Business Decision Making and Education 
Benjamin Grosof & Janine Bloomfield, Coherent Knowledge Systems

Wednesday

9:30 - 10 am
Registration

Vertical Day
Healthcare

Davide Sottara is our chair for the Healthcare day 

Vertical Day
Financial Services

10 - 11 am
TBA
Dr. Davide Sottara, PhD
12 - 1 pm
Lunch & Networking
1 - 2 pm
Cloud-based CEP in Healthcare 
Mariano Nicolas De Maio, PlugTree
Analytics for Payment Fraud
Carlos Serrano-Morales, Sparkling Logic
4 - 5 pm
Speaker Panel
All Speakers

by Mark Proctor (noreply@blogger.com) at October 14, 2014 08:08 AM

Drools & JBPM: Classic Games Development with Drools

I realised I didn't upload my slides from Decision Camp 2013, where I talked about using Games to learn Rule Based Programming. Sorry about that, here they are better late than never:
Learning Rule Based Programming Using Games

The talk provides a gentle introduction into rule engines and then covers a number of different games, all of which are available to run from the drools examples project.
  • Number Guess
  • Adventure Game
  • Space Invaders
  • Pong
  • Wumpus World
While there is no video for the presentation I gave, I have made videos for some of these games in the past. Although beware some of them may be a little out of date now, compared to the version that is in our current examples project.

by Mark Proctor (noreply@blogger.com) at October 14, 2014 02:33 AM

October 10, 2014

Drools & JBPM: 3 Days until Decision Camp 2014, San Jose (13-15 Oct)

Only 3 days to go until Decision Camp 2014 arrives, a free conference in the San Jose area for business rules and decision management practictioners. The conference is multi-track with 3 concurrent tracks at same time. The Decision Camp full agenda can be found here.

Like last year  RuleML will be participating and presenting, such as Dr. Benjamin Grosof. Which is a great opportunity to catch up on the latest happenings in the rules standards industry.

Last year I did a games talk, this year I'm doing something a little more technical, to reflect my current research. Here is my title and abstract.
Demystifying Truth Maintenance and Belief Systems
Basic Truth Maintenance is a common feature, available in many production rule systems, but it one that is not generally well understood. This talk will start by introducing the mechanics of rule engines and how that is extended for the common TMS implementation. It will discuss the limitations of these systems and introduce Justification based Truth Maintenance (JTMS) as a way to add contradictions, to trigger retractions. This will lead onto Defeasible Logic, which while sounding complex, facilitates the resolving of conflicting rules of premisses and contradictions, in a way that follows typical argumentation theory. Finally we will demonstrate how the core of this can be abstracted to allow pluggable beliefs, so that JTMS and Defeasible can be swapped in and out, along other systems such as Bayesian Belief Systems.




by Mark Proctor (noreply@blogger.com) at October 10, 2014 05:11 PM

Thomas Allweyer: BizDevs – die neue Art der Zusammenarbeit von Business und IT?

Cover process-driven applications with BPMNSeit kurzem ist die englische Ausgabe des Buchs über prozessgesteuerte Anwendungen von Volker Stiehl verfügbar. Interessant sind hierzu die Ausführungen des Autors im SAP Community Network, in denen er die Hintergründe und die Ansätze des Buchs beschreibt. Er trennt die Anwendung eine fachliche Prozess- und eine Implementierungs-Schicht, die beide mit BPMN beschrieben werden. Die Prozesse beider Schichten werden ausgeführt und interagieren miteinander. Dabei tragen Business und IT in gleichem Maße die Verantwortung für das ausführbare Modell der fachlichen Prozess-Schicht.

In seinem Blogbeitrag prägt Stiehl die Bezeichung “BizDevs” für die enge Kooperation von Business und Entwicklern (Developers), die für die Entwicklung prozessgetriebener Anwendungen nötig ist. Dabei lehnt er sich an den mittlerweile verbreiteten Ausdruck “DevOps” für die enge Integration von Software-Entwicklung und -Betrieb an. Wünschenswert wäre eine solche engere Zusammenarbeit allemal. Und der in dem Buch vorgestellte Architekturansatz könnte für viele BPMS-basierte Anwendungen wegweisend sein.


Stiehl, Volker:
Process-Driven Applications with BPMN
Springer 2014
Das Buch bei amazon

Besprechung der deutschen Ausgabe

by Thomas Allweyer at October 10, 2014 07:52 AM

October 06, 2014

Thomas Allweyer: Flexibleres BPM durch Graphdatenbanken

graphdatenbankGastbeitrag von Helmut Heptner. Wir erleben einen Übergang von traditionellen Geschäftsprozessmanagement-Systemen auf Basis relationaler Datenbanksysteme, die im Zeitalter von „Big Data“ den Anforderungen selbst nach Umstellung und Neuprogrammierung nicht gewachsen wären, hin zu Systemen basierend auf Graphdatenbanken. Vorreiter dieser Entwicklung sind populäre Social Network Anbieter wie Facebook, Google+ und Twitter, um nur einige Beispiele zu nennen. Allen gemeinsam sind große Teilnehmerzahlen und eine unüberschaubar große Anzahl an Beziehungen zwischen den Anwendern, die dennoch bei Bedarf in Sekundenschnelle zu den unterschiedlichsten Auswertungen kombiniert werden können.

Was ist eine Graphdatenbank und wie unterscheidet sie sich von klassischen Datenbanken?

Eine relationale Datenbank ist vereinfacht dargestellt eine Sammlung von Tabellen (den Relationen), in deren Zeilen Datensätze abgespeichert sind. Dieses Modell ist wegen der wachsenden Datenmengen und der ständig ansteigenden Zahl der bestehenden und möglichen Beziehungen zwischen den Daten für viele Bereiche, vor allem auch als Grundlage für Business Process Management Systeme (BPMS) nicht optimal geeignet. Berechnungen dauern umso länger, je größer die Datenmenge ist und je komplexer die Beziehungen zwischen den Daten sind.

Heutige Anforderungen werden durch Graphdatenbanken besser erfüllt. So genannte NoSQL-Technologien gewinnen rasend schnell an Beliebtheit. Der bekannteste und größte Anbieter dieser Technologie ist wohl Neo Technology mit Neo4j (http://www.neo4j.org), einer in Java implementierten Open-Source-Graphdatenbank. Die Entwickler selbst beschreiben Neo4j auf ihrer Webseite als eine transaktionale Datenbank-Engine, die Daten anstatt in Tabellen in Graphen speichert. Für einen praktischen Einstieg in Theorie und Praxis der Graphdatenbanken ist neo4j.org mit den dort bereitstehenden Informationen und Videos eine hilfreiche Anlaufstelle.

Bekanntestes Beispiel für die Anwendung von Graphdatenbanken ist der “Social Graph” von facebook. Dieser Graph nutzt Beziehungen zwischen Menschen. Die für einen Graph typischen Knoten repräsentieren Menschen, jedem Knoten wird dabei der Name der Person zugeordnet. Die Kanten (zweites Element von Graphdatenbanken) repräsentieren Beziehungen. Diese sind unter anderem durch einen Typ charakterisiert wie „gefällt mir“, „ist befreundet mit“, „gefällt mir nicht“ u.a. Einfache Beispiele für solche Graphen sind Stammbäume mit Familienangehörigen als Knoten und Kanten als Beziehungen zwischen Eltern und Kindern, Streckenpläne des öffentlichen Nahverkehrs, IT-Netzwerkstrukturen oder eben auch Prozessabläufe im BPM.

Emil Eifrem, CEO von Neo Technology, drückt das so aus: “Die weltweit innovativsten Unternehmen – darunter Google, Facebook, Twitter, Adobe und American Express, haben bereits auf die Graphen-Technologien umgestellt, um die Herausforderungen komplexer Daten im Kern anzugehen.”

Wenn es um große (“Big Data”), verteilte und unstrukturierte Datenmengen wie u.a. beim Geschäftsprozessmanagement geht, sind die Graphdatenbanksysteme traditionellen Datenbanksystemen meist haushoch überlegen.

Vorteile: Wieso und in welchen Szenarien Graphdatenbanken traditionelle Datenbanken übertrumpfen

Aktuell gibt es drei Trends in der Datenverarbeitung:

  • Durch die Zunahme an Anwendern, Systemen und Daten, die erfasst werden sollen, steigen die Datenmengen exponentiell an. Ausdruck dieser Entwicklung ist das seit etwa 2013 etablierte Buzz Word “Big Data”.
  • Die Datenmengen befinden sich nicht mehr auf nur einem zentralen System, sondern oft verteilt, um Redundanz sicherzustellen, Performance zu optimieren und die Auslastung zu steuern. Bekannte Beispiele für diese Entwicklung sind Amazon, Google und andere Cloud-Anbieter.
  • Datenstrukturen werden komplexer und vernetzter durch das Internet, soziale Netzwerke und offene Schnittstellen zu Daten aus verschiedensten Systemen.

Diese Trends sind mit etablierten Datenbanksystemen nicht mehr zu beherrschen. Graphdatenbanksysteme sind nicht nur eine Antwort auf diese Herausforderungen, diese machen sogar ihre Stärke aus:

  • Im Gegensatz zum Entwurf für eine relationale Datenbank ist die Datenmodellierung für einen Graphen deutlich einfacher. Im Grunde reicht es aus, Geschäftsprozessschritte als Elemente aufzuzeichnen, untereinander mit Pfeilen zu verbinden und abschließend Bedingungen und Eigenschaften zu definieren. Ein so erstelltes Datenmodell kann meist unverändert in die Datenbank übernommen werden. Dadurch muss man nicht länger Programmierer oder Datenbankspezialist sein. Alle Beteiligten können das Modell verstehen und an wechselnde Anforderungen anpassen, ohne die Integrität des Graphen und der zugehörigen Infrastruktur anzutasten.
  • Bei Geschäftsprozessen ist das flexible Datenmodell der Graphdatenbanken deutlich agiler als andere Systeme. Das liegt schon in der Natur der Sache begründet: Geschäftsprozesse werden als Graphen modelliert. Entscheidungen, die auf sich entwickelnden unternehmenskritischen Daten basieren, können mit Hilfe von Abhängigkeiten und Regeln ebenfalls abgebildet werden. Die Modellierung von Geschäftsprozessen durch Graphen unterstützt die Agilität, weil schnell und wiederholbar auf Prozessänderungen und Prozessmanagement reagiert werden kann.
  • Graphdatenbanken sind leistungsfähiger als konkurrierende Technologien, da sie die Beziehungen bei Abfragen nicht neu berechnen, sondern ihnen folgen. Der Grund: Die Beziehungen werden in der Graphdatenbank bereits beim Einfügen erstellt und stehen danach sofort zur Verfügung. Abfragen beginnen am Startknoten und folgen den Beziehungen zwischen den Knoten. Das ermöglicht beispielsweise Echtzeitabfragen und damit sofortige, exakte und nutzbringende Interaktionen. Die Graphdatenbanksysteme sind also vor allem deshalb auf dem Vormarsch, weil sie bestehende Relationen zwischen den Daten nicht erst zur Laufzeit berechnen, sondern einfach nutzen.
  • Weitere Stärken der Graphdatenbanken liegen im Design (Zuverlässigkeit) und in über Jahrhunderte ausgereiften mathematischen und erkenntnistheoretischen Grundlagen (Dateneinsicht).

Die Vorteile bei der Verwaltung von Geschäftsprozessen

Dass die Graphdatenbanken vor allem in großen Unternehmen mit vielen und komplexen Prozessen Vorteile bieten, liegt vor allem in der naturgemäßen Abbildbarkeit von Geschäftsprozessen und im flexiblen Design der Graphen. Erfahrungsgemäß unterliegen Geschäftsprozesse kontinuierlichen Änderungen und müssen “on-the-fly” angepasst werden. Bei Verwendung relationaler und anderer Datenbankmodelle ist das nicht ohne weiteres möglich, das BPMS muss unter Umständen durch Spezialisten aufwändig aktualisiert werden und steht möglicherweise nicht unterbrechungsfrei zur Verfügung. Bei einem BPMS, das auf einer Graphdatenbank-Technologie basiert wie Comindware Tracker, sind solche Änderungen dagegen unterbrechungsfrei im Live-Betrieb möglich.

Ein weiterer Vorteil liegt in der Anpassungsfähigkeit der Graphdatenbanken an Entwicklungen im Unternehmen. Geschäftsprozesse reifen mit der Zeit, immer mehr Mitarbeiter werden einbezogen, immer neue Bedingungen und Abhängigkeiten müssen berücksichtigt werden. In Graphdatenbanken werden einfach neue Knoten definiert, Übergänge mit Bedingungen hinzugefügt und Eigenschaften definiert – schon sind die dahinter liegenden Geschäftsprozesse korrekt abgebildet.

Hauptvorteil der Graphdatenbanken ist deren Fähigkeit, nicht nur die Daten zu verwalten und für Auswertungen bereitzustellen, sondern auch die Geschäftsregeln zu speichern. Damit wird der an den Prozessen beteiligte Mitarbeiter von Routinearbeiten entlastet und das Potential der Wissensarbeiter steht während der Prozesse zur Verfügung.

Fazit

Während bei relationalen Datenbankanwendungen oft genug auf Bekanntes zurückgegriffen werden kann, beschreiten Entwickler von Graphdatenbank-Anwendungen Neuland. Comindware, Anbieter der BPM-Lösung Comindware Tracker, wurde 2008 gegründet und ist ein junges, modernes Unternehmen, das sich den Anforderungen an moderne BPM-Systeme stellt und schnell auf 70 Mitarbeiter gewachsen ist. Die Gründer erkannten, dass die meisten Geschäftsprozesse unstrukturiert sind oder sich im Laufe der Zeit ändern und dass eine moderne BPM-Lösung diesen Anforderungen gewachsen sein muss. Comindware hat die Graphdatenbank Neo4j genutzt und auf deren Basis für eigene Lösungen die patentierte ElasticData Technologie entwickelt.


Autorenprofil

Der Autor und ehemalige Acronis Geschäftsführer Helmut Heptner ist seit März 2012 Geschäftsführer der Comindware GmbH und verantwortet das operative Geschäft in Zentraleuropa. Comindware zählt zu den Pionieren im adaptiven Business Process Management und beschäftigt derzeit weltweit über 70 Mitarbeiter. Die Lösung Comindware Tracker wird von Unternehmen wie Gazprom Avia und einem großen deutschen Autobauer eingesetzt.

by Thomas Allweyer at October 06, 2014 08:38 AM

October 03, 2014

Drools & JBPM: 10 Days until Decision Camp 2014, San Jose (13-15 Oct)

Only 10 days to go until Decision Camp 2014 arrives, a free conference in the San Jose area for business rules and decision management practictioners. The conference is multi-track with 3 concurrent tracks at same time. The Decision Camp full agenda can be found here.

Like last year  RuleML will be participating and presenting, such as Dr. Benjamin Grosof. Which is a great opportunity to catch up on the latest happenings in the rules standards industry.

Last year I did a games talk, this year I'm doing something a little more technical, to reflect my current research. Here is my title and abstract.
Demystifying Truth Maintenance and Belief Systems
Basic Truth Maintenance is a common feature, available in many production rule systems, but it one that is not generally well understood. This talk will start by introducing the mechanics of rule engines and how that is extended for the common TMS implementation. It will discuss the limitations of these systems and introduce Justification based Truth Maintenance (JTMS) as a way to add contradictions, to trigger retractions. This will lead onto Defeasible Logic, which while sounding complex, facilitates the resolving of conflicting rules of premisses and contradictions, in a way that follows typical argumentation theory. Finally we will demonstrate how the core of this can be abstracted to allow pluggable beliefs, so that JTMS and Defeasible can be swapped in and out, along other systems such as Bayesian Belief Systems.




by Mark Proctor (noreply@blogger.com) at October 03, 2014 04:12 PM

October 02, 2014

Drools & JBPM: Trace output with Drools

Drools 6 includes a trace output that can help get an idea of what is going on in your system,  and how often things are getting executed, and with how much data.

It can also help to understand that Drools 6 is now a goal based algorithm, using a linking mechanism to link in rules for evaluation. More details on that here:
http://blog.athico.com/2013/11/rip-rete-time-to-get-phreaky.html

The first thing to do is set your slf4j logger to trace mode:
<appender name="consoleAppender" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<!-- %l lowers performance -->
<!--<pattern>%d [%t] %-5p %l%n %m%n</pattern>-->
<pattern>%d [%t] %-5p %m%n</pattern>
</encoder>
</appender>

<logger name="org.drools" level="trace"/>

<root level="info"><!-- TODO We probably want to set default level to warn instead -->
<appender-ref ref="consoleAppender" />
</root>
</configuration>

Let's take the shopping example, you can find the Java and Drl files for this here:
https://github.com/droolsjbpm/drools/blob/master/drools-examples/src/main/resources/org/drools/examples/shopping/Shopping.drl
https://github.com/droolsjbpm/drools/blob/master/drools-examples/src/main/java/org/drools/examples/shopping/ShoppingExample.java


Running the example will give output a very detailed and long log of execution. Initially you'll see objects being inserted, which causes linking. Linking of nodes and rules is explained in the Drools 6 algorithm link. In summary 1..n nodes link in a segment, when object are are inserted.
2014-10-02 02:35:09,009 [main] TRACE Insert [fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac]
2014-10-02 02:35:09,020 [main] TRACE LinkNode notify=false nmask=1 smask=1 spos=0 rules=

Then 1..n segments link in a rule. When a Rule is linked in it's schedule on the agenda for evaluation.
2014-10-02 02:35:09,043 [main] TRACE  LinkRule name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]

When it eventually evaluates a rule it will indent as it visits each node, as it evaluate from root to tip. Each node will attempt to tell you how much data is being inserted, updated or deleted at that point.
2014-10-02 02:35:09,046 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE Segment 1
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,056 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]

You can use this information to see how often rules evaluate, how much linking and unlinking happens, how much data propagates and more important how much wasted work is done.  Here is the full log:
2014-10-02 02:35:08,889 [main] DEBUG Starting Engine in PHREAK mode
2014-10-02 02:35:08,927 [main] TRACE Adding Rule Purchase notification
2014-10-02 02:35:08,929 [main] TRACE Adding Rule Discount removed notification
2014-10-02 02:35:08,931 [main] TRACE Adding Rule Discount awarded notification
2014-10-02 02:35:08,933 [main] TRACE Adding Rule Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,009 [main] TRACE Insert [fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac]
2014-10-02 02:35:09,020 [main] TRACE LinkNode notify=false nmask=1 smask=1 spos=0 rules=
2014-10-02 02:35:09,020 [main] TRACE LinkSegment smask=2 rmask=2 name=Discount removed notification
2014-10-02 02:35:09,025 [main] TRACE LinkSegment smask=2 rmask=2 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,028 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=0 rules=[RuleMem Purchase notification], [RuleMem Discount removed notification], [RuleMem Discount awarded notification], [RuleMem Apply 10% discount if total purchases is over 100]
2014-10-02 02:35:09,028 [main] TRACE LinkSegment smask=1 rmask=1 name=Purchase notification
2014-10-02 02:35:09,028 [main] TRACE LinkSegment smask=1 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE LinkSegment smask=1 rmask=1 name=Discount awarded notification
2014-10-02 02:35:09,043 [main] TRACE LinkSegment smask=1 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,043 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:2:14633842:14633842:2:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Product@df4b72]
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:3:732189840:732189840:3:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Product@2ba45490]
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:4:939475028:939475028:4:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@37ff4054]
2014-10-02 02:35:09,045 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,045 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Purchase notification]
2014-10-02 02:35:09,045 [main] TRACE LinkSegment smask=2 rmask=3 name=Purchase notification
2014-10-02 02:35:09,045 [main] TRACE LinkRule name=Purchase notification
2014-10-02 02:35:09,046 [main] TRACE Queue RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,046 [main] TRACE Queue Added 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,046 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Apply 10% discount if total purchases is over 100]
2014-10-02 02:35:09,046 [main] TRACE LinkSegment smask=2 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,046 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,046 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
2014-10-02 02:35:09,046 [main] TRACE Insert [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=2 stagedInsertWasEmpty=false
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=2 stagedInsertWasEmpty=false
2014-10-02 02:35:09,046 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE Segment 1
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,056 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Segment 1
2014-10-02 02:35:09,057 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE 3 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Rule[name=Purchase notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE 4 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Segment 1
2014-10-02 02:35:09,057 [main] TRACE 4 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE 5 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE Segment 1
2014-10-02 02:35:09,058 [main] TRACE 5 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE Fire "Purchase notification"
[[ Purchase notification active=false ] [ [fact 0:4:939475028:939475028:4:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@37ff4054]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark just purchased shoes
2014-10-02 02:35:09,060 [main] TRACE Fire "Purchase notification"
[[ Purchase notification active=false ] [ [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark just purchased hat
2014-10-02 02:35:09,061 [main] TRACE Removing RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,061 [main] TRACE Queue Removed 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,061 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE 6 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Segment 1
2014-10-02 02:35:09,061 [main] TRACE 6 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE 7 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Segment 1
2014-10-02 02:35:09,061 [main] TRACE 7 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Fire "Discount removed notification"
[[ Discount removed notification active=false ] [ null
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 0
2014-10-02 02:35:09,063 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,063 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,063 [main] TRACE Fire "Apply 10% discount if total purchases is over 100"
[[ Apply 10% discount if total purchases is over 100 active=false ] [ [fact 0:6:2063009760:1079902208:6:null:NON_TRAIT:120.0]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
2014-10-02 02:35:09,071 [main] TRACE Insert [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
2014-10-02 02:35:09,071 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,071 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,071 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,071 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,071 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount awarded notification
2014-10-02 02:35:09,071 [main] TRACE LinkRule name=Discount awarded notification
2014-10-02 02:35:09,071 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Added 3 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
Customer mark now has a shopping total of 120.0
2014-10-02 02:35:09,071 [main] TRACE Removing RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Removed 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE 8 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE Segment 1
2014-10-02 02:35:09,072 [main] TRACE 8 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE rightTuples TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE 9 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Segment 1
2014-10-02 02:35:09,073 [main] TRACE 9 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,073 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,073 [main] TRACE Rule[name=Discount awarded notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE 10 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Segment 1
2014-10-02 02:35:09,073 [main] TRACE 10 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE rightTuples TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE 11 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE Segment 1
2014-10-02 02:35:09,074 [main] TRACE 11 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE Fire "Discount awarded notification"
[[ Discount awarded notification active=false ] [ [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 10
2014-10-02 02:35:09,074 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Queue Removed 1 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Delete [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
2014-10-02 02:35:09,074 [main] TRACE LinkSegment smask=2 rmask=3 name=Purchase notification
2014-10-02 02:35:09,074 [main] TRACE LinkRule name=Purchase notification
2014-10-02 02:35:09,074 [main] TRACE Queue RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Queue Added 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE LinkSegment smask=2 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,075 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,075 [main] TRACE Queue RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Queue Added 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
Customer mark has returned the hat
2014-10-02 02:35:09,075 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE 12 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Segment 1
2014-10-02 02:35:09,075 [main] TRACE 12 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE 13 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Segment 1
2014-10-02 02:35:09,075 [main] TRACE 13 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Delete [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
2014-10-02 02:35:09,075 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,075 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,075 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Queue Added 3 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE UnlinkNode notify=true nmask=1 smask=0 spos=1 rules=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,076 [main] TRACE UnlinkSegment smask=2 rmask=1 name=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,076 [main] TRACE UnlinkRule name=Discount awarded notification
2014-10-02 02:35:09,076 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Queue Added 2 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Rule[name=Purchase notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 14 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 14 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 15 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 15 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Removing RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Queue Removed 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 16 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 16 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 17 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 17 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Fire "Discount removed notification"
[[ Discount removed notification active=false ] [ null
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 0
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Rule[name=Discount awarded notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 18 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 18 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 19 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 19 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]

by Mark Proctor (noreply@blogger.com) at October 02, 2014 01:56 AM

October 01, 2014

Keith Swenson: Process Mining MOOC on Coursera

Whether you call it Process Mining, or Automated Process Discovery, nobody can deny that this field that combines big data analytics with business process is at the center of an important transformation in the workplace.  Process mining is useful to kickstart the implementation of predefined BPM diagrams, and it is also useful in unpredictable case management to see what has been done and whether it is compliant with all the rules.  What would you give to attend a complete, college level course on process mining?  What if it was free?

What if it was free, and it was being taught by Wil van der Aalst, arguably the foremost expert on workflow and process mining? What if it started next month, and you could attend from anyplace in the world?  Would you sign up?  I would.  And I have.

Prof van der Aalst from the Technical University of Eindhoven is teaching the course “Process Mining: Data science in Action” on Coursera starting Nov 12.   It is available to everyone everywhere.  It will last 6 weeks, and require about 4-6 hours of work per week.  It is not just an important part of data science, it is data science in action:

Data science is the profession of the future, because organizations that are unable to use (big) data in a smart way will not survive. It is not sufficient to focus on data storage and data analysis. The data scientist also needs to relate data to process analysis. Process mining bridges the gap between traditional model-based process analysis (e.g., simulation and other business process management techniques) and data-centric analysis techniques such as machine learning and data mining. Process mining seeks the confrontation between event data (i.e., observed behavior) and process models (hand-made or discovered automatically). This technology has become available only recently, but it can be applied to any type of operational processes (organizations and systems). Example applications include: analyzing treatment processes in hospitals, improving customer service processes in a multinational, understanding the browsing behavior of customers using a booking site, analyzing failures of a baggage handling system, and improving the user interface of an X-ray machine. All of these applications have in common that dynamic behavior needs to be related to process models. Hence, we refer to this as “data science in action”.

Many of you have seen one of my many talks on process mining, so you know that I believe this is an important, emerging field, one which Fujitsu has been a part of.  This course will be a chance to get below the surface of what we normally present in a 45-minute webinar, and more than you can get from reading the Process Mining Manifesto.  It is the first major MOOC on process mining.  There two reasons why this is notable:

  • First of all, BPM is becoming more evidence-based and the MOOC “Process Mining: Data science in Action” provides a concrete starting point for more fact-driven process management. It fits nicely with the development of data science as a new profession. There is a need for “process scientists”, now and in the future.
  • Second, it is interesting to reflect on MOOCs as a new medium to train BPM professionals and to make end-users aware of new BPM technologies. Such online courses allow for much more specialized BPM courses offered to thousands of participants.

I for one am looking forward to it.  Here is a short video to explain:


by kswenson at October 01, 2014 10:48 AM

September 30, 2014

Keith Swenson: BPM Poster

Here it is, the poster on the definition of BPM, with all the terms defined and explained!poster

This is based on the effort to gain consensus around a single common definition for BPM.  The definition by itself can not convey the meaning, if the terms are not explained.  You have seen this before in my post “One Common Definition for BPM.”  What we have done is to put all the information together into a single poster.

Click here to access the PDF of the poster

It looks best printed 36 inches by 24 inches (90cm by 60cm).  Most of us don’t have printers that big.  You can print it in Acrobat across multiple pieces of paper, and tape them together, but that can be a lot of work.  I am looking a way to allow you to simply order the poster and have it sent to you in a tube.  Once I have found that, I will update the post here.

Or come by the Fujitsu booth and ask for one.


by kswenson at September 30, 2014 11:45 AM

September 29, 2014

Keith Swenson: 3 Innovative Approaches to Process Modeling

In a post titled “Business Etiquette Modeling” I made a plea for modeling business processes such that they naturally deform themselves as needed to accommodate changes.  If we model a fixed process diagram, it is too fragile, and can be costly to manually maintain.  While I was at the EDOC conference and the BPM conference, I saw three papers that introduce innovations which are not completely defined solutions, they represent solid research on steps in the right direction.  Here is a quick summary of each.

(1) Implementation Framework for Production Case
Management: Modeling and Execution

(Andreas Meyer, Nico Herzberg, Mathias Weske of the Hasso Plattner Institute and Frank Puhlmann of Bosch, EDOC 2014 pages 190-199)

This approach is aimed specifically at production case management which means that it is to support a knowledge worker, who has to decide in real time what to do, however the kinds of things that such a worker might do are well known in advance.  The example used is that of a travel agent:  we can identify all the various things that a travel agent might be able to do, but they might combine these actions in an unlimited variety of ways.  If we draw a fixed diagram, we end up restricting the travel agent unnecessarily.  Think about it: a travel agent might book one hotel one day, book flights the next, book another hotel, then change the flights, then cancel one of the hotel bookings — it is simply not possible to say that there is a single, simple process that a travel agent will always follow.

Instead of drawing a single diagram, the approach suggested is to draw separate little process snippets of all the things that a travel agent might do.  Here is the interesting part: the same activity might appear in multiple snippets.  At run time the system combines the snippets dynamically based on conditions.  Each task in each snippet is linked to things that are required before that task would be triggers, so based on the current case instance information, a particular task might or might not appear as needed.  Dynamic instance data determines how the current process is constructed.  Activities have required inputs and produce outputs which is part of the conditions on whether they are included in a particular instance.

modelshotAbove are some examples of the process snippets that might be used for a travel agent.   Note that “Create Offer” and “Validate Offer” appear in two different snippet with slightly different conditions.  The ultimate process would be assembled at run time in a way that depends upon the details of the case.  I would have to refer you to the paper for the full details on how this works, but I was impressed by Andreas’ presentation.  I am not sure this is exactly the right approach, but I am sure that we need this kind of research in this direction.

(2) Informal Process Essentials

(C. Timurhan Sungur, Tobias Binz, Uwe Breitenbücher, Frank Leymann, Universtity of Stuttgart, EDOC 2014 page 200-209)

They describe the need to support “informal processes” which is not exactly what I am looking for.  Informal means “having a relaxed, friendly, or unofficial style, manner, or nature; a style of writing or conversational speech characterized by simple grammatical structures.”  What I am looking for are processes that are well crafted, official, meaningful, accurate, and at the same time responsive to external changes.   Formal/informal is not the same relationship as fixed/adaptive.  However, they do cover some interesting ideas that are relevant.  They specify four properties:

  1. Implicit Business Logic – the logic is not explicit until run time
  2. Different Relationships Among Resources – interrelated sets of individuals are used to accomplish more complex goals
  3. Resource Participation in Multiple Processes – people are not dedicated to a single project.
  4. Changing Resources – dynamic teams assembled as needed.

These properties look a lot like innovative knowledge worker pattern, and so this research is likely to be relevant.  They find the following requirements to be able to meet the need:

  1. Enactable Informal Process Representation
  2. Resource Relationships Definition
  3. Resource Visibility Definition
  4. Support for Dynamically Changing Resources

It seems that these approaches need to focus more on resources, roles, and relationships, and less on the specific sequences of activities.  Then from that, one should be able to generate the actual process needed for a particular instance.

The tricky part is how to find an expert who can model this.  Once of the reasons for drawing a BP diagram is that it is that drawing a diagram simplifies the job of creating the process automation.   Getting to the underlying relationships might be more accurate and adaptive, it is not simpler.

(3) oBPM – An Opportunistic Approach to Business Process Modeling and Execution

(David Grünert, Elke Brucker-Kley and Thomas Keller, Institute for Business Information Management, Winterthur, Switzerland, BPMS2 Workshop at BPM 2014)

This paper comes the closest to Business Etiquette Modeling, because it is specifically about the problem of creating a business with a strict sequence of user tasks.  This top-down approach tends to be over-constrained.  Since this is the BPM and Social Software Workshop, the paper tries to find a ways to be more connected to social technology, and to take a more bottom up approach.  They call it “opportunistic” BPM because the idea is that the actual process flow can be generated after the details of the situation are known.  Such a process can take advantage of the opportunities automatically, without needing a process designer to tweak the process every time.

The research has centered on modeling roles, the activities that those roles typically so, and also associating with the artifacts that are either generated or consumed.  They leverage an extension of the UML use case modeling notation, and it might look a little like this:

usecaseshotThe artifacts (documents, etc) have a state themselves.  When a particular document enters a particular state, it enables a particular activity for a particular role.  To me this shows a lot of promise.  Upon examination, there are weaknesses to this approach: modeling the state diagram for a document would seem to be a challenge because the states that a document can be in are too intricately tied to the process you want to perform.  It might be that our preconception of the process might overly restrict the state chart, which in turn limits what processes could be generated.   Also, there is a data model that Grünert admitted would have to be modeled by a data model expert, but perhaps there are a limited number of data models, and maybe they don’t change that often.  Somehow, all of this would have to be discoverable automatically from the working of the knowledge workers in order to eliminate the huge up front cost of having to model all this explicitly.  Again, I refer you to the actual paper for the details.

Net-Net

What this shows is that there is research being done to take process to the next level.  Perhaps a combination of these approaches might leave us with the ultimate solution: a system that can generate process maps on demand that are appropriate for a specific situation.  This would be exactly like your GPS unit which can generate a route from point A to point B give the underlying map of what is possible.  That is what we are looking for, is a way to map what the underlying role interactions could possibly be, along with a set of rules about what might be appropriate when.  Like in a GPS when you add a new highway, you might add a new rule, and all the existing business processes would automatically change if that new rule applies to that case.  We are not there yet, but this research shows promise.


by kswenson at September 29, 2014 04:48 PM

September 23, 2014

Thomas Allweyer: Von der Pyramide zum Haus – Neue Auflage des Praxishandbuch BPMN

Cover Praxishandbuch BPMN 2.0 - vierte AuflageVon dem weit verbreiteten Praxishandbuch BPMN 2.0 von Jakob Freund und Bernd Rücker ist kürzlich die vierte Auflage. Wesentlicher Unterschied zur dritten Auflage: Das bislang als Pyramide dargestellte camunda Methodenframework wurde geändert und wird nun in Form eines Hauses visualisiert. Ausschlaggebend für die Änderung waren einige Missverständnisse, die im Zusammenhang mit der Pyramide gelegentlich auftraten. Darin war die Ebene des technischen, d. h. des ausführbaren Prozessmodells unterhalb der Ebene des operativen Prozessmodells angesiedelt. Dies veranlasste viele Leser zur Auffassung, dass die technische Ebene zwangsläufig eine Verfeinerung der operativen Ebene sein müsse. Damit verbanden sie die Erwartung, dass die technischen Prozessmodelle immer nach den operativen Modellen entstehen müssten und dass die Verantwortungen für diese Ebenen säuberlich zwischen Fachabteilung und IT getrennt seien.

Diese Auffassungen entsprechen aber nicht den Intentionen der Verfasser. In der neuen Darstellung als Haus enthält das Dach nach wie vor die Ebene des strategischen Prozessmodells. Das Haus selbst besteht jedoch nur aus einem Stockwerk, dem operativen Prozessmodell. Es ist unterteilt in einen “menschlichen Prozessfluss” und einen “technischen Prozessfluss”, die sich beide auf derselben Ebene befinden. Der menschliche Prozessfluss wird von den Prozessbeteiligten durchgeführt. Die Abarbeitung des technischen Prozessflusses erfolgt durch ein Softwaresystem, typischerweise eine Process Engine. Zumeist bestehen enge Interaktionen zwischen menschlichem und technischem Fluss. Im Zuge einer agilen Prozessentwicklung werden beide Flüsse gemeinsam entwickelt, wobei Fach- und IT-Experten eng zusammenarbeiten.

Ansonsten sind im Buch nur kleinere Änderungen vorgenommen worden. Da der XML-basierte BPEL-Standard für ausführbare Prozesse stark an Bedeutung verloren hat, wird hierauf nur noch kurz eingegangen. Schließlich wurde noch ein kurzer Überblick über die Open Source Plattform “camunda BPM” eingefügt, die unter Leitung der Autoren entwickelt wurde.


Freund, J.; Rücker, B.:
Praxishandbuch BPMN 2.0. 4. Auflage.
Hanser 2014
Das Buch bei amazon.

by Thomas Allweyer at September 23, 2014 06:36 AM

September 22, 2014

BPM-Guide.de: Thanks for an awesome BPMCon 2014

Awesome location, awesome talks and most of all: awesome attendees. This year’s BPMCon was indeed the “schönste BPM-Konferenz” I’ve ever seen. Thank you so much to all who made it happen, including Guido Fischermanns for the moderation, Sandy Kemsley for her Keynote about the Zero-Code BPM Myth, all those BPM practitioners who presented their lessons [...]

by Jakob Freund at September 22, 2014 05:46 PM

September 19, 2014

Drools & JBPM: The Birth of Drools Pojo Rules

A few weeks back I blogged about our plans for a clean low level executable mode, you can read about that here.

We now have our first rules working, and you can find the project with unit tests here. None of this requires drools-compiler any more, and allows people to write DSLs without ever going through DRL and heavy compilation stages.

It's far off our eventually plans for the executable model, but it's a good start that fits our existing problem domain. Here is a code snippet from the example in the project above, it uses the classic Fire Alarm example from the documentation.

We plan to build Scala and Clojure DSLs in the near future too, using the same technique as below.

public static class WhenThereIsAFireTurnOnTheSprinkler {
Variable<Fire> fire = any(Fire.class);
Variable<Sprinkler> sprinkler = any(Sprinkler.class);

Object when = when(
input(fire),
input(sprinkler),
expr(sprinkler, s -> !s.isOn()),
expr(sprinkler, fire, (s, f) -> s.getRoom().equals(f.getRoom()))
);

public void then(Drools drools, Sprinkler sprinkler) {
System.out.println("Turn on the sprinkler for room " + sprinkler.getRoom().getName());
sprinkler.setOn(true);
drools.update(sprinkler);
}
}

public static class WhenTheFireIsGoneTurnOffTheSprinkler {
Variable<Fire> fire = any(Fire.class);
Variable<Sprinkler> sprinkler = any(Sprinkler.class);

Object when = when(
input(sprinkler),
expr(sprinkler, Sprinkler::isOn),
input(fire),
not(fire, sprinkler, (f, s) -> f.getRoom().equals(s.getRoom()))
);

public void then(Drools drools, Sprinkler sprinkler) {
System.out.println("Turn off the sprinkler for room " + sprinkler.getRoom().getName());
sprinkler.setOn(false);
drools.update(sprinkler);
}
}

by Mark Proctor (noreply@blogger.com) at September 19, 2014 06:03 PM

September 18, 2014

Sandy Kemsley: What’s Next In camunda – Wrapping Up Community Day

We finished the camunda community day with an update from camunda on features coming in 7.2 next month, and the future roadmap. camunda releases the community edition in advance of the commercial...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 04:12 PM

Sandy Kemsley: camunda Community Day technical presentations

The second customer speaker at camunda’s community day was Peter Hachenberger from 1&1 Internet, describing how they use Signavio and camunda BPM to create their Process Platform, which is...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 02:59 PM

Sandy Kemsley: Australia Post at camunda Community Day

I am giving the keynote at camunda’s BPMcon conference tomorrow, and since I arrived in Berlin a couple of days early, camunda invited me to attend their community day today, which is the open...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 11:53 AM

September 17, 2014

Drools & JBPM: Decision Camp is just 1 Month away (SJC 13 Oct)

Decision Camp, San Jose (CA), October 2014, is only one month away, and is free for all attendees who register. Follow the link here, for more details on agenda and registration.

by Mark Proctor (noreply@blogger.com) at September 17, 2014 02:50 AM

September 16, 2014

Drools & JBPM: Workbench Multi Module Project Structure Support

The upcoming Drools and jBPM community 6.2 release will be adding support for Maven multi-module projects. Walter has prepared a video, showing the work in progress. While not shown in this video, the multi-module projects will have managed support to assist with automating version updates, releases, and will have full support for multiple version streams across GIT branches.

There is no audio, but it's fairly self explanatory. The video starts by creating a single project, and then showing how the wizard can convert it to a multi-module project. It then proceeds to add and edit modules, also demonstrating how the parent pom information is configured. The video also shows how this can work across different repositories without a problem - each with their own project structure page. Repositories can also be unmanaged, which allows for user created single projects, much as we have now with  6.0 and 6.1, which means previous repositories will still continue to work as they did before.

Don't forget to switch the video to 720p, and watch it full screen. Youtube does not always select that by default, and the video is fuzzy without it.




by Mark Proctor (noreply@blogger.com) at September 16, 2014 10:25 PM

September 15, 2014

Sandy Kemsley: Survey on Mobile BPM and DM

James Taylor of Decision Management Solutions and I are doing some research into the use and integration of BPM (business process management) and DM (decision management) technology into mobile...

[Content summary only, click through for full article and links]

by sandy at September 15, 2014 04:52 PM

Drools & JBPM: Setting up the Kie Server (6.2.Beta version)

Roger Parkinson did a nice blog on how to setup the Kie Server 6.2.Beta version to play with.

This is still under development (hence Beta) and we are working on improving both setup and features before final, but following his blog steps you can easily setup your environment to play with it.

Only one clarification: while the workbench can connect and manage/provision to multiple remote kie-servers, they are designed to work independently and one can use REST services exclusively to manage/provision the kie-server. In this case, it is not necessary to use the workbench.

Here are a few test cases showing off how to use the client API (a helper wrapper around the REST calls) in case you wanna try:

https://github.com/droolsjbpm/droolsjbpm-integration/blob/master/kie-server/kie-server-services/src/test/java/org/kie/server/integrationtests/KieServerContainerCRUDIntegrationTest.java

https://github.com/droolsjbpm/droolsjbpm-integration/blob/master/kie-server/kie-server-services/src/test/java/org/kie/server/integrationtests/KieServerIntegrationTest.java

Thanks Roger!

by Edson Tirelli (noreply@blogger.com) at September 15, 2014 03:59 PM

Thomas Allweyer: Soll man noch das “klassische” BPMS-Konzept vermitteln?

Bislang habe ich sehr positive Reaktionen auf mein neues BPMS-Buch erhalten. Unter anderem kam aber auch die Frage auf, ob das klassische, Prozessmodell-getriebene BPMS-Konzept, das ich in dem Buch mit vielen Beispielprozessen erläutere, überhaupt noch zeitgemäß ist. Sollte man sich angesichts eines immer größeren Anteils an Wissensarbeitern nicht stattdessen lieber mit neueren und flexibleren Ansätzen beschäftigen, wie Adaptive Case Management (ACM)?

Sicherlich muss man die klassische BPMS-Philosophie kritisch hinsichtlich ihrer Eignung für verschiedene Einsatzbereiche hinterfragen. Für die meisten schwach strukturierten und wissensintensiven Prozesse ist es tatsächlich nicht sinnvoll und meist auch gar nicht möglich, den kompletten Ablauf im Voraus in Form eines BPMN-Modells festzulegen. Für solche Prozesse ist Adaptive Case Management besser geeignet. Das heißt aber nicht, dass der herkömmliche BPM-Ansatz komplett überholt wären. Das Buch soll einen fundierten Einstieg in das Themengebiet bieten. Es gibt eine Reihe von Gründen, weshalb ich mich darin auf Prozessmodell-basierte BPMS beschränkt habe:

  • Die überwiegende Mehrzahl aller BPMS, die heute auf dem Markt verfügbar sind, verwenden den Prozessmodell-basierten Ansatz. Zwar gibt es durchaus reine ACM-Systeme, doch sind diese zumindest momentan noch in der Minderheit. Häufig wird Case Management auf klassischen BPM-Plattformen als zusätzliche Funktionalität angeboten.
  • Das klassische BPM-Konzept ist in Theorie und Praxis recht weit entwickelt. Die entsprechenden Systeme haben einen hohen Reifegrad erreicht. Es handelt sich somit um einen etablierten Ansatz, der eine Grundlage dieses Fachgebiets darstellt.
  • Bei ACM hingegen handelt es sich um einen recht neuen Ansatz, der sich noch stark in Entwicklung befindet. Daher ist es schwierig, entsprechende Grundlagen zu identifizieren, die nicht bereits in wenigen Jahren überholt sein werden.
  • Die Kenntnis der klassischen BPM-Grundlagen hilft beim Verständnis von ACM und anderen neuen Ansätzen. So finden sich Konzepte wie Definitionen und Instanzen von Prozessen auch bei ACM in Form von Fall-Templates und Fällen wieder. Ebenso sollte man etwa verstehen, worum es bei der Korrelation von Nachrichten geht. Ob eine Nachricht einer Prozessinstanz oder einem Fall zugeordnet wird, stellt keinen großen Unterschied dar. Manche Vorteile des ACM-Ansatzes erschließen sich erst richtig, wenn man sie mit dem klassischen Konzept vergleicht, wo z. B. Mitarbeiter während der Prozessdurchführung nicht so einfach ganz neue Bearbeitungsschritte hinzufügen können.
  • Auch bei der Bearbeitung von Fällen gibt es oftmals Teile, die in Form von strukturierten Prozessen ablaufen. Das klassische BPM-Konzept wird daher wohl nicht komplett abgelöst. Stattdessen werden sich ACM und BPMS-Funktionalitäten sinnvoll ergänzen.
  • Die Zahl der strukturierten und standardisierten Prozesse dürfte in Zukunft keineswegs sinken. Zum einen gibt es immer mehr komplett automatisierte Prozesse, die zwangsläufig sehr stark strukturiert sind – zumindest solange, bis sich hochgradig intelligente und autonome Software-Agenten auf breiter Front durchgesetzt haben. Zum anderen müssen immer mehr Prozesse skalierbar sein um effizient über das Internet abgewickelt werden zu können. Hierzu müssen sie stark strukturiert und standardisiert sein. Wenn jemand bei einem großen Internethändler etwas bestellt, dann wird nicht erst individuell darüber nachgedacht, wie man die Wünsche dieses Kunden erfüllen kann. Es läuft vielmehr ein komplett standardisierter Prozess ab. Es mag sein, dass Prozesse mit starker Mitarbeiterbeteiligung künftig verstärkt mit Hilfe von ACM unterstützt werden. Klassische Process Engines wird man dann eher bei der Steuerung komplett automatisierter Prozesse finden. Die Zahl der Einsatzmöglichkeiten wird damit aber nicht geringer.

Wer sich als für den Einstieg zunächst mit den Grundlagen klassischer BPMS beschäftigt, liegt daher auf jeden Fall richtig. Und am besten versteht man diese, wenn man sie selbst ausprobiert. Daher gibt es die zahlreichen Beispielprozesse zu dem Buch, die man herunterladen und mit dem Open Source-System “Bonita” ausführen kann.

by Thomas Allweyer at September 15, 2014 10:54 AM

September 13, 2014

Keith Swenson: BPM2014 Keynote: Keith Swenson

I was honored to give the keynote on the second day of the BPM2014 conference, and promised to answer questions, so here are the slides and summary.

Slides are at slideshare:

(Slideshare no longer has the ability to put an audio together with the slides, so I apologize that the slides alone probably don’t make a lot of sense.  I hope to get invited to present the same talk at another event where they video.)

Twitter Responses

twitter7twitter6

twitter8

twitter1

twitter16

twitter3

Nice of you to notice!  The talk went on schedule and as far as I know there was nothing that I forgot to say.

twitter10

excellent!

twitter9

It is a little of both.  There is a tendency for managers of all types, especially less experienced managers, to want to over-constrain the processes.  At the same time, programmers tend to implement restrictions very literally and without any wiggle room.  I don’t think we can let either one off the hook.

twitter11

twitter14

 

This was one of my key points:  if our goal is to make ‘business’ successful, maybe there is more to it than just increasing raw efficiency in terms of reducing expenses.  Maybe an excellent business needs to keep their knowledge workers experienced, and possibly our IT systems should be helping to exercise the knowledge workers.

twitter12

This tweet got the most favorites and retweets.  I had not realized that this was not clear before, so let me state it here.  I included in the presentation the definition of BPM that was gathered earlier this year.  I mentioned that this was not exactly the definition that I had formerly thought, but the discussion included a broad spectrum of BPM experts, and so I am willing to go along with this definition.

Under this new definition, ANYTHING and EVERYTHING that makes your business processes better is included.  Some of you thought this all the time.  Previously, I had subscribed to a different (and wrong) definition of BPM, which was a bit more restrictive, and that is why in the past I have stressed the distinction between BPM and ACM.  However, this new, agreed upon definition allows BPM method to have and to not have models, to have and not have execution, etc.  So BPM clearly includes ACM because it also is a way of supporting business and processes.  This is the definition now that so many have pledged to support, and I can support it as well.

I am still teaching myself to say “Workflow-style BPM” or “traditional-BPM” instead of simply ‘BPM’, and I have not completely mastered that change.

twitter13

twitter15

 

There is no doubt:  knowledge work is more satisfying to do.   I spoke to some attendees afterwards, who felt I was being ‘unfair’ to the routine workers:  they are doing their jobs too, why pick on them just because their job is routine?   I am not sure how to respond to that.  I think most people find routine work dull and boring.  Sure, it is a job, but most people would like to be doing more interesting things, and that generally is knowledge work that depends upon expertise you acquire.  In general, automatic routine work will allow a typical business to employ more knowledge workers, particularly if the competitors are doing so.  It is somewhat unlikely to think that all routine worker individuals will switch and become knowledge workers, but some will, and for the most part the transition will occur by hiring exclusively knowledge workers, and losing routine workers by attrition.

photo4


by kswenson at September 13, 2014 08:00 AM

September 11, 2014

Tom Baeyens: 5 Types Of Cloud Workflow

Last Wednesday, Box Workflow was announced. It was a expected move for them to go higher up the stack as the cost of storage “races very quickly toward zero”.  It made me realize there are actually 4 different types of workflow solutions available on the cloud.

Box, Salesforce, Netsuite and many others have bolted workflow on top of their products.  In this case workflow is offered as a feature on a product with a different focus.  The advantage is they are well integrated with the product and that it’s available when you have the product already.  The downside can be that the scope is mostly limited to the product.

Another type is the BPM as a service (aka cloud enabled BPM).  BPM as a service has an online service for which you can register an account and use the product online without setting up or maintaining any IT infrastructure for it.  The cloud poses a different set of challenges and opportunities for BPM.  We at Effektif provide a product that is independent, focused on BPM and which is born and raised in the cloud.  In our case, we could say that our on-premise version is actually the afterthought.  Usually it’s the other way round.  Most cloud enabled BPM products were created for on-premise first and have since been tweaked to run on the cloud.  My opinion ‘might’ be a bit biased, but I believe that today’s hybrid enterprise environments are very different from the on-premise-only days.   Ensuring that a BPM solution integrates seamless with other cloud services is non-trivial.   Especially when it needs to integrate just as well with existing on-premise products.

BPM platform as a service (bpmPaaS) is an extension of virtualization.  These are prepackaged images of BPM solutions that can be deployed on a hosting provider.  So you rent a virtual machine with a hosting provider and you then have a ready-to-go image that you can deploy on that machine to run your BPM engine.  As an example, you can have a look at Red Hat’s bpmPaaS cartridge.

Amazon simple workflow service is in many ways unique and a category on it‘s own in my opinion.  It is a developer service that in essence stores the process instance data and it takes care of the distributed locking of activity instances.  All the rest is up to the user to code.  The business logic in the activities has to be coded.  But what makes Amazon’s workflow really unique is that you can (well.. have to) code the logic between the activities yourself as well.  There's no diagram involved.  So when an activity is completed, your code has to perform the calculation of what activities have to be done next.  I think it provides a lot of freedom, but it’s also courageous of them to fight the uphill battle against the user’s expectations of a visual workflow diagram builder.

Then there is IFTTT and Zapier.  These are in my opinion iconic online services because they define a new product category.  At the core, they provide an integration service.  Integration has traditionally been one of the most low level technical aspects of software automation.  Yet they managed to provide this as an online service enabling everyone to accomplish significant integrations without IT or developer involvement.  I refer to those services a lot because they have transformed something that was complex into something simple.  That, I believe, is a significant accomplishment.  We at Effektif are on a similar mission.  BPM has been quite technical and complex.  Our mission is also to remove the need for technical expertise so that you can build your own processes. 

by Tom Baeyens (noreply@blogger.com) at September 11, 2014 09:57 AM

September 10, 2014

BPinPM.net: BPM meets Digital Age – Win the new book „Management by Internet“ by Willms Buhse

Evaluation of Digital Age BPM ideasTogether with eight fearless BPM experts from four different organizations, we went on an exciting journey to bring together Digital Age and BPM. Supported by Dr. Willms Buhse and his experts from doubleYUU, we have developed a number of possibilities to combine web 2.0 and social media features as well as digital leadership aspects with business process management.

Today, we are going to introduce the results of this workshop series in more detail. – And please don’t miss the chance to win a copy of the inspiring book “Management by Internet” by Willms Buhse at the end of this article. The book covers a lot of the aspects, which we combined with BPM and provides practical examples how to benefit from the Digital Age as a manager.

Overall goal of the workshop series was to increase the acceptance and the benefit of BPM by the implementation of Digital Age elements. Within the first workshop session, we developed more than 70 ideas which we clustered into six areas of interest for further evaluation: ‘Participation’, ‘Training and Communication’, ‘Feedback and Exchange’, ‘Search Engine’, ‘Process Transparency’, and ‘Mobile Access’.

Based on an evaluation of these six areas by BPM experts from the participating organizations, we started to develop prototypes during the second workshop for the eleven highest ranked ideas in an overnight delivery session. Afterwards, these prototypes went through a second evaluation cycle by employees of the participating organizations.

Search like googleBiggest winners of the evaluation by the employees were the ideas related to the ‘Search Engine’. Obviously, employees expect the search engine of the BPM system to be as fast and precise as google. But – as we have learned from Willms and his team – it is absolutely not fair to compare google with the search engine of a BPM system. Google processes much more search requests which can be analyzed and google invests an immense amount of money to optimize its algorithms. But there is still the expectation by the employees to have a search like google. Thus, we discussed ideas like tagging, result ranking, and previews to push the BPM search engine towards google expectations.

The "Like-Button" failed...Biggest looser of the evaluation was the “Like-Button” which was represented by a “heart” in our prototypes. By having a closer look onto the results, we realized that it probably doesn’t make sense to “like” a process. Result of our discussion was to redesign the button to a “Helpful”-Button which can be clicked by users to indicate that the process description was helpful for them.

Now, we are going to wrap-up all the learnings for a more detailed presentation of the results during our BPinPM.net Conference in Novembers as well as to prepare the prototypes for further evaluation. In addition, we will present detailed insights about the current implementation status of Digital Age BPM at one of the participating organizations at the conference. So if you are interested in more details, please meet us at the conference. :-)

To provide even more insights of the Digital Age elements which we have discussed during the workshop, we are going to raffle a copy of the new “Management by Internet” book by Willms Buhse. So don’t wait and enter the lottery here…

Best regards,
Mirko

by Mirko Kloppenburg at September 10, 2014 12:48 PM

September 09, 2014

Keith Swenson: Business Etiquette Modeling: a new paradigm for process

The AdaptiveCM 2014 workshop this past Monday provided a very interesting discussion of the state of the art in adaptive case management and other non-workflow oriented ways for supporting knowledge work. While there I presented, and we discussed, an intriguing new way to think about processes which I call “Business Etiquette Modelling”

Processes Emerge from Individual Interaction

The key to this approach is to treat a business process as an epiphenomenon that is a secondary effect that results from business interactions, but is not primary to business interactions.  The primary thing that is happening is interactions between people.  If those interactions are tuned properly, business results.

I have found the following video to be helpful in to giving a concrete idea of emergent behavior that we can discussion.  Watch the video, particularly between 0:30 and 1:30.  plainbirdsThe behavior of the flock of birds, called murmurating, is the way that the groups of birds appears to bunch, expand, and swirl.  The birds themselves have no idea they are doing this.  Take a look (click this link to access the video – strange problem with WordPress at the moment):

The behavior of the flock is analogous to the business that an organization is engaged in.  With regular top-down or outside-in processes, you start with the emergent business behavior that you want to support, and model that directly.  To refer to the analogy, you draw descriptions of the bunching, flowing, and swirling of the flock, and from that you would come up with specific flight paths that individual birds would need to follow to get that overall behavior.  However, that is not how the birds actually do it!

You can simulate this murmuration behavior by endowing individual birds with a few simple rules:  match speed with nearby other birds, try to stay near the group of birds, and leave enough space to avoid hitting other birds.  Computer simulation using these rules produces flock behavior very similar to starlings shown in the video.

murmuration

On the left you see the emergent flock behavior, and on the right the rules that produce that, but there is no known way to derive the rules from the flock behavior.  (These rules were found by trial & error experimentation in the simulator.)

The behavior of the birds in a flock emerges from the behaviors of the individual bird interactions — there is no guidance at the flock level.  This is very much like business:  an organization has many individual people interacting, and the business emerges as a result.  Obviously the interaction of people is far more complex than the birds, and business equally more complex than flock behavior, but the analogy holds: business can be modified indirectly by changing the rules of behavior of individuals.

Top-Down Design Runs Into Trouble

Consider the bird flock again, and approach trying to reproduce this the way that we do with a typical BPM approach.  In BPM we would define the overall process that is desired, and then we would determine the steps of everyone along the way to make that happen.  BirdProcessFor the bird flock, that would be like outlining the shape of the flock, stating that the goal is a particular shape, and a particular swooping, and then calculate the flight paths of each of the birds in order to get the desired output.  That might seem like a daunting task for so many birds, but it is doable.  The result is that you will have a precisely defined flock flying pattern.

This pattern would be very fragile.  If you tried to fly where a tree was in the way, some of the pre-calculated bird trajectories would hit the tree.  If there was a hawk in the region, some of the birds would quite likely be captured, because the path is fixed.  To fix this, you would have to go back to the overall flock design, come with a shape that avoids the specific tree, or makes a hole for the predator, and then calculate all the bird trajectories again.  The bird flock behavior has become fragile because any small perturbation in the context requires manually revisiting, and modifying, the overall plan.

With the bottom-up approach, these situations are cleanly handled by adding a couple more rules: avoid trees and other stationary things, and to always keep a certain distance from a predator.  By adding those rules in, the behavior of the flock becomes stable in the face of those perturbations.  If we design the rules properly, the birds are able to determine their own flight paths.  They do so as they fly, and automatically take into account any need to change the overall flock structure.  Flock automatically avoid trees, and they automatically make a hole where a predator flies.  Note of course that we can not be 100% sure of what the flock will exactly look like when it is flying, but we do know that it will have the swooping behavior, as well as avoiding trees and predators.

The problem with modeling the high level epiphenomenon directly is that once you specify the exact flight paths of the birds, the result is very fragile.  Yes, you get a precise overall behavior, but you get only that exact behavior.  When the conditions change, you are stuck, and it is hard to change.  If however you model the micro-level rules, the resulting macro level behavior automatically adapts without any additional work to the new, unanticipated situation.

What is an Etiquette Model?

Etiquette is a term that refers to the rules of interactions between individuals.  Each individual follows their own rules, and if these rules are defined well enough, proper business behavior will emerge.  We can’t call this “Business Rule Modeling” because that already exists, and means something quite different. The term ‘etiquette’ implies that the rules are specifically for guiding the behavior individuals at the interpersonal level.

The etiquette model defines explicitly how individuals in particular roles interact with others.  There would be a set of tasks that might be performed as well as conditions of when to perform that task structured as a kind of heuristic that can be used as needed. Seletion criteris might include specific goals that an individual might have (such as “John is responsible for customer X.”) as well as global utilities, (such as “try to minimize costs” or “assure that the customer goes away satisfied.”)   The set of heuristics are over-constrained, meaning that the individual does not simply follow all the rules, but would have to weigh the options and choose the best guess for the specific situation.

purchasingagent

For example, a role like “Purchasing Agent” would be fully defined by all the actions that a purchasing agent might make, and the conditions that would be necessary for such a role player to take action.   They might purchase something only when the requesting party presents a properly formed “purchase request” document, and which carries the proper number of approvals from the right people in the organization.  Defined this way, any number of different business processes might have a “purchase by purchaser” within it, and the rules for purchasing would be consistent across all of them.  If there is a need to make a change to the behavior of the purchaser, those ‘etiquette’ rules could be changed, and as a result all of the processes that involve purchasing would be automatically modified in a consistent way.

Isn’t this the Functional Orientation that BPM avoids?

The answer is yes and no.   Yes, it is modeling very fine grained behavior with a set of heuristics that tell what one individual will do to the exclusion of all others.  There is a real danger that the rules for one role might be defined in such a way as to throw a tremendous burden on everyone else in the organization.  This could decreasing the overall efficiency of the organization.  We can not optimize one role’s etiquette rules in exclusion of all other roles — we need to consider how the resulting end-to-end business process appears.

Given the heuristics and guidelines for all the individuals that will be involved in a process, it is possible to simulate what the resulting business processes will be.  Using predictive analytics, we can estimate the efficiency of that process, and particular waste points can be identified.  This can be used to modify the etiquette of the individual participants so that overloaded individuals do slightly fewer things, and underloaded individuals do a bit more, and so that the overall end-to-end process is optimized.

The result is that you achieve the goals of BPM: you are engaged in a practice of continually improving your business processes.  But you do so without directly dictating the form of the process!  You dictate how individuals interact, and the business process naturally emerges from that.

Is this Worth the Trouble?

The amazing result of this approach is that the resulting business process is anti-fragile!   When a perturbation appears in the organization, the business processes can automatically, and instantly, change to adapt to the situation.  A simple example is a heuristic for one role to pick up some tasks from another role, if that other role is overloaded.  Normally it is more efficient for Role X to do that task, but if because of an accident, several of the people who normally play Role X end up in the hospital for a few weeks, the business process automatically, and instantly, adjusts to the new configuration, without any involvement of a process designer or anyone.

Consider a sales example.  There can be multiple heuristics for closing a deal: one that explores all possible product configurations to identify the ideal match with the customer and maximizes revenue for the company, and another heuristic that gets things approximately right but closes very quickly.  As you get closer to the end of the month, the priority to close business in the month might shift from the more accurate heuristic, to the quick-and-dirty heuristic in order to get business into that month’s accounting results.  These kinds of adaptations are incredibly hard to model using the standard workflow diagram type approach.

The Amazon Example

Wil van der Aalst in his keynote at EDOC 2014 reminded me of a situation that happened to me recently with some orders from Amazon.  On one day I ordered two books and one window sticker from Amazon.  On the next day, I remembered about another book, and ordered that.  The result was that a few days later I received all three books in a single shipment, and the window sticker came a week after that separately.  The first order was broken into two parts for shipping, and then the second order was combined together with part of the first order.

This is actually very hard to model using BPMN.  You can make a BPMN process of a particular item, such as a book, which starts by being ordered and ultimately shipped, but the treatment of the order, by splitting when necessary, and combining when necessary will not appear in the BPMN diagram.  It is hard (or impossible) to include the idea to “optimize shipping costs” into a process that represent the behavior of only a single item of the purchase.

When you model the Business Etiquette of the particular roles, it is very easy to include a heuristic to split an order into parts when the parts are coming from different vendors.  Not every order is split up.  There are guidelines for when to use this heuristic that dictate when it should and should not be done.   Same for the shipper, who might have a heuristic to combine shipments if they are close enough together, and then shipping costs can be reduced.

This approach allows for supporting things like the Kanban method which constrains the number of instances that can be in a particular step at a time.  BPMN has no way to express these kinds of constraints that cross multiple processes.

Summary

Let’s discuss this approach.  My rather cursory search did not turn up any research on this approach to representing business process by representing the interactions between individual roles in the organization, although on Monday at the BPM conference I saw a good paper called “Opportunistic Business Process Modeling” which was a start in this direction.  I will make links to research projects if I find some.

This approach also works well for adaptive case management.  The heuristics and guidelines can be used within a case to guide the behavior of the case manger and other participants.  If this is done, then even though you can not predict the course of a single instance, you can use predictive analytics to approximate the handling of future cases.  This technique might be a new tool in the BPM toolkit.


by kswenson at September 09, 2014 04:45 AM

September 05, 2014

Keith Swenson: Final Keynote EDOC 2014: Barbara Weber

Barbara Weber is a professor at University of Innsbruck in Austria.  Next year she will be hosting the BPM 2015 conference at that location.  She gave a talk on how they are studying the difficulties of process modeling.   My notes follow:

Most process model research is focusing on the end product of process models. Studies have shown that a surprisingly large number, from 10% to 50% of existing models, have errors.  Generally process models are created and then the quality of the final model is measured, in terms of complexity of model, model notation, secondary notation, and measure accuracy, speed, and mental effort.   Other studies take collections of industrial models, and measure size, control flow complexity and other metrics, and look for errors like deadlocks and livelocks.

Standard process modeling lifecycle is (1) elicitation, and then (2) formalization. Good communications skills needed in first part. Second part requires skills in a particular notation. She calls this PPM (Process of process modeling). Understanding this better would help both practice and teaching. This can be captured from a couple of different perspectives.

1) logging of modeling interactions
2) tracking of eye movement
3) video and audio
4) biofeedback collecting heart rate etc.

Nautilus Project focused on logging modeling environment. Cheetah experimental platform (CEP) guides modelers through sessions and the other things is that it records the entire thing and plays it back later.  The resulting events can be imported to a process mining tool and analyze the process of process modeling.  She showed some details of the log file that is captured.

Logging at the fine grained level was not going anywhere, because the result was looking like a spaghetti diagram.  They broke the formalization stage into five phases:  

  • problem understanding: what the problem is, what has to be modeled, what notation to use
  • method finding: how to map the things into the modeling notation
  • Modeling: actually doing the drawing on the canvas
  • Reconciliation: is about then improving the understandability of the model, like factoring, layout, and typographic clues all of which make maintenance easier
  • Validation – search for quality issues, comparing external and internal representation, syntactic and semantic, and pragmatic quality issues

They validated this with user doing a “think aloud” work.  They could then map the different kinds of events to these phases.  For example creating elements are modeling phase, while moving and editing existing is more often reconciliation phase.  She showed to charts from two users: one spent a lot of time in problem understanding, and then build quickly, the other user proceeded quite a bit more slowly, adding and removing things over time.

Looking at different users, they found (unsurprisingly) that less experienced users take a lot more time in the ‘problem understanding’ phase.  In ‘method finding’ they found that people with a lot of domain knowledge were significantly more effective.  At the end there are long understanding phases that occur around the clean up.  They did not look at ‘working memory capacity’ as a factor, even though it is well known that this is a factor in most kinds of modeling.  

Second project “Modeling Mind” took a look at eye movements and other biofeedback while modeling.  These additional events in the log will add more dimensions of analysis.  With eye tracking you measure number of fixations, and mean fixation duration.  Then define areas of interesting (modeling canvas, text description, etc.)  They found that eye trace patterns matched well to the phases of modeling.  Initial understanding they spend a lot of time on the text description with quick glances elsewhere.  During the building of the model, naturally you look at the canvas and the tool bar.  During reconciliation there is a lot of looking from model to text and back.

What they would then like is to get a continuous measure of mental effort.  That would give an indication of when people are working hard, and when that changes.  These might give some important clues.  Nothing available at the moment to make this easy, but they are trying to capture this.  For example, maybe measuring the size of the pupil.  Heart rate variability is another way to approximate this.

Conclusion: it is not sufficient to look only at the results of process modeling — the process maps that result — but we really need to look at the process of process modeling: what people are actually doing at the time, and how they accomplish the outcome.  This is the important thing you need to know in order to build better modeling environments, better notations and tools, and ultimately increase the quality of process models.  This might also produce a way to detect errors that are being made during the modeling, and possibly ways to avoid those errors.

Note that there was today no discussion of elicitation phase (process discovery) but that is an area of study they are doing as well.

The tools they use (Cheetah) is open source, and so there are opportunities for others to become involved.

Q&A

Can the modeling tool simulate a complete modeling environment?  Some of the advanced tools check at run time and don’t allow certain syntactic errors.  Can you simulate this? –  The editor models BPMN, and there is significant ability to configure the way it interacts with the user.

Sometimes it is unclear what is the model, and what is the description of the model.  Is this kept clearly separated in your studies?  Do we need more effort to distinguish these more in modelers?  – WE consider that modeling consists of everything including understanding what you have to do, sense making, and then the drawing of the model.  

This is similar to cognitive modeling.  Have you considered using brain imaging techniques?  – we will probably explore that.  There is a student now starting to look at these things. We need to think carefully whether the subject is important enough for such a large investment.

Have you considered making small variations in the description, for example tricky key word, and see how this effects the task?  – we did do one study where we had the same, slightly modified requirements to model.  These can have a large effect.

Starting from greenfield scenario, right?  What about using these for studying process improvement on existing models? – some little bit of study of this.  The same approach should work well.  Would definitely be interesting to do more work on this.

 


by kswenson at September 05, 2014 08:15 AM

Thomas Allweyer: BPM in Practice diskutiert ACM, Internet of Things und mehr

“Enterprise BPM 2.0, Adaptive Case Management und das Internet of Things – Wie passt das alles zusammen?”, fragt Dirk Slama, Autor des empfehlenswerten Buchs “Enterprise BPM“, in seiner Keynote auf dem Workshop “BPM in Practice” am 9. Oktober in Hamburg. Das Adaptive Case Management und seine praktischen Anwendung wird in den anschließenden Parallel-Tracks von mehreren Referenten aufgegriffen und vertieft. Weitere Themen sind die Validierung von Prozessmodellen in Szenarien, Process Mining, Werkzeug- und organisations-übergreifende Kollaboration, Decision Management und die praktische Umsetzung vom Modell zur Automatisierung in 45 Minuten.

Das genaue Programm und ein Anmeldeformular finden sich hier.

by Thomas Allweyer at September 05, 2014 08:08 AM

September 04, 2014

BPM-Guide.de: Neue Auflage: Praxishandbuch BPMN 2.0

Die neueste Auflage gibt es ab sofort im Handel – zum Beispiel bei Amazon. Leider gehen bei Amazon damit wie immer alle Bewertungen der vorherigen Auflage verloren, sprich wir fangen wieder bei Null an. Falls also jemand Zeit und Lust haben sollte, (erneut) seine Meinung über das Buch dort kund zu tun, wären wir mehr [...]

by Jakob Freund at September 04, 2014 08:13 AM