Planet BPM

December 20, 2014

BPM-Guide.de: CMMN – The BPMN for Case Management?

Scott Francis recently asked a legitimate question: Will BPM vendors adopt CMMN, or will they rather focus on topics like mobile/social/local/cloud? (Read the complete blog post here)

As the obviously biased CEO of one of the few vendors who implemented CMMN already, I am actually sceptical. My experience with BPMN during the last 8 years was, that most of the established BPM vendors balked at implementing it, argueing that they already had something “better” for the same use case. I suppose it’s the same with Case Management now, so I predict they will only make the effort of supporting it when …

by Jakob Freund at December 20, 2014 03:25 PM

December 19, 2014

Drools & JBPM: UI Front-end Developer Job Opening in the Drools and jBPM team


We are looking to hire a front end developer to work closely with our UXT team to make our web UI's look and work beautifully. The ideal candidate will have an artistic flair and solid grounding in what makes a good UI. Our existing web tooling can be seen here.

The developer will be working with Java, GWT, Errai, HTML5 and JS. The role is remote.

http://jobs.redhat.com/jobs/descriptions/jboss-software-engineer-job-1-4960459



by Mark Proctor (noreply@blogger.com) at December 19, 2014 10:21 PM

December 16, 2014

Keith Swenson: When Thinking Matters in the Workplace

In the 4 and 1/2 years since “Mastering the Unpredictable” introduced the idea of Adaptive Case Management to the world, a growing group of people have struggled to define what it really means to make use of this new emerging trend.  This new book “When Thinking Matters in the Workplace” takes it one step further — to outline what a manager needs to know, to lead a team of innovative knowledge workers, and how to put in place a system to best support them.

Thinking-Matters_Just_FrontWe know that innovation is the key to success, not just in high tech, but in all industries. Innovation is what happens when knowledge workers are successful.  Innovation is the result of thinking “how can we do this better?”  Innovation has always been a largely manual activity until recent years.  The emergence of BPM technology finally enabled the automation of knowledge worker processes — but then we ran into a problem.   Innovation is not a routine process.  Innovation happens differently every time.  You can’t just define the process of innovation, and automate it.

The decision of how to support workers in an office should not be decided by technologists.  At the level of knowledge workers, the technology defines how the business will operate.  This is a core business decision that must be made by the executives of the organization, not the IT department.  The way that knowledge workers are supported strongly effect the way the entire business operates.

That is why we made this book to address the problem at the executive management level.  This is not a book for technologists.  It is a book for managers to understand the choices they must make, and how those choices will effect their ability to create new products and services.

Knowledge workers are quite flexible and capable in many ways, and so there is no single prescription on how to support them.  However, it is easy to identify technology that is bad for knowledge workers.  It is easy to put in place technology that restricts workers, and eliminates innovation.  You won’t notice that innovation is gone, until it is too late.

This project is the culmination of years of study on how successful teams work together to innovate.  We go into detail on what innovation and knowledge work is, why it is difficult to understand, how it is different from routine work, what good management is, and how technology needs to support this.  After  reading this book, you will be prepared with good reasons for choosing one approach over another, with solid references to support your position.

For readers of this blog post, for a limited time, sign up for the mailing list, and get the first chapter for free.  I will be sending all the chapters out for review to this mailing list as well over time, so stay on the list if the first chapter is of value to you.

Outline of the Contents

1. Innovation Management Challenge.  What is innovation?  How do innovators work? including some examples.  Some misconceptions about innovation: is it rarely a sudden epiphany, and does not require incredible mental powers.  Distinguishing routine work from knowledge work.  Innovators need leaders, and how leaders differ from managers.  How innovators need autonomy and accept unpredictability.

2. Understanding Complexity.  The biggest mistake in supporting innovation, is tied to misunderstanding complexity.  Innovation is not simple.  This high level overview of complexity science gives us an understand of why reducing a business to the simplest form tends to eliminate the ability to innovate, which is necessary at all levels.  We need to leave behind thinking of an organization as a machine, and understand that organizations are complex systems.  Understanding how complex systems behave prepares us for choices we have to make in how to support them.

3 Management and Leadership.  Knowledge workers are not like machines.  Experienced managers already know how to lead people who think. We examine the evidence from management science supporting this, and at the same time contrast such wisdom with the mechanistic approaches that IT departments tend to favor.   We show how sound management principles are often discarded when designing systems for supporting workers.

4 Agile Management. Building on this, we outline a brief overview of the ways that one should lead and support knowledge workers in a way that gives them the freedom to innovate.  Many of these ideas come from other fields, and we are repurposing them for general work environment.  We examine how Toyota did this for the manufacturing industry.  We examine how Agile approaches have been used in software and high tech fields.  We also briefly cover Lean and Six Sigma recommendations for leaders, and well as the power of a checklist.  This gives us an idealized vision for how knowledge workers might be enabled to work best together.

5. Business Architecture.  How do you define your business? How does your business succeed?  This is not about IT systems, but the organization itself.  Every executive knows that the shape of the organization is critical to the long term success of the organization.  We focus in this chapter on some new ways of organizing, that have only recently been possible because of advances in information technology.  How push organizations different from pull organizations.  We look at a company that has eliminated all management at all levels, and how that works.  We look at flow-to-work organizations, hyper-social organizations, and wirearchies.  There is no single best architecture, but most of us should consider whether some aspects of these radical new approaches should be embraced.

6 Business Technology.  Having a solid grounding of where you want to take your organization, we then discuss recently emerged technical options, and how those might effect your organizations.  What is collaboration technology, how email is a benefit and a vice, how social networks might be used, enterprise 2.0, and systems of engagement.  We present the range of 7 different process technologies, with a focus on two types of case management.  There are then a couple of other technological aspects you need to be concerned with: identity, security, and some challenges with mobile technology.

7 Roadblocks to Innovation.  Here we delve into problems that you are likely to run into if you take a naive approach to supporting knowledge workers.  These are things that look good in concept, but when deployed they turn against you.  Some are promises that technology vendors make but are stated in confusing and misleading ways.  Beware the ‘snake oil!’  Some approaches are very attractive, but don’t work.  This chapter prepares you to avoid the misleading approaches.

8 What is Case Management?  The way to support innovation is case management, but once again, not everything that carries this name will necessarily fit the need.  This chapter goes into depth on the features and capabilities that you should expect to find, as well as a discussion of why these features are important.  If you are considering the purchase of case management technology, you should certainly read this chapter before that.

9 Patterns of Innovation.  As you already know, the technology is not the whole story.  This chapter talks about how to successfully use technology in the support of innovation.  It touches on culture change, adoption strategies, how to design for change, how to leverage the intelligence of the workers, how to reduce the cost supporting work, and how to best leverage the transparency and social ties of case management.

10 Fumbling Innovation.  The flip side of the coin, this chapter covers how technology can be used poorly and effectively stop innovation.  These are patterns to look for, and avoid, among the workers.  Don’t micromanage, don’t ask for too much detail up front, don’t prevent changes to plans, don’t punish people who try and fail.   In general, how to avoid making a working environment that is unfriendly to innovation.

11 Leading the Innovators.  Tips and techniques to help the knowledge workers understand how these approach help them innovate, and why it is important for business.  Some hints here on how to inspire workers to make the best use of the technology.

12 Lessons from the Field.  Covers some real live used cases that used case management technology, how they used it, what they did right, and what they did wrong.

Summary

At 370 pages, this book is packed with references that will help you back up the approach you take to support knowledge workers.  Whether you want to try out a radical new style of organization, or whether you want to stick with the reliable existing organization and simply wish to eliminate unnecessary paperwork, this book will give you needed details.  Knowledge is power, and we have tried to bring together everything you might need to make a  decision about Case Management for your organization.

If thinking does matter in your organization:

  • You already know that smart people need autonomy in order to innovate and create.
  • Automation, done haphazardly, can put strait-jackets on your most creative people.
  • You don’t want to put in place technology that micro-manages your knowledge workers.
  • Robots don’t innovate.
  • This book provides some guidelines to help avoid this the worst pitfalls and to take the best advantage of this adaptive approach for supporting teams of knowledge workers.

by kswenson at December 16, 2014 10:15 AM

December 15, 2014

Thomas Allweyer: Das perfekte BPMS gibt es nicht – Neue Studie von Fraunhofer IESE

Cover BPMS-Studie IESEInsgesamt 18 Hersteller von BPM Suiten beteiligten sich an der neuen Studie des Fraunhofer IESE, die in diesem Jahr zum zweiten Mal durchgeführt wurde. Jedes Tool wurde in einem ganztägigen Workshop auf Herz und Nieren geprüft. Die Vertreter des jeweiligen Herstellers sollten die Umsetzung eines vorgegebenen Prozesses demonstrieren. Im Laufe des Workshops mussten dann noch zahlreiche, vorher nicht bekannte Änderungen durchgeführt und weitere Features gezeigt werden. Ich durfte bei einigen der Workshops als Gast teilnehmen. Dabei konnte man ein ziemlich umfassendes Bild des jeweiligen Systems gewinnen – abgesehen von einigen Themen, die explizit nicht untersucht wurden, wie beispielsweise Case Management-Funktionalitäten.

In der Gesamtauswertung liegen die untersuchten Tools relativ nahe beieinander. Weder gab es ein richtig schlechtes Tool, noch konnte ein perfektes BPMS als einsamer Spitzenreiter identifiziert werden. Insbesondere die Mächtigkeit der Werkzeuge, d. h. der Abdeckungsgrad der untersuchten Features, ist meist recht hoch, wohingegen beim Bedienungskomfort vielfach noch Luft nach oben ist. Die höchste Gesamtbewertung konnte Bizagi verbuchen. SoftProject ist das Werkzeug mit der höchsten Mächtigkeit. Axon IVY und IBM wurden in den meisten Einzelkategorien mit “gut” bewertet.

In den Einzelkategorien, wie z. B. Prozessausführung, Prozessmodellierung oder Prozesscontrolling, zeigten sich denn auch wesentlich größere Unterschiede als bei der Gesamtauswertung. Bei der Auswahl des geeigneten BPMS muss man sich folglich genau überlegen, welchen Funktionalitäten man besondere Bedeutung beilegt. So sind etwa die Governance-Fähigkeiten der untersuchten Tools sehr verschieden. Als erste Orientierungshilfe enthält die Studie daher einen Entscheidungsbaum, mit dem man die Tools findet, die die gewünschten Aspekte überdurchschnittlich gut abdecken. Ein genaueres Bild liefern die Einzelanalysen für jede der 18 BPM Suiten.


Adam, S.; Koch, M.; Neffgen, F.; Riegel, N.; Weidenbach, J.:
Business Process Management – Marktanalyse 2014
BPM Suites im Test
Fraunhofer IESE, Kaiserslautern 2014
Download der Kurzfassung und Bestellmöglichkeit

by Thomas Allweyer at December 15, 2014 09:50 AM

December 05, 2014

Keith Swenson: Drucker Forum 2014 Update

It is time again for the Global Peter Drucker Forum.  Here are some highlights of talks from John Hagel, Clayton Christensen, Gary Hamel and others.

All of the talks are are recorded and available on the web at http://www.druckerforum.org/2014/

  • John Hagel – The Dark Side of Technology.  Gave a quick overview of ‘The Big Shift.’  Increasing pressure in three levels:
    (1) removing barriers to entry/movement giving more competition,
    (2) accelerating pace of change,
    (3) connectivity means disruptions can come from anywhere on Earth.
    Analysis of all public companies shows that the return on assets has declined 75%.  There is a mindset disconnect.  Shift from scalable efficiency to scalable learning.  A CEO needs to ask three basic questions:
    (1) what is our business?
    (2) what should it be?
    (3) what should it not be?
    There are three types of business:
    (1) High volume routine processes,
    (2) product innovation,
    (3) customer relationship.  These three business types are very different; we should not lump them together.  The democratization of the means of production of content means that content has grown tremendously.  The same will happen with 3D printing.  The dark side of technology becomes a catalyst for change.
  • Lawrence Crosby – Drucker School of Management has some strategic initiatives
    (1) A Drucker Index as a measure of organizational effectiveness
    (2) Invent/innovation extend to challenge of managing creative workers
    (3) customerific,
    (4) bringing goals of Drucker into economy.
    How to deal with an uncertain future?  Agility, resilience, foresight, and a solid foundation.
  • Clayton Christensen – The Capitalist’s Dilemma.  What causes managers to invest to grow?  Growth comes from innovation, and the link is investment.  Three types of innovation:
    (1) market creating innovation/disruption (growth)
    (2) sustaining innovation
    (3) efficiency innovation.
    If these are all in balance the economy does well.  Finance gives us metrics which are ratios (e.g. IRR, PE, RONA), either increase the numerator, or decrease the denominator.  Investing in market creating innovation destroys these measures.  All free cash goes to efficiency innovation.  The show is being run by the people who dictate the metrics.  Gave the example of Japan.  Not a small problem.
  • Gary Hamel – Hacking Management.  How do you know if you have an evolutionary advantage?  Most companies don’t.  How to build a self-renewing organization.  Not one company of 100 makes innovation intrinsic.  Ask a random selection of employees
    (1) are you trained in innovation?  have you made investment in innovative capital?
    (2) how quickly can you take a small amount of experimental capital and produce something new?
    (3) are you measuring innovation?
    Enormous problem: only 13% of employees are engaged in work.  Our organizations are less capable than the people inside them.  Core-incompetency.  Every organization today is still run on principles of hierarchy/bureaucracy.  Management is a mashup of military command structures and disciplines of industrial engineering.  Stuck with management which is the love child of Julius Caesar and Fredrick Winslow Taylor. Organizations fail when the leaders fail to write off their own depreciating intellectual competence. All of us would be pressed to imagine the structure of some of the world’s most amazing companies.  Will take:
    (1) models of managing without managers (mentioned MorningStar)
    (2) motivation, admit that no alternative to move to new management model
    (3) change mindset and residual beliefs away from hierarchy
    (4) migration paths to get there from here
    Where do you see evidence of bureaucratic management drag.  The organizations that win in the next few years are the ones who learn how to evolve their management models.  Fiddling at the margins will not work.  Not just wrap theory around reality, but to change reality.  Nobody should be sitting on the sidelines.

(more to come)


by kswenson at December 05, 2014 06:03 PM

December 02, 2014

Thomas Allweyer: BPMN-Tools schwächeln bei der Unternehmensmodellierung – Studie zum Download

BPMN-Verknuepfbare Diagramme ThumbnailDie meisten Modellierungstools haben zumindest den größten Teil des BPMN-Standards umgesetzt, so dass sie sich hinsichtlich der Prozessmodellierung nicht mehr stark unterscheiden. Sehr große Unterschiede gibt es aber bei der Verknüpfung von BPMN-Modellen mit anderen Modellen, wie z. B. Organigrammen, Datenmodellen oder IT-Landschaften.

Da die Prozesse eines Unternehmens nicht isoliert stehen, ist es für ein konsequentes Prozessmanagement unerlässlich, das Zusammenspiel mit anderen Aspekten des Unternehmens zu betrachten. Ansätze zur integrierten Unternehmensmodellierung und zum Enterprise Architecture Management (EAM) streben daher eine Verknüpfung der unterschiedlichen Modelle an. Jedes Modell stellt darin eine Sicht auf ein integriertes Gesamtmodell dar. Werden die Geschäftsprozesse in BPMN modelliert, so muss man entscheiden, wie man sie methodisch sinnvoll mit anderen Modellen verknüpft. Weil dies nicht vom BPMN-Standard geregelt wird, haben sich in der Praxis ganz unterschiedliche Verknüpfungen zwischen BPMN-Modellen, Organigrammen, Strategiemodellen usw. herausgebildet.

Für die vorliegende Studie habe ich zum einen Integrationsansätze aus der wissenschaftlichen Literatur ausgewertet und zum anderen insgesamt 13 Modellierungswerkzeuge auf Verknüpfungs- und Erweiterungsmöglichkeiten für BPMN-Modelle untersucht. Wie sich herausstellte, spielt das Thema in der Wissenschaft keine große Rolle. Zwar haben eine ganze Reihe von Forschern spezielle BPMN-Erweiterungen entwickelt, doch nur selten wurde die Verknüpfung mit Modellen anderer Sichten untersucht. Wesentlich ergiebiger war die Analyse der Modellierungswerkzeuge. Hierbei wurden sowohl fachlich ausgerichtete Modellierungplattformen einbezogen (auch als Business Process Analysis- oder BPA-Tools bezeichnet), als auch Modellierungstools aus der Software-Entwicklung und Modellierungskomponenten von Business Process Management-Systemen (BPMS) zur Prozessausführung.

Es stellte sich heraus, dass die Möglichkeiten zur Methodenintegration sehr unterschiedlich sind und oft nur wenige Aspekte abgedeckt werden. So gibt es eine gewisse Grundmenge von Aspekten, die häufig mit BPMN-Modellen verknüpfbar sind. In jeweils mindestens sechs der 13 Tools kann man die Prozessmodelle mit Prozesslandkarten, Daten, IT-Systemen, Risiken oder der Aufbauorganisation verknüpfen. Jedoch gibt es kaum ein System, das zumindest diese fünf Aspekte in einem einzigen Tool abdeckt. Zudem werden dieselben Aspekte in unterschiedlichen Tools methodisch ganz unterschiedlich abgebildet und verknüpft. Es zeigt sich eine bunte Methodenvielfalt, die für den Anwender schwer zu durchschauen ist. Werben beispielsweise zwei Tools damit, dass man BPMN-Modelle mit Aufbauorganisation, Datenstrukturen und IT-Landschaften verknüpfen kann, kann man keinesfalls davon ausgehen, dass das tatsächlich angebotene Set an Methoden gleich mächtig ist.

Bei einigen umfangreichen Modellierungsplattformen verwundert die eher schwach ausgefallene Integration von BPMN-Diagrammen und anderen Modelltypen. Verbindungen sind z. T. sehr umständlich über eigenständige Zuordnungsdiagramme vorzunehmen. Auch die Erweiterungsmöglichkeiten der BPMN, z. B. um zusätzliche Artefakte, werden nur wenig genutzt. Mit wenigen Ausnahmen ist zudem die Dokumentation der Modellintegration dürftig und lückenhaft. Aufgrund der Heterogenität der angebotenen Methoden ist die Entscheidung für ein bestimmtes Tool zugleich die Entscheidung für eine bestimmte Methodik.

Von daher wäre es wünschenswert, dass sich sowohl Wissenschaftler als auch Standardisierungsorganisationen stärker mit der Integration von Modellen verschiedener Standards befassten. Toolhersteller sind dazu aufgerufen, ihr Methodenportfolio im Zusammenspiel mit BPMN kritisch zu durchleuchten und ggf. häufig benötigte Modelltypen und Verbindungen zu BPMN-Modellen hinzuzufügen. Anwender sollten im Vorfeld einer Toolauswahl recht genau analysieren, welche Sachverhalte sie über die reine BPMN-Prozessmodellierung hinaus modellieren wollen und welche Verbindungsmöglichkeiten zu anderen Modelltypen sie entsprechend benötigen. Sie sollten sich bewusst sein, dass sie durch die Auswahl eines bestimmten Tools auch eine Festlegung der möglichen Notationen und ihren Verbindungen treffen.

Download der Studie “BPMN-Prozessmodelle und Unternehmensarchitekturen”.

by Thomas Allweyer at December 02, 2014 07:06 AM

December 01, 2014

Sandy Kemsley: BPM Cyber Monday: Camunda 7.2 Adds Tasklist And CMMN

I caught up with Jakob Freund and Daniel Meyer of camunda last week in advance of their 7.2 release; with 1,700 person-days of work invested in this April-November release cycle, this includes a new...

[Content summary only, click through for full article and links]

by sandy at December 01, 2014 08:24 PM

November 27, 2014

Sandy Kemsley: Activiti BPM Suite – Sweet!

There are definitely changes afoot in the open source BPM market, with both Alfresco’s Activiti and camunda releasing out-of-the-box end-user interfaces and model-driven development tools to augment...

[Content summary only, click through for full article and links]

by sandy at November 27, 2014 04:42 PM

November 21, 2014

Thomas Allweyer: BPMN-Zertifizierungskurs mit ARIS

BPMN-Prozess in ARISDer von mir mitentwickelte BPMN-Zertifizierungskurs, der seit geraumer Zeit sehr erfolgreich in der Schweiz läuft, kommt Anfang nächsten Jahres auch nach Deutschland. Gemeinsam mit der Firma aproo, die unter anderem zahlreiche Dienstleistungen im ARIS-Umfeld anbietet, wurde der Kurs an die Verwendung mit den ARIS-Modellierungswerkzeugen der Software AG angepasst. Lange Zeit dominierte bei ARIS-Nutzern die Ereignisgesteuerte Prozesskette (EPK) als Notation zur Modellierung von Geschäftsprozessen, doch gewinnt die BPMN als alternative oder zusätzliche Darstellungsmethodik auch hier an Bedeutung. ARIS bietet neben der Prozessmodellierung zahlreiche Methoden zur integrierten Modellierung unterschiedlicher Sichten an, wie z. B. Daten, Organisation oder IT-Landschaften. In dem Kurs wird daher auch erläutert, wie sich BPMN-Modelle mit anderen Modelltypen verbinden lassen.

Schwerpunkt der Schulung ist aber die fundierte Einführung in die BPMN selbst. Anhand vieler Beispiele und praktischer Übungen lernen die Teilnehmer die Notation zu verstehen und selbst anzuwenden. Der zweitägige Kurs richtet sich daher nicht nur an ARIS-Nutzer, sondern an jeden, der die Prozessmodellierung mit BPMN erlernen möchte. Optional kann im Anschluss an den Kurs ein Zertifikat erworben werden. Der erste Kurs findet am 27. und 28. Januar in Frankfurt statt. Weitere Termine in anderen Regionen sind in Vorbereitung.

Weitere Informationen und Anmeldung

by Thomas Allweyer at November 21, 2014 10:35 AM

Drools & JBPM: Red Hat JBoss BRMS and BPMS Rich Client Framework demonstrating Polyglot Integration with GWT/Errai/UberFire and AngularJS

Last week I published a blog highlighting a presentation I gave showing our rich client platform that has resulted from the work we have done within the BRMS and BPMS platforms, the productised versions of the Drools and jBPM projects. The presentation is all screenshots and videos, you can find the blog and the link to the slideshare here:
"Red Hat JBoss BRMS and BPMS Workbench and Rich Client Technology"

The presentation highlighted the wider scope of our UI efforts; demonstrating what we've done within the BRMS and BPMS platform and the flexibility and adaptability provided by our UI technology. It provides a great testimony for the power of GWTErrai and UberFire, the three technologies driving all of this. We can't wait for the GWT 2.7 upgrade :)

As mentioned in the last blog the UberFire website is just a placeholder and there is no release yet. The plan is first to publish our 0.5 release, but that is more for our BRMS and BPMS platforms. We will then move it to GWT 2.7 and work towards a UF 1.0, which will be suitable for wider consumption.  With 1.0 we will add examples and documentation and work on making things more easily understood and consumable for end users. Of course there is nothing to stop the adventurous trying 0.5, the code is robust and already productized within BRMS and BPMS - we are always on irc to help, Freenode #uberfire.

That presentation itself built on the early video's showing our new Apps framework:
The Drools and jBPM KIE Apps Framework

The above video already demonstrates our polyglot capabilities, building AngularJS components and using them within the UF environments. It also shows of our spiffy new JSFiddle inspired RAD environment.

I'd now like to share with you the work we've done on the other side of polyglot development - this time using GWT and UF from within AngularJS. It was important we allow for an AngularJS first approach, that worked with the tool chain that AngularJS people are familiar with. By AngularJS first, I mean that AngularJS is the outer most container. Where as in the above video UF is already running and is the outer container in which individual AngularJS components can be used.

Before I detail the work we've done it's first best to cover the concepts of Screens and Perspectives, our two main components that provide our polyglot interoprability - there are others, but this is enough to understand the videos and examples that come next. A Screen is our simplest component, it is a DIV plus optional life cycle callbacks. A Perspective is also a DIV, but it contains a composition of 1..n Screen with different possible layout managers and persistence of the layout.

Screen
  • CDI discovery, or programmatically registered.
  • DIV on a page.
  • Life cycle callbacks.
    • OnStart, OnClose, OnFocus, OnLostFocus, OnMayClose, OnReveal.
  • Decoupling via Errai Bus.
    • Components do not invoke each other, all communication handled by a bus.
  • Editors extend screens, are associated with resource types and provide the additional life cycles
    • onSave, isDirty.
Perspective
  • CDI discovery, or programmatically registered.
  • Composition of 1..n Screens, but is itself a DIV.
  • Supports pluggable window management of Screens.
    • North, East, South West (NESW).
      • Drag and Drop docking capabilities.
    • Bootstrap Grid Views.
      • Separate design time and runtime.
    • Templates (ErraiUI or AngularJS).
      • Absolute control of Perspective content and layout.
  • Supports persistence of Perspective layout, should the user re-design it.
    • Only applicable to NESW and Bootstrap Grid views.

A picture is worth a thousands words, so here is a screenshot of the Perspective Builder in action. Here it uses the Bootstrap Grid View layout manager. Within each grid cell is a Screen. The Perspective is saved and then available from within the application. If the NESW layout manager was used there is no separate design time, and all dragging is done in-place and persistence happens in the background after each change. Although it's not shown in the screenshot below we also support  both list (drop list) and tab stacks for Screens.



Now back to what an AngularJS first approach means. 6 different points were identified as necessary to demonstrate that this is possible.
  1. UF Screens and Perspectives should be available seamlessly as AngularJS Directives.
  2. Bower packaging for a pre-compiled UFJS. UFJS is the pre-compile client only version of UF.
  3. UFJS can work standalone, file:// for example. UFJS can optionally work with an UF war backend, allowing persistence of perspectives and other optional places that UFJS might need to save state as well as access to our full range of provided services, like identity management.
  4. Support live refresh during development.
  5. Nested Controllers.
  6. Persistence and routing.
  7. Work with tools such as Yeoman, Grunt and Karma.
Eder has produced a number of examples, that you can run yourself. These demonstrate all of the points have been solved. You can find the code here, along with the README to get you started. We did not provide video's for point 7, as I believe the video's for points 1 to 6 show that this would not be a problem.

Eder has also created several short videos running the examples, for each of the use cases, and put them into a YouTube playlist. He has added text and callouts to make it clear what's going on:
AngularJS + UF PlayList
  1. Overview explaining what each video demonstrates (33s).
  2. AngularJS App + UFJS, client only, distribution using Bower. (2m30s).
    • Install and play with UFJS through Bower
    • Create a Native AngularJS App
    • Integrate this app with UFJS
      • Show UF Screen Directives
      • Show UF Perspective Directives
  3. AngularJS App + UFJS client and UF Server.
    • 1 of 2 (3m58s).
      • Download UF War
      • Install and run on EAP
      • Download and run our Angular demo on Apache
      • Show AngularJS Routes + UF Integration
    • 2 of 2 (4m06s).
      • Use UF to create Dynamic Screens and Perspectives
      • Encapsulate an AngularJS template in a UF Screen
      • Show an AngularJS App (inside a UF screen) nested in a parent controller.
        • Demonstrated multiple levels of controller nesting.
  4. KIE UF Workbench RAD environment with AngularJS component.
  5. Uberfire Editor working seamlessly as an Eclipse editor.
For completeness the original video's showing the JSFiddle inspired RAD environment, which demonstrates an UF first polyglot environment, have been added to the playlist. See point 4. above

Finally just to show of, and because we can, we added a bonus video demonstrating a UF editor component running seamlessly in Eclipse. This demonstrates the power of our component model - which has been designed to allow our components to work standalone in any environment. We use Errai to intercept all the RPC calls and bridge them to Eclipse. Because the virtual file system our editors use, like other services, is decoupled and abstracted we can adapt it to the Eclipse File io. For the end user the result is a seamless editor, that appears native. This allows the development of components that can work on the web and in Eclipse, or even IntelliJ. We'll work on making this example public at a later date.

Here are some screenshots taken from the video's

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)


(click image to enlarge)


(click image to enlarge)


(click image to enlarge)


(click image to enlarge)



Finally to all those that said it couldn't be done!!!!


by Mark Proctor (noreply@blogger.com) at November 21, 2014 01:30 AM

November 18, 2014

Thomas Allweyer: Fraunhofer IESE stellt am 10.12. die zweite BPMS-Studie vor

Das Fraunhofer Institut für Experimentelles Software Engineering (IESE) hatte im vergangenen Jahr zum ersten Mal eine detaillierte Untersuchung von Business Process Management-Systemen (BPMS), d. h. Systemen zur Prozessausführung, durchgeführt. Aufgrund der großen Resonanz dieser Studie beteiligten sich an der zweiten Auflage rund zwanzig Anbieter. Dabei wurde ihnen ordentlich auf den Zahn gefühlt: Jedes Tool wurde in einem ganztägigen Workshop analysiert. Dabei musste unter anderem ein von den Studien-Machern vorgegebenes Szenario vorgeführt und während des Workshops geändert und erweitert werden.

Ich hatte die Gelegenheit, als Gast an einigen dieser Workshops teilzunehmen. Dabei konnte man wirklich einen umfassenden Einblick in das jeweilige Tool gewinnen. Gelegentlich kamen die Vertreter der Toolhersteller auch ganz schön ins Schwitzen, wenn sie die eine oder andere diffizile Anforderung umsetzen sollten. Derartige Workshops sind sehr aufwändig. Sie erlauben jedoch eine wesentlich tiefer gehende Analyse als viele anderen Studien, bei denen lediglich mit Fragebögen gearbeitet wird, die von den Herstellern ausgefüllt werden.

Die Studie wird am 10. Dezember in Kaiserslautern im Rahmen einer Veranstaltung präsentiert, bei denen viele der beteiligten Anbieter mit ihren Systemen vor Ort sind. Die Teilnahme ist kostenfrei. Weitere Informationen und Anmeldung unter www.iese.fraunhofer.de/bpm2014

by Thomas Allweyer at November 18, 2014 08:06 AM

November 05, 2014

Sandy Kemsley: ActiveMatrix BPM at Citibank Brazil

Roberto Mercadante, SVP of operations and technology at Citibank Brazil, presented a session on their journey with AMX BPM. I also had a chance to talk to him yesterday about their projects, so have...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 11:31 PM

Sandy Kemsley: Event Analytics in Oil and Gas at TIBCONOW

Michael O’Connell, TIBCO’s chief data scientist, and Hayden Schultz, a TIBCO architect, discussed and demonstrated an event-handling example using remote sensor data with Spotfire and...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 10:15 PM

Sandy Kemsley: AMX BPM and Analytics at TIBCONOW

Nicolas Marzin, from the TIBCO BPM field group, presented a breakout session on the benefits of combining BPM and analytics — I’m not sure that anyone really needs to be convinced of the...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 07:44 PM

Sandy Kemsley: TIBCONOW ActiveMatrix BPM Roadmap

On Monday, we heard an update on the current state of AMX BPM from Roger King; today, he gave us more on the new release and future plans in his “BPM for Tomorrow” breakout session. He...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 06:50 PM

Sandy Kemsley: Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so...

[Content summary only, click through for full article and links]

by sandy at November 05, 2014 05:44 PM

Keith Swenson: Five Ways ‘Planning Before Doing’ can be Bad

Before you do something, plan it.  Figure out what you are going to do, and then do it.  If you failed to succeed, then you didn’t plan well enough.  Next time, do better planning.  How many times have you heard these saying from traditional scientific management?  They are so ingrained in our working behaviors, they seem beyond questioning.  But there are some times when planning is a bad idea — and this point talks about 5 such situations.

plannerPlanning takes many forms, and when I talk about a plan, I mean any kind of way that you make a definition of what you or others are going to do in the future.  The plan has to be persistent, usually written, but you can also share a memorized plan if it is not too complicated.  Some plans are a detailed list of instructions.  Others might be a flow chart describing various options, and what one must do if certain possibilities materialize.  A plan is not simply a command to do something, but it is a description of what to do that is worked out in advance.  That is the key:  a plan is created at one time, but the actual action is done later.

Planning helps to coordinate the actions of people, and it is particularly important when those people can not be constantly communicating.  If you want to meet someone at a coffee shop, you have to make a plan, otherwise you are unlikely to meet.  You want to build a skyscraper?  You need a very large elaborate plan for that or else the resulting building might be poorly built.  But there are some situations where a plan is a bad idea.

  • Volatility can make a plan worthless – All of the example where a plan helps is when the patterns of behavior can be accurately predicted and are repeatable enough.  Imagine tomorrows stock market prices.  To make a detailed plan today, about what you are going to trade, and when you are going to trade it tomorrow, is pointless.  It is easy to see how stock prices are chaotic enough on a day by day, but the same principle applies to many other things if the time scale is increased.  Planning a lunch reservation for 6 months from now would have to be a very special occasion and still would be highly tentative at best.  The famous Franco-Prussian War general Helmut von Moltke said “no plan survives contact with the enemy” which reflects this sentiment perfectly given the volatility of a war situation.
  • A plan can be more trouble than the benefit – Imagine ten carloads of people attending an ordinary evening event.  Nobody would bother making a plan for the exact places that each car will park, because there is no substantial benefit in making this plan.  Planning is never free; it always takes at least some time and effort.  If the benefit of the plan is not greater than the cost of planning, then the planning itself is a waste, and should be avoided.
  • Following the plan may be a distraction – There are some situations where creating a plan is a helpful exercise, but taking the plan literally is harmful.  Creating a plan can force a team to think though all the possible things that might happen, and to think about how one might respond.  As an exercise in preparedness this can be a benefit, but it is important for the team to remember that the specific plan may contain details that are not justified and do not properly account for how the reality has unfolded.  The team needs to know to ignore those details in the plan.  This is the idea behind General Eisenhower’s statement that “plans are useless, but planning is indispensable.
  • A plan can give false confidence – Just because someone threw together a plan does not mean that it is a good plan.  Without a plan, people will be searching for possibilities.  Those same people may stop looking for solutions if they have been convinced that a good planning job has already been done.
  • A plan made too early can lead you down the wrong path – A plan that is made too early might be made on misunderstood conditions.  This naïve plan is actually harmful because it locks the team to a particular direction, at a time when the reality of the value of direction is unclear.  Later, when the actual situation has cleared, it may be impossible to change to the better direction.  This is the result of some serious research by Danish economic geographer Bent Flyvbjerg.  He studied hundreds of public works projects, and found that those projects that waited longer before making a plan, tended to do better.  He says: “Often there is ‘lock in’ or ‘capture’ of a certain project concept at an early stage, leaving analysis of alternatives weak or absent [from the plan].

The conclusion from all this is that plans can be costly, so make sure you are getting an appropriate amount of benefit.  In some cases, you are actually better off not making any plan, and simply figuring it out as you go.  In cases where a plan does make sense, there can be a right time to make a plan.  A plan made too late might not have enough effect on the work already in progress, but alternately a plan made too early can be flawed so as to prevent you from finding the right route.  A Late-Structured Process is a strong aspect of case management where the case manager is allowed to plan after the work is started.

So don’t think that a full and complete plan is the obvious and indisputable requirement for success in all projects.   Doing more planning does not necessarily make the project better.  This is not to say that one should never plan.  A majority of cases need a plan, but a simple plan may be more effective than an elaborate one.  The amount of planning should be appropriate to the need.   There is a right time.  Delaying planning to a time where you know more about the specifics may be a good idea, may give you a better plan, which may allow you to outperform those who plan too much & too early.


by kswenson at November 05, 2014 10:53 AM

Drools & JBPM: Red Hat JBoss BRMS and BPMS Workbench and Rich Client Technology

Last week I did a blog highlighting the recent R&D we are doing to make our web platform easily extensible and allow the development of a self service apps platform. The blog had two video links showing progress.

Today I gave a presentation that highlighted the wider scope of our UI efforts; demonstrating what we've done within the BRMS and BPMS platform and the flexibility and adaptability provided by our UI technology. It provides a great testimony for the power of GWTErrai and UberFire, the three technologies driving all of this. We can't wait for the GWT 2.7 upgrade :)



As mentioned in the last blog the UberFire website is just a placeholder and there is no release yet, we'll be working on that over xmas, along with documention to make things more easily understandable and consumerable for end users.

The presentation is now live up on Slideshare and I've taken time to embed all the video's within it.
http://www.slideshare.net/MarkProctor/red-hat-jboss-brms-and-bpms-workbench-as-a-platform

The presentation is mostly self explanatory screenshots with headers with videos scattered throughout - many of which are previous video's you might have seen before. It provides a very good over, using scatter gun approach, of what we have done, what we are doing and where we are going. Two of the videos around the extensible workbench and the technical web ide have audio commentary.

It starts by showing the existing BRMS and BPMS stuff before moving onto the new extensibility efforts. Finally it covers a new demo we've done with the workbench reskinned for a more technical audience. It also shows ACE integration for java and xml editing, as well as real time provisioning and running of Spring Pet Clinic application within the workbench.

by Mark Proctor (noreply@blogger.com) at November 05, 2014 03:14 AM

November 04, 2014

Sandy Kemsley: TIBCONOW 2014 Day 2 Keynote: Product Direction

Yesterday’s keynote was less about TIBCO products and customers, and more about discussions with industry thought leaders about disruptive innovation. This morning’s keynote continued...

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 06:50 PM

Sandy Kemsley: Spotfire Content Analytics At TIBCONOW

(This session was from late yesterday afternoon, but I didn’t remember to post until this morning. Oops.) Update: the speakers were Thomas Blomberg from TIBCO and Rik Tamm-Daniels from Attivio....

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 04:59 PM

Thomas Allweyer: Was leisten BPM-Tools für die Überwachung von Geschäftsprozessen?

Cover_Ueberwachung-von-Geschaeftsprozessen_StudieBei der vorliegenden Studie handelt es sich um eine von drei Schwerpunktstudien zu BPM-Tools, die das Stuttgarter Fraunhofer-Institut IAO in diesem Jahr durchgeführt hat. Von den 28 Herstellern, die im vorausgegangenen Marktüberblick vertreten waren, haben sich nur ganze fünf zum Thema “Überwachung von Geschäftsprozessen” beteiligt. Das ist recht erstaunlich, bergen doch die Überwachung und Auswertung des tatsächlichen Prozessgeschehens ein hohes Potenzial. So können beispielsweise Probleme frühzeitig erkannt, Auslastungen optimiert und Verbesserungsmöglichkeiten aufgedeckt werden. Insbesondere für alle BPMS-Hersteller, die über eine Process Engine verfügen, sind die durch die Prozessautomatisierung eröffneten Auswertungsmöglichkeiten ein wichtiges Verkaufsargument.

Die Autoren gliedern die Monitoring-Funktionalitäten in die drei Aufgabenfelder Datenerfassung, Auswertung und Darstellung. Das Thema betrifft den gesamten Prozessmanagement-Lebenszyklus: So können Process Mining-Algorithmen zur Prozessidentifikation verwendet werden. Bei der Prozessmodellierung sind Kennzahlen und ihre Sollwerte zu bestimmen sowie die Kennzahlenerhebung zu planen. In der Prozessausführung werden die Daten der einzelnen Prozessinstanzen ausgewertet, die dann zu Kennzahlen aggregiert und im Rahmen der Prozessüberwachung in Dashboards präsentiert und analysiert werden.

Zwei der vertretenen Werkzeuge verfügen über eine Process Engine. Sie bieten Funktionen zur Überwachung der Prozessinstanzen in Echtzeit. Von besonderer Bedeutung sind zeitnahe Auswertungen insbesondere für kundenbezogene Prozesse, Ticket-Prozesse im Support-Bereich und für Compliance-relevante Kennzahlen. Auch die Systemverfügbarkeit für Kernprozesse sollte in Echtzeit überwacht werden.

Zwei weitere Tools haben ihren Schwerpunkt in der Prozessmodellierung. Sie verfolgen den Ansatz, Daten aus verschiedenen Drittsystemen zu extrahieren, sie in Bezug zu den Geschäftsprozessmodellen zu setzen und somit prozessbezogene Analysen zu ermöglichen. Schließlich ist noch ein Process Mining-Tool vertreten, bei dem die Prozesse nicht im Vorfeld definiert werden. Stattdessen rekonstruiert das System anhand von Daten aus verschiedenen Anwendungssystemen, wie die Prozessinstanzen tatsächlich abgelaufen sind. Hierbei können ebenfalls verschiedenen Kennzahlen ermittelt und über verschiedene Auswertungen verglichen werden.

Es kommen also ganz unterschiedliche Konzepte zum Einsatz. Dennoch gibt es auch Gemeinsamkeiten. So bieten alle Tools Dashboard-Darstellungen im Browser an, die rollenbasiert und individuell personalisiert werden können. Inhalt und Aufbau können jeweils per grafischer Modellierung oder Konfiguration angepasst werden. Ebenso findet sich überall die Möglichkeit neben vordefinierten Kennzahlen auch eigene Kennzahlen und Auswertungen zu definieren, Kennzahlen zu aggregieren und zu filtern, und verschiedene Statistiken zu erstellen. Zur Präsentation werden neben den üblichen Balkendiagrammen etc. zum Teil auch Prozessmodelle verwendet. Ausgewählte Kennzahlen werden direkt in die betreffenden Stellen des Diagramms eingeblendet und z. B. farblich hervorgehoben.

Auch wenn die Studie mit nur fünf Anbietern den Markt längst nicht abdeckt, werden doch wesentliche Facetten und Möglichkeiten der Prozessüberwachung deutlich.


Falko Kötter, Monika Kochanowski, Thomas Renner:
Business Process Management Tools 2014 – Überwachung von Geschäftsprozessen.
Fraunhofer Verlag 2014.
Weitere Infos und Bestellung beim IAO

by Thomas Allweyer at November 04, 2014 09:10 AM

Sandy Kemsley: BPM For Today At TIBCONOW

Roger King, who heads up TIBCO’s BPM product strategy, gave us an update on ActiveMatrix BPM, and some of the iProcess to AMX BPM tooling (there is a separate session on this tomorrow that I...

[Content summary only, click through for full article and links]

by sandy at November 04, 2014 12:26 AM

November 03, 2014

Sandy Kemsley: BPM COE at TIBCONOW 2014

Raisa Mahomed of TIBCO presented a breakout session on best practices for building a BPM center of excellence. She started with a description of different types of COEs based on Forrester’s...

[Content summary only, click through for full article and links]

by sandy at November 03, 2014 10:17 PM

Sandy Kemsley: TIBCONOW 2014 Opening Keynote: @Gladwell and More. Much More.

San Francisco! Finally, a large vendor figured out that they really can do a 2,500-person conference here rather than Las Vegas, it just means that attendees are spread out in a number of local...

[Content summary only, click through for full article and links]

by sandy at November 03, 2014 08:12 PM

November 01, 2014

Tom Debevoise: By the book: How DMN is connected to BPMN

Throughout our new book 'the Microguide to Process and Decision Modeling in DMN and BPMN' we treat DMN as an integral notation for process modeling in BPMN. Even though decision model notation is a separate domain within the OMG, the DMN spec provides an explicit way to connect to processes in BPMN. DMN provides a schema model in XML format that includes two connection points. First, there is an explicit list that denotes the processes and tasks that use the decisions. Next, DMN provides an input and output data type that implicitly corresponds to the rule activity that invokes the knowledge bases of the decision.

In table 7 of the proposed Decision Model and Notation (DMN) Specification, the class model for the decision defines the BPMN processes and tasks that require the decision to be made (usingProcesses and usingTasks).

Appendix B of the DMN specification says:

“The interface to the decision service will consist of:

  • Input: a list of contexts, providing instances of all the Input Data required by the encapsulated decisions
  • Output: a context, providing (at least) the results of evaluating all the decisions in the minimal output set, using the provided instance data.

When the service is called, providing the input, it returns the output.

In its simplest form a decision service would always evaluate all decisions in the encapsulation set and return all the results.”

Here we are assuming that the decision is created by business rules from input processes and accessed through a decision service. Other decision can be manual or can require user input. The business rule task shape can denote the place within the process model that calls up a DMN model with the needed input and obtains the decision output.

As seen in the figure below, most DMN decision modelers utilize the rule shape to denote a connection to DMN. The inputs of a rule task are processed by the logic defined in the DMN model and then output for use in downstream gateways, participants, events and activities.

The customer is the input for the decision and the output is the customer discount. The process in the diagram can be made explicit according to the execution semantics. The figure shows the usage of the message shape. The association lines (dotted) are used to create the relationship between the message and the data type that is used in the process schema.  When decision for a customer discount is requested a customer message is sent to the decision service. This is an initiating message, so the envelope is white. After the decision is completed, the BRMS returns the customer discount. The message is shown as a non-initiating message with light shading.

To summarize: the DMN spec explicitly defines how a BPMN process is connected to the decision through the usingProcesses and usingTasks metadata for the decision shape. The input and output are attributes of the  decision and created by expressions and decision table. (more on that latter).

by Tom Debevoise at November 01, 2014 06:09 PM

October 30, 2014

Drools & JBPM: The Drools and jBPM KIE Apps Framework

With the Drools and jBPM (KIE) 6 series came a new workbench, with the promise of eventual end user extensibility. I finally have some teaser videos to show this working and what's in store. Make sure you select 1080p and go full screen to see them at their best.
(click to enlarge)

(click to enlarge)

What you seen in these videos is the same workbench available on the Drools video's page. Once this stuff is released you'll be able to extend an existing Drools or JBPM (KIE) installation or make a new one from scratch that doesn't have Drools or jBPM in it - i.e. the workbench and it's extension stuff is available standalone, and you get to chose which plugins you do or don't want.

Here is demo showing the new Bootstrap dynamic grid view builder used to build a perspective, which now doubles as an app. It uses the new RAD, JSFiddle inspired, environment to author a simple AngularJS plugin extension. This all writes to a GIT backend, so you could author these with Intellij or Eclipse and just push it back into the GIT repo. It then demonstrates the creation of a dynamic menu and registers our app there. It then also demonstrates the new app directory. Apps are given labels and can then be discovered in the apps directory - instead, or as well as, top menu entries. Over 2015 we'll be building a case management system which will compliment this perfect as the domain front end - all creating a fantastic Self Service Software platform.
http://youtu.be/KoJ5A5g7y4E

Here is a slightly early video showing our app builder working with DashBuilder,
http://youtu.be/Yhg31m4kRsM

Other components such as our Human Tasks and Forms will be available too. We Also have some cool infrastructure coming event publication and capture and timeline reporting, so you visualise social activity within your organization - you'll be able to place time timeline components you see in this blog, on your app pages:
http://blog.athico.com/2014/09/activity-insight-coming-in-drools-jbpm.html

All this is driven by our new project UberFire, which provides the workbench infrastructure for all of this. The project is not yet announced or released, but will do so soon - the website is currently just a placeholder, we'll blog as soon as there is something to see  :)

by Mark Proctor (noreply@blogger.com) at October 30, 2014 07:35 PM

October 29, 2014

October 28, 2014

Sandy Kemsley: SAP’s Bigger Picture: The AppDev Play

Although I attended some sessions related to BPM and operational process intelligence, last week’s trip to SAP TechEd && d-code 2014 gave me a bit more breathing room to look at the bigger...

[Content summary only, click through for full article and links]

by sandy at October 28, 2014 12:55 PM

October 27, 2014

BPinPM.net: Want to be one step ahead in BPM? Get one of our very last conference tickets for free!

stiftung_naechste_gesellschaft_3We are proud to announce that we have made it to engage Dr. Bernhard Krusche from Stiftung Nächste Gesellschaft (Foundation Next Society) as key note speaker for this year’s BPinPM.net Conference! :-)

Bernhard Krusche is an anthropologist, author, and explorer of the Next Society and he will help us to keep the pace and continue to be one step ahead!

Last year, we have started to tackle the Digital Age and we have performed workshops to dive into Digital Age BPM. – Results will be presented at the conference.

But this year’s key note is designed to go even further into this direction by challenging BPM with insights from the Next Society: Connect, Co-Create, Collaborate!

For more information about the Next Society, have a look at: http://x-society.net

If you have not registered yet, please don’t hesitate to sign up for the conference. There are only 5 very last tickets available!

And: You won’t believe it, but we are going to give away one of these very last tickets for free! Even if you are already holding a ticket, join the lottery now and spread the word. We will redeem your ticket, if you win.

Enter the lottery now… …and – to be on the safe side – buy your conference ticket right away!

We are looking forward to welcome you in Seeheim!

Best regards,
Mirko

by Mirko Kloppenburg at October 27, 2014 08:31 PM

Sandy Kemsley: What’s New With SAP Operational Process Intelligence

Just finishing up some notes from my trip to SAP TechEd && d-code last week with the latest on their Operational Process Intelligence product, which can pull events and data from multiple...

[Content summary only, click through for full article and links]

by sandy at October 27, 2014 01:14 PM

October 24, 2014

Thomas Allweyer: Prozessmanagement-Ratgeber mit praktischen Arbeitshilfen

Cover Fuermann ProzessmanagementFüermann betrachtet Prozessmanagement als langfristiges Organisationsprinzip, das nicht in Form eines einzelnen Projektes eingeführt werden kann, sondern ein umfangreiches Programm erfordert. Er teilt dieses Programm in vier Phasen auf, nach denen auch das Buch gegliedert ist. Zunächst aber werden die Grundlagen der Prozessorganisation behandelt. Von den verschiedenen möglichen Organisationsformen wird die asymmetrische Matrixorganisation hervorgehoben, die das Spannungsfeld zwischen Prozess und Funktion durch eine Matrix löst, aber den Prozessen den Vorrang gibt. Nicht direkt an den kundenorientierten Prozessen beteiligte Organisationseinheiten werden als interne Dienstleister aufgestellt, oder sie entsenden Fachleute in die Kernprozesse.

In der Phase “Infrastruktur” werden die verschiedenen Rollen im Prozessmanagement besetzt und das Programm geplant. Außerdem geht es darum, die Prozesse zu identifizieren und in einer Prozesslandkarte darzustellen. Es folgt die Phase “Beschreibung”, in der die Prozessdetails festgelegt und in Form von Ablaufdiagrammen dokumentiert und verbindlich festgelegt werden. In der Phase “Lenken” werden Indikatoren zur Messung von Prozessen aufgestellt, Schnittstellenvereinbarungen getroffen, Prozessaudits durchgeführt und Korrekturmaßnahmen eingeleitet. Die Methode “Hoshin Kanri” bietet einen Rahmen zur Ziel- und Maßnahmenplanung. Schließlich folgt die Phase “Verbessern”. Das Prinzip der ständigen Verbesserung sorgt für eine kontinuierliche Weiterentwicklung der Prozesse. Als Ansätze für größere Veränderungen werden Six Sigma und Process Re-Engineering beschrieben.

Für jede Phase wird eine Reihe von Arbeitsmitteln zur Verfügung gestellt, die auch von der Website des Buchs heruntergeladen werden können, meist in Form von Excel-Dateien. Darunter finden sind beispielsweise Vorlagen für Programmpläne, für SIPOC-Darstellungen (Supplier – Input – Process – Output – Customer) oder für übersichtliche Prozessberichte im A3-Format. Die Anwendung jedes dieser Hilfsmittel wird im Buch ausführlich erläutert.

Der Fokus des Buchs liegt rein auf organisatorischen Fragestellungen, daher wurden sämtliche IT-Aspekte außen vor gelassen. Ob dies angesichts der immer stärkeren IT-Durchdringung aller Prozesse heute noch sinnvoll ist, lässt sich zumindest diskutieren. Auch einige der vorgestellten Arbeitsmittel im Prozessmanagement sollte man in ernsthaften Prozessmanagement-Initiativen nicht unbedingt ohne entsprechende Software-Unterstützung durchführen. Beispielsweise wird eine rein Excel-basierte Prozessmodellierung recht schnell nicht mehr handhabbar sein. Dass für die grafische Darstellung Programmablaufpläne anstelle verbreiteter Notationen wie BPMN verwendet werden, entspricht nicht ganz dem State of the Art.

Zwar enthält das Buch keine kompletten Neuheiten, doch werden die einzelnen Themen gut verständlich und nachvollziehbar dargestellt. Viele der bereitgestellten Arbeitshilfen können sehr nützlich für die praktische Arbeit von Prozessberatern und -managern sein.


Timo Füermann:
Prozessmanagement – Kompaktes Wissen, Konkrete Umsetzung, Praktische Arbeitshilfen.
Hanser 2014
Das Buch bei amazon.

by Thomas Allweyer at October 24, 2014 09:27 AM

Drools & JBPM: Red Hat Job Opening - Software Sustaining Engineer

We are looking to hire someone to help improve the quality of BRMS and BPMS platforms. These are the productised versions of the Drools and jBPM open source projects.

The role will involve improving our test coverage, diagnosis problems, creating reproducers for problems as well as helping fix them. You’ll also be responsible for helping to setup and maintain our continue integration environment to help streamline the various aspects involved in getting timely high quality releases out.

So if you love Drools and jBPM, and want to help make them even better and even more robust - then this is the job for you :)

The role is remote, so you can be based almost anywhere.

URL to apply now http://jobs.redhat.com/jobs/descriptions/software-engineer-brno-jihomoravsky-kraj-czech-republic-job-1-4759718

Mark

by Mark Proctor (noreply@blogger.com) at October 24, 2014 05:10 AM

October 21, 2014

Sandy Kemsley: SAP TechEd Keynote with @_bgoerke

I spent yesterday getting to Las Vegas for SAP TechEd && d-code and missed last night’s keynote with Steve Lucas, but up this morning to watch Björn Goerke — head of SAP Product...

[Content summary only, click through for full article and links]

by sandy at October 21, 2014 05:44 PM

October 20, 2014

Thomas Allweyer: Welche Projektmanagement-Praktiken sorgen wirklich für Erfolg?

Zwar existieren zahlreiche Ansätze und Methoden im Projektmanagement, doch gibt es wenig systematische Untersuchungen darüber, welche Faktoren tatsächlich für den Projekterfolg ausschlaggebend sind. Daher führt das BPM-Labor der Hochschule Koblenz in Zusammenarbeit mit der Deutschen Gesellschaft für Projektmanagement die Studie “Erfolgsfaktoren im Projektmanagement” durch. Zur Teilnahme an der online durchgeführten Befragung sind alle Personen mit praktischer Projekterfahrungen aufgerufen. Anhand strukturierter Fragen werden Projektrahmen, -typ und genutzte Praktiken von jeweils einem erfolgreichen und einem weniger erfolgreichen Projekt erhoben. Die Teilnehmer erhalten neben dem Studienbericht Sonderauswertungen und Schlüsselaussagen für die eigene Branche. Zudem kann man die Teilnahme an einem Workshop zum agilen Projektmanagement gewinnen.

Eine Beteiligung an der Umfrage ist bis zum 26.11. unter www.erfolgsfaktoren-projektmanagement.de möglich.

by Thomas Allweyer at October 20, 2014 08:30 AM

October 16, 2014

Tom Debevoise: New Book: The Microguide to Process and Decision Modeling in BPMN/DMN

 

The Microguide to Process and Decision Modeling in BPMN/DMN is now available on Amazon.  A little bit about the book: the landscape of process modeling has evolved as have the best practices. The smartest companies are using decision modeling in combination with process modeling. The principle reason is that decisions and processes are discovered and managed in separate, yet, interrelated ways.

Decision Model and Notation (DMN) is an evolution of Business Process Model and Notation (BPMN) 2.0 into an even more powerful and capable tool set and the Microguide book covers both specifications. It also focuses on the best practices in decision and process modeling. A number of these best practices have emerged, creating robust, agile, and traceable solutions.  Decision management and decision modeling are critical, allowing for simpler, smarter, and more agile processes. 

A simple decision and gateway control of an execution path to respond to a purchasing decision.

As the figure above shows, the proper use of decision modeling uncovers critical issues that the process must address to comply with the decision. Decision-driven processes act on the directives of decision logic: decision outputs affect the sequence of things that happen, the paths taken, and who should perform the work. Processes provide critical input into decisions, including data for validation and identification of events or process-relevant conditions. The combination of process and decision modeling is a powerful one.

In most business processes, an operational decision is the controlling factor driving processes. This is powerful, as many governments and enterprises focus on minimizing the event response lag because there is often a financial benefit to faster responses. Straight-through processing and automated decision making, not just automated processes, is also emphasizing the importance of decisions in processes. Developing a decision model in DMN provides a detailed, standardized approach that precisely directs the process and creates a new level of traceability.

Decision modeling can therefore be considered an organizing principle for designing many business processes. Most process modeling in BPMN is accomplished by matching a use case, written or otherwise, with workflow patterns. Process modeling is critical to the creation of a robust and sustainable solution. Without decision modeling, however, such an approach can result in decision logic becoming a sequence of gateways and conditions such that the decision remains hidden and scattered among the process steps.

Without decision modeling, critical decisions, such as how to source a requisition when financial or counter-party risk is unacceptable, or what to offer a customer, are lost to the details of the process. When the time comes to change or improve a decision, a process model in BPMN alone might not meet the need. Providing a notation for modeling decisions separately from processes is the objective of DMN.

by Tom Debevoise at October 16, 2014 09:19 PM

October 15, 2014

Sandy Kemsley: AIIM Information Chaos Rescue Mission – Toronto Edition

AIIM is holding a series of ECM-related seminars across North America, and since today’s is practically in my back yard, I decided to check it out. It’s a free seminar so heavily...

[Content summary only, click through for full article and links]

by sandy at October 15, 2014 05:34 PM

Bruce Silver: BPMN Explained – Part 2

Yesterday I tried to explain BPMN to those who don’t know what it is.  OK, they are probably saying, if BPMN is so great, why do I hear these complaints about it?  Yes, that’s a good question.

First, you need to understand exactly who is complaining.  If it’s a legacy tool vendor wedded to their proprietary (“much better!”) notation, well that speaks for itself.  Ditto if it’s a gray-haired process improvement consultant whose idea of a modern tool is a whiteboard that prints.  Which is most of them.  But even if you cross those guys off the list, there are normal end users who complain about it.

One complaint is there are too many shapes and symbols.  Actually, there are only three primary shapes, called flow nodes: activity, the rounded rectangle, denoting an action step in the process; gateway, the diamond, denoting conditional branching and merging in the flow; and event, the circle, denoting either the start or end of a process or subprocess, or possibly the process’s reaction to a signal that something happened.  Just three, much fewer than a legacy flowcharting notation.  In BPMN, the solid arrow, called sequence flow, must connect at both head and tail to one of these three shape types.

The problem is that the detailed behavior of the flow nodes is actually determined by their icons, markers, and border styles.  There are way too many of those, I will readily admit.  Only a small fraction of them are widely used and important to know; the rest you can simply ignore.  When I started my BPMN training many years ago, I identified a basic working set of shapes and symbols called the Level 1 palette, mostly carried over from traditional flowcharting.  The purpose was to eliminate the need to learn useless BPMN vocabulary that would never be used.  When BPMN 2.0 came out 4 years ago, they did a similar thing, officially this time, but for a different purpose.  The so-called Descriptive process modeling conformance class is essentially the Level 1 working set.  Its purpose, from OMG’s standpoint, was to limit the set of shapes and symbols a tool vendor must support in order to claim BPMN support.  So… if you are new to BPMN, just stick to the Level 1/Descriptive working set.  It will handle most everything you are trying to show, and good BPMN tools in fact let you restrict the palette to just those elements.

I sometimes hear the opposite complaint, that BPMN does not have a standard way to visualize important information, like systems, organizational units, task completion times, or resource costs, available in their current process modeling tool.  Actually, many BPMN tools do have ways to include these things, but each in their own tool-specific way.  BPMN just describes the process logic, that is, how the process starts and ends and the order of the steps.  It doesn’t describe the internal details of a task, like its data or user interface, or decision logic, or systems involved, or important simulation parameters.  Its scope is quite limited.  There are some emerging standards for those other things that will eventually link up with BPMN, but they are not yet widely adopted.  Anyway, it’s important to distinguish the information a BPMN tool can support from information that is part of BPMN itself.

Finally, some people don’t like the fact that BPMN has rules.  A tool validating models against those rules might determine, for instance, that the way you’ve been modeling something for years is invalid in BPMN.  You can ignore that, of course, but remember the goal of BPMN is clear communication of the process logic.  A diagram that violates the rules of the specification probably does not do that very well.  Like any new language, BPMN asks that you take a little time to learn it.  It’s actually not that hard.

The post BPMN Explained – Part 2 appeared first on Business Process Watch.

by bruce at October 15, 2014 05:22 PM

October 14, 2014

Bruce Silver: BPMN Explained

On Twitter someone posted to me: “Have you ever seen a short overview of BPMN that makes sense to people who have never heard of it?”  Hmmm… Probably not.  So here is my attempt.

Business Process Modeling Notation, or BPMN, is a process diagramming language.  It describes, in a picture, the steps in a business process from start to end, an essential starting point whether you are simply documenting the process, analyzing it for possible improvement, or defining business requirements for an IT solution to a process problem. Dozens of process diagramming languages have existed since the 1980s at least, so what’s so special about BPMN?

First, BPMN is an open industry standard, under the auspices of the Object Management Group.  It is not owned by a particular tool or consulting company.  A wide variety of tools support it, and the meaning of the business process diagram is independent of the tool used to create it. With BPMN you don’t need to standardize on a single tool for everyone in the organization, since they all share a common process modeling language.

Second, unlike flowcharts created in a tool like Visio or Powerpoint, the meaning of each BPMN shape and symbol is quite precise – it’s defined in a specification – and in principle independent of the personal interpretation of the person who drew it.  I say “in principle” because it is possible to violate the rules of the BPMN specification, just like it is possible to write an English sentence that violates accepted rules of grammar or spelling.  Nothing drastic happens in that case, but the diagram’s effectiveness at communication is decreased.

Third, BPMN is a language shared by business and IT, the first process modeling language able to make that claim.  When BPMN was first developed about 10 years ago, the only available process modeling standards at that time – UML activity diagrams and IDEF, among others – were rejected as “IT standards” that would not be accepted by business users.  To business users, a process diagram looked like a swimlane flowchart, widely used by BPM practitioners but lacking precise definition in a specification.  BPMN adopted the basic look and feel of a swimlane flowchart, and added to it the precision and expressiveness required by IT.  In fact, that precision and expressiveness is sufficient to drive a process automation engine in a BPM Suite (BPMS).  The fact that the visual language used by the business to describe a proposed To-Be process is the same as the language used by developers to build that process in a BPMS has opened up a new era of business-empowered process solutions in which business and IT collaborate closely throughout a faster and more agile process improvement cycle.

Even if you have no intention to create an automated process solution in a BPMS, BPMN diagrams can reveal information critical to process documentation and analysis that is missing in traditional swimlane flowcharts: exactly how the process starts and ends, what each instance of the process represents, how various exceptions are handled, and the interactions between the process and the customer, external service providers, and other processes.  The rules of the BPMN specification do not require these elements, but use of best-practice modeling conventions in conjunction with a structured methodology can ensure they are included.  My book BPMN Method and Style and my BPMessentials training of the same name are based on such an approach.

So yes, there is a cost to adopting BPMN, whether you are moving from casual tooling like Powerpoint or Visio flowcharts or from a powerful but proprietary language like ARIS EPC.  There is a new diagram vocabulary to learn, diagramming rules, as well as the aforementioned conventions and methodology such as Method and Style.  But the benefits of speaking a common process language are tremendous.  The investment in process discovery and analysis is far more than the cost of a tool or the time required to draw the diagrams.  It involves hundreds of man-hours of meetings, information gathering from stakeholders, workshops, and presentations to management.  The process diagram is a distillation of all that time and effort.  If it cannot be shared across the whole project team – business and IT – or to other project teams across the enterprise, now or in the future, you are throwing away much of that investment.  BPMN provides a way to share it, without requiring everyone to standardize on a single tool.

The post BPMN Explained appeared first on Business Process Watch.

by bruce at October 14, 2014 05:24 PM

Thomas Allweyer: Social BPM-Fähigkeiten von Prozessmanagement-Tools

Cover_Social_BPM_StudieDer Begriff “Social BPM” ist nicht ganz leicht zu fassen. Angesichts der Verbreitung von sozialen Netzwerken und “Social Software” im Unternehmen liegt es nahe, dass diese auch sehr nützlich für das Geschäftsprozessmanagement sein können – zumal es bei der Definition und Ausführung von Prozessen praktisch immer erforderlich ist, dass mehrere Beteiligte erfolgreich zusammenarbeiten. Doch welche konkreten Einsatzmöglichkeiten gibt es für Newsfeeds, Kontakte, Kommentarfunktionen, Wikis etc. im Prozessmanagement und welchen Nutzen bringen Sie?

Die Autoren der vorliegenden Studie haben zunächst die bestehenden Potenziale in den unterschiedlichen Phasen des Prozessmanagement-Kreislaufs herausgearbeitet. So bietet Social BPM in der Phase der Prozessidentifikation und -modellierung den Vorteil, dass sich viele Beteiligte aktiv einbringen können. Z. B. können mehrere Personen gemeinsam an einem Modell arbeiten, sich über Änderungen informieren lassen und Kommentare abgeben. In der Prozessimplementierung und -ausführung helfen beispielsweise zielgerichtete Informationen auf rollenbasierten Prozessportalen. Workflow-Aufgaben können in unternehmensinterne soziale Netzwerke eingespeist und Workflows durch Social Media-Ereignisse gestartet werden. Bei der Prozessüberwachung und der kontinuierlichen Verbesserung können auftretende Probleme schnell allen Betroffenen mitgeteilt werden. Virtuelle Communities können dazu dienen, Prozesse weiterzuentwickeln.

Bei den insgesamt zehn Anbietern von BPM-Software, die an der Schwerpunktstudie teilgenommen haben, handelt es sich überwiegend um Hersteller von Modellierungswerkzeugen. Zwar geben fast alle an, die Ausführung zumindest von Freigabeworkflows u. ä. zu unterstützen, doch liegt der Schwerpunkt bei den meisten ganz deutlich im Bereich Modellierung und Analyse. Entsprechend beziehen sich die angebotenen Social BPM-Funktionen hauptsächlich auf die Phase der Prozessidentifikation und -modellierung. Zwar hat die Mehrheit der Anbieter erst seit etwa 2010 damit begonnen, dedizierte Social Software-Funktionalitäten in ihre Produkte einzubauen, doch bieten viele schon seit Langem Möglichkeiten zur Zusammenarbeit, etwa zentrale Repositories zur verteilten Modellierung. Oftmals wird in diesem Zusammenhang auch die Bezeichnung “Collaborative BPM” verwendet.

Praktisch alle betrachteten Produkte verfügen über Prozessportale, Kommentar- und Bewertungsfunktionen, Newsfeeds und Abonnements sowie eine Aufgabenverwaltung für die Tätigkeiten im Modell-Lebenszyklus. Vereinzelt wird eine vereinfachte Modellierung angeboten, z. B. mittels tabellarischer Darstellungen. Sie ermöglicht es auch Mitarbeitern, die keine Modellierungs-Schulungen erhalten haben, ihre eigenen Prozesse zu erfassen. Eher selten ist die Einbindung von Wikis und Blogs, wobei mehrere Hersteller die Integration von Wikis in ihre Modellierungsplattformen für die Zukunft angekündigt haben.

Neben diesen vorgegebenen Kategorien konnten die Hersteller weitere vorhandene soziale Funktionen nennen. Die Antworten reichen von Abstimmungsfunktionen über Wissens- und Ideen-Management bis hin zur Karriereplanung und zum Case Management. Diese weite Spanne macht deutlich wie vielfältig das Themengebiet Social BPM ist.

Zumeist sind die sozialen Funktionen komplett in das Prozessmanagementwerkzeug integriert. Mehrere Anbieter bieten stattdessen oder zusätzlich eine Integration mit anderen Plattformen an, allen voran mit Microsoft Sharepoint.

Was beim Durchlesen der Studie etwas erstaunt: Die Nutzung von sozialen Funktionen bei der Prozessausführung kommt fast gar nicht vor. Das liegt sicherlich in einem hohen Maße am Teilnehmerfeld, das kaum BPMS-Hersteller mit dem Schwerpunkt Ausführung enthält. Andererseits finden sich in der vorangegangenen Überblicksstudie unter den 28 Herstellern eine Reihe von BPMS-Anbietern. Man kann spekulieren, woran es liegt, dass davon kaum einer an der Social BPM-Studie teilgenommen hat. Entweder haben sie in diesem Bereich wenig anzubieten, oder das Thema Social BPM wird fast ausschließlich unter dem Aspekt der kollaborativen Modellierung gesehen. Dabei liegen in der Prozessausführung deutlich höhere Potenziale als im Bereich Prozessmodellierung, da wesentlich mehr Mitarbeiter Prozesse durchführen als Prozesse modellieren.

Möglicherweise wird aber immer noch eine starke Trennung zwischen stark strukturierten Prozessen und kollaborativen Aufgaben vorgenommen. Beim BPMS-Einsatz für stark strukturierte Prozesse werden soziale Funktionen außen vorgelassen, und für kollaborative Aufgaben werden eventuell unternehmensinterne soziale Netzwerke eingesetzt, aber unabhängig von BPMS. Da es jedoch sowohl bei stark strukturierten Prozessen oftmals die Notwendigkeit zur Zusammenarbeit gibt, als auch bei kollaborativen Tätigkeiten gewisse Steuerungsaufgaben automatisiert werden könnten, wäre eine stärkere Integration mit Sicherheit sinnvoll. Nützliche Ansätze hierzu liefern auch die viel diskutierten Konzepte zum Adaptive Case Management – doch hinkt auch in diesem Bereich die praktische Umsetzung noch hinter der Diskussion hinterher.


Jens Drawehn, Oliver Höß:
Business Process Management Tools 2014 – Social BPM.
Fraunhofer Verlag 2014.
Weitere Infos und Bestellung beim IAO

by Thomas Allweyer at October 14, 2014 09:50 AM

Drools & JBPM: Decision Camp - 2014 - What you are missing out on

Here is the Decision Camp 2014 agenda, so you can see what you are missing out on, if you aren't there :)

Tuesday

9:30 - 10:00 am
Registration
11 am - 12 pm
CTO Panel
Mark Proctor, Red Hat
Dr. Jacob Feldman, OpenRules
Carlos Serrano-Morales, Sparkling Logic
Moderated by James Taylor
12 - 1 pm
Lunch

General Sessions

We will host Best Practices sessions all day, presented by fellow practitioners or technology providers
General sessions will have break out tracks for rules writers and software architects
1 - 2 pm
An Intelligence Led Approach to Decision Management in Tax AdministrationDr. Marcia Gottgtroy, Inland Revenue New Zealand
Decision Tables as a Programming tool
Howard Rogers, RapidGen Software
Are Business Rules Obsolete?
Kenny Shi, UBER Technologies
2 - 3 pm
Customer Support Process AutomationErwin De Ley, iSencia Belgium
Building Domain-Specific Decision ModelsDr. Jacob Feldman, OpenRules
4 - 5 pm
Explanation-based E-Learning for Business Decision Making and Education 
Benjamin Grosof & Janine Bloomfield, Coherent Knowledge Systems

Wednesday

9:30 - 10 am
Registration

Vertical Day
Healthcare

Davide Sottara is our chair for the Healthcare day 

Vertical Day
Financial Services

10 - 11 am
TBA
Dr. Davide Sottara, PhD
12 - 1 pm
Lunch & Networking
1 - 2 pm
Cloud-based CEP in Healthcare 
Mariano Nicolas De Maio, PlugTree
Analytics for Payment Fraud
Carlos Serrano-Morales, Sparkling Logic
4 - 5 pm
Speaker Panel
All Speakers

by Mark Proctor (noreply@blogger.com) at October 14, 2014 08:08 AM

Drools & JBPM: Classic Games Development with Drools

I realised I didn't upload my slides from Decision Camp 2013, where I talked about using Games to learn Rule Based Programming. Sorry about that, here they are better late than never:
Learning Rule Based Programming Using Games

The talk provides a gentle introduction into rule engines and then covers a number of different games, all of which are available to run from the drools examples project.
  • Number Guess
  • Adventure Game
  • Space Invaders
  • Pong
  • Wumpus World
While there is no video for the presentation I gave, I have made videos for some of these games in the past. Although beware some of them may be a little out of date now, compared to the version that is in our current examples project.

by Mark Proctor (noreply@blogger.com) at October 14, 2014 02:33 AM

October 10, 2014

Drools & JBPM: 3 Days until Decision Camp 2014, San Jose (13-15 Oct)

Only 3 days to go until Decision Camp 2014 arrives, a free conference in the San Jose area for business rules and decision management practictioners. The conference is multi-track with 3 concurrent tracks at same time. The Decision Camp full agenda can be found here.

Like last year  RuleML will be participating and presenting, such as Dr. Benjamin Grosof. Which is a great opportunity to catch up on the latest happenings in the rules standards industry.

Last year I did a games talk, this year I'm doing something a little more technical, to reflect my current research. Here is my title and abstract.
Demystifying Truth Maintenance and Belief Systems
Basic Truth Maintenance is a common feature, available in many production rule systems, but it one that is not generally well understood. This talk will start by introducing the mechanics of rule engines and how that is extended for the common TMS implementation. It will discuss the limitations of these systems and introduce Justification based Truth Maintenance (JTMS) as a way to add contradictions, to trigger retractions. This will lead onto Defeasible Logic, which while sounding complex, facilitates the resolving of conflicting rules of premisses and contradictions, in a way that follows typical argumentation theory. Finally we will demonstrate how the core of this can be abstracted to allow pluggable beliefs, so that JTMS and Defeasible can be swapped in and out, along other systems such as Bayesian Belief Systems.




by Mark Proctor (noreply@blogger.com) at October 10, 2014 05:11 PM

Thomas Allweyer: BizDevs – die neue Art der Zusammenarbeit von Business und IT?

Cover process-driven applications with BPMNSeit kurzem ist die englische Ausgabe des Buchs über prozessgesteuerte Anwendungen von Volker Stiehl verfügbar. Interessant sind hierzu die Ausführungen des Autors im SAP Community Network, in denen er die Hintergründe und die Ansätze des Buchs beschreibt. Er trennt die Anwendung eine fachliche Prozess- und eine Implementierungs-Schicht, die beide mit BPMN beschrieben werden. Die Prozesse beider Schichten werden ausgeführt und interagieren miteinander. Dabei tragen Business und IT in gleichem Maße die Verantwortung für das ausführbare Modell der fachlichen Prozess-Schicht.

In seinem Blogbeitrag prägt Stiehl die Bezeichung “BizDevs” für die enge Kooperation von Business und Entwicklern (Developers), die für die Entwicklung prozessgetriebener Anwendungen nötig ist. Dabei lehnt er sich an den mittlerweile verbreiteten Ausdruck “DevOps” für die enge Integration von Software-Entwicklung und -Betrieb an. Wünschenswert wäre eine solche engere Zusammenarbeit allemal. Und der in dem Buch vorgestellte Architekturansatz könnte für viele BPMS-basierte Anwendungen wegweisend sein.


Stiehl, Volker:
Process-Driven Applications with BPMN
Springer 2014
Das Buch bei amazon

Besprechung der deutschen Ausgabe

by Thomas Allweyer at October 10, 2014 07:52 AM

October 06, 2014

Thomas Allweyer: Flexibleres BPM durch Graphdatenbanken

graphdatenbankGastbeitrag von Helmut Heptner. Wir erleben einen Übergang von traditionellen Geschäftsprozessmanagement-Systemen auf Basis relationaler Datenbanksysteme, die im Zeitalter von „Big Data“ den Anforderungen selbst nach Umstellung und Neuprogrammierung nicht gewachsen wären, hin zu Systemen basierend auf Graphdatenbanken. Vorreiter dieser Entwicklung sind populäre Social Network Anbieter wie Facebook, Google+ und Twitter, um nur einige Beispiele zu nennen. Allen gemeinsam sind große Teilnehmerzahlen und eine unüberschaubar große Anzahl an Beziehungen zwischen den Anwendern, die dennoch bei Bedarf in Sekundenschnelle zu den unterschiedlichsten Auswertungen kombiniert werden können.

Was ist eine Graphdatenbank und wie unterscheidet sie sich von klassischen Datenbanken?

Eine relationale Datenbank ist vereinfacht dargestellt eine Sammlung von Tabellen (den Relationen), in deren Zeilen Datensätze abgespeichert sind. Dieses Modell ist wegen der wachsenden Datenmengen und der ständig ansteigenden Zahl der bestehenden und möglichen Beziehungen zwischen den Daten für viele Bereiche, vor allem auch als Grundlage für Business Process Management Systeme (BPMS) nicht optimal geeignet. Berechnungen dauern umso länger, je größer die Datenmenge ist und je komplexer die Beziehungen zwischen den Daten sind.

Heutige Anforderungen werden durch Graphdatenbanken besser erfüllt. So genannte NoSQL-Technologien gewinnen rasend schnell an Beliebtheit. Der bekannteste und größte Anbieter dieser Technologie ist wohl Neo Technology mit Neo4j (http://www.neo4j.org), einer in Java implementierten Open-Source-Graphdatenbank. Die Entwickler selbst beschreiben Neo4j auf ihrer Webseite als eine transaktionale Datenbank-Engine, die Daten anstatt in Tabellen in Graphen speichert. Für einen praktischen Einstieg in Theorie und Praxis der Graphdatenbanken ist neo4j.org mit den dort bereitstehenden Informationen und Videos eine hilfreiche Anlaufstelle.

Bekanntestes Beispiel für die Anwendung von Graphdatenbanken ist der “Social Graph” von facebook. Dieser Graph nutzt Beziehungen zwischen Menschen. Die für einen Graph typischen Knoten repräsentieren Menschen, jedem Knoten wird dabei der Name der Person zugeordnet. Die Kanten (zweites Element von Graphdatenbanken) repräsentieren Beziehungen. Diese sind unter anderem durch einen Typ charakterisiert wie „gefällt mir“, „ist befreundet mit“, „gefällt mir nicht“ u.a. Einfache Beispiele für solche Graphen sind Stammbäume mit Familienangehörigen als Knoten und Kanten als Beziehungen zwischen Eltern und Kindern, Streckenpläne des öffentlichen Nahverkehrs, IT-Netzwerkstrukturen oder eben auch Prozessabläufe im BPM.

Emil Eifrem, CEO von Neo Technology, drückt das so aus: “Die weltweit innovativsten Unternehmen – darunter Google, Facebook, Twitter, Adobe und American Express, haben bereits auf die Graphen-Technologien umgestellt, um die Herausforderungen komplexer Daten im Kern anzugehen.”

Wenn es um große (“Big Data”), verteilte und unstrukturierte Datenmengen wie u.a. beim Geschäftsprozessmanagement geht, sind die Graphdatenbanksysteme traditionellen Datenbanksystemen meist haushoch überlegen.

Vorteile: Wieso und in welchen Szenarien Graphdatenbanken traditionelle Datenbanken übertrumpfen

Aktuell gibt es drei Trends in der Datenverarbeitung:

  • Durch die Zunahme an Anwendern, Systemen und Daten, die erfasst werden sollen, steigen die Datenmengen exponentiell an. Ausdruck dieser Entwicklung ist das seit etwa 2013 etablierte Buzz Word “Big Data”.
  • Die Datenmengen befinden sich nicht mehr auf nur einem zentralen System, sondern oft verteilt, um Redundanz sicherzustellen, Performance zu optimieren und die Auslastung zu steuern. Bekannte Beispiele für diese Entwicklung sind Amazon, Google und andere Cloud-Anbieter.
  • Datenstrukturen werden komplexer und vernetzter durch das Internet, soziale Netzwerke und offene Schnittstellen zu Daten aus verschiedensten Systemen.

Diese Trends sind mit etablierten Datenbanksystemen nicht mehr zu beherrschen. Graphdatenbanksysteme sind nicht nur eine Antwort auf diese Herausforderungen, diese machen sogar ihre Stärke aus:

  • Im Gegensatz zum Entwurf für eine relationale Datenbank ist die Datenmodellierung für einen Graphen deutlich einfacher. Im Grunde reicht es aus, Geschäftsprozessschritte als Elemente aufzuzeichnen, untereinander mit Pfeilen zu verbinden und abschließend Bedingungen und Eigenschaften zu definieren. Ein so erstelltes Datenmodell kann meist unverändert in die Datenbank übernommen werden. Dadurch muss man nicht länger Programmierer oder Datenbankspezialist sein. Alle Beteiligten können das Modell verstehen und an wechselnde Anforderungen anpassen, ohne die Integrität des Graphen und der zugehörigen Infrastruktur anzutasten.
  • Bei Geschäftsprozessen ist das flexible Datenmodell der Graphdatenbanken deutlich agiler als andere Systeme. Das liegt schon in der Natur der Sache begründet: Geschäftsprozesse werden als Graphen modelliert. Entscheidungen, die auf sich entwickelnden unternehmenskritischen Daten basieren, können mit Hilfe von Abhängigkeiten und Regeln ebenfalls abgebildet werden. Die Modellierung von Geschäftsprozessen durch Graphen unterstützt die Agilität, weil schnell und wiederholbar auf Prozessänderungen und Prozessmanagement reagiert werden kann.
  • Graphdatenbanken sind leistungsfähiger als konkurrierende Technologien, da sie die Beziehungen bei Abfragen nicht neu berechnen, sondern ihnen folgen. Der Grund: Die Beziehungen werden in der Graphdatenbank bereits beim Einfügen erstellt und stehen danach sofort zur Verfügung. Abfragen beginnen am Startknoten und folgen den Beziehungen zwischen den Knoten. Das ermöglicht beispielsweise Echtzeitabfragen und damit sofortige, exakte und nutzbringende Interaktionen. Die Graphdatenbanksysteme sind also vor allem deshalb auf dem Vormarsch, weil sie bestehende Relationen zwischen den Daten nicht erst zur Laufzeit berechnen, sondern einfach nutzen.
  • Weitere Stärken der Graphdatenbanken liegen im Design (Zuverlässigkeit) und in über Jahrhunderte ausgereiften mathematischen und erkenntnistheoretischen Grundlagen (Dateneinsicht).

Die Vorteile bei der Verwaltung von Geschäftsprozessen

Dass die Graphdatenbanken vor allem in großen Unternehmen mit vielen und komplexen Prozessen Vorteile bieten, liegt vor allem in der naturgemäßen Abbildbarkeit von Geschäftsprozessen und im flexiblen Design der Graphen. Erfahrungsgemäß unterliegen Geschäftsprozesse kontinuierlichen Änderungen und müssen “on-the-fly” angepasst werden. Bei Verwendung relationaler und anderer Datenbankmodelle ist das nicht ohne weiteres möglich, das BPMS muss unter Umständen durch Spezialisten aufwändig aktualisiert werden und steht möglicherweise nicht unterbrechungsfrei zur Verfügung. Bei einem BPMS, das auf einer Graphdatenbank-Technologie basiert wie Comindware Tracker, sind solche Änderungen dagegen unterbrechungsfrei im Live-Betrieb möglich.

Ein weiterer Vorteil liegt in der Anpassungsfähigkeit der Graphdatenbanken an Entwicklungen im Unternehmen. Geschäftsprozesse reifen mit der Zeit, immer mehr Mitarbeiter werden einbezogen, immer neue Bedingungen und Abhängigkeiten müssen berücksichtigt werden. In Graphdatenbanken werden einfach neue Knoten definiert, Übergänge mit Bedingungen hinzugefügt und Eigenschaften definiert – schon sind die dahinter liegenden Geschäftsprozesse korrekt abgebildet.

Hauptvorteil der Graphdatenbanken ist deren Fähigkeit, nicht nur die Daten zu verwalten und für Auswertungen bereitzustellen, sondern auch die Geschäftsregeln zu speichern. Damit wird der an den Prozessen beteiligte Mitarbeiter von Routinearbeiten entlastet und das Potential der Wissensarbeiter steht während der Prozesse zur Verfügung.

Fazit

Während bei relationalen Datenbankanwendungen oft genug auf Bekanntes zurückgegriffen werden kann, beschreiten Entwickler von Graphdatenbank-Anwendungen Neuland. Comindware, Anbieter der BPM-Lösung Comindware Tracker, wurde 2008 gegründet und ist ein junges, modernes Unternehmen, das sich den Anforderungen an moderne BPM-Systeme stellt und schnell auf 70 Mitarbeiter gewachsen ist. Die Gründer erkannten, dass die meisten Geschäftsprozesse unstrukturiert sind oder sich im Laufe der Zeit ändern und dass eine moderne BPM-Lösung diesen Anforderungen gewachsen sein muss. Comindware hat die Graphdatenbank Neo4j genutzt und auf deren Basis für eigene Lösungen die patentierte ElasticData Technologie entwickelt.


Autorenprofil

Der Autor und ehemalige Acronis Geschäftsführer Helmut Heptner ist seit März 2012 Geschäftsführer der Comindware GmbH und verantwortet das operative Geschäft in Zentraleuropa. Comindware zählt zu den Pionieren im adaptiven Business Process Management und beschäftigt derzeit weltweit über 70 Mitarbeiter. Die Lösung Comindware Tracker wird von Unternehmen wie Gazprom Avia und einem großen deutschen Autobauer eingesetzt.

by Thomas Allweyer at October 06, 2014 08:38 AM

October 03, 2014

Drools & JBPM: 10 Days until Decision Camp 2014, San Jose (13-15 Oct)

Only 10 days to go until Decision Camp 2014 arrives, a free conference in the San Jose area for business rules and decision management practictioners. The conference is multi-track with 3 concurrent tracks at same time. The Decision Camp full agenda can be found here.

Like last year  RuleML will be participating and presenting, such as Dr. Benjamin Grosof. Which is a great opportunity to catch up on the latest happenings in the rules standards industry.

Last year I did a games talk, this year I'm doing something a little more technical, to reflect my current research. Here is my title and abstract.
Demystifying Truth Maintenance and Belief Systems
Basic Truth Maintenance is a common feature, available in many production rule systems, but it one that is not generally well understood. This talk will start by introducing the mechanics of rule engines and how that is extended for the common TMS implementation. It will discuss the limitations of these systems and introduce Justification based Truth Maintenance (JTMS) as a way to add contradictions, to trigger retractions. This will lead onto Defeasible Logic, which while sounding complex, facilitates the resolving of conflicting rules of premisses and contradictions, in a way that follows typical argumentation theory. Finally we will demonstrate how the core of this can be abstracted to allow pluggable beliefs, so that JTMS and Defeasible can be swapped in and out, along other systems such as Bayesian Belief Systems.




by Mark Proctor (noreply@blogger.com) at October 03, 2014 04:12 PM

October 02, 2014

Drools & JBPM: Trace output with Drools

Drools 6 includes a trace output that can help get an idea of what is going on in your system,  and how often things are getting executed, and with how much data.

It can also help to understand that Drools 6 is now a goal based algorithm, using a linking mechanism to link in rules for evaluation. More details on that here:
http://blog.athico.com/2013/11/rip-rete-time-to-get-phreaky.html

The first thing to do is set your slf4j logger to trace mode:
<appender name="consoleAppender" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<!-- %l lowers performance -->
<!--<pattern>%d [%t] %-5p %l%n %m%n</pattern>-->
<pattern>%d [%t] %-5p %m%n</pattern>
</encoder>
</appender>

<logger name="org.drools" level="trace"/>

<root level="info"><!-- TODO We probably want to set default level to warn instead -->
<appender-ref ref="consoleAppender" />
</root>
</configuration>

Let's take the shopping example, you can find the Java and Drl files for this here:
https://github.com/droolsjbpm/drools/blob/master/drools-examples/src/main/resources/org/drools/examples/shopping/Shopping.drl
https://github.com/droolsjbpm/drools/blob/master/drools-examples/src/main/java/org/drools/examples/shopping/ShoppingExample.java


Running the example will give output a very detailed and long log of execution. Initially you'll see objects being inserted, which causes linking. Linking of nodes and rules is explained in the Drools 6 algorithm link. In summary 1..n nodes link in a segment, when object are are inserted.
2014-10-02 02:35:09,009 [main] TRACE Insert [fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac]
2014-10-02 02:35:09,020 [main] TRACE LinkNode notify=false nmask=1 smask=1 spos=0 rules=

Then 1..n segments link in a rule. When a Rule is linked in it's schedule on the agenda for evaluation.
2014-10-02 02:35:09,043 [main] TRACE  LinkRule name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]

When it eventually evaluates a rule it will indent as it visits each node, as it evaluate from root to tip. Each node will attempt to tell you how much data is being inserted, updated or deleted at that point.
2014-10-02 02:35:09,046 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE Segment 1
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,056 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]

You can use this information to see how often rules evaluate, how much linking and unlinking happens, how much data propagates and more important how much wasted work is done.  Here is the full log:
2014-10-02 02:35:08,889 [main] DEBUG Starting Engine in PHREAK mode
2014-10-02 02:35:08,927 [main] TRACE Adding Rule Purchase notification
2014-10-02 02:35:08,929 [main] TRACE Adding Rule Discount removed notification
2014-10-02 02:35:08,931 [main] TRACE Adding Rule Discount awarded notification
2014-10-02 02:35:08,933 [main] TRACE Adding Rule Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,009 [main] TRACE Insert [fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac]
2014-10-02 02:35:09,020 [main] TRACE LinkNode notify=false nmask=1 smask=1 spos=0 rules=
2014-10-02 02:35:09,020 [main] TRACE LinkSegment smask=2 rmask=2 name=Discount removed notification
2014-10-02 02:35:09,025 [main] TRACE LinkSegment smask=2 rmask=2 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,028 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=0 rules=[RuleMem Purchase notification], [RuleMem Discount removed notification], [RuleMem Discount awarded notification], [RuleMem Apply 10% discount if total purchases is over 100]
2014-10-02 02:35:09,028 [main] TRACE LinkSegment smask=1 rmask=1 name=Purchase notification
2014-10-02 02:35:09,028 [main] TRACE LinkSegment smask=1 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE LinkSegment smask=1 rmask=1 name=Discount awarded notification
2014-10-02 02:35:09,043 [main] TRACE LinkSegment smask=1 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,043 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,043 [main] TRACE Queue RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Queue Added 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,043 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:2:14633842:14633842:2:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Product@df4b72]
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:3:732189840:732189840:3:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Product@2ba45490]
2014-10-02 02:35:09,044 [main] TRACE Insert [fact 0:4:939475028:939475028:4:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@37ff4054]
2014-10-02 02:35:09,045 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,045 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Purchase notification]
2014-10-02 02:35:09,045 [main] TRACE LinkSegment smask=2 rmask=3 name=Purchase notification
2014-10-02 02:35:09,045 [main] TRACE LinkRule name=Purchase notification
2014-10-02 02:35:09,046 [main] TRACE Queue RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,046 [main] TRACE Queue Added 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,046 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Apply 10% discount if total purchases is over 100]
2014-10-02 02:35:09,046 [main] TRACE LinkSegment smask=2 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,046 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,046 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
2014-10-02 02:35:09,046 [main] TRACE Insert [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=2 stagedInsertWasEmpty=false
2014-10-02 02:35:09,046 [main] TRACE BetaNode insert=2 stagedInsertWasEmpty=false
2014-10-02 02:35:09,046 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE Segment 1
2014-10-02 02:35:09,047 [main] TRACE 1 [ AccumulateNode(12) ] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,047 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,056 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Segment 1
2014-10-02 02:35:09,057 [main] TRACE 2 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE 3 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Rule[name=Purchase notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE 4 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,057 [main] TRACE Segment 1
2014-10-02 02:35:09,057 [main] TRACE 4 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE rightTuples TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE 5 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE Segment 1
2014-10-02 02:35:09,058 [main] TRACE 5 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=2, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,058 [main] TRACE Fire "Purchase notification"
[[ Purchase notification active=false ] [ [fact 0:4:939475028:939475028:4:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@37ff4054]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark just purchased shoes
2014-10-02 02:35:09,060 [main] TRACE Fire "Purchase notification"
[[ Purchase notification active=false ] [ [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark just purchased hat
2014-10-02 02:35:09,061 [main] TRACE Removing RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,061 [main] TRACE Queue Removed 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,061 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE 6 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Segment 1
2014-10-02 02:35:09,061 [main] TRACE 6 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE 7 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Segment 1
2014-10-02 02:35:09,061 [main] TRACE 7 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,061 [main] TRACE Fire "Discount removed notification"
[[ Discount removed notification active=false ] [ null
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 0
2014-10-02 02:35:09,063 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,063 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,063 [main] TRACE Fire "Apply 10% discount if total purchases is over 100"
[[ Apply 10% discount if total purchases is over 100 active=false ] [ [fact 0:6:2063009760:1079902208:6:null:NON_TRAIT:120.0]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
2014-10-02 02:35:09,071 [main] TRACE Insert [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
2014-10-02 02:35:09,071 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,071 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,071 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Added 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE BetaNode insert=1 stagedInsertWasEmpty=true
2014-10-02 02:35:09,071 [main] TRACE LinkNode notify=true nmask=1 smask=1 spos=1 rules=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,071 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount awarded notification
2014-10-02 02:35:09,071 [main] TRACE LinkRule name=Discount awarded notification
2014-10-02 02:35:09,071 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Added 3 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
Customer mark now has a shopping total of 120.0
2014-10-02 02:35:09,071 [main] TRACE Removing RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Queue Removed 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,071 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE 8 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE Segment 1
2014-10-02 02:35:09,072 [main] TRACE 8 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,072 [main] TRACE rightTuples TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE 9 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Segment 1
2014-10-02 02:35:09,073 [main] TRACE 9 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,073 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,073 [main] TRACE Rule[name=Discount awarded notification] segments=2 TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE 10 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,073 [main] TRACE Segment 1
2014-10-02 02:35:09,073 [main] TRACE 10 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE rightTuples TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE 11 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE Segment 1
2014-10-02 02:35:09,074 [main] TRACE 11 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,074 [main] TRACE Fire "Discount awarded notification"
[[ Discount awarded notification active=false ] [ [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 10
2014-10-02 02:35:09,074 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Queue Removed 1 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Delete [fact 0:5:8996952:8996952:5:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Purchase@894858]
2014-10-02 02:35:09,074 [main] TRACE LinkSegment smask=2 rmask=3 name=Purchase notification
2014-10-02 02:35:09,074 [main] TRACE LinkRule name=Purchase notification
2014-10-02 02:35:09,074 [main] TRACE Queue RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,074 [main] TRACE Queue Added 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE LinkSegment smask=2 rmask=3 name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,075 [main] TRACE LinkRule name=Apply 10% discount if total purchases is over 100
2014-10-02 02:35:09,075 [main] TRACE Queue RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Queue Added 2 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Added Apply 10% discount if total purchases is over 100 to eager evaluation list.
Customer mark has returned the hat
2014-10-02 02:35:09,075 [main] TRACE Rule[name=Apply 10% discount if total purchases is over 100] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE 12 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Segment 1
2014-10-02 02:35:09,075 [main] TRACE 12 [ AccumulateNode(12) ] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE 13 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Segment 1
2014-10-02 02:35:09,075 [main] TRACE 13 [RuleTerminalNode(13): rule=Apply 10% discount if total purchases is over 100] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,075 [main] TRACE Delete [fact 0:7:874153561:874153561:7:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Discount@341a8659]
2014-10-02 02:35:09,075 [main] TRACE LinkSegment smask=2 rmask=3 name=Discount removed notification
2014-10-02 02:35:09,075 [main] TRACE LinkRule name=Discount removed notification
2014-10-02 02:35:09,075 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE Queue Added 3 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,075 [main] TRACE UnlinkNode notify=true nmask=1 smask=0 spos=1 rules=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,076 [main] TRACE UnlinkSegment smask=2 rmask=1 name=[RuleMem Discount awarded notification]
2014-10-02 02:35:09,076 [main] TRACE UnlinkRule name=Discount awarded notification
2014-10-02 02:35:09,076 [main] TRACE Queue RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Queue Added 2 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Rule[name=Purchase notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 14 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 14 [JoinNode(5) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Purchase]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 15 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 15 [RuleTerminalNode(6): rule=Purchase notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Removing RuleAgendaItem [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Queue Removed 1 [Activation rule=Purchase notification, act#=2, salience=10, tuple=null]
2014-10-02 02:35:09,076 [main] TRACE Rule[name=Discount removed notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE 16 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE Segment 1
2014-10-02 02:35:09,076 [main] TRACE 16 [NotNode(8) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,076 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 17 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 17 [RuleTerminalNode(9): rule=Discount removed notification] TupleSets[insertSize=1, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Fire "Discount removed notification"
[[ Discount removed notification active=false ] [ null
[fact 0:1:1455177644:1455177644:1:DEFAULT:NON_TRAIT:org.drools.examples.shopping.ShoppingExample$Customer@56bc3fac] ] ]
Customer mark now has a discount of 0
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Discount removed notification, act#=0, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Rule[name=Discount awarded notification] segments=2 TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 18 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 18 [JoinNode(10) - [ClassObjectType class=org.drools.examples.shopping.ShoppingExample$Discount]] TupleSets[insertSize=0, deleteSize=0, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE rightTuples TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE 19 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Segment 1
2014-10-02 02:35:09,077 [main] TRACE 19 [RuleTerminalNode(11): rule=Discount awarded notification] TupleSets[insertSize=0, deleteSize=1, updateSize=0]
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Discount awarded notification, act#=7, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Removing RuleAgendaItem [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]
2014-10-02 02:35:09,077 [main] TRACE Queue Removed 1 [Activation rule=Apply 10% discount if total purchases is over 100, act#=1, salience=0, tuple=null]

by Mark Proctor (noreply@blogger.com) at October 02, 2014 01:56 AM

October 01, 2014

Keith Swenson: Process Mining MOOC on Coursera

Whether you call it Process Mining, or Automated Process Discovery, nobody can deny that this field that combines big data analytics with business process is at the center of an important transformation in the workplace.  Process mining is useful to kickstart the implementation of predefined BPM diagrams, and it is also useful in unpredictable case management to see what has been done and whether it is compliant with all the rules.  What would you give to attend a complete, college level course on process mining?  What if it was free?

What if it was free, and it was being taught by Wil van der Aalst, arguably the foremost expert on workflow and process mining? What if it started next month, and you could attend from anyplace in the world?  Would you sign up?  I would.  And I have.

Prof van der Aalst from the Technical University of Eindhoven is teaching the course “Process Mining: Data science in Action” on Coursera starting Nov 12.   It is available to everyone everywhere.  It will last 6 weeks, and require about 4-6 hours of work per week.  It is not just an important part of data science, it is data science in action:

Data science is the profession of the future, because organizations that are unable to use (big) data in a smart way will not survive. It is not sufficient to focus on data storage and data analysis. The data scientist also needs to relate data to process analysis. Process mining bridges the gap between traditional model-based process analysis (e.g., simulation and other business process management techniques) and data-centric analysis techniques such as machine learning and data mining. Process mining seeks the confrontation between event data (i.e., observed behavior) and process models (hand-made or discovered automatically). This technology has become available only recently, but it can be applied to any type of operational processes (organizations and systems). Example applications include: analyzing treatment processes in hospitals, improving customer service processes in a multinational, understanding the browsing behavior of customers using a booking site, analyzing failures of a baggage handling system, and improving the user interface of an X-ray machine. All of these applications have in common that dynamic behavior needs to be related to process models. Hence, we refer to this as “data science in action”.

Many of you have seen one of my many talks on process mining, so you know that I believe this is an important, emerging field, one which Fujitsu has been a part of.  This course will be a chance to get below the surface of what we normally present in a 45-minute webinar, and more than you can get from reading the Process Mining Manifesto.  It is the first major MOOC on process mining.  There two reasons why this is notable:

  • First of all, BPM is becoming more evidence-based and the MOOC “Process Mining: Data science in Action” provides a concrete starting point for more fact-driven process management. It fits nicely with the development of data science as a new profession. There is a need for “process scientists”, now and in the future.
  • Second, it is interesting to reflect on MOOCs as a new medium to train BPM professionals and to make end-users aware of new BPM technologies. Such online courses allow for much more specialized BPM courses offered to thousands of participants.

I for one am looking forward to it.  Here is a short video to explain:


by kswenson at October 01, 2014 10:48 AM

September 30, 2014

Keith Swenson: BPM Poster

Here it is, the poster on the definition of BPM, with all the terms defined and explained!poster

This is based on the effort to gain consensus around a single common definition for BPM.  The definition by itself can not convey the meaning, if the terms are not explained.  You have seen this before in my post “One Common Definition for BPM.”  What we have done is to put all the information together into a single poster.

Click here to access the PDF of the poster

It looks best printed 36 inches by 24 inches (90cm by 60cm).  Most of us don’t have printers that big.  You can print it in Acrobat across multiple pieces of paper, and tape them together, but that can be a lot of work.  I am looking a way to allow you to simply order the poster and have it sent to you in a tube.  Once I have found that, I will update the post here.

Or come by the Fujitsu booth and ask for one.


by kswenson at September 30, 2014 11:45 AM

September 29, 2014

Keith Swenson: 3 Innovative Approaches to Process Modeling

In a post titled “Business Etiquette Modeling” I made a plea for modeling business processes such that they naturally deform themselves as needed to accommodate changes.  If we model a fixed process diagram, it is too fragile, and can be costly to manually maintain.  While I was at the EDOC conference and the BPM conference, I saw three papers that introduce innovations which are not completely defined solutions, they represent solid research on steps in the right direction.  Here is a quick summary of each.

(1) Implementation Framework for Production Case
Management: Modeling and Execution

(Andreas Meyer, Nico Herzberg, Mathias Weske of the Hasso Plattner Institute and Frank Puhlmann of Bosch, EDOC 2014 pages 190-199)

This approach is aimed specifically at production case management which means that it is to support a knowledge worker, who has to decide in real time what to do, however the kinds of things that such a worker might do are well known in advance.  The example used is that of a travel agent:  we can identify all the various things that a travel agent might be able to do, but they might combine these actions in an unlimited variety of ways.  If we draw a fixed diagram, we end up restricting the travel agent unnecessarily.  Think about it: a travel agent might book one hotel one day, book flights the next, book another hotel, then change the flights, then cancel one of the hotel bookings — it is simply not possible to say that there is a single, simple process that a travel agent will always follow.

Instead of drawing a single diagram, the approach suggested is to draw separate little process snippets of all the things that a travel agent might do.  Here is the interesting part: the same activity might appear in multiple snippets.  At run time the system combines the snippets dynamically based on conditions.  Each task in each snippet is linked to things that are required before that task would be triggers, so based on the current case instance information, a particular task might or might not appear as needed.  Dynamic instance data determines how the current process is constructed.  Activities have required inputs and produce outputs which is part of the conditions on whether they are included in a particular instance.

modelshotAbove are some examples of the process snippets that might be used for a travel agent.   Note that “Create Offer” and “Validate Offer” appear in two different snippet with slightly different conditions.  The ultimate process would be assembled at run time in a way that depends upon the details of the case.  I would have to refer you to the paper for the full details on how this works, but I was impressed by Andreas’ presentation.  I am not sure this is exactly the right approach, but I am sure that we need this kind of research in this direction.

(2) Informal Process Essentials

(C. Timurhan Sungur, Tobias Binz, Uwe Breitenbücher, Frank Leymann, Universtity of Stuttgart, EDOC 2014 page 200-209)

They describe the need to support “informal processes” which is not exactly what I am looking for.  Informal means “having a relaxed, friendly, or unofficial style, manner, or nature; a style of writing or conversational speech characterized by simple grammatical structures.”  What I am looking for are processes that are well crafted, official, meaningful, accurate, and at the same time responsive to external changes.   Formal/informal is not the same relationship as fixed/adaptive.  However, they do cover some interesting ideas that are relevant.  They specify four properties:

  1. Implicit Business Logic – the logic is not explicit until run time
  2. Different Relationships Among Resources – interrelated sets of individuals are used to accomplish more complex goals
  3. Resource Participation in Multiple Processes – people are not dedicated to a single project.
  4. Changing Resources – dynamic teams assembled as needed.

These properties look a lot like innovative knowledge worker pattern, and so this research is likely to be relevant.  They find the following requirements to be able to meet the need:

  1. Enactable Informal Process Representation
  2. Resource Relationships Definition
  3. Resource Visibility Definition
  4. Support for Dynamically Changing Resources

It seems that these approaches need to focus more on resources, roles, and relationships, and less on the specific sequences of activities.  Then from that, one should be able to generate the actual process needed for a particular instance.

The tricky part is how to find an expert who can model this.  Once of the reasons for drawing a BP diagram is that it is that drawing a diagram simplifies the job of creating the process automation.   Getting to the underlying relationships might be more accurate and adaptive, it is not simpler.

(3) oBPM – An Opportunistic Approach to Business Process Modeling and Execution

(David Grünert, Elke Brucker-Kley and Thomas Keller, Institute for Business Information Management, Winterthur, Switzerland, BPMS2 Workshop at BPM 2014)

This paper comes the closest to Business Etiquette Modeling, because it is specifically about the problem of creating a business with a strict sequence of user tasks.  This top-down approach tends to be over-constrained.  Since this is the BPM and Social Software Workshop, the paper tries to find a ways to be more connected to social technology, and to take a more bottom up approach.  They call it “opportunistic” BPM because the idea is that the actual process flow can be generated after the details of the situation are known.  Such a process can take advantage of the opportunities automatically, without needing a process designer to tweak the process every time.

The research has centered on modeling roles, the activities that those roles typically so, and also associating with the artifacts that are either generated or consumed.  They leverage an extension of the UML use case modeling notation, and it might look a little like this:

usecaseshotThe artifacts (documents, etc) have a state themselves.  When a particular document enters a particular state, it enables a particular activity for a particular role.  To me this shows a lot of promise.  Upon examination, there are weaknesses to this approach: modeling the state diagram for a document would seem to be a challenge because the states that a document can be in are too intricately tied to the process you want to perform.  It might be that our preconception of the process might overly restrict the state chart, which in turn limits what processes could be generated.   Also, there is a data model that Grünert admitted would have to be modeled by a data model expert, but perhaps there are a limited number of data models, and maybe they don’t change that often.  Somehow, all of this would have to be discoverable automatically from the working of the knowledge workers in order to eliminate the huge up front cost of having to model all this explicitly.  Again, I refer you to the actual paper for the details.

Net-Net

What this shows is that there is research being done to take process to the next level.  Perhaps a combination of these approaches might leave us with the ultimate solution: a system that can generate process maps on demand that are appropriate for a specific situation.  This would be exactly like your GPS unit which can generate a route from point A to point B give the underlying map of what is possible.  That is what we are looking for, is a way to map what the underlying role interactions could possibly be, along with a set of rules about what might be appropriate when.  Like in a GPS when you add a new highway, you might add a new rule, and all the existing business processes would automatically change if that new rule applies to that case.  We are not there yet, but this research shows promise.


by kswenson at September 29, 2014 04:48 PM

September 23, 2014

Thomas Allweyer: Von der Pyramide zum Haus – Neue Auflage des Praxishandbuch BPMN

Cover Praxishandbuch BPMN 2.0 - vierte AuflageVon dem weit verbreiteten Praxishandbuch BPMN 2.0 von Jakob Freund und Bernd Rücker ist kürzlich die vierte Auflage. Wesentlicher Unterschied zur dritten Auflage: Das bislang als Pyramide dargestellte camunda Methodenframework wurde geändert und wird nun in Form eines Hauses visualisiert. Ausschlaggebend für die Änderung waren einige Missverständnisse, die im Zusammenhang mit der Pyramide gelegentlich auftraten. Darin war die Ebene des technischen, d. h. des ausführbaren Prozessmodells unterhalb der Ebene des operativen Prozessmodells angesiedelt. Dies veranlasste viele Leser zur Auffassung, dass die technische Ebene zwangsläufig eine Verfeinerung der operativen Ebene sein müsse. Damit verbanden sie die Erwartung, dass die technischen Prozessmodelle immer nach den operativen Modellen entstehen müssten und dass die Verantwortungen für diese Ebenen säuberlich zwischen Fachabteilung und IT getrennt seien.

Diese Auffassungen entsprechen aber nicht den Intentionen der Verfasser. In der neuen Darstellung als Haus enthält das Dach nach wie vor die Ebene des strategischen Prozessmodells. Das Haus selbst besteht jedoch nur aus einem Stockwerk, dem operativen Prozessmodell. Es ist unterteilt in einen “menschlichen Prozessfluss” und einen “technischen Prozessfluss”, die sich beide auf derselben Ebene befinden. Der menschliche Prozessfluss wird von den Prozessbeteiligten durchgeführt. Die Abarbeitung des technischen Prozessflusses erfolgt durch ein Softwaresystem, typischerweise eine Process Engine. Zumeist bestehen enge Interaktionen zwischen menschlichem und technischem Fluss. Im Zuge einer agilen Prozessentwicklung werden beide Flüsse gemeinsam entwickelt, wobei Fach- und IT-Experten eng zusammenarbeiten.

Ansonsten sind im Buch nur kleinere Änderungen vorgenommen worden. Da der XML-basierte BPEL-Standard für ausführbare Prozesse stark an Bedeutung verloren hat, wird hierauf nur noch kurz eingegangen. Schließlich wurde noch ein kurzer Überblick über die Open Source Plattform “camunda BPM” eingefügt, die unter Leitung der Autoren entwickelt wurde.


Freund, J.; Rücker, B.:
Praxishandbuch BPMN 2.0. 4. Auflage.
Hanser 2014
Das Buch bei amazon.

by Thomas Allweyer at September 23, 2014 06:36 AM

September 22, 2014

BPM-Guide.de: Thanks for an awesome BPMCon 2014

Awesome location, awesome talks and most of all: awesome attendees. This year’s BPMCon was indeed the “schönste BPM-Konferenz” I’ve ever seen. Thank you so much to all who made it happen, including Guido Fischermanns for the moderation, Sandy Kemsley for her Keynote about the Zero-Code BPM Myth, all those BPM practitioners who presented their lessons learned, and also to the Birds of a Feather – Presenters (running Camunda on Raspberry PI and inside the coolest thing in the internet of things).

My personal award for the very best picture goes to Sandy – she took a snapshot of her auditorium during …

by Jakob Freund at September 22, 2014 05:46 PM

September 19, 2014

Drools & JBPM: The Birth of Drools Pojo Rules

A few weeks back I blogged about our plans for a clean low level executable mode, you can read about that here.

We now have our first rules working, and you can find the project with unit tests here. None of this requires drools-compiler any more, and allows people to write DSLs without ever going through DRL and heavy compilation stages.

It's far off our eventually plans for the executable model, but it's a good start that fits our existing problem domain. Here is a code snippet from the example in the project above, it uses the classic Fire Alarm example from the documentation.

We plan to build Scala and Clojure DSLs in the near future too, using the same technique as below.

public static class WhenThereIsAFireTurnOnTheSprinkler {
Variable<Fire> fire = any(Fire.class);
Variable<Sprinkler> sprinkler = any(Sprinkler.class);

Object when = when(
input(fire),
input(sprinkler),
expr(sprinkler, s -> !s.isOn()),
expr(sprinkler, fire, (s, f) -> s.getRoom().equals(f.getRoom()))
);

public void then(Drools drools, Sprinkler sprinkler) {
System.out.println("Turn on the sprinkler for room " + sprinkler.getRoom().getName());
sprinkler.setOn(true);
drools.update(sprinkler);
}
}

public static class WhenTheFireIsGoneTurnOffTheSprinkler {
Variable<Fire> fire = any(Fire.class);
Variable<Sprinkler> sprinkler = any(Sprinkler.class);

Object when = when(
input(sprinkler),
expr(sprinkler, Sprinkler::isOn),
input(fire),
not(fire, sprinkler, (f, s) -> f.getRoom().equals(s.getRoom()))
);

public void then(Drools drools, Sprinkler sprinkler) {
System.out.println("Turn off the sprinkler for room " + sprinkler.getRoom().getName());
sprinkler.setOn(false);
drools.update(sprinkler);
}
}

by Mark Proctor (noreply@blogger.com) at September 19, 2014 06:03 PM

September 18, 2014

Sandy Kemsley: What’s Next In camunda – Wrapping Up Community Day

We finished the camunda community day with an update from camunda on features coming in 7.2 next month, and the future roadmap. camunda releases the community edition in advance of the commercial...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 04:12 PM

Sandy Kemsley: camunda Community Day technical presentations

The second customer speaker at camunda’s community day was Peter Hachenberger from 1&1 Internet, describing how they use Signavio and camunda BPM to create their Process Platform, which is...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 02:59 PM

Sandy Kemsley: Australia Post at camunda Community Day

I am giving the keynote at camunda’s BPMcon conference tomorrow, and since I arrived in Berlin a couple of days early, camunda invited me to attend their community day today, which is the open...

[Content summary only, click through for full article and links]

by sandy at September 18, 2014 11:53 AM

September 17, 2014

Drools & JBPM: Decision Camp is just 1 Month away (SJC 13 Oct)

Decision Camp, San Jose (CA), October 2014, is only one month away, and is free for all attendees who register. Follow the link here, for more details on agenda and registration.

by Mark Proctor (noreply@blogger.com) at September 17, 2014 02:50 AM

September 16, 2014

Drools & JBPM: Workbench Multi Module Project Structure Support

The upcoming Drools and jBPM community 6.2 release will be adding support for Maven multi-module projects. Walter has prepared a video, showing the work in progress. While not shown in this video, the multi-module projects will have managed support to assist with automating version updates, releases, and will have full support for multiple version streams across GIT branches.

There is no audio, but it's fairly self explanatory. The video starts by creating a single project, and then showing how the wizard can convert it to a multi-module project. It then proceeds to add and edit modules, also demonstrating how the parent pom information is configured. The video also shows how this can work across different repositories without a problem - each with their own project structure page. Repositories can also be unmanaged, which allows for user created single projects, much as we have now with  6.0 and 6.1, which means previous repositories will still continue to work as they did before.

Don't forget to switch the video to 720p, and watch it full screen. Youtube does not always select that by default, and the video is fuzzy without it.




by Mark Proctor (noreply@blogger.com) at September 16, 2014 10:25 PM

September 15, 2014

Sandy Kemsley: Survey on Mobile BPM and DM

James Taylor of Decision Management Solutions and I are doing some research into the use and integration of BPM (business process management) and DM (decision management) technology into mobile...

[Content summary only, click through for full article and links]

by sandy at September 15, 2014 04:52 PM

Drools & JBPM: Setting up the Kie Server (6.2.Beta version)

Roger Parkinson did a nice blog on how to setup the Kie Server 6.2.Beta version to play with.

This is still under development (hence Beta) and we are working on improving both setup and features before final, but following his blog steps you can easily setup your environment to play with it.

Only one clarification: while the workbench can connect and manage/provision to multiple remote kie-servers, they are designed to work independently and one can use REST services exclusively to manage/provision the kie-server. In this case, it is not necessary to use the workbench.

Here are a few test cases showing off how to use the client API (a helper wrapper around the REST calls) in case you wanna try:

https://github.com/droolsjbpm/droolsjbpm-integration/blob/master/kie-server/kie-server-services/src/test/java/org/kie/server/integrationtests/KieServerContainerCRUDIntegrationTest.java

https://github.com/droolsjbpm/droolsjbpm-integration/blob/master/kie-server/kie-server-services/src/test/java/org/kie/server/integrationtests/KieServerIntegrationTest.java

Thanks Roger!

by Edson Tirelli (noreply@blogger.com) at September 15, 2014 03:59 PM

Thomas Allweyer: Soll man noch das “klassische” BPMS-Konzept vermitteln?

Bislang habe ich sehr positive Reaktionen auf mein neues BPMS-Buch erhalten. Unter anderem kam aber auch die Frage auf, ob das klassische, Prozessmodell-getriebene BPMS-Konzept, das ich in dem Buch mit vielen Beispielprozessen erläutere, überhaupt noch zeitgemäß ist. Sollte man sich angesichts eines immer größeren Anteils an Wissensarbeitern nicht stattdessen lieber mit neueren und flexibleren Ansätzen beschäftigen, wie Adaptive Case Management (ACM)?

Sicherlich muss man die klassische BPMS-Philosophie kritisch hinsichtlich ihrer Eignung für verschiedene Einsatzbereiche hinterfragen. Für die meisten schwach strukturierten und wissensintensiven Prozesse ist es tatsächlich nicht sinnvoll und meist auch gar nicht möglich, den kompletten Ablauf im Voraus in Form eines BPMN-Modells festzulegen. Für solche Prozesse ist Adaptive Case Management besser geeignet. Das heißt aber nicht, dass der herkömmliche BPM-Ansatz komplett überholt wären. Das Buch soll einen fundierten Einstieg in das Themengebiet bieten. Es gibt eine Reihe von Gründen, weshalb ich mich darin auf Prozessmodell-basierte BPMS beschränkt habe:

  • Die überwiegende Mehrzahl aller BPMS, die heute auf dem Markt verfügbar sind, verwenden den Prozessmodell-basierten Ansatz. Zwar gibt es durchaus reine ACM-Systeme, doch sind diese zumindest momentan noch in der Minderheit. Häufig wird Case Management auf klassischen BPM-Plattformen als zusätzliche Funktionalität angeboten.
  • Das klassische BPM-Konzept ist in Theorie und Praxis recht weit entwickelt. Die entsprechenden Systeme haben einen hohen Reifegrad erreicht. Es handelt sich somit um einen etablierten Ansatz, der eine Grundlage dieses Fachgebiets darstellt.
  • Bei ACM hingegen handelt es sich um einen recht neuen Ansatz, der sich noch stark in Entwicklung befindet. Daher ist es schwierig, entsprechende Grundlagen zu identifizieren, die nicht bereits in wenigen Jahren überholt sein werden.
  • Die Kenntnis der klassischen BPM-Grundlagen hilft beim Verständnis von ACM und anderen neuen Ansätzen. So finden sich Konzepte wie Definitionen und Instanzen von Prozessen auch bei ACM in Form von Fall-Templates und Fällen wieder. Ebenso sollte man etwa verstehen, worum es bei der Korrelation von Nachrichten geht. Ob eine Nachricht einer Prozessinstanz oder einem Fall zugeordnet wird, stellt keinen großen Unterschied dar. Manche Vorteile des ACM-Ansatzes erschließen sich erst richtig, wenn man sie mit dem klassischen Konzept vergleicht, wo z. B. Mitarbeiter während der Prozessdurchführung nicht so einfach ganz neue Bearbeitungsschritte hinzufügen können.
  • Auch bei der Bearbeitung von Fällen gibt es oftmals Teile, die in Form von strukturierten Prozessen ablaufen. Das klassische BPM-Konzept wird daher wohl nicht komplett abgelöst. Stattdessen werden sich ACM und BPMS-Funktionalitäten sinnvoll ergänzen.
  • Die Zahl der strukturierten und standardisierten Prozesse dürfte in Zukunft keineswegs sinken. Zum einen gibt es immer mehr komplett automatisierte Prozesse, die zwangsläufig sehr stark strukturiert sind – zumindest solange, bis sich hochgradig intelligente und autonome Software-Agenten auf breiter Front durchgesetzt haben. Zum anderen müssen immer mehr Prozesse skalierbar sein um effizient über das Internet abgewickelt werden zu können. Hierzu müssen sie stark strukturiert und standardisiert sein. Wenn jemand bei einem großen Internethändler etwas bestellt, dann wird nicht erst individuell darüber nachgedacht, wie man die Wünsche dieses Kunden erfüllen kann. Es läuft vielmehr ein komplett standardisierter Prozess ab. Es mag sein, dass Prozesse mit starker Mitarbeiterbeteiligung künftig verstärkt mit Hilfe von ACM unterstützt werden. Klassische Process Engines wird man dann eher bei der Steuerung komplett automatisierter Prozesse finden. Die Zahl der Einsatzmöglichkeiten wird damit aber nicht geringer.

Wer sich als für den Einstieg zunächst mit den Grundlagen klassischer BPMS beschäftigt, liegt daher auf jeden Fall richtig. Und am besten versteht man diese, wenn man sie selbst ausprobiert. Daher gibt es die zahlreichen Beispielprozesse zu dem Buch, die man herunterladen und mit dem Open Source-System “Bonita” ausführen kann.

by Thomas Allweyer at September 15, 2014 10:54 AM