Planet BPM

March 27, 2015

Sandy Kemsley: Going Beyond Process Modeling, Part 1

I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for...

[Content summary only, click through for full article and links]

by sandy at March 27, 2015 02:55 PM

March 26, 2015 New Camunda Usergroup in Australia

Camunda is spreading, also in Australia. The first usergroup is already evolving, and they will meet for the second time next week.

If you would like to swing by and meet some other Camunda users, here is what you need to know:

Date: Tuesday, March 31 Time: 5pm Melbourne Time Place: Tuscan Bar – 79 Bourke Street, Melbourne

This time you can also meet Bernd Frey, one of our senior consultants who is currently down under and engaged in a fascinating Camunda project.

Many thanks to Phillip Spartalis, who is organizing this. He has agreed to share his email address here in case …

by Jakob Freund at March 26, 2015 01:29 AM

March 23, 2015

Thomas Allweyer: Praxisforum zu 20 Jahren Prozessmanagement

Seit dem Erscheinen des wegweisenden Buchs “Reengineering the Corporation” von Hammer und Champy sind schon über 20 Jahre vergangen. Daher widmet sich das Praxisforum BPM & ERP in einer ganztägigen Veranstaltung der Entwicklung des Prozessmanagements in diesen zwanzig Jahren und dem heute erreichten Stand. Neben der historischen Rückschau stehen auch zahlreiche Praxisvorträge auf dem Programm, u. a. von MAN, dem Landschaftsverband Rheinland, BASF, Globus und Bayer. Zu den behandelten Themen gehören beispielsweise Process Excellence, Prozesslandkarten, Datenmanagement, Prozessautomatisierung und ERP-Einführung.
Die Tagung findet am 16. Juni in der Nähe von Koblenz statt. Das vollständige Programm und ein Anmeldeformular finden sich hier.

by Thomas Allweyer at March 23, 2015 09:21 AM

March 21, 2015 Review: Camunda Community Day in London

Yesterday we had our first Camunda Community Day in the UK. Thanks to our friends at 6point6 who organized this, we could meet in the famous Royal Institution. This was definitely the most decent location we had for a communiy meeting so far!

It was a great half day of presentations, discussions and networking. Most of the attendees already knew existing BPM products, and when I described the Zero-Code BPM Myth they immediately knew what I was talking about. I also gave a little BPMN crash-course, and I did not use a single slide, but just live-modeled everything I explained …

by Jakob Freund at March 21, 2015 09:56 AM

March 18, 2015

Bruce Silver: Process-Driven Applications: A New Approach to Executable BPMN

One of the singular successes of BPM technology is a common language – BPMN – used both for process modeling and executable design.  At least in theory….   In reality, the BPMN created by the business analyst to represent the business requirements for implementation often bears little resemblance to the BPMN created by the BPMS developer, which must cope with real-world details of application integration.  That not only weakens the business-IT collaboration so central to BPM’s promise of business agility, but it leads to BPMN that must be revised whenever any backend system is updated or changed.  It doesn’t have to be that way, according to an interesting new book by Volker Stiehl of SAP, called Process-Driven Applications with BPMN (

Process-driven applications are executable BPMN processes with these characteristics:

  1. Strategic to the business, not situational apps.  They must be worth designing for the long term.
  2. Containing a mix of human and automated activities, not human-only or straight-through processing.
  3. Span functional and system boundaries, integrating with multiple systems of record.
  4. Performed (with local variations) in multiple areas of the company.
  5. Subject to change over time, either in business functionality or in underlying technical infrastructure, or both.

Stiehl identifies the following design objectives of process-driven applications:

  • Process-driven applications should be loosely coupled with the called back-end systems. They should be as independent as possible from the system landscape. After all, the composite does not know which systems it will actually run against.
  • Process-driven applications, because of their independence, should have their own lifecycles, which ideally are separate from the lifecycles of the systems involved. It is also desirable that the versions of a composite and the versions of the called back-end systems are independent of one another. This protects a composite from version changes in the involved applications.
  • Process-driven applications should work only with the data that they need to meet their business requirements. The aim is to keep the number of attributes of a business object within a composite to a minimum.
  • Process-driven applications should work with a canonical data type system, which enables a loose coupling with their environment at the data type level. They intentionally abstain from reusing data types and interfaces that may already exist in the back-end systems.
  • Process-driven applications should be non-invasive. They should not require any kind of adaptation or modification in the connected systems in order to use the functionality of a process-driven application.  Services in the systems to be integrated should be used exactly as they are.


Let’s look at a very simple example, an Order Booking process.  Here is the process model created by the business analyst in conjunction with the business.  Upon receipt of an order from the customer, an on order entry clerk enters it into a form, from which the price is calculated.  Then an automated task charges the credit card.  If the charge does not succeed, a customer service rep contacts the customer to resolve the problem.  Once the charge succeeds, the process books the order in the ERP system, another automated task, and ends by returning a confirmation message to the customer.  If the charge fails and cannot be resolved, the process ends by sending a failure notice to the customer.


In the conventional BPMS scenario, here is the developer’s view.  It looks the same except that the simple service tasks have been replaced by subprocesses, and the service providers – the credit card processing and ERP booking services – are shown as black box pools with the request and response messages visible as message flows.  There are 2 reasons the service tasks were changed to subprocesses: One is to accommodate technical exception handling.  What happens if the service returns a fault, or times out?  Some system administrator has to intervene, fix the problem, and retry the action.  The BA isn’t going to put that in the BPMN, but it needs to be in the solution somewhere.  The second reason is to allow for asynchronous calls to the services, with separate send and receive steps.  You also notice that Book order is interacting with more than one ERP system.  Don’t you wish there was one ERP system that handled everything the customer could buy?  Well sometimes there is not, so the process must determine which one to use for each instance.  Actually an order could have some items booked in system A and other items booked in system B.  The business stakeholders, possibly even the business analyst, may be unaware of these technical details, but the developer must be fully aware.


Here is the child level of Charge credit card.  It is invoked asynchronously, submitting a charge request and then waiting for a response.  If the service times out, an administrator must fix the issue and retry the charge.  The service returns either a confirmation if the charge succeeds or an error message if it fails.  Here we modeled this as two different messages; in other circumstances we might have modeled these as two different values of a single message.   If you remember your Method and Style, the child level has two end states, Charge ok and Charge failed, that match up with the gateway in the parent level.


And here is the child level of Book order.  A decision task needs to parse the order and for each order item determine is it handled by system A or system B.  Then there are separate booking subprocesses to submit the booking request and receive the confirmation for each order item in each system.  Finally an automated task consolidates all the item confirmations into an overall order confirmation.

So you already can see some of the problems with this approach.  The developer’s BPMN is no longer recognizable by the business, possibly even by the BA.  This reduces one of BPMN’s most important potential benefits, a common process description shared by business and IT.  Second, the integration details are inside the process model.  Whenever there are changes to the interface of either the credit card service, ERP system A, or system B, the process model must be changed as well.  If this process is repeated in various divisions of the company, using different ERP and credit systems, those process models will all be different.  And third, this tight binding of process activities to a SOA-defined interface to specific application systems means the process is manipulating heavyweight business objects that specify many details of no interest to the process.

All three of these problems illustrate what you could call the SOA fallacy in BPM.  In theory, SOA is supposed to maximize reuse of business functions performed on backend systems.  In practice, SOA has succeeded in enabling more consistent communications between processes and these systems, but the reuse as imagined by SOA architects has been difficult to achieve.  The actual reuse by business processes is frequently defeated by variation and change in the specific systems that perform the services.  So, instead the PDA approach seeks to maximize the actual reuse of business-defined functionality provided by services, not across different processes but across variations of the same process, caused by variation and change in the enterprise system landscape.  This is a radical difference in philosophy.

In his book, Volker Stiehl calls this new approach Process-Driven Architecture.  This architecture layers the process design and removes all integration details from the business process model, representing the Process-Driven Application, or PDA. The services specified in the PDA process make no reference to the actual interfaces and endpoints of specific backend systems.  Instead each service in the PDA process defines and references a fixed service contract interface, specifying just the elements needed to perform the required business function, regardless of the actual interface of the backend systems required.  This service contract interface is essentially defined by the business process – by the business, not the SOA architect or integration developer.

The data elements and types used in that interface are based on a canonical data model, not the elements and types specific to a backend system.  Remember, the object is not reuse of SOA endpoints and service interfaces across business processes, but reuse of this particular service contract interface across the system differences found in various divisions of the enterprise and across changes in these systems over time.  Ideally the PDA process, from a business perspective, is universal across the enterprise and stable over time.

Translation from this stable service contract interface, based on canonical data, to the occasionally changing interfaces and data of real backend systems is the responsibility of the Service Contract implementation layer.  What makes this nice is that BPMN can be used in this layer as well.  Each integration service call from the PDA process is represented in the architecture by a Service Contract Implementation (SCI) process defined in BPMN.  This process effectively binds the system-agnostic call by the PDA process to a specific system or systems used to implement the service.  It performs the data mapping required, issues the requests and waits for responses, and handles technical exceptions.  The PDA process doesn’t deal with any of this.  Moreover, the SCI process is non-invasive, meaning it should not require any change to existing backend systems or existing SOA services.  Everything required to link the PDA to these real systems and services must be designed into the SCI process.

The beauty of this architecture is that, unlike the conventional approach, the PDA process model is the same for the business analyst and the integration developer.  All of the variation and change inherent in the enterprise system landscape is encapsulated in the SCI process; the PDA process doesn’t change.  Effectively the executable process solution becomes truly business-driven.


Here is a diagrammatic representation of the architecture. The steps in a PDA process, modeled in BPMN, represent various user interfaces and service calls. When the service call is implemented by a backend system, a business partner, or an external process, its interface – shown here as the Service Contract Interface – is defined by the PDA not by the external system or process. For each call to the Service Contract Interface, a Service Contract Implementation process is defined, also in BPMN, to communicate with the backend system, trading partner, or external process, insulating the PDA process from all those details. The Service Contract Interface, based on a canonical data model, defines the interface between the PDA process and the SCI process. This neatly separates the work of the process designer, creating the executable PDA process, from that of the integration designer working in the Service Contract Implementation layer.  Since the PDA process and the SCI processes are both based on BPMN, the simplest thing is use the same BPM Suite process engine for both, with communication between them using standard BPMN message flows.


Here is what it looks like with our simple order booking process reconfigured using Process-Driven architecture. The details of Charge credit card are no longer modeled in a child level diagram of the business process, but instead are modeled as a separate SCI process.  The charge credit card activity in the PDA process is truly a reusable business service.  It defines the service contract interface using only the business data required: the cardholder name, card number and expiration date, charge amount, return status, and confirmation number. It doesn’t know anything about how or where the credit card service is performed, whether it is performed by a machine or a person, the format of the data inputs and outputs, or the communications to the service provider. All of those integration details could change and the PDA process would not need to change.  The SCI process maps the canonical request to the input parameters of the actual service provider, issues the request, receives the response, maps that back to the canonical response format, and replies to the PDA service task.


If the card service is temporarily unavailable or fails to return a response within a reasonable time period, a system administrator may be required to resolve the problem and resubmit the charge. The business user is not involved in this, and it should not be part of the PDA process. This too is part of the SCI process.  However, if the service returns a business exception, such as invalid credentials or the charge is declined, this must be handled by the business process, so this detail is part of the PDA. And in fact, it should be part of the business analyst’s model worked out in conjunction with the business.

This 2-layer architecture, consisting of a PDA process layer and a Service Contract Implementation layer, succeeds in isolating the business process model from the details of application integration. But there are some problems with it…

  • First, the BPMN engine running the PDA and SCI process must be able to connect directly to all of the backend systems, trading partners, and external processes involved. In many large-scale processes, in particular core processes, this is difficult if not impossible to achieve.
  • Second, a single SCI process may involve multiple backend systems and must be revised whenever any of them changes.
  • And third, things like flexible enterprise-scale communications, guaranteed message delivery, data mapping and message aggregation are handled more easily, reliably, and faster in an enterprise service bus than in a BPMN process. So we’d like to leverage that if possible.

The solution then is to split the SCI Layer in two, creating a 3 layer architecture.  The SCI process is divided into a stateful integration process and one or more stateless ESB processes.  Stateless here means short-running and able to run as a single unit of work or transaction. A stateless process cannot include human tasks, waiting for a message or a join, anything that takes time and requires maintaining the state of the instance. ESBs are designed to execute these very well. A single ESB process can send a message (or possibly N messages all at once) but does not wait for a response. A separate ESB process is instantiated with each response.

The stateful integration process can be long-running, meaning it can contain human tasks, it can wait for a message, or wait for parallel paths to join.  The stateful integration process can process a correlation id, linking an instance of the stateful process to the right process and activity instance in PDA. The stateless ESB processes cannot do this. More on that in a minute.


To illustrate this let’s look at the activity Book Order, which books the order in the ERP system and generates a confirmation for the customer. Recall that this is what it looked like in our conventional BPMN. We have two ERP systems, and the process needs to look up which system applies to each order item before issuing the booking request. Here you can see some of the defects we have just discussed: The process model must map to the details of system A and system B; the system administrator handling stuck booking requests must be a BPMS user; etc.


Here is what it looks like in the 3-layer PDA architecture. The PDA process is almost exactly as before. The subprocess book order is simply an asynchronous send followed by a wait for the response, a simple long-running sevice call. The request message – the order – and the confirmation response message are defined by the PDA, that is, by the business, without regard to the parameters required by the ERP systems. Book order is a reusable business service in the sense that it can be used with any booking system, now or in the future.

The messy integration work is left to the SCI layer, here divided into a stateful integration process and 2 stateless ESB processes, one for sending and the other for processing the response. Here is what they do… Upon receipt of the order message from the PDA process, the SCI process first parses the order and looks up the ERP system associated with each order item. Really it just needs a count of the receivers of the ERP booking request message, so the process knows how many responses to wait for. This could be a service task or a business rule task depending on the implementation. Let’s say this receiver list is simply put in the header of the order message, which is then sent, using a send task, to the stateless ESB send process.

The ESB has the job of dealing with the details of the individual systems. First it splits the order message into separate variables for each system, that is, for each instance of the multi-instance Book in ERP system. For each system, this activity first looks up the interface of the request message, then maps the canonical order data to the system request parameters, providing any additional details required by the system interface, and then sends the ERP booking request to the system. A basic principle of the PDA approach is that the call to the external system or service is non-invasive, i.e. it must not be modified in any way in order to be integrated with the process. The integration process must accommodate its interface as-is.

I have shown the ESB process using BPMN but typically ESBs have their own modeling language and tooling. That’s fine. Since it’s a stateless process by definition, the BPMN is not asking the ESB to do anything that cannot be done in its native tooling.

The ERP system sends back its response, which triggers a second stateless process, ESB Receive. We’ve marked this as a multi-participant pool, meaning N instances of it will be created for a single order. The ESB does not know the count. Each ERP booking response simply triggers a new instance. Now here is something interesting: correlation. We need to correlate the booking response to a particular booking request. In a stateful process you can save a request id and use it to match up with the response. But the ESB processes are stateless. The Send process can’t communicate its request id to the Receive process. So the Receive process must parse the order content to uniquely determine the order instance. The receive process must also look up the service interface of the ERP system sending the response and then map the response back to the format expected by the stateful SCI process, the same for all of the called systems.

Now back in the stateful integration process, the subprocess Receive booking response receives the message. Because it is stateful, this process can correlate message exchanges, so the message event is triggered only by a receiver response for this particular process instance. This booking service normally completes immediately, so if no response is received in one minute, something is wrong. Here we’ll say a system administrator resolves the problem and manually books the order in the external system. Even though this human intervention is required it is outside the scope of the business user’s concerns, and not part of the PDA process. This multi-instance subprocess waits for a message from each receiver. Recall that we derived the count in the first step of this process. So it is quite general. It works for any number of receivers, as long as a receiver can be determined each order item. This process doesn’t even need to know the technical details of the receiver, its endpoint, interface, or communications methods. All of that is delegated to the ESB. The stateful integration process does need to define a way to extract a unique instance id out of the original order message content, as this logic will be used by the ESB Receive process to provide correlation.

Once the ERP booking response is received, it is used to update a cross-reference table. What is that? This is a table that provides a uniform means of confirmation regardless of the physical system used to book each item. Each of those systems will provide a confirmation string in its own format. The Xref table links each system-specific confirmation string with the confirmation string for the order as a whole, in the format defined by the PDA.

One final detail before we leave this diagram… the message flows. The message flows linking PDA process to the stateful SCI process, as well as those linking the stateful SCI process to the ESB process, are standard message flows as implemented by the BPMS for process-to-process communication within the product and for reading and writing message queues. The message flows between the ESB processes and the backend systems are more flexible. The transport and message format are probably determined by the external system, whether that is a web service call, file transfer, EDI or whatever the ESB can handle. This communications complexity is completely removed from the BPMS, which is the strength of the ESB approach.

There is a bit more to it, but if you are interested, I suggest you get the book.

The post Process-Driven Applications: A New Approach to Executable BPMN appeared first on Business Process Watch.

by bruce at March 18, 2015 09:20 PM

Keith Swenson: ‘Fail fast, fail often’ is essential advice for innovators

Yes, it is a negative statement, but in uttering it, you desensitize the team to a harmful fear of failure.

I am responding today to an article in The Globe and Mail titled “‘Fail fast, fail often’ may be the stupidest business mantra of all time.”  The article criticizes this saying on two points.  First, business people have a hard time saying it, and don’t come across as credible.  Second, the statement focuses on the negative which is … negative.   The author proposed an improved statement: “Succeed fast, adjust or move on.”

Recasting it like this shows that the author does not understand the point of making the statement in the first place.  Psychologists have demonstrated that people naturally have a bais against loss.  Given a carefully designed test, people will value $100 loss as equivalent to $200 of gain.  That is, people are naturally very loss averse.  Irrationally so.  People naturally tend to form groups that tend to punish failure as a way to prevent even small failures.

Saying “succeed fast” does not really give the option for failure.  Fear of failure has a powerful inhibiting effect which needs to be countered, particularly in an organization that strives to be innovative.

While success is the goal, there is one thing worse than failure, and that is doing nothing.  If you do nothing, you always lose.  On average, people will come up with good ideas, but not always.  If you fear failure, if you have a culture that punished failure, then members will not try.  They will wait until they are sure they have a success, and only then act.  Many many opportunities will be lost because the risk of failure might be a fraction of the benefit of success, but that risk prevents action.

When a leader says “fail fast, fail often” they make it clear that failure is a word that we can talk about.  Failure is no longer taboo.  It may be hard for them to say it — nobody said that leadership was easy.  They want success, they don’t like talking about failure, but doing so makes it clear that the culture would rather see you try and sometimes fail, than it is to not try at all.

Some say that you can only learn from your mistakes.  But you can’t learn if you don’t make any mistakes.  If people are too fearful to try, you won’t have a learning organization.

Another silicon valley statement is: “Don’t ask for permission, ask for forgiveness.”  This focuses on the negative as well, but it is essential to the spirit of innovation that you make it clear that success is not required 100% of the time, and action is valued over inaction.

While the ‘fail fast, fail often’ statement is negative, it inoculates the group against a crippling fear of failure.  Far from the stupidest mantra of all time, it shows depth of wisdom and skill of leadership.  The writer of this article clearly does not understand the dynamics of an innovative organization.

(See “When Thinking Matters in the Workplace” chapter 4: “Agile Management” on Amazon)

by kswenson at March 18, 2015 03:33 PM

March 16, 2015

Sandy Kemsley: Effektif BPM Goes Open Source

On a call with Tom Baeyens last week, he told me about their decision to turn the engine and APIs of Effektif BPM into an open source project: not a huge surprise since he was a driver behind two...

[Content summary only, click through for full article and links]

by sandy at March 16, 2015 11:05 AM

March 13, 2015

Drools & JBPM: Reactive Incremental Graph Reasoning with Drools

Today Mario got a first working version for incremental reactive graphs with Drools. This means people no longer need to flatten their models to a purely relational representation to get reactivity. It provides a hybrid language and engine for both relational and graph based reasoning. To differentiate between relational joins and reference traversal a new XPath-like was introduced, that can be used inside of patterns. Like XPath it supports collection iteration.

Here is a simple example, that finds all men in the working memory:
Man( $toy: /wife/children[age > 10]/toys )

For each man it navigates the wife reference then the children reference; the children reference is a list. For each child in the list that is over ten it will navigate to its toy's list. With the XPath notation if the leaf property is collection it will iterate it, and the variable binds to each iteration value. If there are two children over the age of 10, who have 3 toys each, it would execute 6 times.

As it traverses each reference a hook is injected to support incremental reactivity. If a new child is added or removed, or if an age changes, it will propagate the incremental changes. The incremental nature means these hooks are added and removed as needed, which keeps it efficient and light.

You can follow some of the unit tests here:

It's still very early pre-alpha stuff, but I think this is exciting stuff.

by Mark Proctor ( at March 13, 2015 11:34 PM

March 12, 2015

Thomas Allweyer: Kontinuierliches Prozessmanagement ist vielfach immer noch Mangelware

Cover Studie Reifegrade GPM 2015Wer sich regelmäßig mit dem Thema Prozessmanagement beschäftigt, dürfte von diesem Ergebnis nicht wirklich überrascht sein: Zwar haben immer mehr Unternehmen Maßnahmen zur Verbesserung ihrer Prozesse eingeführt, doch kümmern sie sich wesentlich weniger um das Controlling und die Weiterentwicklung des Prozessmanagements. An der jüngst erschienen Studie “Reifegrad des Geschäftsprozessmanagements 2015″ beteiligten sich 216 Teilnehmer aus dem deutschsprachigen Raum, die sich durchschnittlich seit neun Jahren mit dem Prozessmanagement befassen. Als Grundlage für die Befragung wurde ein von iProcess entwickeltes Reifegradmodell verwendet, das über fünf Reifegradstufen verfügt. Im Schnitt erreichten die Unternehmen die Reifegradstufe zwei. Es ist also noch einiges Entwicklungspotenzial vorhanden, denn nach wie vor liegt der Fokus vielerorts vor allem auf der Modellierung und Analyse der Abläufe, nicht jedoch auf der kontinuierlichen Überprüfung und Weiterentwicklung des Prozessmanagements. So mag es zwar gelingen, “Quick Wins” durch konkrete Prozessverbesserungen zu erreichen, doch wird das viel weitergehende Potenzial eines durchgängig geschlossesen Prozessmanagement-Kreislaufs nicht genutzt.

Interessanterweise konnte kein eindeutiger Zusammenhang zwischen Unternehmensgröße und Prozessmanagement-Reifegrad festgestellt werden. Kleinere und mittlere Unternehmen (KMU) können durchaus mit wesentlich größeren Organisationen mithalten. Flache Hierarchien und eine höhere Kundennähe aller Beteiligten erleichtern den KMU das Management ihrer Prozesse. Hingegen zeigten sich deutliche Unterschiede zwischen den Branchen. So ist der Reifegrad im Bereich Immobilien und Handel besonders hoch. Auch die Transportbranche sowie Banken sind hier ganz gut aufgestellt. Die Autoren der Studie sehen dies dadurch verursacht, dass diese Branchen sehr personal- und wissensintensive Prozesse haben. Zudem herrscht ein hoher Wettbewerbsdruck. Außerdem zeigte sich, dass Unternehmen mit vielen verteilten Niederlassungen über einen höheren Prozessreifegrad verfügen, da bei ihnen die Standardisierung der Prozesse eine hohe Bedeutung hat.

Minonne, C.; Koch, A.; Ginsburg, V.:
Reifegrad des Geschäftsprozessmanagements 2015. Eine empirische Untersuchung.
iProcess AG (Ltd.) Luzern 2015
Leseprobe und Bestellmöglichkeit

by Thomas Allweyer at March 12, 2015 01:01 PM

March 11, 2015

Sandy Kemsley: KofaxTransform 2015 In Pictures

As I prepared to depart Las Vegas, I flicked through some of my photos from the past couple of days and decided to share. First, the great work of the ImageThink team of graphic recorders:   ...

[Content summary only, click through for full article and links]

by sandy at March 11, 2015 07:30 PM

March 10, 2015

Sandy Kemsley: Analytics For Kofax TotalAgility With @Altosoft

Last session here at Kofax Transform, and as much I’d like to be sitting around the pool, I also like to squeeze every bit out of these events, and support the speakers who get this most...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 11:15 PM

Sandy Kemsley: Smarter Processes With Kapow Integration

I’m in a Kofax Transform breakout session on Kapow Integration together with KTA; I missed documenting the first part of the session when my Bluetooth keyboard stopped talking to my Android...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 09:43 PM

Sandy Kemsley: Process Intelligence at KofaxTransform

It’s after lunch on the second (last) day of Kofax Transform, and the bar for keeping my attention in a session has gone up somewhat. To that end, I’m in a session with Scott Opitz and...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 08:47 PM

Sandy Kemsley: Kofax Claims Agility SPA

Continuing with breakout sessions at Kofax Transform is a presentation on the Claims Agility smart process application that Kofax is creating for US healthcare claims processing, based on the KTA...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 06:45 PM

Sandy Kemsley: TotalAgility Product Update At KofaxTransform

In a breakout session at Kofax Transform, Dermot McCauley gave us an update on the TotalAgility product vision and strategy. He described five vital communities impacted by their product innovation:...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 06:17 PM

Sandy Kemsley: KofaxTransform 2015: Day 2 Customer Keynotes

I had a chance to hear Tom Knapp from Waterstone Mortgage speak yesterday at the analyst briefing here at Kofax Transform, and we have him to kick off this morning’s keynote. They started their...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 04:21 PM

Drools & JBPM: UF Dashbuilder - Activity monitoring in jBPM

syndicated from

Last week, the jBPM team announced the 6.2.0.Final release (announcement here). In this release (like in previous ones) you can author processes, rules, data models, forms and all the assets of a BPM project. You can also create or clone existing projects from remote GIT repositories and group such repositories into different organizational units. Everything can be done from the jBPM authoring console (aka KIE Workbench), a unified UI built using the Uberfire framework & GWT.

   In this latest release, they have also added a new perspective to monitor the activity of the source GIT repositories and organizational units managed by the tooling (see screenshot below). The perspective itself it's just a dashboard displaying several indicators about the commit activity. From the dashboard controls it is possible to:

  • Show the overall activity on our repositories
  • Select a single organizational unit or repository
  • List the top contributors
  • Show only the activity for an specific time frame

  In this video you can see the dashboard in action (do not forget to select HD).

Contributors Perspective

  Organizational units can be managed from the menu Authoring>Administration>Organizational Units. Every time an organizational unit is added or removed the dashboard is updated.

Administration - Organizational Units 

   Likewise, from the Authoring>Administration>Repositories view we can create, clone or delete repositories. The dashboard will always feed from the list of repositories available.

Administration - Repositories

   As shown, activity monitoring in jBPM can be applied to not only to the processes business domain but also to the authoring lifecycle in order the get a detailed view of the ongoing development activities.

How it's made

The following diagram shows the overall design of the dashboard architecture. Components in grey are platform components, blue ones are specific to the contributors dashboard.

Contributors dashboard architecture

  These are the steps the backend components take to build the contributors data set:

  • The ContributorsManager asks the platform services for the set of available org. units & repos. 
  • Once it has such information, it builds a data set containing the commit activity.
  • The contributors dataset is registered into the Dashbuilder's DataSetManager.

   All the steps above are executed on application start up time. Once running, the ContributorsManager also receives notifications form the platform services about any changes on the org. units & repositories registered, so that the contributors data set is synced up accordingly. 

   From the UI perspective, the jBPM's contributors dashboard is an example of hard-coded dashboard built using the Dashbuilder Displayer API which was introduced in this blog entry. The ContributorsDashboard component is just a GWT composite widget containing several Displayer instances feeding from the contributors data set.

   (The source code of the contributors perspective can be found here)

    This has been a good example of how to leverage the Dashbuilder technology to build activity monitoring dashboards. In the future, we plan for applying the technology in other areas within jBPM, like, for instance, an improved version of the jBPM process dashboard. We will keep you posted!

by Mark Proctor ( at March 10, 2015 03:24 PM

March 09, 2015

Drools & JBPM: Zooming and Panning between Multiple Huge Interconnected Decision Tables

Michael has started the work on revamping our web based decision tables. We've been experimenting with HMTL5 canvas with great results, using the excellent Lienzo tool. First we needed to ensure we could scale to really large decision tables, with thousands of rows. Secondly we wanted to be able to pan and zoom between related or interconnected decision tables. We'll be working towards Decision Model and Notation support, that allows networked diagrams of Decision Tables.

You can watch the video here, don't forget to select HD:

Notice in the video that while you can manually pan and zoom it also has links between tables. When you select the link it animates the pan and zoom to the linked location. 25s to 47s in is showing  that we can have really large number of rows and keep excellent performance, while 55s is showing the pan speed with these large tables. Initially the example starts with 50% of cells populated, at 1m in we change that to 100% populated and demonstrate that we still have excellent performance.

by Mark Proctor ( at March 09, 2015 11:55 PM

Sandy Kemsley: Kofax Altosoft For Operational Intelligence

Wayne Chambliss and Rich Rabin of Kofax Altosoft gave a presentation at Kofax Transform, most of which was a demo, on becoming an operational intelligence guru. This is my first real look at the...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 09:48 PM

Sandy Kemsley: Tablets And Digital Signatures At AIA Life

Just to maximize confusion, we have a second AIA at the Kofax Transform conference: this morning, Aia referred to the customer communications management company recently acquired by Kofax; this...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 08:44 PM

Sandy Kemsley: Kofax Analyst Briefing And Portfolio Update

Following the Kofax Transform day 1 keynotes, we had a separate session for financial and industry analysts to be briefed on the products and financials. After a brief introduction from Reynolds...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 08:03 PM Insights from German flagship conference Wirtschaftsinformatik 2015

Last week, the team visited the conference Wirtschaftsinformatik 2015 in Osnabrück. Topic of this year’s conference was Smart Enterprise Engineering.

In three days, several research and business tracks gave visitors insights in emerging trends of information systems. Furthermore, different companies (e.g., Thyssen Krupp or SAP) presented their next steps for achieving digitalized business.

For example, Thyssen Krupp wants to use Big Data and digitalization to revolutionize their elevator business. In the future, elevators won’t be travelling solely vertically but also horizontally. In the keynote, this video was shown. We found it very impressing and it fits very well to our scheduled innovation workshop “BPM meets the Innovation Helix“. So, we want to share it with you:

Besides the business tracks, the German research community presented their recent work. The presented papers dealt with Business Process Management, Information Systems Usage, or Social Media an Collective Intelligence.

We are proud that one of our team members also presented her work at the conference. Janina Kettenbohrer talked about impact of employees’ attitude toward their job on business process standardization acceptance. She and her two colleagues Dr. Andreas Eckhardt and Prof. Dr. Daniel Beimborn developed a theoretical model which explains how job-related attributes (e.g., autonomy or skill variety), work-role fit, co-worker relation, and the wider process environment influence the employees’ perception of meaningfulness of work and consequently process standardization acceptance. If you are interested in Janina’s latest work, you can find her paper here:

If you’re interested in testing the model in your organization and in finding out how to successfully implement process standards, please contact Janina.

by Mirko Kloppenburg at March 09, 2015 08:02 PM

Sandy Kemsley: Kicking off KofaxTransform 2015: Day 1 Keynotes

I’m in Vegas for a couple of days for the Kofax Transform conference. Kofax has built their business beyond their original scanning and capture capabilities (although many customers still use...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 04:38 PM

Drools & JBPM: jBPM 6.2.0.Final released

The bits for the jBPM 6.2 release are now available for you to download and try out !  

Version 6.2 comes with a few new features and a lot of bug fixes !  New features include a.o. EJB, (improved) OSGi and Camel endpoints support, a new asset management feature (to introduce a development and release branch and promote assets between both), social profiles and feeds and the ability to extend the workbench with your own plugins!

More details below, but if you want to jump right in:

Release Notes

Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

jBPM 6.2 is released alongside Drools (for business rules) and Optaplanner (for planning and constraint solving), check out the new features in the Drools release blog, including a brand new rules execution server and the Optaplanner release blog as well.

A big thank you to everyone who contributed to this release!

Some highlights from the release notes.

Core services

  • EJB: the jBPM execution server (that is for example embedded in our web-based workbench) now also comes with an EJB interface.  A refactoring of the underlying jbpm-services now makes the execution services accessible using pure Java, CDI, EJB and Spring. Remote interfaces using REST and JMS are still available as well of course !  A lot more details are described in Maciej's blog here.
  • Deployments (defining which versions of which projects are currently active in the execution server) are now by default stored in the database.  This greatly simplifies the architecture in a clustered environment in case you are only using our runtime side of our web tooling (for example by having dedicated execution servers in production).
  • Our asynchronous job executor has improved support for requeuing failed jobs and for recurring jobs (e.g. daily tasks).
  • OSGi: Full core engine functionality is now available on top of OSGi.  A significant number of additional jars (including for example the human task service, the runtime managers, full persistence, etc.) were "OSGi-fied". Specific extensions and tests showing it in action are available for Apache Karaf and Aries Blueprint (in the droolsjbpm-integration repository).
  • Camel endpoint URIs: A new out-of-the-box service task has been implemented for using Apache Camel to connect a process to the outside world using some of the numerous Camel endpoint URIs. The service task allows you to for example specify how to pass data to an FTP endpoint by configuring properties such as hostname, port, username, payload, etc. for some common endpoints like (S)FTP, File, JMS, XSLT, etc. but you can use virtually any of the available endpoints by defining the URI yourself (

  • Form Modeler comes with improved support for adding custom logic to your forms using JavaScript on changes, and support for configurable ComboBox and RadioGroup fields, and simple List types.
  • Asset management: It is now possible to make a repository a "managed repository".  This allows you to split up a repository in multiple branches, one for doing development and on for releasing.  Users can then request various assets to be promoted to the resource branch when ready.  This promotion process, and the linked build and deploy processes, are defined using a BPMN2 process as well and include approval and build tasks.  Check the documentation for more details.

  • Social features, like user profiles (including gravatar pictures), and various event feeds like the most recent assets you worked on, on recent changes by other users.

  • Contributors perspective is a new out-of-the-box report (using the new dashbuilder technology) that gives high-level insight in who is changing what in your repositories.
  • Pluggable workbench:  you can now extend the workbench with your own views, menus, etc. using workbench plugins. Available features includes creation of perspectives via a programmable or a drag and drop interface, create new screens, editors, splashscreens and dynamic menus. 

by Kris Verlaenen ( at March 09, 2015 02:39 PM

Sandy Kemsley: Software AG Analyst Day: The Enterprise Gets Digital

After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely: Back at work, I spent a day last week in...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 12:42 PM

March 06, 2015

Drools & JBPM: Drools 6.2.0.Final Released

We are happy to announce the latest and greatest Drools 6.2.0.Final release.

This release in particular had a greater focus on improved usability and features that make the project easier to use (and adopt). Lots of improvements on the workbench UI, support for social activities and plugin management, as well as a brand new Execution Server for rules are among the new features.

Improved Wizards

Execution Server Management UI

Social activities

Contributors dashboard

Perspective editors

Here are a few links of interest:

We would like to use the opportunity to thank all the community members for their contributions to this release and also JetBrains and Syncro Soft for the open source licenses to their products that greatly help our developers!

Happy drooling!


by Edson Tirelli ( at March 06, 2015 03:54 PM

March 05, 2015

Thomas Allweyer: FireStart kann beides: Durchgängige fachliche Modellierung und Prozessausführung

Firestart Outlook-IntegrationMeist werden für die fachliche Prozessmodellierung und die Prozessausführung unterschiedliche Systeme eingesetzt. Zwar bieten einige BPMS-Hersteller auch Wertschöpfungskettendiagramme und ähnliches an, doch bleiben die Fähigkeiten zur fachlichen Prozessdokumentation und -analyse meist weit hinter den reinen Prozessmodellierungswerkzeugen zurück. Eine positive Ausnahme stellt die FireStart BPM Suite von Prologics dar. Die Plattform ermöglicht eine kollaborative Modellierung in einer benutzerfreundlichen grafischen Modellierungsumgebung, die über das gewohnte Look and Feel von Office-Produkten verfügt. Die Modelle werden in einem zentralen Repository abgelegt. Die Publikation der Modelle in einem Prozessportal und die Generierung von Prozesshandbüchern werden ebenso unterstützt wie eine Versionsverwaltung und die revisionssichere Ablage der Modelle. Das rollenbasiert anpassbare Prozessportal verfügt über eine moderne, mit HTML 5 realisierte Oberfläche. Die leistungsfähige Suche und weitere Funktionen werden mittels bei Bedarf eingeblendeter Overlay-Menüs aufgerufen, wie man sie z. B. aus Google Maps kennt.

Neben den Prozessen lassen sich u. a. auch Prozesslandkarten, Organigramme, Datenmodelle, IT-Landschaften und Risiken modellieren und auf einfache Weise mit den Aktivitäten in den Prozessmodellen verbinden. Damit ist eine durchgängig integrierte Unternehmensmodellierung möglich. Insbesondere im Zusammenhang mit BPMN-Modellen ist dies nicht selbstverständlich – selbst prominente Modellierungsplattformen weisen hier oftmals Schwächen auf, wie diese Untersuchung zeigt. Der Clou: Bei der Darstellung der Prozessmodelle kann man jederzeit zwischen der Darstellung in BPMN und EPK umschalten und in der jeweils anderen Notation weitermodellieren. In der EPK-Darstellung werden zugeordnete Organisationseinheiten, IT-Systeme u. ä. als eigene Objekte dargestellt, die über Pfeile mit den jeweiligen Aktivitäten verbunden sind. In der BPMN-Darstellung wird durch kleine Icons in den Aktivitäts-Symbolen angezeigt, zu welchen weiteren Objekttypen Verbindungen bestehen. Diese Möglichkeit des Wechsels zwischen BPMN und EPK dürfte insbesondere die Akzeptanz in Fachabteilungen erhöhen, die vielerorts die EPK-Darstellung gewohnt sind.

FireStart Durchlaufzeitenanalyse im GanttchartInternationale Unternehmen werden sich über die integrierte Übersetzungsfunktion freuen, die die Modelle ohne weiteres Zutun in eine Vielzahl von Sprachen übersetzen kann. Auch wenn eine automatisierte Übersetzung nicht immer perfekt sein dürfte, erleichtert sie das Verständnis der Modelle in verschiedenen Landesniederlassungen immens.

Für die Prozessanalyse stehen spezielle Ansichten der Prozesse zur Verfügung. So kann man sich den zeitlichen Verlauf eines Prozesses in Form eines Gantt-Charts anzeigen lassen. Verändert man die Zeiten einzelner Prozess-Schritte, so wird direkt die Auswirkung auf die Gesamtdurchlaufzeit deutlich. Da FireStart die Prozesse im Gegensatz zu reinen Modellierungswerkzeugen auch ausführen kann, ist eine solche Durchlaufzeitenanalyse nicht nur auf der Grundlage von Vorgabewerten möglich, sondern auch auf Basis der echten Daten ausgeführter Prozessinstanzen. In einer Matrixdarstellung können Prozesskosten analysiert und den einzelnen Aktivitäten Hinweise auf Schwachstellen und Verbesserungsvorschläge zugeordnet werden.

Die integrierte Modellierung von Prozessen, Organigrammen, Daten usw. ist nicht auf eine fachliche Betrachtung beschränkt. Sie wird auch beim Übergang zur Prozessausführung genutzt. So werden etwa die fachlichen Datenobjekte um technische Details ergänzt, so dass sie bei der Prozessausführung zur Aufnahme konkreter Daten dienen können. Den Organisationseinheiten aus dem Organigramm werden konkrete Benutzern zugeordnet, und modellierte IT-Systeme werden mit Schnittstellen- und Aufrufinformationen hinterlegt. Die ausführungsbezogene Konfiguration der Modelle wird dem Modellierer an vielen Stellen erleichtert. Zieht man etwa ein Datenobjekt auf einen Benutzer-Task, so stehen die betreffenden Datenfelder direkt im Formular dieses Tasks zur Verfügung.

Firestart ProzesskostenanalyseAuch bei der Prozessausführung macht sich die Integration der fachlichen Prozessmodellierung im selben Tool bezahlt. So ist das Portal, das die am Prozess beteiligten Bearbeiter nutzen, dasselbe, das zur Publikation der Prozessmodelle dient. Man kann sich somit bei der Prozessdurchführung jederzeit über die Prozesse informieren. Zudem können laufende Prozessinstanzen im Prozessmodell verfolgt werden. Das Portal ist responsiv gestaltet, so dass auch eine komfortable Bearbeitung auf Tablets und Smartphones möglich ist, auch mit Gestensteuerung. Zudem wird eine Integration in Microsoft Sharepoint und Outlook angeboten. Damit können Mitarbeiter ihre Aufgaben über den gewohnten Maileingang erhalten und die zugehörigen Formulare komplett in Outlook bearbeiten ohne in das separate Portal wechseln zu müssen. Generell spielt FireStart seine Stärken in der Integration mit Microsoft-Produkten aus. Daneben stehen aber auch Konnektoren zu SAP und anderen Systemen zur Verfügung, und natürlich werden auch verschiedene Standards wie Web Services unterstützt.

Bei der jüngsten BPM-Studie des Fraunhofer IESE landete FireStart im Spitzenfeld. Das System schnitt in den meisten untersuchten Kategorien überdurchschnittlich ab. Dass es insbesondere in der Kategorie “Prozessmodellierung” vor allen anderen BPM-Systemen landete, überrascht angesichts der Funktionsvielfalt der Modellierungskomponente nicht.

by Thomas Allweyer at March 05, 2015 08:39 AM

March 02, 2015 Invitation to “BPM meets the Innovation Helix” Workshop

„Quo Vadis, BPM?“ – This was already the title of the key note speech held by Dr. Bernhard Krusche at our recent Process Management Conference and most of the conference participants agreed, that the challenges of the digital transformation of organizations will also challenge BPM.

Dr. Krusche’s idea to combine tools of successful innovation processes with structured BPM started a discussion on how this fusion of classical BPM and new innovation methodologies could look like.

Thus, we decided to set up a workshop to explore the “Innovation Helix” which was invented by Dr. Krusche and Prof. Sonja Zillner and match it with the BPM Life Cycle.

To learn more about this innovation workshop, please check the event details…

by Mirko Kloppenburg at March 02, 2015 09:45 PM

February 27, 2015

Thomas Allweyer: IT-Strategie-Studie gestartet

Im letzten Jahr kam eine von Scheer Management erstellte Studie zu dem ernüchternden Fazit, dass es vielen Firmen nicht gelingt, ihre Unternehmensstrategien auch tatsächlich umzusetzen. Auf dem operativen Level kam zumeist kaum noch etwas von dem an, was in der Führungsetage als Strategie erarbeitet wurde. Jetzt hat das Saarbrücker Beratungshaus eine neue Umfrage gestartet, diesmal zum Thema IT-Strategien. Es wird untersucht, welche Elemente IT-Strategien in der Praxis beinhalten und wie sie kommuniziert und umgesetzt werden.

Die Studie richtet sich an alle Management-Ebenen und Branchen. Die Beantwortung der Fragen dauert etwa 15 Minuten. Die Teilnahme ist unter diesem Link möglich.

by Thomas Allweyer at February 27, 2015 07:50 AM

February 26, 2015

Bruce Silver: BPM at IBM InterConnect

It would be unfair to say there was absolutely nothing on BPM at IBM’s InterConnect conference, which took place this week in Las Vegas… but it would not be far from the truth. InterConnect is the supposed successor to IBM’s annual Impact middleware event where “Smarter Process” – IBM’s term for BPM and decision management – has always played a large role. I say “supposed” because the new mega-event, triple the size of Impact and split between 2 hotels a mile apart, was such a logistical debacle that I seriously doubt they will try it this way again.

InterConnect is officially about Cloud, Mobile, DevOps, and Security. Middleware is sort of there but well below the fold. The overarching theme this year was “hybrid cloud” – new apps combining services in public and private clouds, even behind the firewall – based on IBM’s new strategic platform-as-a-service called Bluemix. Even though Bluemix is new in the past year, they never really explained what it is. Here is what it says on the website:

“IBM Bluemix is the cloud platform that helps developers rapidly build, manage and run web and mobile applications. Based on the open source architecture of Cloud Foundry, Bluemix provides the flexibility to integrate development frameworks, languages and services that suit your needs. Develop applications using Web IDE and Eclipse – while storing your code directly on Bluemix or GitHub. Bluemix is based on Cloud Foundry, an open source project, and features additional runtimes and services from the open source community. This makes Bluemix a great place to build and run applications that leverage technology and innovation from the open source developer community. Bluemix not only offers developers a broad range of IBM, third-party and open source APIs and services, but it integrates with many of the developer tools you already use today. By abstracting lower-level infrastructure components, Bluemix enables you to spend more of your time and talent writing the code that will differentiate your app and drive user adoption and engagement. Build apps and services for free in the first 30 days. Enjoy the free tier even after the trial ends, and pay only for what you use. No credit card is required to get started.”

If this is truly the new strategic direction, it’s clearly not what we normally think of as IBM. The whole event affected that open source/hacker/developer-centric tone, and honestly, it was kind of interesting and refreshing. The problem for me was that BPM does not seem to play in this brave new world.

The main tent demos all reprised the old “systems of engagement” theme, in which the app uses some combination of mobile, social, and analytics technology to lure unsuspecting mall dwellers into buying something. Only now it’s on Bluemix!

In the new IBM, it’s all about customer-facing apps on phones, not cross-functional business processes. It’s about writing code, not model-driven development. This revolution, they tell us, will be hacker-led, not business-empowering. All those old BPM values and principles, apparently, are yesterday’s news.

But it seems to me that BPM – the technology, if not the IBM product – could have a valuable role to play here. These new engagement apps, for example, depend heavily on events, decisions, and analytics, all technologies central to IBM’s Smarter Process portfolio, but not really integrated with the BPM product. IBM Decision Server Insights, based on ODM, for example, introduces “rich time modeling, reasoning and analytics to detect and respond to intricate patterns and trends; innovative global analytics to extract valuable insights over populations of business entities in real-time; and generalized, business-friendly modeling over all aspects of the decision model design.” Yes, this is exactly in tune with the new direction, but it is not integrated with BPM. If you ask why (and I did), the answer is always “our customers are not asking for it.”

And how does BPM fit into Bluemix? It doesn’t (yet), but Bluemix does include a Workflow service:

“Workflow for Bluemix makes it easy for you to create workflows that orchestrate and coordinate the REST-based services that you use in your apps. The JavaScript based Workflow language lets you define interactions between any services. By off-loading all the service interactions to the Workflow service, your application becomes easier to understand, maintain and evolve. Your workflows are run and managed in a robust and scalable way, regardless of whether your workflow and services run for milliseconds or days.”

Hmmm… To me this sounds like an updated version of the Windows Workflow Foundation, a set of programmer components that Microsoft put into the .Net Framework several years back. Embedding workflow in the OS!  An obvious win, right?  It might be workflow automation but it’s not BPM as we know it.

So what should a renovated IBM BPM on Bluemix, consistent with the new strategy, look like? If they asked me (which they have not), it would include the following:

  • Model-driven, instant playback, business-empowered process design… all those old Lombardi values that rescued IBM BPM in the first place!
  • Event-aware continuous query engine, like Decision Server Insights, but able both to trigger BPMN events and to receive and process events generated by the process engine and BAM.  And business-friendly modeling tools to go with it.
  • An enhanced Coach Designer that makes performer-facing tasks look just as engaging, powerful, and mobile-enabled as the customer-facing apps IBM is showing today in the main tent.
  • Complete unification of case management with structured BPM.  None of the current “basic” case management baloney.

Before closing, I have to say that I did see one BPM thing at InterConnect that was new and interesting, an executable version of Blueworks Live.  Blueworks Live has long had a simple automated workflow capability, but it was totally disconnected from the BPMN modeling piece.  The new version actually executes the BPMN model, using activity inputs and outputs – properties currently provided by the tool – to autogenerate task forms and to serve as process variables.  The BPMN activities today are simple human tasks, but I believe some kind of service invocation scenarios are planned.  You can step through the process in “test mode,” just like the Playback feature in IBM BPM. It is really pretty cool.   I’m not sure when this new version will be available or what it costs, but I suspect it could take some business away from BPM Standard (the one without Process Server underneath).  It’s definitely not a toy.


The post BPM at IBM InterConnect appeared first on Business Process Watch.

by bruce at February 26, 2015 11:11 PM

February 25, 2015

Sandy Kemsley: Capital Raising Through Crowdfunding

Nicholas Doyle of DST gave a presentation on crowdfunding: an interesting topic to cover at a conference attended primarily by old-school financial services companies, who are the ones most likely to...

[Content summary only, click through for full article and links]

by sandy at February 25, 2015 10:10 PM

Sandy Kemsley: AXA And The Digital Enterprise

Day 2 at DST ADVANCE 2015, and I’m attending a panel of three people from AXA on how their journey to becoming a digital insurance business. They define digital business as new ways of engaging...

[Content summary only, click through for full article and links]

by sandy at February 25, 2015 07:16 PM

February 24, 2015

Sandy Kemsley: Innovations In AWD User Experience

To finish off the first morning at DST ADVANCE 2015, I attended the session on customer and work experience, which was presented as a case study of background investigations on a security-sensitive...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 06:43 PM

Sandy Kemsley: AWD 2015 Product Strategy

Roy Brackett and Mike Lovell from DST’s BPS (Business Process Solutions) product management gave us a review of what happened in 2014 and an update on their 2015 product strategy, following on...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 05:46 PM

Sandy Kemsley: Kicking Off #DSTAdvance15 – DST Update From @JCV816

Conference season always brings some decisions and conflicts, and this year’s first one (for me) came down to a decision between DST‘s ADVANCE in Phoenix, and IBM InterConnect in Las...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 04:47 PM

Thomas Allweyer: Neuauflage Basiswissen Geschäftsprozessmanagement – mit BPMN 2.0

Cover Basiswissen Geschäftsprozessmanagement 2. AuflageDas Buch dient zur Vorbereitung auf die Prüfung zum “Certified Expert in Business Process Management” der OMG. Als vor fünf Jahren die erste Auflage erschien, war die Version 2.0 der Prozessmodellierungsnotation BPMN noch in Arbeit, weshalb für das Zertifikat damals noch BPMN 1.2 zugrunde gelegt wurde. Inzwischen wurde das Zertifizierungsprogramm aktualisiert, weshalb auch eine Neuauflage des Buchs nötig wurde. Die wichtigste Neuerung ist daher auch die Berücksichtigung der BPMN-Version 2.0.

Die weiteren Inhalte haben sich gegenüber der ersten Auflage kaum geändert. Es wurden lediglich kleinere Ergänzungen vorgenommen, z. B. zu BPMS und ausführbaren Prozessmodellen. Wer das Buch zum Nachschlagen oder als kompakten Einstieg in die OMG-Sicht auf das Thema BPM nutzt, kann die erste Auflage weiterverwenden. Wenn man sich auf die Zertifizierung vorbereitet, sollte man zur aktuellen Version greifen. Eine Besprechung der ersten Auflage findet sich hier.

Weilkiens, T.; Weiss, C.; Grass, A.; Duggen K.:
Basiswissen Geschäftsprozessmanagement: Aus- und Weiterbildung zum OMG Certified Expert in Business Process Management 2 (OCEB 2) – Fundamental Level. 2. Auflage
dpunkt, Heidelberg 2015
Das Buch bei amazon

by Thomas Allweyer at February 24, 2015 09:21 AM

February 20, 2015

Thomas Allweyer: BPM-News aus der Schweiz

Mit dem Stand des Prozessmanagements in der Schweiz befasst sich ein kürzlich erschienenes Special der Handelszeitung. Angesichts der Herausforderungen durch den stark aufgewerteten Franken dürften effiziente Prozesse bei vielen Unternehmen künftig noch wichtiger sein. Eine Studie der Zürcher Hochschule für Angewandte Wissenschaften (ZHAW) konstatiert, dass das Bewusstsein für BPM in den vergangen Jahren stark zugenommen habe. Hierbei gewinnt auch die Prozessautomatisierung an Bedeutung. Und: Kleine und mittelständische Unternehmen brauchen sich keineswegs hinter Großunternehmen zu verstecken, wenn es um die Einführung von Prozessmanagement geht. Im Interview nennt Karlheinz Baumann, COO des Uhrenherstellers IWC, als wichtigsten Mehrwert die Transparenz der Geschäftsprozesse sowie die erhöhte Anpassungsgeschwindigkeit bei Transformationsprozessen. Ein ausführlicher Beitrag befasst sich mit der Situation von Energieversorgern, wo sich insbesondere bei der Abbildung kundenbezogener Prozesse zahlreiche positive Beispiele finden lassen. Dennoch gibt es bei den insgesamt 680 Elektrizitätsversorgern des Landes vielerorts noch Nachholbedarf.

Markus Fischer von Axon Ivy betont in seinem Beitrag die Rolle, die intelligente Business Process Management-Systeme (iBPMS) für die Realisierung neuer, digitalisierter Geschäftsmodelle spielen. Hierzu müssen Geschäftsregeln und Services aus verschiedensten Datenquellen in komplexe oder repetitive Prozesse integriert werden – und zwar in Echtzeit. Dabei können z. B. auch Big Data-Analysen, Cloud-Anwendungen und Soziale Medien eingebunden werden. Er illustriert dies am Beispiel eines Online-Shops, bei dem sehr flexibel neue Zahlungsverfahren eingebunden wurden. Hierbei wurde über das BPMS eine Echtzeit-Bonitätsprüfung integriert, wodurch das bei Zahlung auf Rechnung bestehende Risiko deutlich reduziert werden konnte.

Weitere Beiträge befassen sich unter anderem mit Ausbildungen und Zertifizierungen im Bereich Prozessmanagement, sowie einer Studie zum Prozessreifegrad in Unternehmen des deutschsprachigen Raums.

Das BPM-Special der Handelszeitung ist unter diesem Link verfügbar. Am 5. März findet in Zürich das Swiss BPM Forum statt. Es steht in diesem Jahr unter dem Motto “Business Process Innovations – Die Treiber der Digitalen (R)Evolution”.

by Thomas Allweyer at February 20, 2015 11:11 AM

February 19, 2015

Drools & JBPM: Submit your work to the 9th International Web Rule Symposium (RuleML 2015)

Professor at Freie Universitaet Berlin
Dear CEP Colleagues,

I would like to encourage you to submit your research in the field of rules, rule-based reasoning and its applications to RuleML 2015 (

It is an excellent opportunity for a high impact conference publication (; e.g., RuleML is in the top 100 venues for impact factor in CiteSeerX

Also note, that there are several additional collocated events, e.g., the Recommender Systems for the Web of Data Challenge (, the Doctoral Consortium (, the 9th International Rules Challenge with competitive prizes for the best rule base, the Reasoning Web Summer School, RR 2015 and the Workshop on Formal Ontologies meet Industry (, as well as the 25th CADE 2015.

If you are doing your Phd in this field I would like to point you to the Reasoning Web Summer School ( and the joint RuleML/RR Doctoral Consortium ((, where you can submit your Phd paper. Accepted RuleML Phd papers and demo papers will be published in the Challenge proceedings which are listed in DBLP ( and fully indexed e.g. in Scopus.
Also the workshop on Formal Ontologies meet Industry (FOMI 2015) might be relevant for you if you are working on Ontologies (

And, further interesting things will happen at RuleML 2015, such as an ISO Common Logic, OMG API4KB and OASIS LegalRuleML face-to-face meeting. We will also have a Berlin Semantic Web Meetup during RuleML on August 4th. Details will follow soon on the Meetup website:

Thank you and hope to see you in Berlin, Germany in August,

(General Chair RuleML 2015)

Prof. Dr. Adrian Paschke
AG Corporate Semantic Web
Freie Universitaet Berlin

by Mark Proctor ( at February 19, 2015 12:49 PM

February 17, 2015 Community-Driven Product Management

We are currently discussing if and how we should support the new Decision Model and Notation (DMN) Standard by OMG.

It’s basically about business rules, and of course our customers often ask us for business rules support. We typically recommend to combine Camunda with a rule engine such as JBoss Drools. That works very well, there are numerous examples, blueprints and tutorials available and also project experiences. However, actually in 95% of the real-world projects we have seen you don’t really need the features that Drools or the other leading rule engines provide. It’s mostly just about exposing business rules …

by Jakob Freund at February 17, 2015 02:14 PM

Thomas Allweyer: smartfacts: Toolübergreifende Plattform für Modelle

Smartfacts ScreenshotInsbesondere in großen Unternehmen dürfte es eher die Regel als die Ausnahme sein, dass mehrere unterschiedliche Modellierungstools zum Einsatz kommen. Die entstandenen Modellwelten sind voneinander isoliert. So ist es kaum möglich herauszufinden, welche Modelle es im Unternehmen gibt, geschweige denn beispielsweise alle Modelle zu finden in denen das Geschäftsobjekt “Kundenauftrag” verwendet wird. Hier verspricht das Produkt “smartfacts” des Nürnberger Modellierungsspezialisten MID Abhilfe. Die Plattform ermöglicht eine einheitliche Sicht auf Modelle unterschiedlichen Ursprungs.

Zu den angebotenen Features gehören eine modellübergreifende Suche, die Versionierung der Modelle und die Möglichkeit, beliebige Modelle miteinander zu verknüpfen. Egal, mit welchem Tool die verschiedenen Modelle erstellt wurden – sie werden auf einheitliche Weise im Browser dargestellt. Man kann jedes Diagramm beliebig vergrößern und verkleinern und direkt aus der Plattform heraus ausdrucken. Für Tablets und Smartphones wird die Darstellung entsprechend angepasst. Die enthaltenen Symbole können angeklickt werden, woraufhin die Attribute des jeweiligen Objekts angezeigt werden. smartfacts lädt beim Import also nicht nur eine Grafik, es wird vielmehr die Struktur des Modells mit übernommen.

smartfacts dient zudem als Kollaborationsplattform. So kann man etwa Modelle kommentieren, Diskussionen führen und Entscheidungen treffen. Über ein Berechtigungskonzept kann man regeln, welche Benutzer Zugriff auf die verschiedenen Modelle haben.

Derzeit können Modelle aus ARIS, Visio, Enterprise Architect und MIDs eigenem Modellierungswerkzeug innovator importiert werden. Zudem lassen sich BPMN-Modelle hochladen, die im BPMN-Standardaustauschformat vorliegen und somit aus jedem Tool stammen können, das einen entsprechenden Export anbietet. Aus den explizit unterstützten Tools lassen sich hingegen nicht nur Prozessmodelle übernehmen, sondern auch beliebige andere Modelltypen, wie Datenmodelle oder EPKs. Um Modelle aus Visio, Enterprise Architect oder innovator nach smartfacts zu übertragen, muss in dem betreffenden Tool ein Plugin installiert werden. Als Beispiel wurde das Visio-Plugin getestet, das problemlos funktionierte. Auch die Übernahme eines BPMN-Modells im Standardformat klappte ohne Weiteres. Ein Export aus ARIS Architect (Version 9.7) konnte im Test hingegen nicht importiert werden [Update: Das scheint daran zu liegen, dass die ARIS-Dateien verschlüsselt sind, siehe den Kommentar von Herrn Puschaddel].

smartfacts ist eine echte Innovation und kann ein nützliches Hilfsmittel für Unternehmen sein, die verschiedene Modellierungstools im Einsatz haben – und dies auch nicht ändern wollen. Zwar hat die Plattform den Vorteil, dass man vorhandene Modelle nicht verwerfen oder manuell in ein neues Tool überführen muss, doch stehen z. B. die Kollaborationsmöglichkeiten oder die Versionierungsmechanismen durchaus in Konkurrenz zu vergleichbaren Features herkömmlicher Modellierungssuiten, und man muss sorgfältig überlegen, welche Aufgaben man in welchem Tool erledigen möchte. Trotz der Berücksichtigung der Modellstrukturen beim Import kann smartfacts die verschiedenen Modelle nicht so nahtlos integrieren wie dies möglich ist, wenn sie direkt in einem einzigen Werkzeug erstellt werden.

Zudem bedeutet das nachträgliche Hinzufügen von Versionsnummern, Beschreibungen und Verlinkungen von Modellen einen zusätzlichen Aufwand, und es erfordert einige Disziplin, um Änderungen in den einzelnen Tools und in smartfacts konsistent zu halten. Werden diese Aspekte vernachlässigt, so hat man in der neuen Plattform schnell eine riesige, schwer überschaubare Sammlung verschiedenartiger Modelle. Diese lassen sich zwar durchsuchen, doch was fängt man mit den gefundenen Modellen an, wenn aus den verfügbaren Informationen etwa nicht klar ist, aus welchem Kontext sie stammen und ob sie aktuell gültig sind?

Um das Potenzial von smartfacts erschließen zu können, muss man sich daher im Vorfeld recht genaue Gedanken über die Governance der Modelle machen. Wenn im Unternehmen eine Vielzahl unterschiedlicher Tools im Einsatz ist, dann wurden die Prozesse zur Erstellung, Prüfung und Veröffentlichung von Modellen meist nicht einheitlich gehandhabt. Die entstandene Heterogenität der Modell-Landschaft wird man alleine durch die Einführung einer zusätzlichen Softwareplattform nicht in den Griff bekommen. Wenn man aber saubere Prozesse im Umfeld der Modellierung etabliert, dann kann smartfacts sicherlich ein sehr nützliches Hilfsmittel darstellen.

Auf der smartfacts-Website kann man sich einen 30tägigen Test freischalten lassen.

by Thomas Allweyer at February 17, 2015 09:24 AM

February 14, 2015 BPMN Online Training coming up – get your free pass

I just looked it up: During the last seven years, we coached more than 500 individuals in BPMN classroom trainings, and delivered more than 300 BPMN onsite trainings to organizations all over the world. I would say we probably know our business here.

But people kept asking us for an online version, allowing them to learn BPMN where and when they prefer. So we started working on this, and I expect that we can deliver the first chapters within the next months. The training will be based on our handbook Real-Life BPMN, but probably with a stronger focus on process automation. …

by Jakob Freund at February 14, 2015 05:56 PM

February 09, 2015

Drools & JBPM: The Relationship of Decision Model and Notation (DMN) to SBVR and BPMN (Full Article)

"Publications by James Taylor and Neil Raden[2], Barbara von Halle and Larry Goldberg[1], Ron Ross[7], and others have popularized "Decision Modeling."  The very short summary is that this is about modeling business decision logic for and by business users.
A recent Decision Modeling Information Day conducted by the Object Management Group (OMG)[4] showed considerable interest among customers, consultants, and software vendors.  The OMG followed up by releasing a Request for Proposals (RFP) for a Decision Model and Notation (DMN) specification.[5]  According to the RFP,
"Decision Models are developed to define how businesses make decisions, usually as a part of a business process model (covered by the OMG BPMN standard in Business Process Management Solutions).  Such models are both business (for example, using business vocabularies per OMG SBVR) and IT (for example, mapping to rule engines per OMG PRR in Business Rule Management Systems)."
This quote says a little about how DMN may relate to SBVR[6] and BPMN[3], but there are many more open questions than answers.  How do SBVR rules relate to decisions?  Is there just one or are there multiple decisions per SBVR rule?  Is there more to say about how SBVR and DMN relate to BPMN?
This article attempts to "position" DMN against the SBVR and BPMN specifications.  Of course, DMN doesn't exist yet so the concepts presented here are more the authors' ideas about how these three specifications shouldrelate to each other, than reality.  We present these ideas in the hope that they will positively influence the discussions that lead up to the DMN specification."

by Mark Proctor ( at February 09, 2015 11:32 PM

Thomas Allweyer: Manager predigen Prozessorientierung und leben Funktionsorientierung

Interessante Ergebnisse zur Verbreitung der Prozessorganisation förderte eine neue Studie zu Tage, die im Auftrag der Gesellschaft für Organisation (gfo) durchgeführt wurde. Viele der 165 in der Studie vertretenen Unternehmen richten ihre Organisation durchaus an den Prozessen aus – allerdings nur in den unteren Führungsebenen. Auf den oberen Ebenen herrscht dagegen nach wie vor eine starke Funktionsorientierung. Dies gilt insbesondere für Großunternehmen. So ordnet das oberste Management oftmals für die nachgeordneten Ebenen eine stärkere Prozessorientierung und andere Maßnahmen zur Steigerung der Effizienz an, nimmt sich selbst aber davon aus. Das ist umso beklagenswerter, als dieselbe Studie wieder einmal bestätigt, was der wichtigste Erfolgsfaktor des Prozessmanagements ist: Die oberste Leitung muss Prozessorientierung unterstützen und selbst vorleben.

Bei etwa einem Drittel der Befragten werden die Prozesse bei der Gestaltung der Organisation überhaupt nicht betrachtet. Und nur bei einem Prozent gibt es eine reine Prozessorganisation. Insgesamt schneiden kleine Unternehmen besser ab als größere. Etwa die Hälfte hat wesentliche Aspekte eines konsequenten Prozessmanagements umgesetzt, einschließlich Prozessverantwortlichen, Kennzahlensystemen und Regelkreisen zur ständigen Verbesserung. Die andere Hälfte der beteiligten Unternehmen verzichtet darauf noch weitgehend. Immerhin zwei Drittel nutzen Modellierungswerkzeuge. Als Notation kommt bei über einem Viertel BPMN zum Einsatz.

Die Studie, die in der aktuellen Ausgabe 1/2015 der Zeitschrift Führung + Organisation (zfo) vorgestellt wird, nennt auch wesentliche Barrieren für die Prozessorganisation. An erster Stelle liegt hier die “Dominanz funktionsbezogener Subkulturen”, gefolgt von ungeeigneten Anreiz- und Karrieresystemen, unzureichender Anpassung von Ressourcen und Entscheidungskompetenzen und politischen Widerständen. Als wichtigste Erfolgsfaktoren werden nach dem bereits angesprochenen Top Management Commitment die folgenden genannt: Motivation der Beteiligten, Umfassende Kommunikation der Prozesse an die Mitarbeiter sowie abgestimmte und verständliche Prozessbeschreibungen.

Die komplette Studie wird man nach Erscheinen bei der gfo beziehen können.

by Thomas Allweyer at February 09, 2015 11:34 AM

February 07, 2015 Happy Birthday to the BPMN 2.0 Model Interchange Working Group

Today, two years ago, the BPMN Model Interchange Working Group (MIWG) of the OMG met for the very first time and I’ll take a look on what has been achieved so far. The mission of this working group led by Denis Gagne is to support, facilitate, and promote the interchange of BPMN Models among different tools. In order to do that, we created a test suite of currently eight BPMN 2.0 reference models that can be used to test the import and export capabilities of BPMN tools. So far test results of 23 tools have been submitted by vendors or …

by Falko Menge at February 07, 2015 01:29 AM

February 03, 2015 Camunda in the UK: Kick-off on March 20 in London

This year we will increase our presence in the UK. We think the market there has suffered from traditional BPM products long enough.

Our friends at 6point6 prove to be fantastic partners for this enterprise. You can meet them and us at our first

Camunda BPM Community Day in the UK Date: Friday 20th March Time: 9am to 11.30am (alternatively: lunch bag session 12pm – 1.30pm) Place: Royal Institution, London Admission: Free

Agenda and Registry

And even before that, we will be presenting at QCon (March 4-6) and I will be attending the Gartner BPM Summit (March 18-19).

So let me know in case you’d like to …

by Jakob Freund at February 03, 2015 02:42 PM

February 02, 2015

Thomas Allweyer: BPM-Tools und Compliance

Cover_BPM_Compliance_StudieDie jüngste BPM-Tool-Studie des Fraunhofer-IAO beschäftigt sich mit dem Thema Compliance in Geschäftsprozessen. Vorangegangen waren ein Marktüberblick sowie zwei Studien zu Social BPM und der Überwachung von Geschäftsprozessen. 14 der 28 Anbieter, die im Marktüberblick vertreten sind, beteiligten sich an der Compliance-Studie. Das Thema scheint für die Anbieter interessanter zu sein als Social BPM und Prozess-Überwachung, da an den betreffenden Studien nur zehn bzw. fünf Anbieter teilgenommen hatten. Der ursprüngliche Fokus der Studie wurde von dem eng gefassten Compliance-Begriff auf den gesamten Themenkomplex “Governance, Risk & Compliance” (GRC) erweitert.

Compliance umfasst dabei die Einhaltung von Gesetzen und externen Richtlinien. Governance bezieht sich auf die Führung und Kontrolle eines Unternehmens. Gegenstand des Risikomanagements ist der systematische Umgang mit Risiken. Häufig wird hierfür ein Internes Kontrollsystem (IKS) aufgebaut. Im Rahmen von GRC müssen die verschiedenen Komponenten Strategie, Prozesse, Technologien und Menschen zusammenwirken. Im Zusammenhang mit BPM-Tools wird vor allem das Management der Prozesse betrachtet. GRC-Aspekte können in allen Phasen des Prozessmanagement-Kreislaufs eine Rolle spielen. So werden etwa im Rahmen der Sollprozess-Modellierung Risiken modelliert, Compliance-Anforderungen dokumentiert und das IKS definiert. Während der Prozessausführung werden u. a. die Kontrollen ausgeführt und Risiken nachverfolgt. In der Prozessüberwachung geht es um Auswertung, Ablage und Nachvollziehbarkeit.

Vielfältige Varianten der Toolunterstützung

Entsprechend vielfältig kann die Toolunterstützung für GRC-Aufgaben ausfallen. Die entsprechenden Funktionalitäten hängen von der Kategorie des jeweiligen Tools ab. So kann man mit Modellierungstools etwa Risiken und Kontrollen dokumentieren, Compliance-Anforderungen zu Prozessen zuordnen und die Einhaltung von Modellierungsrichtlinien überprüfen. Teilweise werden auch Freigabeworkflows für die erstellten Modelle angeboten. Mit Prozessanalysetools kann man beispielsweise prüfen, welche Auswirkungen GRC-bezogene Änderungen auf verschiedene Prozesse haben. BPM-Systeme, die eine Prozessausführung ermöglichen, erlauben es unter anderem, interne Kontrollen automatisiert durchzuführen, die Risiken nachzuverfolgen und die Verletzung von Vorgaben zu entdecken. Die untersuchten Tools haben zum größten Teil den Schwerpunkt Prozessmodellierung und -Analyse. Viele bieten auch eine Ablaufsteuerung von Management-Prozessen, wie Prüfung und Freigabe von Prozessdokumentationen. Jeweils ein kleinerer Teil bietet eine kennzahlengestützte Überwachung von Prozessen oder eine Prozessausführung.

Alle 14 teilnehmenden Anbieter geben an, die Bereiche Governance, Risikomanagement, Compliance und IKS zu unterstützen. Zumeist sind die entsprechenden Features voll in das jeweilige Tool integriert, in einigen Fällen werden sie aber auch als eigenständig nutzbare Komponenten angeboten. Für die Unterstützung von GRC-Anforderungen kann auch ein Geschäftsregelmanagement von Bedeutung sein, wie es von neun Tools angeboten wird. Im Bereich Governance legen die meisten Hersteller den Fokus auf die Process Governance, also die Regeln für den Umgang mit Prozessen und Prozessmodellen. Teilweise wird auch die Einbettung in die Unternehmens-Governance betrachtet, z. B. indem dokumentiert wird, wie die Unternehmensziele unterstützt werden. Die Tools weisen hier recht viele Ähnlichkeiten auf. So verfügen praktisch alle über ein Versionsmanagement, ein Prozessportal sowie die Unterstützung von Management-Prozessen. Vielfach werden auch die Einhaltung von Modellierungsrichtlinien sichergestellt und Best Practice-Sammlungen bereitgestellt. Größere Unterschiede gibt es bei den unterstützten Standards. Am häufigsten genannt wurden CoBIT, COSO, ITIL sowie ISO 9000/9001.

Risiken in Prozessen modellieren

Im Bereich Risikomanagement bieten die meisten Tools einerseits die Möglichkeit, den Risikomanagement-Prozess oder die Durchführung von Kontrollmaßnahmen in Form von Prozessmodellen zu beschreiben. Zum anderen kann man Risiken modellieren und zu Prozess-Schritten zuordnen. Bei den möglichen Darstellungen unterscheiden sich die Tools. Sie erfolgt z. B. in Tabellen, direkt im Prozessmodell, oder in Form einer hierarchischen Baumdarstellung. Für die Nachverfolgung von Risiken kann man fast überall die Risikohöhe und die Eintrittswahrscheinlichkeit erfassen. Schließlich werden unterschiedliche Arten von Auswertungen angeboten.

Für den Aufbau eines Internen Kontrollsystems (IKS) kann man bei den meisten Tools Rollen und Kontrollen definieren und die Kontrollen den Risiken in den Prozessen zuordnen. Tools mit einer Ausführungsumgebung können zudem die Ausführung von internen Kontrollen unterstützen sowie die durchgeführten Kontrollen auswerten und sämtliche Informationen dazu revisionssicher archivieren.

Verschiedenes Verständnis der einzelnen GRC-Themen

Zur Compliance tragen die untersuchten Tools bei, indem man mit ihnen Compliance-Anforderungen und die zugehörigen Quelldokumente revisionssicher hinterlegen und untereinander sowie mit weiteren Elementen verknüpfen kann. Meist kann man auch den Compliance-Status darstellen und in Form von Reports ausgeben. Sehr unterschiedlich sind auch hier die von den Tools unterstützten Standards. Jeweils drei Tools unterstützen die Regelwerke Basel 2/Basel 3 und SOX (Sarbanes-Oxley).

Insgesamt zeigen sich in der Studie zahlreiche Übereinstimmungen bei den angebotenen Funktionalitäten. Dennoch kann sich die Art und Weise, wie ein bestimmter Aspekt umgesetzt ist, zum Teil ganz deutlich unterscheiden. In der Befragung wurden die Hersteller auch gebeten, die verschiedenen Begriffe aus dem Bereich GRC zu definieren. Bei den Antworten wird deutlich, dass sie vielfach ein ganz verschiedenes Verständnis bestimmter Themen haben. Dieses schlägt sich dann auch in den jeweiligen Tools nieder.

Monika Kochanowski, Falko Kötter, Thomas Renner:
Business Process Management Tools 2014 – Compliance in Geschäftsprozessen.
Fraunhofer Verlag 2014.
Weitere Infos und Bestellung beim IAO

by Thomas Allweyer at February 02, 2015 09:48 AM

January 22, 2015 Leading BPM. – Impressions of Conference 2014

02_KeynoteBringing experts together, exchanging knowledge, and gaining new ideas from other business areas is the ongoing spirit of the conferences. In November 2014, the third conference was held under the motto “Leading BPM” and we welcomed over 80 experts from diverse industry and business areas as well as the social sector in Lufthansa Training & Conference Center Seeheim in the area of Frankfurt.

Dr. Bernhard Krusche from the association “NEXT SOCIETY” started the conference with his thought-provoking keynote “Quo Vadis, BPM?”. He encouraged the audience to think outside the box and to be open-minded towards future trends like the digital age and its implications for personal life, business, and BPM. – As result of the discussion, we will pick up this topic in 2015 to further explore how BPM can prepare for the next society.

After the key note, topics of day one focused on BPM training, activities to strengthen acceptance of BPM systems, and change management within BPM. Each topic was divided in three parts: Best practices distilled from our workshops, theoretical insights provided by students from the University of Bamberg, and practical examples from Diehl Controls, Lufthansa Technik and the City of Hamburg.

The second day was introduced by the announcement of the winner of the BPM2thePeople Award 2014. As there were two extraordinary examples of implementing process management in the social and educational sector, the prize was shared by two organizations: The kindergarten “Am See” in Großbettlingen and the “Behörde für Arbeit, Soziales, Familie und Integration” of the City of Hamburg. Both organizations have a role model function with their innovative and efficient ideas and both absolutely deserve the award. Read more about the award and its winners on the following website:

In addition, the audience was given insights into the Haiti Entrepreneurship Camp, a fairly different area of business process management. The project supports local entrepreneurs in Haiti to gain experience in management tools and provides a mentoring program. Spontaneously, we decided to support the project with a 1.500 Euro donation out of the conference result.

The final part of the conference was themed under the topic “Digital Age BPM”. Dorit Fischer, digital native and master student at the University of Bamberg, presented the results of a series of workshops about how to combine social media, web 2.0, mobile devices, and further trends of the Digital Age with BPM. In addition to the workshop results, Manuel Büchele and Martin Mannweiler from DZ Bank provided insights into their implementation of Digital Age BPM features into their system.

Matching the “Digital Age BPM” topic, Samsung supported our conference with their newest generation of tablets so that participants could take an active part in the conference. They took pictures and notes, used the chat function to communicate with other experts, and participated in live surveys. What was new this year was the opportunity to take part in a speed dating session during the breaks. Here BPM experts came together with other BPM experts in a face-to-face meeting to simplify networking and exchange experience.

To wrap up the conference, experts and students gathered in interactive workshops where they discussed the latest research findings and their application in the business world. Based on these workshops, the conference was finalized by a panel discussion and a live voting to identify the most important content and issues for our next conference.

We would like to thank all participants for the great discussions in Seeheim and the overwhelming feedback to the conference! Let’s see, when the next conference will take place…

Let's get it started... Check In Key Note by Dr. Bernhard Krusche Research Results Patricia Jäger from Diehl Controls Volker Pelikan from Lufthansa Technik Alfons Federspiel presents the BPM approach of BASFI in context of change management Our hosts Michael Bögle and Mirko Kloppenburg Networking Delicious Winter Buffet Good Morning Seeheim! BPM2thePeople Award Haiti Entrepreneurship Camp presented by Pietro Montemurri Speed-Dating Dorit Fischer presents the results of the Digital Age BPM workshops Our fearless students of University of Bamberg Mirko Kloppenburg introduces the BPM future workshops Panel discussion with all speakers

Pictures by Stefan Bergmann

by Mirko Kloppenburg at January 22, 2015 08:35 PM

January 16, 2015

Keith Swenson: 5 Opportunities in the Process Space for 2015

There are 5 key opportunities to participate in process space (BPM and ACM) in the next few months, and the deadlines are coming up, so don’t delay, and don’t miss out.

1. Book:  BPM EVERYWHERE: Going Beyond Basic BPM

The call for papers for chapters in this book is out and the deadline is now.  which is going to cover all aspects of how the Internet of Things and BPM come together.  In less than 5 years, the majority of customer interactions will no longer be person-to-person or over even the phone, but through engagement via intelligent agents, and increasingly between the agents themselves. Analytics will drive an ever-growing number of decisions, not as historic reports but rather through real-time support on mobile and wearable devices. Robots once hidden among warehouses and on the factory floor will become the fastest growing sector of the workforce, participating in a vast array of knowledge worker processes.   Got some good ideas about this?  Put what you know to paper and include in this promising book which will be launched in April.  Proposals due Jan 15, 2015 but if you do it immediately there might still be availability.

2. Adaptive CM Workshop – Aug 31, 2015

This marks the fourth year that this full day Workshop on Adaptive Case Management and other non-workflow approaches to BPM.  This year it will be held in Innsbruck Austria on Aug 31, 2015 in conjunction with the BPM 2015 conference.   Past workshops have been the premier place to publish rigorously peer reviewed scientific papers on the groundbreaking new technologies.  See the call for papers.   Submission abstracts are due 22 May 2015, full text on 29 May 2015 and notification to authors 29 June.

3. ACM Awards

The WfMC will be running another ACM awards program to recognize excellent use of case management style approaches to supporting knowledge workers.  The awards are designed to help show how flexible, task tracking software is increasingly used by knowledge workers with unpredictable work patterns.  Winners are recognized on the hall of fame site (see sample page highlighting a winner) and in a ceremony at the BPM and Case Management Summit in June. Each winning use case is published so that others can know about the good work you have been doing, and can follow your lead.  This series of books is the premier source of best practices in the case management space.  Submit proposals and abstracts now for help and guidance in preparing a high quality entry, and final submission due April 2015.

4. BPM Next – March 30, 2015

The meeting of the gurus in the BPM space.  This is where the leaders of the industry come together to discuss evolving new approaches, and to help understand the leading trends.  The engineering-oriented talks are required to have a demo of actual running code to avoid imaginative, but unrealistic, fantasies.  This year the presentations will all start with an “Ignite” presentation which has exactly 20 slides and lasts exactly 5 minutes to reign in the guru’s natural tendency for lengthy and wordy presentations.  The program is already set however attendee registration is still open.  This year it will be held in a new location on the California coast:  the quaint old-town of Santa Barabara.

5. BPM and Case Management Global Summit – June

The premier independent industry show for the full range of process technologies.  Many of last year’s attendees described this as the best, most informative, conference on BPM and ACM that they had ever seen.  It’s the second year to be held at the Ritz-Carlton in Washington DC, June 22-24 2015 and it promises to be bigger and better.    The call for papers abstract deadline is February 28, 2015, with final acceptance/commitment in March.

If you have been doing work in the process space — implementing customer systems, researching new technologies, developing new products — these are all excellent opportunities to share your expertise and show your leadership.  But don’t delay.  Remember those New Year’s resolutions … and register now so you don’t miss the deadline.

Also, don’t forget.  Signup to receive for free the first chapter of “When Thinking Matters in the Workplace.

by kswenson at January 16, 2015 06:31 PM

January 13, 2015

Keith Swenson: 2015

The half-way point for the decade of the 2010’s can best summed up by a few things that happened to me in the past couple days:

1. I spent $12 to purchase 12 months of web hosting. That doesn’t even cover the electric bill for the laptop I will use to set the site up. Cloud computing resources are unbelievably cheap.

2. I used Google today to look up the address for the office I work in. It is printed on all my business cards, but it is just easier to use Google copy/paste than to find a business card and type it in.  Information is unbelievable easy to access.

3. To save money in the next couple months, my company has cancelled all travel not mandated by a customer contract. We are all traveling less, and yet working more on teams that are distributed over larger areas. Communications allows unbelievable levels of collaboration.

4. A single professor in Holland gave completion certificates to 1,690 students from all over the world on the subject of Process Mining. And the course was free. Over the holidays, I discovered a new JavaScript framework, and I found a dozen hours of free videos explaining how to use it. Learning can scale to unbelievable levels.

5. I heard a rumor today that President Obama was disbanding the U.S. Marines. In spite of amazing resources to vet information with multiple sources: Wild, unsubstantiated, and crazy stories are repeated further and faster than ever before (by people who behave otherwise perfectly sane.) Critical thinking is unbelievably poorly exercised.

We are in a time of wonders. Some day we will look back on this year, and say; “Those were the good old days.” However, we will mostly think of them as unbelievable.

by kswenson at January 13, 2015 10:49 PM

December 28, 2014 The year we crossed the chasm – Camunda review 2014

If you’re interested in Camunda’s journey as a company, or in tech startup journeys in general, this post may be worth reading.

The beginning: Shift a paradigm, sidestep competition and find a blue ocean

In 2012, we created a software product called Camunda BPM. The core value proposition of Camunda implicates a paradigm shift in what Business Process Management technology should deliver:

Primary Target Group Value Proposition Most BPM-Suites Companies that are trying to reduce their software development force “Our product allows you to create process applications in a model-driven approach, so that you need fewer software developers.” Camunda Companies that consider their software development force a strategic …

by Jakob Freund at December 28, 2014 04:24 PM

December 20, 2014 CMMN – The BPMN for Case Management?

Scott Francis recently asked a legitimate question: Will BPM vendors adopt CMMN, or will they rather focus on topics like mobile/social/local/cloud? (Read the complete blog post here)

As the obviously biased CEO of one of the few vendors who implemented CMMN already, I am actually sceptical. My experience with BPMN during the last 8 years was, that most of the established BPM vendors balked at implementing it, argueing that they already had something “better” for the same use case. I suppose it’s the same with Case Management now, so I predict they will only make the effort of supporting it when …

by Jakob Freund at December 20, 2014 03:25 PM

December 19, 2014

Drools & JBPM: UI Front-end Developer Job Opening in the Drools and jBPM team

We are looking to hire a front end developer to work closely with our UXT team to make our web UI's look and work beautifully. The ideal candidate will have an artistic flair and solid grounding in what makes a good UI. Our existing web tooling can be seen here.

The developer will be working with Java, GWT, Errai, HTML5 and JS. The role is remote.

by Mark Proctor ( at December 19, 2014 10:21 PM

December 16, 2014

Keith Swenson: When Thinking Matters in the Workplace

In the 4 and 1/2 years since “Mastering the Unpredictable” introduced the idea of Adaptive Case Management to the world, a growing group of people have struggled to define what it really means to make use of this new emerging trend.  This new book “When Thinking Matters in the Workplace” takes it one step further — to outline what a manager needs to know, to lead a team of innovative knowledge workers, and how to put in place a system to best support them.

Thinking-Matters_Just_FrontWe know that innovation is the key to success, not just in high tech, but in all industries. Innovation is what happens when knowledge workers are successful.  Innovation is the result of thinking “how can we do this better?”  Innovation has always been a largely manual activity until recent years.  The emergence of BPM technology finally enabled the automation of knowledge worker processes — but then we ran into a problem.   Innovation is not a routine process.  Innovation happens differently every time.  You can’t just define the process of innovation, and automate it.

The decision of how to support workers in an office should not be decided by technologists.  At the level of knowledge workers, the technology defines how the business will operate.  This is a core business decision that must be made by the executives of the organization, not the IT department.  The way that knowledge workers are supported strongly effect the way the entire business operates.

That is why we made this book to address the problem at the executive management level.  This is not a book for technologists.  It is a book for managers to understand the choices they must make, and how those choices will effect their ability to create new products and services.

Knowledge workers are quite flexible and capable in many ways, and so there is no single prescription on how to support them.  However, it is easy to identify technology that is bad for knowledge workers.  It is easy to put in place technology that restricts workers, and eliminates innovation.  You won’t notice that innovation is gone, until it is too late.

This project is the culmination of years of study on how successful teams work together to innovate.  We go into detail on what innovation and knowledge work is, why it is difficult to understand, how it is different from routine work, what good management is, and how technology needs to support this.  After  reading this book, you will be prepared with good reasons for choosing one approach over another, with solid references to support your position.

For readers of this blog post, for a limited time, sign up for the mailing list, and get the first chapter for free.  I will be sending all the chapters out for review to this mailing list as well over time, so stay on the list if the first chapter is of value to you.

UPDATE: Find it on Amazon

Outline of the Contents

1. Innovation Management Challenge.  What is innovation?  How do innovators work? including some examples.  Some misconceptions about innovation: is it rarely a sudden epiphany, and does not require incredible mental powers.  Distinguishing routine work from knowledge work.  Innovators need leaders, and how leaders differ from managers.  How innovators need autonomy and accept unpredictability.

2. Understanding Complexity.  The biggest mistake in supporting innovation, is tied to misunderstanding complexity.  Innovation is not simple.  This high level overview of complexity science gives us an understand of why reducing a business to the simplest form tends to eliminate the ability to innovate, which is necessary at all levels.  We need to leave behind thinking of an organization as a machine, and understand that organizations are complex systems.  Understanding how complex systems behave prepares us for choices we have to make in how to support them.

3 Management and Leadership.  Knowledge workers are not like machines.  Experienced managers already know how to lead people who think. We examine the evidence from management science supporting this, and at the same time contrast such wisdom with the mechanistic approaches that IT departments tend to favor.   We show how sound management principles are often discarded when designing systems for supporting workers.

4 Agile Management. Building on this, we outline a brief overview of the ways that one should lead and support knowledge workers in a way that gives them the freedom to innovate.  Many of these ideas come from other fields, and we are repurposing them for general work environment.  We examine how Toyota did this for the manufacturing industry.  We examine how Agile approaches have been used in software and high tech fields.  We also briefly cover Lean and Six Sigma recommendations for leaders, and well as the power of a checklist.  This gives us an idealized vision for how knowledge workers might be enabled to work best together.

5. Business Architecture.  How do you define your business? How does your business succeed?  This is not about IT systems, but the organization itself.  Every executive knows that the shape of the organization is critical to the long term success of the organization.  We focus in this chapter on some new ways of organizing, that have only recently been possible because of advances in information technology.  How push organizations different from pull organizations.  We look at a company that has eliminated all management at all levels, and how that works.  We look at flow-to-work organizations, hyper-social organizations, and wirearchies.  There is no single best architecture, but most of us should consider whether some aspects of these radical new approaches should be embraced.

6 Business Technology.  Having a solid grounding of where you want to take your organization, we then discuss recently emerged technical options, and how those might effect your organizations.  What is collaboration technology, how email is a benefit and a vice, how social networks might be used, enterprise 2.0, and systems of engagement.  We present the range of 7 different process technologies, with a focus on two types of case management.  There are then a couple of other technological aspects you need to be concerned with: identity, security, and some challenges with mobile technology.

7 Roadblocks to Innovation.  Here we delve into problems that you are likely to run into if you take a naive approach to supporting knowledge workers.  These are things that look good in concept, but when deployed they turn against you.  Some are promises that technology vendors make but are stated in confusing and misleading ways.  Beware the ‘snake oil!’  Some approaches are very attractive, but don’t work.  This chapter prepares you to avoid the misleading approaches.

8 What is Case Management?  The way to support innovation is case management, but once again, not everything that carries this name will necessarily fit the need.  This chapter goes into depth on the features and capabilities that you should expect to find, as well as a discussion of why these features are important.  If you are considering the purchase of case management technology, you should certainly read this chapter before that.

9 Patterns of Innovation.  As you already know, the technology is not the whole story.  This chapter talks about how to successfully use technology in the support of innovation.  It touches on culture change, adoption strategies, how to design for change, how to leverage the intelligence of the workers, how to reduce the cost supporting work, and how to best leverage the transparency and social ties of case management.

10 Fumbling Innovation.  The flip side of the coin, this chapter covers how technology can be used poorly and effectively stop innovation.  These are patterns to look for, and avoid, among the workers.  Don’t micromanage, don’t ask for too much detail up front, don’t prevent changes to plans, don’t punish people who try and fail.   In general, how to avoid making a working environment that is unfriendly to innovation.

11 Leading the Innovators.  Tips and techniques to help the knowledge workers understand how these approach help them innovate, and why it is important for business.  Some hints here on how to inspire workers to make the best use of the technology.

12 Lessons from the Field.  Covers some real live used cases that used case management technology, how they used it, what they did right, and what they did wrong.


At 370 pages, this book is packed with references that will help you back up the approach you take to support knowledge workers.  Whether you want to try out a radical new style of organization, or whether you want to stick with the reliable existing organization and simply wish to eliminate unnecessary paperwork, this book will give you needed details.  Knowledge is power, and we have tried to bring together everything you might need to make a  decision about Case Management for your organization.

If thinking does matter in your organization:

  • You already know that smart people need autonomy in order to innovate and create.
  • Automation, done haphazardly, can put strait-jackets on your most creative people.
  • You don’t want to put in place technology that micro-manages your knowledge workers.
  • Robots don’t innovate.
  • This book provides some guidelines to help avoid this the worst pitfalls and to take the best advantage of this adaptive approach for supporting teams of knowledge workers.

by kswenson at December 16, 2014 10:15 AM

December 15, 2014

Thomas Allweyer: Das perfekte BPMS gibt es nicht – Neue Studie von Fraunhofer IESE

Cover BPMS-Studie IESEInsgesamt 18 Hersteller von BPM Suiten beteiligten sich an der neuen Studie des Fraunhofer IESE, die in diesem Jahr zum zweiten Mal durchgeführt wurde. Jedes Tool wurde in einem ganztägigen Workshop auf Herz und Nieren geprüft. Die Vertreter des jeweiligen Herstellers sollten die Umsetzung eines vorgegebenen Prozesses demonstrieren. Im Laufe des Workshops mussten dann noch zahlreiche, vorher nicht bekannte Änderungen durchgeführt und weitere Features gezeigt werden. Ich durfte bei einigen der Workshops als Gast teilnehmen. Dabei konnte man ein ziemlich umfassendes Bild des jeweiligen Systems gewinnen – abgesehen von einigen Themen, die explizit nicht untersucht wurden, wie beispielsweise Case Management-Funktionalitäten.

In der Gesamtauswertung liegen die untersuchten Tools relativ nahe beieinander. Weder gab es ein richtig schlechtes Tool, noch konnte ein perfektes BPMS als einsamer Spitzenreiter identifiziert werden. Insbesondere die Mächtigkeit der Werkzeuge, d. h. der Abdeckungsgrad der untersuchten Features, ist meist recht hoch, wohingegen beim Bedienungskomfort vielfach noch Luft nach oben ist. Die höchste Gesamtbewertung konnte Bizagi verbuchen. SoftProject ist das Werkzeug mit der höchsten Mächtigkeit. Axon IVY und IBM wurden in den meisten Einzelkategorien mit “gut” bewertet.

In den Einzelkategorien, wie z. B. Prozessausführung, Prozessmodellierung oder Prozesscontrolling, zeigten sich denn auch wesentlich größere Unterschiede als bei der Gesamtauswertung. Bei der Auswahl des geeigneten BPMS muss man sich folglich genau überlegen, welchen Funktionalitäten man besondere Bedeutung beilegt. So sind etwa die Governance-Fähigkeiten der untersuchten Tools sehr verschieden. Als erste Orientierungshilfe enthält die Studie daher einen Entscheidungsbaum, mit dem man die Tools findet, die die gewünschten Aspekte überdurchschnittlich gut abdecken. Ein genaueres Bild liefern die Einzelanalysen für jede der 18 BPM Suiten.

Adam, S.; Koch, M.; Neffgen, F.; Riegel, N.; Weidenbach, J.:
Business Process Management – Marktanalyse 2014
BPM Suites im Test
Fraunhofer IESE, Kaiserslautern 2014
Download der Kurzfassung und Bestellmöglichkeit

by Thomas Allweyer at December 15, 2014 09:50 AM

December 05, 2014

Keith Swenson: Drucker Forum 2014 Update

It is time again for the Global Peter Drucker Forum.  Here are some highlights of talks from John Hagel, Clayton Christensen, Gary Hamel and others.

All of the talks are are recorded and available on the web at

  • John Hagel – The Dark Side of Technology.  Gave a quick overview of ‘The Big Shift.’  Increasing pressure in three levels:
    (1) removing barriers to entry/movement giving more competition,
    (2) accelerating pace of change,
    (3) connectivity means disruptions can come from anywhere on Earth.
    Analysis of all public companies shows that the return on assets has declined 75%.  There is a mindset disconnect.  Shift from scalable efficiency to scalable learning.  A CEO needs to ask three basic questions:
    (1) what is our business?
    (2) what should it be?
    (3) what should it not be?
    There are three types of business:
    (1) High volume routine processes,
    (2) product innovation,
    (3) customer relationship.  These three business types are very different; we should not lump them together.  The democratization of the means of production of content means that content has grown tremendously.  The same will happen with 3D printing.  The dark side of technology becomes a catalyst for change.
  • Lawrence Crosby – Drucker School of Management has some strategic initiatives
    (1) A Drucker Index as a measure of organizational effectiveness
    (2) Invent/innovation extend to challenge of managing creative workers
    (3) customerific,
    (4) bringing goals of Drucker into economy.
    How to deal with an uncertain future?  Agility, resilience, foresight, and a solid foundation.
  • Clayton Christensen – The Capitalist’s Dilemma.  What causes managers to invest to grow?  Growth comes from innovation, and the link is investment.  Three types of innovation:
    (1) market creating innovation/disruption (growth)
    (2) sustaining innovation
    (3) efficiency innovation.
    If these are all in balance the economy does well.  Finance gives us metrics which are ratios (e.g. IRR, PE, RONA), either increase the numerator, or decrease the denominator.  Investing in market creating innovation destroys these measures.  All free cash goes to efficiency innovation.  The show is being run by the people who dictate the metrics.  Gave the example of Japan.  Not a small problem.
  • Gary Hamel – Hacking Management.  How do you know if you have an evolutionary advantage?  Most companies don’t.  How to build a self-renewing organization.  Not one company of 100 makes innovation intrinsic.  Ask a random selection of employees
    (1) are you trained in innovation?  have you made investment in innovative capital?
    (2) how quickly can you take a small amount of experimental capital and produce something new?
    (3) are you measuring innovation?
    Enormous problem: only 13% of employees are engaged in work.  Our organizations are less capable than the people inside them.  Core-incompetency.  Every organization today is still run on principles of hierarchy/bureaucracy.  Management is a mashup of military command structures and disciplines of industrial engineering.  Stuck with management which is the love child of Julius Caesar and Fredrick Winslow Taylor. Organizations fail when the leaders fail to write off their own depreciating intellectual competence. All of us would be pressed to imagine the structure of some of the world’s most amazing companies.  Will take:
    (1) models of managing without managers (mentioned MorningStar)
    (2) motivation, admit that no alternative to move to new management model
    (3) change mindset and residual beliefs away from hierarchy
    (4) migration paths to get there from here
    Where do you see evidence of bureaucratic management drag.  The organizations that win in the next few years are the ones who learn how to evolve their management models.  Fiddling at the margins will not work.  Not just wrap theory around reality, but to change reality.  Nobody should be sitting on the sidelines.

(more to come)

by kswenson at December 05, 2014 06:03 PM

December 02, 2014

Thomas Allweyer: BPMN-Tools schwächeln bei der Unternehmensmodellierung – Studie zum Download

BPMN-Verknuepfbare Diagramme ThumbnailDie meisten Modellierungstools haben zumindest den größten Teil des BPMN-Standards umgesetzt, so dass sie sich hinsichtlich der Prozessmodellierung nicht mehr stark unterscheiden. Sehr große Unterschiede gibt es aber bei der Verknüpfung von BPMN-Modellen mit anderen Modellen, wie z. B. Organigrammen, Datenmodellen oder IT-Landschaften.

Da die Prozesse eines Unternehmens nicht isoliert stehen, ist es für ein konsequentes Prozessmanagement unerlässlich, das Zusammenspiel mit anderen Aspekten des Unternehmens zu betrachten. Ansätze zur integrierten Unternehmensmodellierung und zum Enterprise Architecture Management (EAM) streben daher eine Verknüpfung der unterschiedlichen Modelle an. Jedes Modell stellt darin eine Sicht auf ein integriertes Gesamtmodell dar. Werden die Geschäftsprozesse in BPMN modelliert, so muss man entscheiden, wie man sie methodisch sinnvoll mit anderen Modellen verknüpft. Weil dies nicht vom BPMN-Standard geregelt wird, haben sich in der Praxis ganz unterschiedliche Verknüpfungen zwischen BPMN-Modellen, Organigrammen, Strategiemodellen usw. herausgebildet.

Für die vorliegende Studie habe ich zum einen Integrationsansätze aus der wissenschaftlichen Literatur ausgewertet und zum anderen insgesamt 13 Modellierungswerkzeuge auf Verknüpfungs- und Erweiterungsmöglichkeiten für BPMN-Modelle untersucht. Wie sich herausstellte, spielt das Thema in der Wissenschaft keine große Rolle. Zwar haben eine ganze Reihe von Forschern spezielle BPMN-Erweiterungen entwickelt, doch nur selten wurde die Verknüpfung mit Modellen anderer Sichten untersucht. Wesentlich ergiebiger war die Analyse der Modellierungswerkzeuge. Hierbei wurden sowohl fachlich ausgerichtete Modellierungplattformen einbezogen (auch als Business Process Analysis- oder BPA-Tools bezeichnet), als auch Modellierungstools aus der Software-Entwicklung und Modellierungskomponenten von Business Process Management-Systemen (BPMS) zur Prozessausführung.

Es stellte sich heraus, dass die Möglichkeiten zur Methodenintegration sehr unterschiedlich sind und oft nur wenige Aspekte abgedeckt werden. So gibt es eine gewisse Grundmenge von Aspekten, die häufig mit BPMN-Modellen verknüpfbar sind. In jeweils mindestens sechs der 13 Tools kann man die Prozessmodelle mit Prozesslandkarten, Daten, IT-Systemen, Risiken oder der Aufbauorganisation verknüpfen. Jedoch gibt es kaum ein System, das zumindest diese fünf Aspekte in einem einzigen Tool abdeckt. Zudem werden dieselben Aspekte in unterschiedlichen Tools methodisch ganz unterschiedlich abgebildet und verknüpft. Es zeigt sich eine bunte Methodenvielfalt, die für den Anwender schwer zu durchschauen ist. Werben beispielsweise zwei Tools damit, dass man BPMN-Modelle mit Aufbauorganisation, Datenstrukturen und IT-Landschaften verknüpfen kann, kann man keinesfalls davon ausgehen, dass das tatsächlich angebotene Set an Methoden gleich mächtig ist.

Bei einigen umfangreichen Modellierungsplattformen verwundert die eher schwach ausgefallene Integration von BPMN-Diagrammen und anderen Modelltypen. Verbindungen sind z. T. sehr umständlich über eigenständige Zuordnungsdiagramme vorzunehmen. Auch die Erweiterungsmöglichkeiten der BPMN, z. B. um zusätzliche Artefakte, werden nur wenig genutzt. Mit wenigen Ausnahmen ist zudem die Dokumentation der Modellintegration dürftig und lückenhaft. Aufgrund der Heterogenität der angebotenen Methoden ist die Entscheidung für ein bestimmtes Tool zugleich die Entscheidung für eine bestimmte Methodik.

Von daher wäre es wünschenswert, dass sich sowohl Wissenschaftler als auch Standardisierungsorganisationen stärker mit der Integration von Modellen verschiedener Standards befassten. Toolhersteller sind dazu aufgerufen, ihr Methodenportfolio im Zusammenspiel mit BPMN kritisch zu durchleuchten und ggf. häufig benötigte Modelltypen und Verbindungen zu BPMN-Modellen hinzuzufügen. Anwender sollten im Vorfeld einer Toolauswahl recht genau analysieren, welche Sachverhalte sie über die reine BPMN-Prozessmodellierung hinaus modellieren wollen und welche Verbindungsmöglichkeiten zu anderen Modelltypen sie entsprechend benötigen. Sie sollten sich bewusst sein, dass sie durch die Auswahl eines bestimmten Tools auch eine Festlegung der möglichen Notationen und ihren Verbindungen treffen.

Download der Studie “BPMN-Prozessmodelle und Unternehmensarchitekturen”.

by Thomas Allweyer at December 02, 2014 07:06 AM

December 01, 2014

Sandy Kemsley: BPM Cyber Monday: Camunda 7.2 Adds Tasklist And CMMN

I caught up with Jakob Freund and Daniel Meyer of camunda last week in advance of their 7.2 release; with 1,700 person-days of work invested in this April-November release cycle, this includes a new...

[Content summary only, click through for full article and links]

by sandy at December 01, 2014 08:24 PM

November 27, 2014

Sandy Kemsley: Activiti BPM Suite – Sweet!

There are definitely changes afoot in the open source BPM market, with both Alfresco’s Activiti and camunda releasing out-of-the-box end-user interfaces and model-driven development tools to augment...

[Content summary only, click through for full article and links]

by sandy at November 27, 2014 04:42 PM

November 21, 2014

Thomas Allweyer: BPMN-Zertifizierungskurs mit ARIS

BPMN-Prozess in ARISDer von mir mitentwickelte BPMN-Zertifizierungskurs, der seit geraumer Zeit sehr erfolgreich in der Schweiz läuft, kommt Anfang nächsten Jahres auch nach Deutschland. Gemeinsam mit der Firma aproo, die unter anderem zahlreiche Dienstleistungen im ARIS-Umfeld anbietet, wurde der Kurs an die Verwendung mit den ARIS-Modellierungswerkzeugen der Software AG angepasst. Lange Zeit dominierte bei ARIS-Nutzern die Ereignisgesteuerte Prozesskette (EPK) als Notation zur Modellierung von Geschäftsprozessen, doch gewinnt die BPMN als alternative oder zusätzliche Darstellungsmethodik auch hier an Bedeutung. ARIS bietet neben der Prozessmodellierung zahlreiche Methoden zur integrierten Modellierung unterschiedlicher Sichten an, wie z. B. Daten, Organisation oder IT-Landschaften. In dem Kurs wird daher auch erläutert, wie sich BPMN-Modelle mit anderen Modelltypen verbinden lassen.

Schwerpunkt der Schulung ist aber die fundierte Einführung in die BPMN selbst. Anhand vieler Beispiele und praktischer Übungen lernen die Teilnehmer die Notation zu verstehen und selbst anzuwenden. Der zweitägige Kurs richtet sich daher nicht nur an ARIS-Nutzer, sondern an jeden, der die Prozessmodellierung mit BPMN erlernen möchte. Optional kann im Anschluss an den Kurs ein Zertifikat erworben werden. Der erste Kurs findet am 27. und 28. Januar in Frankfurt statt. Weitere Termine in anderen Regionen sind in Vorbereitung.

Weitere Informationen und Anmeldung

by Thomas Allweyer at November 21, 2014 10:35 AM

Drools & JBPM: Red Hat JBoss BRMS and BPMS Rich Client Framework demonstrating Polyglot Integration with GWT/Errai/UberFire and AngularJS

Last week I published a blog highlighting a presentation I gave showing our rich client platform that has resulted from the work we have done within the BRMS and BPMS platforms, the productised versions of the Drools and jBPM projects. The presentation is all screenshots and videos, you can find the blog and the link to the slideshare here:
"Red Hat JBoss BRMS and BPMS Workbench and Rich Client Technology"

The presentation highlighted the wider scope of our UI efforts; demonstrating what we've done within the BRMS and BPMS platform and the flexibility and adaptability provided by our UI technology. It provides a great testimony for the power of GWTErrai and UberFire, the three technologies driving all of this. We can't wait for the GWT 2.7 upgrade :)

As mentioned in the last blog the UberFire website is just a placeholder and there is no release yet. The plan is first to publish our 0.5 release, but that is more for our BRMS and BPMS platforms. We will then move it to GWT 2.7 and work towards a UF 1.0, which will be suitable for wider consumption.  With 1.0 we will add examples and documentation and work on making things more easily understood and consumable for end users. Of course there is nothing to stop the adventurous trying 0.5, the code is robust and already productized within BRMS and BPMS - we are always on irc to help, Freenode #uberfire.

That presentation itself built on the early video's showing our new Apps framework:
The Drools and jBPM KIE Apps Framework

The above video already demonstrates our polyglot capabilities, building AngularJS components and using them within the UF environments. It also shows of our spiffy new JSFiddle inspired RAD environment.

I'd now like to share with you the work we've done on the other side of polyglot development - this time using GWT and UF from within AngularJS. It was important we allow for an AngularJS first approach, that worked with the tool chain that AngularJS people are familiar with. By AngularJS first, I mean that AngularJS is the outer most container. Where as in the above video UF is already running and is the outer container in which individual AngularJS components can be used.

Before I detail the work we've done it's first best to cover the concepts of Screens and Perspectives, our two main components that provide our polyglot interoprability - there are others, but this is enough to understand the videos and examples that come next. A Screen is our simplest component, it is a DIV plus optional life cycle callbacks. A Perspective is also a DIV, but it contains a composition of 1..n Screen with different possible layout managers and persistence of the layout.

  • CDI discovery, or programmatically registered.
  • DIV on a page.
  • Life cycle callbacks.
    • OnStart, OnClose, OnFocus, OnLostFocus, OnMayClose, OnReveal.
  • Decoupling via Errai Bus.
    • Components do not invoke each other, all communication handled by a bus.
  • Editors extend screens, are associated with resource types and provide the additional life cycles
    • onSave, isDirty.
  • CDI discovery, or programmatically registered.
  • Composition of 1..n Screens, but is itself a DIV.
  • Supports pluggable window management of Screens.
    • North, East, South West (NESW).
      • Drag and Drop docking capabilities.
    • Bootstrap Grid Views.
      • Separate design time and runtime.
    • Templates (ErraiUI or AngularJS).
      • Absolute control of Perspective content and layout.
  • Supports persistence of Perspective layout, should the user re-design it.
    • Only applicable to NESW and Bootstrap Grid views.

A picture is worth a thousands words, so here is a screenshot of the Perspective Builder in action. Here it uses the Bootstrap Grid View layout manager. Within each grid cell is a Screen. The Perspective is saved and then available from within the application. If the NESW layout manager was used there is no separate design time, and all dragging is done in-place and persistence happens in the background after each change. Although it's not shown in the screenshot below we also support  both list (drop list) and tab stacks for Screens.

Now back to what an AngularJS first approach means. 6 different points were identified as necessary to demonstrate that this is possible.
  1. UF Screens and Perspectives should be available seamlessly as AngularJS Directives.
  2. Bower packaging for a pre-compiled UFJS. UFJS is the pre-compile client only version of UF.
  3. UFJS can work standalone, file:// for example. UFJS can optionally work with an UF war backend, allowing persistence of perspectives and other optional places that UFJS might need to save state as well as access to our full range of provided services, like identity management.
  4. Support live refresh during development.
  5. Nested Controllers.
  6. Persistence and routing.
  7. Work with tools such as Yeoman, Grunt and Karma.
Eder has produced a number of examples, that you can run yourself. These demonstrate all of the points have been solved. You can find the code here, along with the README to get you started. We did not provide video's for point 7, as I believe the video's for points 1 to 6 show that this would not be a problem.

Eder has also created several short videos running the examples, for each of the use cases, and put them into a YouTube playlist. He has added text and callouts to make it clear what's going on:
AngularJS + UF PlayList
  1. Overview explaining what each video demonstrates (33s).
  2. AngularJS App + UFJS, client only, distribution using Bower. (2m30s).
    • Install and play with UFJS through Bower
    • Create a Native AngularJS App
    • Integrate this app with UFJS
      • Show UF Screen Directives
      • Show UF Perspective Directives
  3. AngularJS App + UFJS client and UF Server.
    • 1 of 2 (3m58s).
      • Download UF War
      • Install and run on EAP
      • Download and run our Angular demo on Apache
      • Show AngularJS Routes + UF Integration
    • 2 of 2 (4m06s).
      • Use UF to create Dynamic Screens and Perspectives
      • Encapsulate an AngularJS template in a UF Screen
      • Show an AngularJS App (inside a UF screen) nested in a parent controller.
        • Demonstrated multiple levels of controller nesting.
  4. KIE UF Workbench RAD environment with AngularJS component.
  5. Uberfire Editor working seamlessly as an Eclipse editor.
For completeness the original video's showing the JSFiddle inspired RAD environment, which demonstrates an UF first polyglot environment, have been added to the playlist. See point 4. above

Finally just to show of, and because we can, we added a bonus video demonstrating a UF editor component running seamlessly in Eclipse. This demonstrates the power of our component model - which has been designed to allow our components to work standalone in any environment. We use Errai to intercept all the RPC calls and bridge them to Eclipse. Because the virtual file system our editors use, like other services, is decoupled and abstracted we can adapt it to the Eclipse File io. For the end user the result is a seamless editor, that appears native. This allows the development of components that can work on the web and in Eclipse, or even IntelliJ. We'll work on making this example public at a later date.

Here are some screenshots taken from the video's

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

(click image to enlarge)

Finally to all those that said it couldn't be done!!!!

by Mark Proctor ( at November 21, 2014 01:30 AM