Planet BPM

April 14, 2015

BPM-Guide.de: Free: Camunda BPM Online Training

My formidable Co-Founder Bernd Rücker created a self-paced training course for Camunda BPM. It consists of 4.5 hours of video plus a couple of hands-on exercises with sample solutions.

You can complete this course if you want to get your feet wet with Camunda, plus it provides some valuable insights into best practices from our consulting experience, e.g. for creating UI in different technologies, writing unit tests or handling transactions.

And it’s free! You just have to sign up for the Camunda BPM Network, and off you go.

Get the Camunda BPM Online Training

by Jakob Freund at April 14, 2015 01:07 PM

Sandy Kemsley: London Calling To The Faraway Towns…For EACBPM

I missed the IRM Business Process Management Europe conference in London last June, but will be there this year from June 15-18 with a workshop, plus a breakout session and a panel session. It’s...

[Content summary only, click through for full article and links]

by sandy at April 14, 2015 11:41 AM

April 13, 2015

Keith Swenson: bpmNEXT – Day 2

Here are my notes from the second day of bpmNEXT on March 31, 2015.  Note: I spoke on day 3, so was too busy, so these conclude my notes of the event.

 

Michael Grohs, Sapiens DECISION – How to manage Decision Logic

Decision aware business models are simpler, easier to maintain.  Less ambiguity than natural language descriptions.  Rule content can be managed by users because it is clearly separate from the program.  Communities make their own vocabularies.  Decision management can produce rules that run in several different rules engines.

Showed the decision design tool.  Typically a whole team working on it playing different roles.  There is a defined process for changing rules, and every change is tracked.  (Product has this javascript “windows” inside a browser, and the demo seems to have trouble managing the windowing. )

Decision starts with an octagon, then some squares with top corners chopped off which represents a rule family linked together with arrows.  Each rule family declares what values (facts) it generates.  In a typical flat rule set the inferential information is lost, and once lost hard to maintain.  Opening the rule family it looks like a table, essentially a decision table.  Each row is OR with the earlier lines.   Not sure if it is first line that matches, or last line that matches.

Example was a rules base, and then a specialized rule base for a particular state: Florida.  You can see a side-by-side window, with the differences highlighted.  There are logs of all changes.

Q: What about knowledge workers?  Can they use it, and can they have their own rules?

A: Right now focusing on the automation, the 80/20 part.  Everything is about faster and cheaper.  In the future we may think about more elaborate rules.  Agility and closed loop with knowledge worker is the next step.

Gero Decker, Signavio – Business Decision Management

BPM addresses 50% of the questions.  The rest is making decisions.  New standard Decision Modeling Notation (DMN 1.0)   Signavio Decision Manager concentrates on the modeling and governance.

Drew up a quick BPMN diagram.  Used a “business rule” task.  Open that up and see a decisions tree.  Open one node in the tree and it looks like a decision table.  Created a quick example decision table.  The decision node in the tree can have “sub-decisions” which are more decision tables.  Product does some decision checking.  There is a rule testing capability as well.

DMN is pretty powerful, it covers rules as well as predictive analytics – not supported yet by Signavio.

Export to DROOLS code, which is pretty flat view of all the rules.  The hierarchy is not apparent, but coded into the rules.  There is a declare statement at the top for run-time binding to the execution environment.

As for execution of the rules in the DMN standard, there are differing hit rules:  first hit, multiple hits, some sort of weighting, last hit.  He demoed ‘exclusive’ which means that you can not have overlapping rules, and in the case of exclusive it automatically shows you when rules overlap.

John Reynolds – Kofax – Digital World

Used to be that BPM ignored the physical world, and BPEL is the best example of that.   Now we need to engage the customers int he real world.  One rule: don’t force users to gather information that is already out there — use a robot instead.  Some information is there in paper form.  Claims that a lot of people still print their PDF files.  Scanning is what Kofax does, and Kapow was awarded last year for creating BPM processes.  Now, SignDoc for signing applications.

For the demo, he got out a utility bill and a driver’s license.  Processes from the past too often assumed that all the information was already there.  Holds the smart phone over the document,a nd captures the document from the video stream, so it processes and cleans up the image specifically for optical character recognition.  Processed on the phone.  The image has the coffee stain removed, and made it black and white.  This cleanup is a kind of “compression” which is important for mobile and storage.

The documents are scanned in using some libraries for scaning that they make available to put into custom apps.  The user does not have to re-enter anything … just take pictures of the documents.

These document “teach” the transformation servers.  There is not any coding, but instead teaching.  Characters are recognized, and then the fields on the document are recognized as well.  There is a manual correction that feeds back to improve the recognition.

Mike Marin – Mobile Case Management and Mobile

Mobile is no longer Optional.  Use case is an insurance company that has unhappy customers and decides to implement a mobile app to make customer experience better.  Will show content, capture, and case — all on the mobile device.

Again, with the phone, took a picture of the document.  After taking the picture, there are some options for cleaning up the document and submit it to the process / case.  Can review the contents.

Robert Shapiro – Process Discovery

Take the event logs, and mine them in order to work out a BPMN process model that will have the same statistical behaviors as the event log.  Example demo will be on stat orders. Looking at analytics, we see that we are not meeting the objective KPI.  First figure out all the paths, and analyze all the variants, and find a critical path.  We can see that one step is causing a lot of the delays.  Propose two different strategies: one to reduce time, another to reduce costs.  After optimization, we meet the delay time reduction criteria, and we see that the second case has better cost benefit values.

Started the demo by opening an event log.  This creates a BPMN serialization.  He has used the idea of “strict BPMN” to enforce BPMN semantics on the model.  It found a top model and two sub models.  It created events and gateways which never were in the log — they had to be figured out.  He showed a series of different models that had been mined.  One had found 3 parallel tasks.  The models also look at the data, and find correlations between different data items and paths.  This can be used to mine the branch conditions.  Can discover a 1hour timer event task as well.

Can even detect manipulations on data if the event log has data captured in the event log.

Q: (Bruce) Impressive to figure this all out from the logs.  What happens if 90% of the time data causes a branch, but not 100%.

A: It never requires 100%  There are statistical assessments of the conditions.  Not 100% precise.

Q: Is simulation the key different from Disco and the others

A: you need to have a complete, executable model if you want to make changes and improve the model.

Tim Stephenson – Omny Link – Toward Zero Code

Looked at a bunch of self-coding but found that things didn’t work too well.  Going to focus today on decisions.  WordPress claimed that they ran 1/4 of the web.  “Firm Gains” is a business to sell your business.  First step is to build  a form.  Then a decision table.

Demo started by logging into wordpress.  Edited what looked like a blog post, but it was actually a form.  Standard list of fields without an programming.  Went to another page, inserted a square-bracket-style wordpress tag to include the form, and it appeared.  Very easy, very simple, workflow processes.

Scott Francis – BP3 – Sleep at Night Again

How to automate static analysis for BPM.   Your design team is not always experienced.  Once they select technology, they implement, and many times it is over engineered and hard to maintain, and sometimes has to be throws out.  In reality, the kruft does not get added all at once, instead it is incremental.  Neches is a tool to analyze the code, and find problems early in the iterations, and keep them from building up.  Does a complexity measurement on the application.

Neches is a SaaS tool.  Can sign up for an account.  Users upload their application.  Drill into it.  There can be many versions, and you can look at the measurements how they have changed over time.  You can drill down into the individual metrics.  Example: the length of JavaScript server scripts triggers a warning if longer than a threshold.  Particular rules can be excluded (if you don’t agree with them) or particular flagged issues can also be excluded.

Q: How complicated is it to create new rules?

A: Not too hard.  Today this is not exposed, but internally we find it easy to do, and believe once this is exposed people will find it easy enough.

Linus Chow- Oracle – Rapid Process Excellence

Showed a web console for starting and interacting with BPM applications.  Mobile interface as well.

2015-03-31 16.44.10


2015-03-31 17.37.57_sm

And that is it.  On day 3 I had a presentation, and was too busy to focus on taking good notes, and then for the entire session before they sequester the laptop.  Overall bpmNEXT remains a place for very forward discussion of new directions, a place that is helpful for me to stay on top of things.  The new venue — Santa Barbara — is likely to remain the choice for next year.  I am looking forward to it already.


by kswenson at April 13, 2015 10:25 AM

April 10, 2015

BPM-Guide.de: From Push to Pull – External Tasks in BPMN processes

A process engine typically call services actively (e.g. via Java, REST or SOAP) from within a Service Task. But what if this is not possible because we cannot reach the service? Then we use a pattern we called “External Task” – which I briefly want to describe today.

Picture on the right taken from http://www.from-push-to-pull.com/projects/what-is-pull-marketing/ – thanks!

Context and problem

A couple of recent trends increased the need for this pattern, namely:

Cloud: When running process/orchestration engines in the cloud you might not be able to reach the target service via network connections – and VPNs or Tunneling is always cumbersome. It is …

by Bernd Rücker at April 10, 2015 10:10 AM

April 09, 2015

BPM-Guide.de: Orchestration using BPMN and Microservices – Good or bad Practice?

Martin Fowler recommends in his famous Microservices Article: “Smart endpoints and dumb pipes”. He states:

The microservice community favours an alternative approach: smart endpoints and dumb pipes. Applications built from microservices aim to be as decoupled and as cohesive as possible – they own their own domain logic and act more as filters in the classical Unix sense – receiving a request, applying logic as appropriate and producing a response. These are choreographed using simple RESTish protocols rather than complex protocols such as WS-Choreography or BPEL or orchestration by a central tool.

I do not agree! I think even – …

by Bernd Rücker at April 09, 2015 11:35 AM

April 02, 2015

BPM-Guide.de: bpmNEXT – the BPM industry event that *really* matters

Picture taken by Benjamin Notheis from SAP, this year’s winner of the best-in-show-award

Clay Richardson from Forrester Research put it in a nutshell: “bpmNEXT means ‘Show me yours, I’ll show you mine'”.

And show we did: All BPM Software Vendors that *really* matter were there, presenting the latest and greatest they have to offer – or will offer soon. This was not about Sales or Marketing, but just about showing-off the things we’re proud of, and showing it off to peers who understand and appreciate the passion behind it.

But bpmNEXT is even more, it is the global gathering of a …

by Jakob Freund at April 02, 2015 01:25 PM

Thomas Allweyer: Anwender von Prozessmodellierungstools sind weitgehend zufrieden

Im Durchschnitt bewerten die in einer neuen Studie der Firma BPM&O befragten Anwender ihr Prozessmodellierungstool mit der Note 2,6. Sie sind also zumindest weitgehend zufrieden. Interessanterweise sinkt die Zufriedenheit mit der Nutzungsdauer. Die Autoren der Studie führen dies darauf zurück, dass sich im Laufe der Zeit die Anforderungen und Rahmenbedingungen ändern, so dass das ursprünglich gewählte Tool nicht mehr ganz so gut passt.

Mit insgesamt 64 Teilnehmern ist die Studie nicht repräsentativ. Dennoch bietet sie einen interessanten Überblick über die Erfahrungen und Meinungen der Anwender. Überwiegend handelt es sich bei den Teilnehmern um Modellierungsexperten aus BPM-Stabsstellen oder Prozessanalysten. Die am meisten verwendete Notation ist BPMN. Sie wurde doppelt so häufig genannt wie die EPK. Interessanterweise wurden auch Wertschöpfungskettendiagramme, die für Überblicksdarstellungen und Prozesslandkarten dienen, vergleichsweise selten eingesetzt.

Die Tools werden zumeist von der internen IT-Abteilung bereitgestellt. SaaS-Angebote kommen bislang nur in 15% der Fälle zum Einsatz. Am wichtigsten ist den Anwendern eine einfache Bedienung. Auch ein gutes Portal zur Veröffentlichung der Prozessmodelle sowie Funktionen zur Beteiligung der Fachbereiche haben eine hohe Bedeutung. Für die Zukunft wünschen sich die Modellierer von den Toolanbietern mächtigere Reporting-Möglichkeiten und verbesserte Prozessportale.

Die Studie kann unter www.bpm-toolmarktmonitor.de heruntergeladen werden (Registrierung erforderlich). Dort findet sich auch eine im letzten Jahr durchgeführte Anbieter-Umfrage. Außerdem kann man selbst an der Anwenderumfrage teilnehmen, die kontinuierlich fortgesetzt wird.

by Thomas Allweyer at April 02, 2015 10:02 AM

April 01, 2015

Sandy Kemsley: bpmNEXT 2015 Day 3 Demos: Camunda, Fujitsu and Best In Show

Last demo block of the conference, and we’re focused on case management and unstructured processes. Camunda, CMMN and BPMN Combined Jakob Freund presented on OMG’s (relatively) new...

[Content summary only, click through for full article and links]

by sandy at April 01, 2015 07:02 PM

Sandy Kemsley: bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you...

[Content summary only, click through for full article and links]

by sandy at April 01, 2015 05:31 PM

Sandy Kemsley: bpmNEXT 2015 Day 2 Demos: Omny.link, BP-3, Oracle

We’re finishing up this full day of demos with a mixed bag of BPM application development topics, from integration and customization that aims to have no code, to embracing and measuring code...

[Content summary only, click through for full article and links]

by sandy at April 01, 2015 12:04 AM

March 31, 2015

Sandy Kemsley: bpmNEXT 2015 Day 2 Demos: Kofax, IBM, Process Analytica

Our first afternoon demo session included two mobile presentations and one on analytics, hitting a couple of the hot buttons of today’s BPM. Kofax: Integrating Mobile Capture and Mobile...

[Content summary only, click through for full article and links]

by sandy at March 31, 2015 10:07 PM

Sandy Kemsley: bpmNEXT 2015 Day 2 Demos: Sapiens Decision, Signavio

We finished the morning demo sessions with two on the theme of decision modeling and management. Sapiens: How to Manage Business Logic Michael Grohs highlighted the OMG release of the Decision Model...

[Content summary only, click through for full article and links]

by sandy at March 31, 2015 07:24 PM

Sandy Kemsley: bpmNEXT 2015 Day 2 Demos: Trisotech, Comindware, Bonitasoft

The first group of demos on bpmNEXT day 2 had a focus on the links between architecture and process: from architectural modeling, to executable architecture, to loosely-coupled development...

[Content summary only, click through for full article and links]

by sandy at March 31, 2015 05:33 PM

Keith Swenson: bpmNEXT – Day 1

My notes from first day of bpmNEXT 2015, March 30.

Bruce Silver – Conference introduction

Today we focus somewhere between BPM and Enterprise Architecture.  15 years ago we thought it was huge that we had one system to integrate human and back-end systems, and we have come a long way.  Now, there is still too much balkanization of the technology.

Main Themes of the Conference:

  1. Breaking the barrier between BPM and Enterprise Architecture.  Anatoly and another from Comindware is going to talk about the 3 gaps.  Denis Gagne will talk about the semantic graph to break down barriers.
  2. Bridge gap between process modeling and decision modeling.  Called “business rules” back then, as if this was an alternative to BPM.  Sapiens has started something called the “Decision Model” because this is too important to leave to the existing approach.  Signavio will also show business decision modeling.
  3. Bridge the gap between BPM and Case Management.  Camunda is offering a unified BPMN/CMMN execution.  Safiro and Cryo will present on how BPM needs to be loosened up.  Kofax and IBM will present on mobile case management and capture.  How do we do case management on our smart phones.  Including signature capture.  IBM has put a lot of emphasis on design, so we might see some of that.
  4. Expanding into new things like the Internet of Things. Presentation from SAP and W4 will focus on this.
  5. Expanding into expert systems and machine learning.  BP3 will present on the automated analysis of bpm code.  Fujitsu (Keith) will present on reconciling independent experts.  IBM will talk about Watson not just winning Jeaopardy, but how it can be used in the cloud with pre-trained services.  Living Systems (Whitestein) measurable intelligence in the process platform.
  6. Expanding into process mining, and Robert will speak about optimization of resources from this.
  7. Reaffirming core values of business empowerment.  Omny.link puts BPM in WordPress for non-programmers.  Oracle will talk about BPM in the public cloud.
  8. Reaffirm embracing continual change, a presentation by bonitasoft on building “living applications”.

2015-03-30 09.32.09Nathaniel Palmer – What does BPM will look like in 2020

Today, BPM looks like well defined, fixed routing of packages: channels, switches, but no awareness of what the other packages are doing.  Where does it need to be: like an Amazon warehouse with Kiva robots.  Needs to be data driver, goal oriented, adaptive, and intelligent automation.

Three things:  Robots, Rules, and Relationship.

An illustration of the change from 2005 to 2013 – the smartphone example at the announcement of the new pope.

60% of people switching banks in the past year did so because of insufficient mobile banking capabilities.  Mobile support is the most important thing.  But don’t just transport the laptop UI to the mobile.  Gave an example of an Oldsmobile radio ads that was moved to TV showing a static picture and the radio ad behind it.  The new medium affords new forms of content.   Showed an automated teller as a “state of automation today” which was obviously not mobile.

What can you do if you have mobile?  Kindle Fire has a “MayDay” button — you press and get an instant conversation with a support person.  This instant connection enables “relationship”.  He showed the Echo from Amazon, because Echo can help walk you through an amazon purchase.  Also showed My Jibo which was popularized through a kickstarter campaign.  Not to automate tasks, but to interface with tasks.

Another thing is wearables, including wearable workflow.  Task might change to not be a single discrete unit of work;  remove the distinction between the task and the things that support the task.   The three tier architecture is common today.  We need to move toward a four tier architecture.  Client tier (mobile) delivery tier, aggregation tier, and services tier.  JSON and REST, and tasks need to be discoverable.

Process mining and optimization.  data driven, goal oriented, adaptive, intelligent automation.


2015-03-30 09.55.49smClay Richardson – Reinventing BPM for the age of the customer.

Nathaniel’s talk focused on customer experience.  10 years ago much of process was focused on back end systems, and we have changed.  Today it is how to engage customers with mobile.

60% of all business leaders prioritize revenue growth and customer experience.

Four periods of history:

  • (1900) age of manufacturing,
  • (1960) age of distribution,
  • (1990) age of information, and finally
  • (2010) the age of the customer.

Told a story about a promotion combining Jaguar and Thomas Pink.  Packaging was excellent, but reception was completely bad.  The bad impression is a perfect example of a process failure: the dealer had not been informed, they were not prepared, not engaging.  Was really wanting a memorable, rich, engaging experience.

Big challenge today is to get across from the old to the new.  42% of business people put better mobile support on critical or high priority.  Examples of new mobile apps to order pizza from Pizza Hut and Dominos.  Pizza Hut simply ported their web site to the phone and it took about 20 minutes to make the first order.  Dominos on the other hand made something that works very well: easy to order, buttons for what you ordered last time, and has a tracker to tell you where it is.  Another example is buying US savings bonds.  Clay helped to redesign this, and found that the changes required broke many of the assumptions in the back end systems.

BPM people don’t have a lot of credibility for improving customer experience.  Need a new title “Digital Customer Experience Arcthitect”.    First, digitize customer end-to-end experience, Another is “Digital operational excellence architect” for drive rapid customer centric innovation and to support prototyping.

What has to change in BPM?: He produced a customer centric BPM Tech Radar.  Two key items on this chart:  1) low code platforms.  2) customer journey mapping.   Simple cloud orchestration, how to quickly program devices, how to connect devices.

Customer facing cadence is faster.  The real thing is really a need for speed.  Months to get things done.  Now when touching customers we need to work faster.  This is driving to “low code” approach.  Develop in weeks, releases weekly, method is test and learn, and adoption is intuitive now.

Gave an example of a customer journey from Philips medical devices to sell a life alert bracelet.  Opportunity to redesign the delivery because the older patient is often anxious, and the purchasing customer needs to be informed.  Another issue was billing since that was being split three ways, make it easier to do this.


2015-03-30 10.57.49smPanel on BPM Business

  • Miguel Valdes Faura – Bonita Soft
  • Scott Francis – BP3 Global
  • Denis Gagne – Trisotch

Miguel – Open Source is the key to building a successful ecosystem.  Akira Awata has translated the entire platform into Japanese and there is a big uptick in usage.  Before that, downloads in Japan were limited.  Open Source BPM Japan.   Now reselling subscription to businesses like Bridgestone and Sony.  Bonita BPM Essentials book developed on the open source version, and people can download and access the examples.  Banking regulation is changing in Switzerland, and some are using processes in Bonita to match these new rules.  There are large benefits to the open source model.

Scott Francis – How to move from Lombardi to independent.  Started trying to be the best Lombardi partner, and then IBM partner.  People worry about time, money, and focus, and it is focus that is the easiest to lose track of.  Learned to find our own customers – IBM did not refer anything.  Service providers get a lot of pressure to pick up other products, for example pick up more IBM products, but it might be better to focus on one product and do the job really well.

Denis Gagne – Two hobbies: Building an Ecosystem of BPM & standards work.  Still amazed at how much “BPM-101″ needs to be taught.  There is a need for us as a general community to educate better.  BPM Incubator has more members outside of US than inside.  190 countries.  We all benefit if the BPM community is better informed.

Q: (Neil) Convergent vs. Divergent Standards.  Why do standards sometimes work, and sometimes not?

A: it is easier to have agreement when you are only touching one set of customers.  Bonita has 1000 customers, but they use only about 30% of BPMN.

Q: (Bruce) People are building Apps.  If the problem that the BPM platforms don’t provide something suitable for those Apps?

A: (Miguel) This is an important questions.  How to make sustainable apps.  We have been doing a poor job in the BPM industry in helping people to make customized UI.  There is a portal, and there is a level of customization, but you are constrained to the box.  You can’t say, put a button on the right corner of the screen.  How to change?

  1. Low-code approach to avoid the need for developers,
  2. instead making things that support developers to make them more powerful.

(Scott) A lot of the people doing mobile apps, have no concept of process.  Once the data is shipped back to the server, they don’t care where it goes. Opportunity to fill the gap between mobile and back end.


2015-03-30 11.42.36smNeil Ward-Dutton – Schroedinger’s BPM

Is this the end of BPM?  Are we seeing the end of “business transformation”.  Where are we going next?

Is it dead?: the term BPM is disappearing from conversations.  People don’t want to talk about it.  Instead they use smart process, case management, anything else.  BPM Technology platforms growing at 3%.  (Clay thinks 8%)  Maintenance revenue is dominating license revenue.  However, there is a lot of inquiries, particularly from non-traditional sectors.  Actually we are probably in the very middle of the adoption curve.

BPMS is fundamentally unlike most enterprise technologies.  Really weird and horrible chimera.   Hard to map on the ways that people normally work.  The innovators think they can use BPM to reinvent the way they work.  But the mainstream reject as having tried it and wasted time and money.  Just another attempt to get us locked into a enterprise platform.  Culture change is too expensive..

Someone created a “Customer Project Manager” to help premium in-home customer services.  Didn’t call it BPM.  This was about agility.  Another example was a large bank who has a IT led enterprise wide transformation failed big time.

They are embracing cloud aggressively.  They are using agile ways of working.  Low cost propositions.  The lightweight approaches are about spending less up front.   Why are there all these people out there building these apps, but not really engaging with the back-end.  The culture change is not coordinated: it is too scary.

Low-code is what we used to call 4GL.

New agile enterprise has no “target operating model.”  They don’t know what it will be.  This is not the way we did transformation ten years ago.  First instrument, then provide agility of services.

Why would you do “simulation” when you could put the real solution in the hands of real users and observe how it works?

Customer journey slide very interesting.  Knowing customer is not enough, build surfacing, on that acting and finally shaping.   That all needs to be done across marketing, sales, operations, and service.

Advice: don’t fixate on SPA’s, don’t obsess over traditional competitors, don’t fixate on throwing more in the box, do find ways to enhance BYOP particularly with auditability, do look at the implications of digital strategies, do enable clients to take the portfolio management approaches to business processes, do partner, buy, build.


2015-03-30 13.43.59smRemy Glaisner – Myria Research – Chief Robotics Officers and BPM

RIOS – Robotics and Intelligent Operational Systems.  Automation, Robotics, and mobile technologies.  There will soon be many people calling themselves “Chief Robotics Officers.”   This is a completely open, nascent field — no leaders yet.  inflection point expected in 2017-2018.  For manufacturing they are already there, but agriculture is a ways off.

Client acquisition is based largely on how fast you can deliver.  Automation including robotic automation, is quite important.

By 2025 over 60% of manufacturers over $1B will have a Chief Robotics Officer (CRO) on staff.


2015-03-30 14.28.52smBenjamin Notheis, Harsh Jegadeesan

Internet of Things.  There are others:  Internet of People, Internet of Places, and Internet of Content.    All four of these together.  Wil vdAalst talks about the Internet of Events.  IoE means massive data (bigger than big data).  Event stream processing sense patterns in real time.  Once a pattern is identified, one can response with rules and processes.

Presented a use case about a person who managed pipelines in L.A.  Events notify that there is a problem.   The options to replace a pump are given, different prices, different quality of pump.  Demo is hard to describe here — so see the video.  At one point he assigned a task to someone just typing “@manny escalate issue”  the user was found and task assigned.  Very dynamic!   Had a visual depiction of incidents displayed as tiles, where the size of the tile represnts the number in that category.

The coolest part of the demo was when he showed the user interface display on a watch display.  One could see the task, see the data values and options, make an audio annotation of the task, and mark the task as completed.  All from the watch.

Eclipse based modeler showing extended BPMN.   Models can be imported to http://bpmn.io.  This is compiled to JavaScript for running in the SAP cloud service.   Referred to it java script event loop.

Q: does this use NetWeaver and/or work with it? A:  Basically, not much.  It is a new process engine implemented over the last 6 months or so.

2015-03-30 14.37.56sm


Francois Bonnet, W4

Francois gave a great presentation and demo around a use case of monitoring elderly and responding to falls.  Showed a “faintness sensor” based on a raspberry pi processor.  When it tilted for more than a few seconds, it started a BPMN process.  A heartrate event might cause this process to escalate to various steps, such as calling them, calling a neighbor, and sending in a response team.  If it got back upright, the process was cancelled.  If the fall happened too many times in a particular period, it started a different process.

It was pretty interesting that the event modeling was done effectively in BPMN, however the aggregate even (falling too many times in a period) was not modeled directly in BPMN.


Dan Neason, Living Systems

Covered the Whitestein system.  All processes have a reflection capability so you can ask a running model what it is capable of doing.  Interesting demo, but hard to capture here.


2015-03-30 16.14.01smJim Sinur – Swarming and Goal Directed Collaborative Processes

BPM is not a sexy term any more.  What else do we go to?  There is a notion of a Hybrid Processes.  Could go on that, but as Neal pointed out, growth is not that high.

The idea we should follow is that of transforming the digital organization.  How do processes help organizations become digital organizations.

Got some of this from Keith’s presentation last year where he showed a video of starlings flocking (murmerating).  The idea is that birds guided by simple rules can act collectively in an emergent way.  But we ned to think about flocks with starlings, ducks, geese, sparrows, etc.   We will have swarms of things, but they consist of robots, information systems, and everything else.

Processes should help organizations cope with the “big change” coming their way.  We force customers to go through a phone menu which matches the organization that was designed on industrial age ideas.  Why force the customer to this?  Tomorrow there will be an “Uber” in every different industry.

Gave an example of an insurance company that wanted general reps to be able to handle all products.  They used AI systems to help.  Tried to get rid of specialization, but they failed because the rules technology was not available.

In production in Norway is a company to help with dementia patients.  Gave them a wristband with GPS in there.  If the patient approaches or crosses a boundary, they are notified and and go get him.

“Going digital” is the goal.  A couple of way to get there.  “do it, try it, fix it” is one approach.  Today the process is often in control.  But in the future the goals are in control of work and the process.

Can you imagine a bunch of swarming agents deciding what to do next?  Agents are: level of humanity, level of collaboration, level of intelligence, and a vector in goal driven freedom.  Hybrid resourceses, hybrid process styles (cases, flows, forms), hybrid speed, hybrid goals, etc.

Example of a bike store that has a kiosk that analyzes the customer to determine what mood, what kind of personality, and body type.  She keys in information about the kinds of riding she would like, and it suggests a bike. Imagine that there were many of these intelligent agenst swarming to help sell this bike.

Another exmple is using a swarm to find a suitable house by sending in a photo of the kind of house you want.  It could search for similar homes, and a bank might do this in order to also offer a mortgage.  Issues with autonomous cars and robots: legal issues.  Who do you sue when something goes wrong?

Not just UI, not just mobile, but how you treat customers and how you meet their need is the important thing.


That is it for the first day.  Then it was off to winetasting on the roof-top patio.

2015-03-30 18.12.21sm


by kswenson at March 31, 2015 11:20 AM

March 30, 2015

Sandy Kemsley: bpmNEXT 2015 Day 1 Demos: SAP, W4 and Whitestein

The demo program kicked off in the afternoon, with time for three of them sandwiched between two afternoon keynotes. Demos are strictly limited to 30 minutes, with a 5-minute, 20-slide,...

[Content summary only, click through for full article and links]

by sandy at March 30, 2015 11:52 PM

Sandy Kemsley: bpmNEXT 2015 Day 1: More Business of BPM

Talking with people at the first break of the first day, I feel so lucky to be part of a community with so many people who are friends, and with whom you can have both enlightening and amusing...

[Content summary only, click through for full article and links]

by sandy at March 30, 2015 07:15 PM

Sandy Kemsley: bpmNEXT 2015 Day 1: The Business of BPM

I can’t believe it’s already the third year of bpmNEXT, my favorite BPM conference, organized by Nathaniel Palmer and Bruce Silver. It’s a place to meet up with other BPM industry...

[Content summary only, click through for full article and links]

by sandy at March 30, 2015 05:44 PM

March 27, 2015

Sandy Kemsley: Going Beyond Process Modeling, Part 1

I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for...

[Content summary only, click through for full article and links]

by sandy at March 27, 2015 02:55 PM

March 26, 2015

BPM-Guide.de: New Camunda Usergroup in Australia

Camunda is spreading, also in Australia. The first usergroup is already evolving, and they will meet for the second time next week.

If you would like to swing by and meet some other Camunda users, here is what you need to know:

Date: Tuesday, March 31 Time: 5pm Melbourne Time Place: Tuscan Bar – 79 Bourke Street, Melbourne

This time you can also meet Bernd Frey, one of our senior consultants who is currently down under and engaged in a fascinating Camunda project.

Many thanks to Phillip Spartalis, who is organizing this. He has agreed to share his email address here in case …

by Jakob Freund at March 26, 2015 01:29 AM

March 23, 2015

Thomas Allweyer: Praxisforum zu 20 Jahren Prozessmanagement

Seit dem Erscheinen des wegweisenden Buchs “Reengineering the Corporation” von Hammer und Champy sind schon über 20 Jahre vergangen. Daher widmet sich das Praxisforum BPM & ERP in einer ganztägigen Veranstaltung der Entwicklung des Prozessmanagements in diesen zwanzig Jahren und dem heute erreichten Stand. Neben der historischen Rückschau stehen auch zahlreiche Praxisvorträge auf dem Programm, u. a. von MAN, dem Landschaftsverband Rheinland, BASF, Globus und Bayer. Zu den behandelten Themen gehören beispielsweise Process Excellence, Prozesslandkarten, Datenmanagement, Prozessautomatisierung und ERP-Einführung.
Die Tagung findet am 16. Juni in der Nähe von Koblenz statt. Das vollständige Programm und ein Anmeldeformular finden sich hier.

by Thomas Allweyer at March 23, 2015 09:21 AM

March 21, 2015

BPM-Guide.de: Review: Camunda Community Day in London

Yesterday we had our first Camunda Community Day in the UK. Thanks to our friends at 6point6 who organized this, we could meet in the famous Royal Institution. This was definitely the most decent location we had for a communiy meeting so far!

It was a great half day of presentations, discussions and networking. Most of the attendees already knew existing BPM products, and when I described the Zero-Code BPM Myth they immediately knew what I was talking about. I also gave a little BPMN crash-course, and I did not use a single slide, but just live-modeled everything I explained …

by Jakob Freund at March 21, 2015 09:56 AM

March 18, 2015

Bruce Silver: Process-Driven Applications: A New Approach to Executable BPMN

One of the singular successes of BPM technology is a common language – BPMN – used both for process modeling and executable design.  At least in theory….   In reality, the BPMN created by the business analyst to represent the business requirements for implementation often bears little resemblance to the BPMN created by the BPMS developer, which must cope with real-world details of application integration.  That not only weakens the business-IT collaboration so central to BPM’s promise of business agility, but it leads to BPMN that must be revised whenever any backend system is updated or changed.  It doesn’t have to be that way, according to an interesting new book by Volker Stiehl of SAP, called Process-Driven Applications with BPMN (www.springer.com/978-3-319-07217-3).

Process-driven applications are executable BPMN processes with these characteristics:

  1. Strategic to the business, not situational apps.  They must be worth designing for the long term.
  2. Containing a mix of human and automated activities, not human-only or straight-through processing.
  3. Span functional and system boundaries, integrating with multiple systems of record.
  4. Performed (with local variations) in multiple areas of the company.
  5. Subject to change over time, either in business functionality or in underlying technical infrastructure, or both.

Stiehl identifies the following design objectives of process-driven applications:

  • Process-driven applications should be loosely coupled with the called back-end systems. They should be as independent as possible from the system landscape. After all, the composite does not know which systems it will actually run against.
  • Process-driven applications, because of their independence, should have their own lifecycles, which ideally are separate from the lifecycles of the systems involved. It is also desirable that the versions of a composite and the versions of the called back-end systems are independent of one another. This protects a composite from version changes in the involved applications.
  • Process-driven applications should work only with the data that they need to meet their business requirements. The aim is to keep the number of attributes of a business object within a composite to a minimum.
  • Process-driven applications should work with a canonical data type system, which enables a loose coupling with their environment at the data type level. They intentionally abstain from reusing data types and interfaces that may already exist in the back-end systems.
  • Process-driven applications should be non-invasive. They should not require any kind of adaptation or modification in the connected systems in order to use the functionality of a process-driven application.  Services in the systems to be integrated should be used exactly as they are.

fig1

Let’s look at a very simple example, an Order Booking process.  Here is the process model created by the business analyst in conjunction with the business.  Upon receipt of an order from the customer, an on order entry clerk enters it into a form, from which the price is calculated.  Then an automated task charges the credit card.  If the charge does not succeed, a customer service rep contacts the customer to resolve the problem.  Once the charge succeeds, the process books the order in the ERP system, another automated task, and ends by returning a confirmation message to the customer.  If the charge fails and cannot be resolved, the process ends by sending a failure notice to the customer.

fig2

In the conventional BPMS scenario, here is the developer’s view.  It looks the same except that the simple service tasks have been replaced by subprocesses, and the service providers – the credit card processing and ERP booking services – are shown as black box pools with the request and response messages visible as message flows.  There are 2 reasons the service tasks were changed to subprocesses: One is to accommodate technical exception handling.  What happens if the service returns a fault, or times out?  Some system administrator has to intervene, fix the problem, and retry the action.  The BA isn’t going to put that in the BPMN, but it needs to be in the solution somewhere.  The second reason is to allow for asynchronous calls to the services, with separate send and receive steps.  You also notice that Book order is interacting with more than one ERP system.  Don’t you wish there was one ERP system that handled everything the customer could buy?  Well sometimes there is not, so the process must determine which one to use for each instance.  Actually an order could have some items booked in system A and other items booked in system B.  The business stakeholders, possibly even the business analyst, may be unaware of these technical details, but the developer must be fully aware.

fig3

Here is the child level of Charge credit card.  It is invoked asynchronously, submitting a charge request and then waiting for a response.  If the service times out, an administrator must fix the issue and retry the charge.  The service returns either a confirmation if the charge succeeds or an error message if it fails.  Here we modeled this as two different messages; in other circumstances we might have modeled these as two different values of a single message.   If you remember your Method and Style, the child level has two end states, Charge ok and Charge failed, that match up with the gateway in the parent level.

fig4

And here is the child level of Book order.  A decision task needs to parse the order and for each order item determine is it handled by system A or system B.  Then there are separate booking subprocesses to submit the booking request and receive the confirmation for each order item in each system.  Finally an automated task consolidates all the item confirmations into an overall order confirmation.

So you already can see some of the problems with this approach.  The developer’s BPMN is no longer recognizable by the business, possibly even by the BA.  This reduces one of BPMN’s most important potential benefits, a common process description shared by business and IT.  Second, the integration details are inside the process model.  Whenever there are changes to the interface of either the credit card service, ERP system A, or system B, the process model must be changed as well.  If this process is repeated in various divisions of the company, using different ERP and credit systems, those process models will all be different.  And third, this tight binding of process activities to a SOA-defined interface to specific application systems means the process is manipulating heavyweight business objects that specify many details of no interest to the process.

All three of these problems illustrate what you could call the SOA fallacy in BPM.  In theory, SOA is supposed to maximize reuse of business functions performed on backend systems.  In practice, SOA has succeeded in enabling more consistent communications between processes and these systems, but the reuse as imagined by SOA architects has been difficult to achieve.  The actual reuse by business processes is frequently defeated by variation and change in the specific systems that perform the services.  So, instead the PDA approach seeks to maximize the actual reuse of business-defined functionality provided by services, not across different processes but across variations of the same process, caused by variation and change in the enterprise system landscape.  This is a radical difference in philosophy.

In his book, Volker Stiehl calls this new approach Process-Driven Architecture.  This architecture layers the process design and removes all integration details from the business process model, representing the Process-Driven Application, or PDA. The services specified in the PDA process make no reference to the actual interfaces and endpoints of specific backend systems.  Instead each service in the PDA process defines and references a fixed service contract interface, specifying just the elements needed to perform the required business function, regardless of the actual interface of the backend systems required.  This service contract interface is essentially defined by the business process – by the business, not the SOA architect or integration developer.

The data elements and types used in that interface are based on a canonical data model, not the elements and types specific to a backend system.  Remember, the object is not reuse of SOA endpoints and service interfaces across business processes, but reuse of this particular service contract interface across the system differences found in various divisions of the enterprise and across changes in these systems over time.  Ideally the PDA process, from a business perspective, is universal across the enterprise and stable over time.

Translation from this stable service contract interface, based on canonical data, to the occasionally changing interfaces and data of real backend systems is the responsibility of the Service Contract implementation layer.  What makes this nice is that BPMN can be used in this layer as well.  Each integration service call from the PDA process is represented in the architecture by a Service Contract Implementation (SCI) process defined in BPMN.  This process effectively binds the system-agnostic call by the PDA process to a specific system or systems used to implement the service.  It performs the data mapping required, issues the requests and waits for responses, and handles technical exceptions.  The PDA process doesn’t deal with any of this.  Moreover, the SCI process is non-invasive, meaning it should not require any change to existing backend systems or existing SOA services.  Everything required to link the PDA to these real systems and services must be designed into the SCI process.

The beauty of this architecture is that, unlike the conventional approach, the PDA process model is the same for the business analyst and the integration developer.  All of the variation and change inherent in the enterprise system landscape is encapsulated in the SCI process; the PDA process doesn’t change.  Effectively the executable process solution becomes truly business-driven.

fig5

Here is a diagrammatic representation of the architecture. The steps in a PDA process, modeled in BPMN, represent various user interfaces and service calls. When the service call is implemented by a backend system, a business partner, or an external process, its interface – shown here as the Service Contract Interface – is defined by the PDA not by the external system or process. For each call to the Service Contract Interface, a Service Contract Implementation process is defined, also in BPMN, to communicate with the backend system, trading partner, or external process, insulating the PDA process from all those details. The Service Contract Interface, based on a canonical data model, defines the interface between the PDA process and the SCI process. This neatly separates the work of the process designer, creating the executable PDA process, from that of the integration designer working in the Service Contract Implementation layer.  Since the PDA process and the SCI processes are both based on BPMN, the simplest thing is use the same BPM Suite process engine for both, with communication between them using standard BPMN message flows.

fig6

Here is what it looks like with our simple order booking process reconfigured using Process-Driven architecture. The details of Charge credit card are no longer modeled in a child level diagram of the business process, but instead are modeled as a separate SCI process.  The charge credit card activity in the PDA process is truly a reusable business service.  It defines the service contract interface using only the business data required: the cardholder name, card number and expiration date, charge amount, return status, and confirmation number. It doesn’t know anything about how or where the credit card service is performed, whether it is performed by a machine or a person, the format of the data inputs and outputs, or the communications to the service provider. All of those integration details could change and the PDA process would not need to change.  The SCI process maps the canonical request to the input parameters of the actual service provider, issues the request, receives the response, maps that back to the canonical response format, and replies to the PDA service task.

fig7

If the card service is temporarily unavailable or fails to return a response within a reasonable time period, a system administrator may be required to resolve the problem and resubmit the charge. The business user is not involved in this, and it should not be part of the PDA process. This too is part of the SCI process.  However, if the service returns a business exception, such as invalid credentials or the charge is declined, this must be handled by the business process, so this detail is part of the PDA. And in fact, it should be part of the business analyst’s model worked out in conjunction with the business.

This 2-layer architecture, consisting of a PDA process layer and a Service Contract Implementation layer, succeeds in isolating the business process model from the details of application integration. But there are some problems with it…

  • First, the BPMN engine running the PDA and SCI process must be able to connect directly to all of the backend systems, trading partners, and external processes involved. In many large-scale processes, in particular core processes, this is difficult if not impossible to achieve.
  • Second, a single SCI process may involve multiple backend systems and must be revised whenever any of them changes.
  • And third, things like flexible enterprise-scale communications, guaranteed message delivery, data mapping and message aggregation are handled more easily, reliably, and faster in an enterprise service bus than in a BPMN process. So we’d like to leverage that if possible.

The solution then is to split the SCI Layer in two, creating a 3 layer architecture.  The SCI process is divided into a stateful integration process and one or more stateless ESB processes.  Stateless here means short-running and able to run as a single unit of work or transaction. A stateless process cannot include human tasks, waiting for a message or a join, anything that takes time and requires maintaining the state of the instance. ESBs are designed to execute these very well. A single ESB process can send a message (or possibly N messages all at once) but does not wait for a response. A separate ESB process is instantiated with each response.

The stateful integration process can be long-running, meaning it can contain human tasks, it can wait for a message, or wait for parallel paths to join.  The stateful integration process can process a correlation id, linking an instance of the stateful process to the right process and activity instance in PDA. The stateless ESB processes cannot do this. More on that in a minute.

fig8

To illustrate this let’s look at the activity Book Order, which books the order in the ERP system and generates a confirmation for the customer. Recall that this is what it looked like in our conventional BPMN. We have two ERP systems, and the process needs to look up which system applies to each order item before issuing the booking request. Here you can see some of the defects we have just discussed: The process model must map to the details of system A and system B; the system administrator handling stuck booking requests must be a BPMS user; etc.

fig9

Here is what it looks like in the 3-layer PDA architecture. The PDA process is almost exactly as before. The subprocess book order is simply an asynchronous send followed by a wait for the response, a simple long-running sevice call. The request message – the order – and the confirmation response message are defined by the PDA, that is, by the business, without regard to the parameters required by the ERP systems. Book order is a reusable business service in the sense that it can be used with any booking system, now or in the future.

The messy integration work is left to the SCI layer, here divided into a stateful integration process and 2 stateless ESB processes, one for sending and the other for processing the response. Here is what they do… Upon receipt of the order message from the PDA process, the SCI process first parses the order and looks up the ERP system associated with each order item. Really it just needs a count of the receivers of the ERP booking request message, so the process knows how many responses to wait for. This could be a service task or a business rule task depending on the implementation. Let’s say this receiver list is simply put in the header of the order message, which is then sent, using a send task, to the stateless ESB send process.

The ESB has the job of dealing with the details of the individual systems. First it splits the order message into separate variables for each system, that is, for each instance of the multi-instance Book in ERP system. For each system, this activity first looks up the interface of the request message, then maps the canonical order data to the system request parameters, providing any additional details required by the system interface, and then sends the ERP booking request to the system. A basic principle of the PDA approach is that the call to the external system or service is non-invasive, i.e. it must not be modified in any way in order to be integrated with the process. The integration process must accommodate its interface as-is.

I have shown the ESB process using BPMN but typically ESBs have their own modeling language and tooling. That’s fine. Since it’s a stateless process by definition, the BPMN is not asking the ESB to do anything that cannot be done in its native tooling.

The ERP system sends back its response, which triggers a second stateless process, ESB Receive. We’ve marked this as a multi-participant pool, meaning N instances of it will be created for a single order. The ESB does not know the count. Each ERP booking response simply triggers a new instance. Now here is something interesting: correlation. We need to correlate the booking response to a particular booking request. In a stateful process you can save a request id and use it to match up with the response. But the ESB processes are stateless. The Send process can’t communicate its request id to the Receive process. So the Receive process must parse the order content to uniquely determine the order instance. The receive process must also look up the service interface of the ERP system sending the response and then map the response back to the format expected by the stateful SCI process, the same for all of the called systems.

Now back in the stateful integration process, the subprocess Receive booking response receives the message. Because it is stateful, this process can correlate message exchanges, so the message event is triggered only by a receiver response for this particular process instance. This booking service normally completes immediately, so if no response is received in one minute, something is wrong. Here we’ll say a system administrator resolves the problem and manually books the order in the external system. Even though this human intervention is required it is outside the scope of the business user’s concerns, and not part of the PDA process. This multi-instance subprocess waits for a message from each receiver. Recall that we derived the count in the first step of this process. So it is quite general. It works for any number of receivers, as long as a receiver can be determined each order item. This process doesn’t even need to know the technical details of the receiver, its endpoint, interface, or communications methods. All of that is delegated to the ESB. The stateful integration process does need to define a way to extract a unique instance id out of the original order message content, as this logic will be used by the ESB Receive process to provide correlation.

Once the ERP booking response is received, it is used to update a cross-reference table. What is that? This is a table that provides a uniform means of confirmation regardless of the physical system used to book each item. Each of those systems will provide a confirmation string in its own format. The Xref table links each system-specific confirmation string with the confirmation string for the order as a whole, in the format defined by the PDA.

One final detail before we leave this diagram… the message flows. The message flows linking PDA process to the stateful SCI process, as well as those linking the stateful SCI process to the ESB process, are standard message flows as implemented by the BPMS for process-to-process communication within the product and for reading and writing message queues. The message flows between the ESB processes and the backend systems are more flexible. The transport and message format are probably determined by the external system, whether that is a web service call, file transfer, EDI or whatever the ESB can handle. This communications complexity is completely removed from the BPMS, which is the strength of the ESB approach.

There is a bit more to it, but if you are interested, I suggest you get the book.

The post Process-Driven Applications: A New Approach to Executable BPMN appeared first on Business Process Watch.

by bruce at March 18, 2015 09:20 PM

Keith Swenson: ‘Fail fast, fail often’ is essential advice for innovators

Yes, it is a negative statement, but in uttering it, you desensitize the team to a harmful fear of failure.

I am responding today to an article in The Globe and Mail titled “‘Fail fast, fail often’ may be the stupidest business mantra of all time.”  The article criticizes this saying on two points.  First, business people have a hard time saying it, and don’t come across as credible.  Second, the statement focuses on the negative which is … negative.   The author proposed an improved statement: “Succeed fast, adjust or move on.”

Recasting it like this shows that the author does not understand the point of making the statement in the first place.  Psychologists have demonstrated that people naturally have a bais against loss.  Given a carefully designed test, people will value $100 loss as equivalent to $200 of gain.  That is, people are naturally very loss averse.  Irrationally so.  People naturally tend to form groups that tend to punish failure as a way to prevent even small failures.

Saying “succeed fast” does not really give the option for failure.  Fear of failure has a powerful inhibiting effect which needs to be countered, particularly in an organization that strives to be innovative.

While success is the goal, there is one thing worse than failure, and that is doing nothing.  If you do nothing, you always lose.  On average, people will come up with good ideas, but not always.  If you fear failure, if you have a culture that punished failure, then members will not try.  They will wait until they are sure they have a success, and only then act.  Many many opportunities will be lost because the risk of failure might be a fraction of the benefit of success, but that risk prevents action.

When a leader says “fail fast, fail often” they make it clear that failure is a word that we can talk about.  Failure is no longer taboo.  It may be hard for them to say it — nobody said that leadership was easy.  They want success, they don’t like talking about failure, but doing so makes it clear that the culture would rather see you try and sometimes fail, than it is to not try at all.

Some say that you can only learn from your mistakes.  But you can’t learn if you don’t make any mistakes.  If people are too fearful to try, you won’t have a learning organization.

Another silicon valley statement is: “Don’t ask for permission, ask for forgiveness.”  This focuses on the negative as well, but it is essential to the spirit of innovation that you make it clear that success is not required 100% of the time, and action is valued over inaction.

While the ‘fail fast, fail often’ statement is negative, it inoculates the group against a crippling fear of failure.  Far from the stupidest mantra of all time, it shows depth of wisdom and skill of leadership.  The writer of this article clearly does not understand the dynamics of an innovative organization.

(See “When Thinking Matters in the Workplace” chapter 4: “Agile Management” on Amazon)


by kswenson at March 18, 2015 03:33 PM

March 16, 2015

Sandy Kemsley: Effektif BPM Goes Open Source

On a call with Tom Baeyens last week, he told me about their decision to turn the engine and APIs of Effektif BPM into an open source project: not a huge surprise since he was a driver behind two...

[Content summary only, click through for full article and links]

by sandy at March 16, 2015 11:05 AM

March 13, 2015

Drools & JBPM: Reactive Incremental Graph Reasoning with Drools

Today Mario got a first working version for incremental reactive graphs with Drools. This means people no longer need to flatten their models to a purely relational representation to get reactivity. It provides a hybrid language and engine for both relational and graph based reasoning. To differentiate between relational joins and reference traversal a new XPath-like was introduced, that can be used inside of patterns. Like XPath it supports collection iteration.

Here is a simple example, that finds all men in the working memory:
Man( $toy: /wife/children[age > 10]/toys )

For each man it navigates the wife reference then the children reference; the children reference is a list. For each child in the list that is over ten it will navigate to its toy's list. With the XPath notation if the leaf property is collection it will iterate it, and the variable binds to each iteration value. If there are two children over the age of 10, who have 3 toys each, it would execute 6 times.

As it traverses each reference a hook is injected to support incremental reactivity. If a new child is added or removed, or if an age changes, it will propagate the incremental changes. The incremental nature means these hooks are added and removed as needed, which keeps it efficient and light.

You can follow some of the unit tests here:
https://github.com/mariofusco/drools/blob/xpath/drools-compiler/src/test/java/org/drools/compiler/xpath/XpathTest.java

It's still very early pre-alpha stuff, but I think this is exciting stuff.

by Mark Proctor (noreply@blogger.com) at March 13, 2015 11:34 PM

March 12, 2015

Thomas Allweyer: Kontinuierliches Prozessmanagement ist vielfach immer noch Mangelware

Cover Studie Reifegrade GPM 2015Wer sich regelmäßig mit dem Thema Prozessmanagement beschäftigt, dürfte von diesem Ergebnis nicht wirklich überrascht sein: Zwar haben immer mehr Unternehmen Maßnahmen zur Verbesserung ihrer Prozesse eingeführt, doch kümmern sie sich wesentlich weniger um das Controlling und die Weiterentwicklung des Prozessmanagements. An der jüngst erschienen Studie “Reifegrad des Geschäftsprozessmanagements 2015″ beteiligten sich 216 Teilnehmer aus dem deutschsprachigen Raum, die sich durchschnittlich seit neun Jahren mit dem Prozessmanagement befassen. Als Grundlage für die Befragung wurde ein von iProcess entwickeltes Reifegradmodell verwendet, das über fünf Reifegradstufen verfügt. Im Schnitt erreichten die Unternehmen die Reifegradstufe zwei. Es ist also noch einiges Entwicklungspotenzial vorhanden, denn nach wie vor liegt der Fokus vielerorts vor allem auf der Modellierung und Analyse der Abläufe, nicht jedoch auf der kontinuierlichen Überprüfung und Weiterentwicklung des Prozessmanagements. So mag es zwar gelingen, “Quick Wins” durch konkrete Prozessverbesserungen zu erreichen, doch wird das viel weitergehende Potenzial eines durchgängig geschlossesen Prozessmanagement-Kreislaufs nicht genutzt.

Interessanterweise konnte kein eindeutiger Zusammenhang zwischen Unternehmensgröße und Prozessmanagement-Reifegrad festgestellt werden. Kleinere und mittlere Unternehmen (KMU) können durchaus mit wesentlich größeren Organisationen mithalten. Flache Hierarchien und eine höhere Kundennähe aller Beteiligten erleichtern den KMU das Management ihrer Prozesse. Hingegen zeigten sich deutliche Unterschiede zwischen den Branchen. So ist der Reifegrad im Bereich Immobilien und Handel besonders hoch. Auch die Transportbranche sowie Banken sind hier ganz gut aufgestellt. Die Autoren der Studie sehen dies dadurch verursacht, dass diese Branchen sehr personal- und wissensintensive Prozesse haben. Zudem herrscht ein hoher Wettbewerbsdruck. Außerdem zeigte sich, dass Unternehmen mit vielen verteilten Niederlassungen über einen höheren Prozessreifegrad verfügen, da bei ihnen die Standardisierung der Prozesse eine hohe Bedeutung hat.


Minonne, C.; Koch, A.; Ginsburg, V.:
Reifegrad des Geschäftsprozessmanagements 2015. Eine empirische Untersuchung.
iProcess AG (Ltd.) Luzern 2015
Leseprobe und Bestellmöglichkeit

by Thomas Allweyer at March 12, 2015 01:01 PM

March 11, 2015

Sandy Kemsley: KofaxTransform 2015 In Pictures

As I prepared to depart Las Vegas, I flicked through some of my photos from the past couple of days and decided to share. First, the great work of the ImageThink team of graphic recorders:   ...

[Content summary only, click through for full article and links]

by sandy at March 11, 2015 07:30 PM

March 10, 2015

Sandy Kemsley: Analytics For Kofax TotalAgility With @Altosoft

Last session here at Kofax Transform, and as much I’d like to be sitting around the pool, I also like to squeeze every bit out of these events, and support the speakers who get this most...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 11:15 PM

Sandy Kemsley: Smarter Processes With Kapow Integration

I’m in a Kofax Transform breakout session on Kapow Integration together with KTA; I missed documenting the first part of the session when my Bluetooth keyboard stopped talking to my Android...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 09:43 PM

Sandy Kemsley: Process Intelligence at KofaxTransform

It’s after lunch on the second (last) day of Kofax Transform, and the bar for keeping my attention in a session has gone up somewhat. To that end, I’m in a session with Scott Opitz and...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 08:47 PM

Sandy Kemsley: Kofax Claims Agility SPA

Continuing with breakout sessions at Kofax Transform is a presentation on the Claims Agility smart process application that Kofax is creating for US healthcare claims processing, based on the KTA...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 06:45 PM

Sandy Kemsley: TotalAgility Product Update At KofaxTransform

In a breakout session at Kofax Transform, Dermot McCauley gave us an update on the TotalAgility product vision and strategy. He described five vital communities impacted by their product innovation:...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 06:17 PM

Sandy Kemsley: KofaxTransform 2015: Day 2 Customer Keynotes

I had a chance to hear Tom Knapp from Waterstone Mortgage speak yesterday at the analyst briefing here at Kofax Transform, and we have him to kick off this morning’s keynote. They started their...

[Content summary only, click through for full article and links]

by sandy at March 10, 2015 04:21 PM

Drools & JBPM: UF Dashbuilder - Activity monitoring in jBPM

syndicated from http://dashbuilder.blogspot.com.es/2015/03/uf-dashbuilder-in-jbpm-for-activity.html

Last week, the jBPM team announced the 6.2.0.Final release (announcement here). In this release (like in previous ones) you can author processes, rules, data models, forms and all the assets of a BPM project. You can also create or clone existing projects from remote GIT repositories and group such repositories into different organizational units. Everything can be done from the jBPM authoring console (aka KIE Workbench), a unified UI built using the Uberfire framework & GWT.

   In this latest release, they have also added a new perspective to monitor the activity of the source GIT repositories and organizational units managed by the tooling (see screenshot below). The perspective itself it's just a dashboard displaying several indicators about the commit activity. From the dashboard controls it is possible to:

  • Show the overall activity on our repositories
  • Select a single organizational unit or repository
  • List the top contributors
  • Show only the activity for an specific time frame

  In this video you can see the dashboard in action (do not forget to select HD).

Contributors Perspective

  Organizational units can be managed from the menu Authoring>Administration>Organizational Units. Every time an organizational unit is added or removed the dashboard is updated.

Administration - Organizational Units 

   Likewise, from the Authoring>Administration>Repositories view we can create, clone or delete repositories. The dashboard will always feed from the list of repositories available.

Administration - Repositories



   As shown, activity monitoring in jBPM can be applied to not only to the processes business domain but also to the authoring lifecycle in order the get a detailed view of the ongoing development activities.

How it's made


The following diagram shows the overall design of the dashboard architecture. Components in grey are platform components, blue ones are specific to the contributors dashboard.

Contributors dashboard architecture

  These are the steps the backend components take to build the contributors data set:

  • The ContributorsManager asks the platform services for the set of available org. units & repos. 
  • Once it has such information, it builds a data set containing the commit activity.
  • The contributors dataset is registered into the Dashbuilder's DataSetManager.

   All the steps above are executed on application start up time. Once running, the ContributorsManager also receives notifications form the platform services about any changes on the org. units & repositories registered, so that the contributors data set is synced up accordingly. 


   From the UI perspective, the jBPM's contributors dashboard is an example of hard-coded dashboard built using the Dashbuilder Displayer API which was introduced in this blog entry. The ContributorsDashboard component is just a GWT composite widget containing several Displayer instances feeding from the contributors data set.

   (The source code of the contributors perspective can be found here)

    This has been a good example of how to leverage the Dashbuilder technology to build activity monitoring dashboards. In the future, we plan for applying the technology in other areas within jBPM, like, for instance, an improved version of the jBPM process dashboard. We will keep you posted!

by Mark Proctor (noreply@blogger.com) at March 10, 2015 03:24 PM

March 09, 2015

Drools & JBPM: Zooming and Panning between Multiple Huge Interconnected Decision Tables

Michael has started the work on revamping our web based decision tables. We've been experimenting with HMTL5 canvas with great results, using the excellent Lienzo tool. First we needed to ensure we could scale to really large decision tables, with thousands of rows. Secondly we wanted to be able to pan and zoom between related or interconnected decision tables. We'll be working towards Decision Model and Notation support, that allows networked diagrams of Decision Tables.

You can watch the video here, don't forget to select HD:
https://www.youtube.com/watch?v=WgZTdfLis0Q

Notice in the video that while you can manually pan and zoom it also has links between tables. When you select the link it animates the pan and zoom to the linked location. 25s to 47s in is showing  that we can have really large number of rows and keep excellent performance, while 55s is showing the pan speed with these large tables. Initially the example starts with 50% of cells populated, at 1m in we change that to 100% populated and demonstrate that we still have excellent performance.




by Mark Proctor (noreply@blogger.com) at March 09, 2015 11:55 PM

Sandy Kemsley: Kofax Altosoft For Operational Intelligence

Wayne Chambliss and Rich Rabin of Kofax Altosoft gave a presentation at Kofax Transform, most of which was a demo, on becoming an operational intelligence guru. This is my first real look at the...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 09:48 PM

Sandy Kemsley: Tablets And Digital Signatures At AIA Life

Just to maximize confusion, we have a second AIA at the Kofax Transform conference: this morning, Aia referred to the customer communications management company recently acquired by Kofax; this...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 08:44 PM

Sandy Kemsley: Kofax Analyst Briefing And Portfolio Update

Following the Kofax Transform day 1 keynotes, we had a separate session for financial and industry analysts to be briefed on the products and financials. After a brief introduction from Reynolds...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 08:03 PM

BPinPM.net: Insights from German flagship conference Wirtschaftsinformatik 2015

Last week, the BPinPM.net team visited the conference Wirtschaftsinformatik 2015 in Osnabrück. Topic of this year’s conference was Smart Enterprise Engineering.

In three days, several research and business tracks gave visitors insights in emerging trends of information systems. Furthermore, different companies (e.g., Thyssen Krupp or SAP) presented their next steps for achieving digitalized business.

For example, Thyssen Krupp wants to use Big Data and digitalization to revolutionize their elevator business. In the future, elevators won’t be travelling solely vertically but also horizontally. In the keynote, this video was shown. We found it very impressing and it fits very well to our scheduled innovation workshop “BPM meets the Innovation Helix“. So, we want to share it with you:

https://www.youtube.com/watch?v=KUa8M0H9J5

Besides the business tracks, the German research community presented their recent work. The presented papers dealt with Business Process Management, Information Systems Usage, or Social Media an Collective Intelligence.

We are proud that one of our team members also presented her work at the conference. Janina Kettenbohrer talked about impact of employees’ attitude toward their job on business process standardization acceptance. She and her two colleagues Dr. Andreas Eckhardt and Prof. Dr. Daniel Beimborn developed a theoretical model which explains how job-related attributes (e.g., autonomy or skill variety), work-role fit, co-worker relation, and the wider process environment influence the employees’ perception of meaningfulness of work and consequently process standardization acceptance. If you are interested in Janina’s latest work, you can find her paper here:

http://www.wi2015.uni-osnabrueck.de/Files//WI2015-D-14-00270.pdf

If you’re interested in testing the model in your organization and in finding out how to successfully implement process standards, please contact Janina.

by Mirko Kloppenburg at March 09, 2015 08:02 PM

Sandy Kemsley: Kicking off KofaxTransform 2015: Day 1 Keynotes

I’m in Vegas for a couple of days for the Kofax Transform conference. Kofax has built their business beyond their original scanning and capture capabilities (although many customers still use...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 04:38 PM

Drools & JBPM: jBPM 6.2.0.Final released

The bits for the jBPM 6.2 release are now available for you to download and try out !  

Version 6.2 comes with a few new features and a lot of bug fixes !  New features include a.o. EJB, (improved) OSGi and Camel endpoints support, a new asset management feature (to introduce a development and release branch and promote assets between both), social profiles and feeds and the ability to extend the workbench with your own plugins!

More details below, but if you want to jump right in:

Downloads
Documentation
Release Notes

Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

jBPM 6.2 is released alongside Drools (for business rules) and Optaplanner (for planning and constraint solving), check out the new features in the Drools release blog, including a brand new rules execution server and the Optaplanner release blog as well.

A big thank you to everyone who contributed to this release!

Some highlights from the release notes.

Core services

  • EJB: the jBPM execution server (that is for example embedded in our web-based workbench) now also comes with an EJB interface.  A refactoring of the underlying jbpm-services now makes the execution services accessible using pure Java, CDI, EJB and Spring. Remote interfaces using REST and JMS are still available as well of course !  A lot more details are described in Maciej's blog here.
  • Deployments (defining which versions of which projects are currently active in the execution server) are now by default stored in the database.  This greatly simplifies the architecture in a clustered environment in case you are only using our runtime side of our web tooling (for example by having dedicated execution servers in production).
  • Our asynchronous job executor has improved support for requeuing failed jobs and for recurring jobs (e.g. daily tasks).
  • OSGi: Full core engine functionality is now available on top of OSGi.  A significant number of additional jars (including for example the human task service, the runtime managers, full persistence, etc.) were "OSGi-fied". Specific extensions and tests showing it in action are available for Apache Karaf and Aries Blueprint (in the droolsjbpm-integration repository).
  • Camel endpoint URIs: A new out-of-the-box service task has been implemented for using Apache Camel to connect a process to the outside world using some of the numerous Camel endpoint URIs. The service task allows you to for example specify how to pass data to an FTP endpoint by configuring properties such as hostname, port, username, payload, etc. for some common endpoints like (S)FTP, File, JMS, XSLT, etc. but you can use virtually any of the available endpoints by defining the URI yourself (http://camel.apache.org/uris.html).

Workbench
  • Form Modeler comes with improved support for adding custom logic to your forms using JavaScript on changes, and support for configurable ComboBox and RadioGroup fields, and simple List types.
  • Asset management: It is now possible to make a repository a "managed repository".  This allows you to split up a repository in multiple branches, one for doing development and on for releasing.  Users can then request various assets to be promoted to the resource branch when ready.  This promotion process, and the linked build and deploy processes, are defined using a BPMN2 process as well and include approval and build tasks.  Check the documentation for more details.

  • Social features, like user profiles (including gravatar pictures), and various event feeds like the most recent assets you worked on, on recent changes by other users.


  • Contributors perspective is a new out-of-the-box report (using the new dashbuilder technology) that gives high-level insight in who is changing what in your repositories.
  • Pluggable workbench:  you can now extend the workbench with your own views, menus, etc. using workbench plugins. Available features includes creation of perspectives via a programmable or a drag and drop interface, create new screens, editors, splashscreens and dynamic menus. 

by Kris Verlaenen (noreply@blogger.com) at March 09, 2015 02:39 PM

Sandy Kemsley: Software AG Analyst Day: The Enterprise Gets Digital

After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely: Back at work, I spent a day last week in...

[Content summary only, click through for full article and links]

by sandy at March 09, 2015 12:42 PM

March 06, 2015

Drools & JBPM: Drools 6.2.0.Final Released

We are happy to announce the latest and greatest Drools 6.2.0.Final release.

This release in particular had a greater focus on improved usability and features that make the project easier to use (and adopt). Lots of improvements on the workbench UI, support for social activities and plugin management, as well as a brand new Execution Server for rules are among the new features.

Improved Wizards

Execution Server Management UI

Social activities


Contributors dashboard


Perspective editors


Here are a few links of interest:

We would like to use the opportunity to thank all the community members for their contributions to this release and also JetBrains and Syncro Soft for the open source licenses to their products that greatly help our developers!

Happy drooling!


   



by Edson Tirelli (noreply@blogger.com) at March 06, 2015 03:54 PM

March 05, 2015

Thomas Allweyer: FireStart kann beides: Durchgängige fachliche Modellierung und Prozessausführung

Firestart Outlook-IntegrationMeist werden für die fachliche Prozessmodellierung und die Prozessausführung unterschiedliche Systeme eingesetzt. Zwar bieten einige BPMS-Hersteller auch Wertschöpfungskettendiagramme und ähnliches an, doch bleiben die Fähigkeiten zur fachlichen Prozessdokumentation und -analyse meist weit hinter den reinen Prozessmodellierungswerkzeugen zurück. Eine positive Ausnahme stellt die FireStart BPM Suite von Prologics dar. Die Plattform ermöglicht eine kollaborative Modellierung in einer benutzerfreundlichen grafischen Modellierungsumgebung, die über das gewohnte Look and Feel von Office-Produkten verfügt. Die Modelle werden in einem zentralen Repository abgelegt. Die Publikation der Modelle in einem Prozessportal und die Generierung von Prozesshandbüchern werden ebenso unterstützt wie eine Versionsverwaltung und die revisionssichere Ablage der Modelle. Das rollenbasiert anpassbare Prozessportal verfügt über eine moderne, mit HTML 5 realisierte Oberfläche. Die leistungsfähige Suche und weitere Funktionen werden mittels bei Bedarf eingeblendeter Overlay-Menüs aufgerufen, wie man sie z. B. aus Google Maps kennt.

Neben den Prozessen lassen sich u. a. auch Prozesslandkarten, Organigramme, Datenmodelle, IT-Landschaften und Risiken modellieren und auf einfache Weise mit den Aktivitäten in den Prozessmodellen verbinden. Damit ist eine durchgängig integrierte Unternehmensmodellierung möglich. Insbesondere im Zusammenhang mit BPMN-Modellen ist dies nicht selbstverständlich – selbst prominente Modellierungsplattformen weisen hier oftmals Schwächen auf, wie diese Untersuchung zeigt. Der Clou: Bei der Darstellung der Prozessmodelle kann man jederzeit zwischen der Darstellung in BPMN und EPK umschalten und in der jeweils anderen Notation weitermodellieren. In der EPK-Darstellung werden zugeordnete Organisationseinheiten, IT-Systeme u. ä. als eigene Objekte dargestellt, die über Pfeile mit den jeweiligen Aktivitäten verbunden sind. In der BPMN-Darstellung wird durch kleine Icons in den Aktivitäts-Symbolen angezeigt, zu welchen weiteren Objekttypen Verbindungen bestehen. Diese Möglichkeit des Wechsels zwischen BPMN und EPK dürfte insbesondere die Akzeptanz in Fachabteilungen erhöhen, die vielerorts die EPK-Darstellung gewohnt sind.

FireStart Durchlaufzeitenanalyse im GanttchartInternationale Unternehmen werden sich über die integrierte Übersetzungsfunktion freuen, die die Modelle ohne weiteres Zutun in eine Vielzahl von Sprachen übersetzen kann. Auch wenn eine automatisierte Übersetzung nicht immer perfekt sein dürfte, erleichtert sie das Verständnis der Modelle in verschiedenen Landesniederlassungen immens.

Für die Prozessanalyse stehen spezielle Ansichten der Prozesse zur Verfügung. So kann man sich den zeitlichen Verlauf eines Prozesses in Form eines Gantt-Charts anzeigen lassen. Verändert man die Zeiten einzelner Prozess-Schritte, so wird direkt die Auswirkung auf die Gesamtdurchlaufzeit deutlich. Da FireStart die Prozesse im Gegensatz zu reinen Modellierungswerkzeugen auch ausführen kann, ist eine solche Durchlaufzeitenanalyse nicht nur auf der Grundlage von Vorgabewerten möglich, sondern auch auf Basis der echten Daten ausgeführter Prozessinstanzen. In einer Matrixdarstellung können Prozesskosten analysiert und den einzelnen Aktivitäten Hinweise auf Schwachstellen und Verbesserungsvorschläge zugeordnet werden.

Die integrierte Modellierung von Prozessen, Organigrammen, Daten usw. ist nicht auf eine fachliche Betrachtung beschränkt. Sie wird auch beim Übergang zur Prozessausführung genutzt. So werden etwa die fachlichen Datenobjekte um technische Details ergänzt, so dass sie bei der Prozessausführung zur Aufnahme konkreter Daten dienen können. Den Organisationseinheiten aus dem Organigramm werden konkrete Benutzern zugeordnet, und modellierte IT-Systeme werden mit Schnittstellen- und Aufrufinformationen hinterlegt. Die ausführungsbezogene Konfiguration der Modelle wird dem Modellierer an vielen Stellen erleichtert. Zieht man etwa ein Datenobjekt auf einen Benutzer-Task, so stehen die betreffenden Datenfelder direkt im Formular dieses Tasks zur Verfügung.

Firestart ProzesskostenanalyseAuch bei der Prozessausführung macht sich die Integration der fachlichen Prozessmodellierung im selben Tool bezahlt. So ist das Portal, das die am Prozess beteiligten Bearbeiter nutzen, dasselbe, das zur Publikation der Prozessmodelle dient. Man kann sich somit bei der Prozessdurchführung jederzeit über die Prozesse informieren. Zudem können laufende Prozessinstanzen im Prozessmodell verfolgt werden. Das Portal ist responsiv gestaltet, so dass auch eine komfortable Bearbeitung auf Tablets und Smartphones möglich ist, auch mit Gestensteuerung. Zudem wird eine Integration in Microsoft Sharepoint und Outlook angeboten. Damit können Mitarbeiter ihre Aufgaben über den gewohnten Maileingang erhalten und die zugehörigen Formulare komplett in Outlook bearbeiten ohne in das separate Portal wechseln zu müssen. Generell spielt FireStart seine Stärken in der Integration mit Microsoft-Produkten aus. Daneben stehen aber auch Konnektoren zu SAP und anderen Systemen zur Verfügung, und natürlich werden auch verschiedene Standards wie Web Services unterstützt.

Bei der jüngsten BPM-Studie des Fraunhofer IESE landete FireStart im Spitzenfeld. Das System schnitt in den meisten untersuchten Kategorien überdurchschnittlich ab. Dass es insbesondere in der Kategorie “Prozessmodellierung” vor allen anderen BPM-Systemen landete, überrascht angesichts der Funktionsvielfalt der Modellierungskomponente nicht.

by Thomas Allweyer at March 05, 2015 08:39 AM

March 02, 2015

BPinPM.net: Invitation to “BPM meets the Innovation Helix” Workshop

„Quo Vadis, BPM?“ – This was already the title of the key note speech held by Dr. Bernhard Krusche at our recent BPinPM.net Process Management Conference and most of the conference participants agreed, that the challenges of the digital transformation of organizations will also challenge BPM.

Dr. Krusche’s idea to combine tools of successful innovation processes with structured BPM started a discussion on how this fusion of classical BPM and new innovation methodologies could look like.

Thus, we decided to set up a workshop to explore the “Innovation Helix” which was invented by Dr. Krusche and Prof. Sonja Zillner and match it with the BPM Life Cycle.

To learn more about this innovation workshop, please check the event details…

by Mirko Kloppenburg at March 02, 2015 09:45 PM

February 27, 2015

Thomas Allweyer: IT-Strategie-Studie gestartet

Im letzten Jahr kam eine von Scheer Management erstellte Studie zu dem ernüchternden Fazit, dass es vielen Firmen nicht gelingt, ihre Unternehmensstrategien auch tatsächlich umzusetzen. Auf dem operativen Level kam zumeist kaum noch etwas von dem an, was in der Führungsetage als Strategie erarbeitet wurde. Jetzt hat das Saarbrücker Beratungshaus eine neue Umfrage gestartet, diesmal zum Thema IT-Strategien. Es wird untersucht, welche Elemente IT-Strategien in der Praxis beinhalten und wie sie kommuniziert und umgesetzt werden.

Die Studie richtet sich an alle Management-Ebenen und Branchen. Die Beantwortung der Fragen dauert etwa 15 Minuten. Die Teilnahme ist unter diesem Link möglich.

by Thomas Allweyer at February 27, 2015 07:50 AM

February 26, 2015

Bruce Silver: BPM at IBM InterConnect

It would be unfair to say there was absolutely nothing on BPM at IBM’s InterConnect conference, which took place this week in Las Vegas… but it would not be far from the truth. InterConnect is the supposed successor to IBM’s annual Impact middleware event where “Smarter Process” – IBM’s term for BPM and decision management – has always played a large role. I say “supposed” because the new mega-event, triple the size of Impact and split between 2 hotels a mile apart, was such a logistical debacle that I seriously doubt they will try it this way again.

InterConnect is officially about Cloud, Mobile, DevOps, and Security. Middleware is sort of there but well below the fold. The overarching theme this year was “hybrid cloud” – new apps combining services in public and private clouds, even behind the firewall – based on IBM’s new strategic platform-as-a-service called Bluemix. Even though Bluemix is new in the past year, they never really explained what it is. Here is what it says on the website:

“IBM Bluemix is the cloud platform that helps developers rapidly build, manage and run web and mobile applications. Based on the open source architecture of Cloud Foundry, Bluemix provides the flexibility to integrate development frameworks, languages and services that suit your needs. Develop applications using Web IDE and Eclipse – while storing your code directly on Bluemix or GitHub. Bluemix is based on Cloud Foundry, an open source project, and features additional runtimes and services from the open source community. This makes Bluemix a great place to build and run applications that leverage technology and innovation from the open source developer community. Bluemix not only offers developers a broad range of IBM, third-party and open source APIs and services, but it integrates with many of the developer tools you already use today. By abstracting lower-level infrastructure components, Bluemix enables you to spend more of your time and talent writing the code that will differentiate your app and drive user adoption and engagement. Build apps and services for free in the first 30 days. Enjoy the free tier even after the trial ends, and pay only for what you use. No credit card is required to get started.”

If this is truly the new strategic direction, it’s clearly not what we normally think of as IBM. The whole event affected that open source/hacker/developer-centric tone, and honestly, it was kind of interesting and refreshing. The problem for me was that BPM does not seem to play in this brave new world.

The main tent demos all reprised the old “systems of engagement” theme, in which the app uses some combination of mobile, social, and analytics technology to lure unsuspecting mall dwellers into buying something. Only now it’s on Bluemix!

In the new IBM, it’s all about customer-facing apps on phones, not cross-functional business processes. It’s about writing code, not model-driven development. This revolution, they tell us, will be hacker-led, not business-empowering. All those old BPM values and principles, apparently, are yesterday’s news.

But it seems to me that BPM – the technology, if not the IBM product – could have a valuable role to play here. These new engagement apps, for example, depend heavily on events, decisions, and analytics, all technologies central to IBM’s Smarter Process portfolio, but not really integrated with the BPM product. IBM Decision Server Insights, based on ODM, for example, introduces “rich time modeling, reasoning and analytics to detect and respond to intricate patterns and trends; innovative global analytics to extract valuable insights over populations of business entities in real-time; and generalized, business-friendly modeling over all aspects of the decision model design.” Yes, this is exactly in tune with the new direction, but it is not integrated with BPM. If you ask why (and I did), the answer is always “our customers are not asking for it.”

And how does BPM fit into Bluemix? It doesn’t (yet), but Bluemix does include a Workflow service:

“Workflow for Bluemix makes it easy for you to create workflows that orchestrate and coordinate the REST-based services that you use in your apps. The JavaScript based Workflow language lets you define interactions between any services. By off-loading all the service interactions to the Workflow service, your application becomes easier to understand, maintain and evolve. Your workflows are run and managed in a robust and scalable way, regardless of whether your workflow and services run for milliseconds or days.”

Hmmm… To me this sounds like an updated version of the Windows Workflow Foundation, a set of programmer components that Microsoft put into the .Net Framework several years back. Embedding workflow in the OS!  An obvious win, right?  It might be workflow automation but it’s not BPM as we know it.

So what should a renovated IBM BPM on Bluemix, consistent with the new strategy, look like? If they asked me (which they have not), it would include the following:

  • Model-driven, instant playback, business-empowered process design… all those old Lombardi values that rescued IBM BPM in the first place!
  • Event-aware continuous query engine, like Decision Server Insights, but able both to trigger BPMN events and to receive and process events generated by the process engine and BAM.  And business-friendly modeling tools to go with it.
  • An enhanced Coach Designer that makes performer-facing tasks look just as engaging, powerful, and mobile-enabled as the customer-facing apps IBM is showing today in the main tent.
  • Complete unification of case management with structured BPM.  None of the current “basic” case management baloney.

Before closing, I have to say that I did see one BPM thing at InterConnect that was new and interesting, an executable version of Blueworks Live.  Blueworks Live has long had a simple automated workflow capability, but it was totally disconnected from the BPMN modeling piece.  The new version actually executes the BPMN model, using activity inputs and outputs – properties currently provided by the tool – to autogenerate task forms and to serve as process variables.  The BPMN activities today are simple human tasks, but I believe some kind of service invocation scenarios are planned.  You can step through the process in “test mode,” just like the Playback feature in IBM BPM. It is really pretty cool.   I’m not sure when this new version will be available or what it costs, but I suspect it could take some business away from BPM Standard (the one without Process Server underneath).  It’s definitely not a toy.

 

The post BPM at IBM InterConnect appeared first on Business Process Watch.

by bruce at February 26, 2015 11:11 PM

February 25, 2015

Sandy Kemsley: Capital Raising Through Crowdfunding

Nicholas Doyle of DST gave a presentation on crowdfunding: an interesting topic to cover at a conference attended primarily by old-school financial services companies, who are the ones most likely to...

[Content summary only, click through for full article and links]

by sandy at February 25, 2015 10:10 PM

Sandy Kemsley: AXA And The Digital Enterprise

Day 2 at DST ADVANCE 2015, and I’m attending a panel of three people from AXA on how their journey to becoming a digital insurance business. They define digital business as new ways of engaging...

[Content summary only, click through for full article and links]

by sandy at February 25, 2015 07:16 PM

February 24, 2015

Sandy Kemsley: Innovations In AWD User Experience

To finish off the first morning at DST ADVANCE 2015, I attended the session on customer and work experience, which was presented as a case study of background investigations on a security-sensitive...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 06:43 PM

Sandy Kemsley: AWD 2015 Product Strategy

Roy Brackett and Mike Lovell from DST’s BPS (Business Process Solutions) product management gave us a review of what happened in 2014 and an update on their 2015 product strategy, following on...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 05:46 PM

Sandy Kemsley: Kicking Off #DSTAdvance15 – DST Update From @JCV816

Conference season always brings some decisions and conflicts, and this year’s first one (for me) came down to a decision between DST‘s ADVANCE in Phoenix, and IBM InterConnect in Las...

[Content summary only, click through for full article and links]

by sandy at February 24, 2015 04:47 PM

Thomas Allweyer: Neuauflage Basiswissen Geschäftsprozessmanagement – mit BPMN 2.0

Cover Basiswissen Geschäftsprozessmanagement 2. AuflageDas Buch dient zur Vorbereitung auf die Prüfung zum “Certified Expert in Business Process Management” der OMG. Als vor fünf Jahren die erste Auflage erschien, war die Version 2.0 der Prozessmodellierungsnotation BPMN noch in Arbeit, weshalb für das Zertifikat damals noch BPMN 1.2 zugrunde gelegt wurde. Inzwischen wurde das Zertifizierungsprogramm aktualisiert, weshalb auch eine Neuauflage des Buchs nötig wurde. Die wichtigste Neuerung ist daher auch die Berücksichtigung der BPMN-Version 2.0.

Die weiteren Inhalte haben sich gegenüber der ersten Auflage kaum geändert. Es wurden lediglich kleinere Ergänzungen vorgenommen, z. B. zu BPMS und ausführbaren Prozessmodellen. Wer das Buch zum Nachschlagen oder als kompakten Einstieg in die OMG-Sicht auf das Thema BPM nutzt, kann die erste Auflage weiterverwenden. Wenn man sich auf die Zertifizierung vorbereitet, sollte man zur aktuellen Version greifen. Eine Besprechung der ersten Auflage findet sich hier.


Weilkiens, T.; Weiss, C.; Grass, A.; Duggen K.:
Basiswissen Geschäftsprozessmanagement: Aus- und Weiterbildung zum OMG Certified Expert in Business Process Management 2 (OCEB 2) – Fundamental Level. 2. Auflage
dpunkt, Heidelberg 2015
Das Buch bei amazon

by Thomas Allweyer at February 24, 2015 09:21 AM

February 20, 2015

Thomas Allweyer: BPM-News aus der Schweiz

Mit dem Stand des Prozessmanagements in der Schweiz befasst sich ein kürzlich erschienenes Special der Handelszeitung. Angesichts der Herausforderungen durch den stark aufgewerteten Franken dürften effiziente Prozesse bei vielen Unternehmen künftig noch wichtiger sein. Eine Studie der Zürcher Hochschule für Angewandte Wissenschaften (ZHAW) konstatiert, dass das Bewusstsein für BPM in den vergangen Jahren stark zugenommen habe. Hierbei gewinnt auch die Prozessautomatisierung an Bedeutung. Und: Kleine und mittelständische Unternehmen brauchen sich keineswegs hinter Großunternehmen zu verstecken, wenn es um die Einführung von Prozessmanagement geht. Im Interview nennt Karlheinz Baumann, COO des Uhrenherstellers IWC, als wichtigsten Mehrwert die Transparenz der Geschäftsprozesse sowie die erhöhte Anpassungsgeschwindigkeit bei Transformationsprozessen. Ein ausführlicher Beitrag befasst sich mit der Situation von Energieversorgern, wo sich insbesondere bei der Abbildung kundenbezogener Prozesse zahlreiche positive Beispiele finden lassen. Dennoch gibt es bei den insgesamt 680 Elektrizitätsversorgern des Landes vielerorts noch Nachholbedarf.

Markus Fischer von Axon Ivy betont in seinem Beitrag die Rolle, die intelligente Business Process Management-Systeme (iBPMS) für die Realisierung neuer, digitalisierter Geschäftsmodelle spielen. Hierzu müssen Geschäftsregeln und Services aus verschiedensten Datenquellen in komplexe oder repetitive Prozesse integriert werden – und zwar in Echtzeit. Dabei können z. B. auch Big Data-Analysen, Cloud-Anwendungen und Soziale Medien eingebunden werden. Er illustriert dies am Beispiel eines Online-Shops, bei dem sehr flexibel neue Zahlungsverfahren eingebunden wurden. Hierbei wurde über das BPMS eine Echtzeit-Bonitätsprüfung integriert, wodurch das bei Zahlung auf Rechnung bestehende Risiko deutlich reduziert werden konnte.

Weitere Beiträge befassen sich unter anderem mit Ausbildungen und Zertifizierungen im Bereich Prozessmanagement, sowie einer Studie zum Prozessreifegrad in Unternehmen des deutschsprachigen Raums.

Das BPM-Special der Handelszeitung ist unter diesem Link verfügbar. Am 5. März findet in Zürich das Swiss BPM Forum statt. Es steht in diesem Jahr unter dem Motto “Business Process Innovations – Die Treiber der Digitalen (R)Evolution”.

by Thomas Allweyer at February 20, 2015 11:11 AM

February 19, 2015

Drools & JBPM: Submit your work to the 9th International Web Rule Symposium (RuleML 2015)

Professor at Freie Universitaet Berlin
Dear CEP Colleagues,

I would like to encourage you to submit your research in the field of rules, rule-based reasoning and its applications to RuleML 2015 (http://2015.ruleml.org).

It is an excellent opportunity for a high impact conference publication (http://en.wikipedia.org/wiki/RuleML_Symposium; e.g., RuleML is in the top 100 venues for impact factor in CiteSeerX http://citeseerx.ist.psu.edu/stats/venues).

Also note, that there are several additional collocated events, e.g., the Recommender Systems for the Web of Data Challenge (http://2015.ruleml.org/recsysrules-2015.html), the Doctoral Consortium (http://2015.ruleml.org/DoctoralConsortium.html), the 9th International Rules Challenge with competitive prizes for the best rule base, the Reasoning Web Summer School, RR 2015 and the Workshop on Formal Ontologies meet Industry (http://www.csw.inf.fu-berlin.de/fomi2015/), as well as the 25th CADE 2015.

If you are doing your Phd in this field I would like to point you to the Reasoning Web Summer School (http://www.csw.inf.fu-berlin.de/rw2015/) and the joint RuleML/RR Doctoral Consortium ((http://2015.ruleml.org/DoctoralConsortium.html), where you can submit your Phd paper. Accepted RuleML Phd papers and demo papers will be published in the Challenge proceedings which are listed in DBLP (http://dblp1.uni-trier.de/db/conf/ruleml/) and fully indexed e.g. in Scopus.
Also the workshop on Formal Ontologies meet Industry (FOMI 2015) might be relevant for you if you are working on Ontologies (http://www.csw.inf.fu-berlin.de/fomi2015/).

And, further interesting things will happen at RuleML 2015, such as an ISO Common Logic, OMG API4KB and OASIS LegalRuleML face-to-face meeting. We will also have a Berlin Semantic Web Meetup during RuleML on August 4th. Details will follow soon on the Meetup website: http://www.meetup.com/The-Berlin-Semantic-Web-Meetup-Group/.

Thank you and hope to see you in Berlin, Germany in August,

Adrian
(General Chair RuleML 2015)

Prof. Dr. Adrian Paschke
AG Corporate Semantic Web
Freie Universitaet Berlin
Germany
http://www.mi.fu-berlin.de/en/inf/groups/ag-csw/


by Mark Proctor (noreply@blogger.com) at February 19, 2015 12:49 PM

February 17, 2015

BPM-Guide.de: Community-Driven Product Management

We are currently discussing if and how we should support the new Decision Model and Notation (DMN) Standard by OMG.

It’s basically about business rules, and of course our customers often ask us for business rules support. We typically recommend to combine Camunda with a rule engine such as JBoss Drools. That works very well, there are numerous examples, blueprints and tutorials available and also project experiences. However, actually in 95% of the real-world projects we have seen you don’t really need the features that Drools or the other leading rule engines provide. It’s mostly just about exposing business rules …

by Jakob Freund at February 17, 2015 02:14 PM

Thomas Allweyer: smartfacts: Toolübergreifende Plattform für Modelle

Smartfacts ScreenshotInsbesondere in großen Unternehmen dürfte es eher die Regel als die Ausnahme sein, dass mehrere unterschiedliche Modellierungstools zum Einsatz kommen. Die entstandenen Modellwelten sind voneinander isoliert. So ist es kaum möglich herauszufinden, welche Modelle es im Unternehmen gibt, geschweige denn beispielsweise alle Modelle zu finden in denen das Geschäftsobjekt “Kundenauftrag” verwendet wird. Hier verspricht das Produkt “smartfacts” des Nürnberger Modellierungsspezialisten MID Abhilfe. Die Plattform ermöglicht eine einheitliche Sicht auf Modelle unterschiedlichen Ursprungs.

Zu den angebotenen Features gehören eine modellübergreifende Suche, die Versionierung der Modelle und die Möglichkeit, beliebige Modelle miteinander zu verknüpfen. Egal, mit welchem Tool die verschiedenen Modelle erstellt wurden – sie werden auf einheitliche Weise im Browser dargestellt. Man kann jedes Diagramm beliebig vergrößern und verkleinern und direkt aus der Plattform heraus ausdrucken. Für Tablets und Smartphones wird die Darstellung entsprechend angepasst. Die enthaltenen Symbole können angeklickt werden, woraufhin die Attribute des jeweiligen Objekts angezeigt werden. smartfacts lädt beim Import also nicht nur eine Grafik, es wird vielmehr die Struktur des Modells mit übernommen.

smartfacts dient zudem als Kollaborationsplattform. So kann man etwa Modelle kommentieren, Diskussionen führen und Entscheidungen treffen. Über ein Berechtigungskonzept kann man regeln, welche Benutzer Zugriff auf die verschiedenen Modelle haben.

Derzeit können Modelle aus ARIS, Visio, Enterprise Architect und MIDs eigenem Modellierungswerkzeug innovator importiert werden. Zudem lassen sich BPMN-Modelle hochladen, die im BPMN-Standardaustauschformat vorliegen und somit aus jedem Tool stammen können, das einen entsprechenden Export anbietet. Aus den explizit unterstützten Tools lassen sich hingegen nicht nur Prozessmodelle übernehmen, sondern auch beliebige andere Modelltypen, wie Datenmodelle oder EPKs. Um Modelle aus Visio, Enterprise Architect oder innovator nach smartfacts zu übertragen, muss in dem betreffenden Tool ein Plugin installiert werden. Als Beispiel wurde das Visio-Plugin getestet, das problemlos funktionierte. Auch die Übernahme eines BPMN-Modells im Standardformat klappte ohne Weiteres. Ein Export aus ARIS Architect (Version 9.7) konnte im Test hingegen nicht importiert werden [Update: Das scheint daran zu liegen, dass die ARIS-Dateien verschlüsselt sind, siehe den Kommentar von Herrn Puschaddel].

smartfacts ist eine echte Innovation und kann ein nützliches Hilfsmittel für Unternehmen sein, die verschiedene Modellierungstools im Einsatz haben – und dies auch nicht ändern wollen. Zwar hat die Plattform den Vorteil, dass man vorhandene Modelle nicht verwerfen oder manuell in ein neues Tool überführen muss, doch stehen z. B. die Kollaborationsmöglichkeiten oder die Versionierungsmechanismen durchaus in Konkurrenz zu vergleichbaren Features herkömmlicher Modellierungssuiten, und man muss sorgfältig überlegen, welche Aufgaben man in welchem Tool erledigen möchte. Trotz der Berücksichtigung der Modellstrukturen beim Import kann smartfacts die verschiedenen Modelle nicht so nahtlos integrieren wie dies möglich ist, wenn sie direkt in einem einzigen Werkzeug erstellt werden.

Zudem bedeutet das nachträgliche Hinzufügen von Versionsnummern, Beschreibungen und Verlinkungen von Modellen einen zusätzlichen Aufwand, und es erfordert einige Disziplin, um Änderungen in den einzelnen Tools und in smartfacts konsistent zu halten. Werden diese Aspekte vernachlässigt, so hat man in der neuen Plattform schnell eine riesige, schwer überschaubare Sammlung verschiedenartiger Modelle. Diese lassen sich zwar durchsuchen, doch was fängt man mit den gefundenen Modellen an, wenn aus den verfügbaren Informationen etwa nicht klar ist, aus welchem Kontext sie stammen und ob sie aktuell gültig sind?

Um das Potenzial von smartfacts erschließen zu können, muss man sich daher im Vorfeld recht genaue Gedanken über die Governance der Modelle machen. Wenn im Unternehmen eine Vielzahl unterschiedlicher Tools im Einsatz ist, dann wurden die Prozesse zur Erstellung, Prüfung und Veröffentlichung von Modellen meist nicht einheitlich gehandhabt. Die entstandene Heterogenität der Modell-Landschaft wird man alleine durch die Einführung einer zusätzlichen Softwareplattform nicht in den Griff bekommen. Wenn man aber saubere Prozesse im Umfeld der Modellierung etabliert, dann kann smartfacts sicherlich ein sehr nützliches Hilfsmittel darstellen.

Auf der smartfacts-Website kann man sich einen 30tägigen Test freischalten lassen.

by Thomas Allweyer at February 17, 2015 09:24 AM

February 14, 2015

BPM-Guide.de: BPMN Online Training coming up – get your free pass

I just looked it up: During the last seven years, we coached more than 500 individuals in BPMN classroom trainings, and delivered more than 300 BPMN onsite trainings to organizations all over the world. I would say we probably know our business here.

But people kept asking us for an online version, allowing them to learn BPMN where and when they prefer. So we started working on this, and I expect that we can deliver the first chapters within the next months. The training will be based on our handbook Real-Life BPMN, but probably with a stronger focus on process automation. …

by Jakob Freund at February 14, 2015 05:56 PM

February 09, 2015

Drools & JBPM: The Relationship of Decision Model and Notation (DMN) to SBVR and BPMN

http://www.brcommunity.com/b597.php (Full Article)

Overview
"Publications by James Taylor and Neil Raden[2], Barbara von Halle and Larry Goldberg[1], Ron Ross[7], and others have popularized "Decision Modeling."  The very short summary is that this is about modeling business decision logic for and by business users.
A recent Decision Modeling Information Day conducted by the Object Management Group (OMG)[4] showed considerable interest among customers, consultants, and software vendors.  The OMG followed up by releasing a Request for Proposals (RFP) for a Decision Model and Notation (DMN) specification.[5]  According to the RFP,
"Decision Models are developed to define how businesses make decisions, usually as a part of a business process model (covered by the OMG BPMN standard in Business Process Management Solutions).  Such models are both business (for example, using business vocabularies per OMG SBVR) and IT (for example, mapping to rule engines per OMG PRR in Business Rule Management Systems)."
This quote says a little about how DMN may relate to SBVR[6] and BPMN[3], but there are many more open questions than answers.  How do SBVR rules relate to decisions?  Is there just one or are there multiple decisions per SBVR rule?  Is there more to say about how SBVR and DMN relate to BPMN?
This article attempts to "position" DMN against the SBVR and BPMN specifications.  Of course, DMN doesn't exist yet so the concepts presented here are more the authors' ideas about how these three specifications shouldrelate to each other, than reality.  We present these ideas in the hope that they will positively influence the discussions that lead up to the DMN specification."

by Mark Proctor (noreply@blogger.com) at February 09, 2015 11:32 PM

Thomas Allweyer: Manager predigen Prozessorientierung und leben Funktionsorientierung

Interessante Ergebnisse zur Verbreitung der Prozessorganisation förderte eine neue Studie zu Tage, die im Auftrag der Gesellschaft für Organisation (gfo) durchgeführt wurde. Viele der 165 in der Studie vertretenen Unternehmen richten ihre Organisation durchaus an den Prozessen aus – allerdings nur in den unteren Führungsebenen. Auf den oberen Ebenen herrscht dagegen nach wie vor eine starke Funktionsorientierung. Dies gilt insbesondere für Großunternehmen. So ordnet das oberste Management oftmals für die nachgeordneten Ebenen eine stärkere Prozessorientierung und andere Maßnahmen zur Steigerung der Effizienz an, nimmt sich selbst aber davon aus. Das ist umso beklagenswerter, als dieselbe Studie wieder einmal bestätigt, was der wichtigste Erfolgsfaktor des Prozessmanagements ist: Die oberste Leitung muss Prozessorientierung unterstützen und selbst vorleben.

Bei etwa einem Drittel der Befragten werden die Prozesse bei der Gestaltung der Organisation überhaupt nicht betrachtet. Und nur bei einem Prozent gibt es eine reine Prozessorganisation. Insgesamt schneiden kleine Unternehmen besser ab als größere. Etwa die Hälfte hat wesentliche Aspekte eines konsequenten Prozessmanagements umgesetzt, einschließlich Prozessverantwortlichen, Kennzahlensystemen und Regelkreisen zur ständigen Verbesserung. Die andere Hälfte der beteiligten Unternehmen verzichtet darauf noch weitgehend. Immerhin zwei Drittel nutzen Modellierungswerkzeuge. Als Notation kommt bei über einem Viertel BPMN zum Einsatz.

Die Studie, die in der aktuellen Ausgabe 1/2015 der Zeitschrift Führung + Organisation (zfo) vorgestellt wird, nennt auch wesentliche Barrieren für die Prozessorganisation. An erster Stelle liegt hier die “Dominanz funktionsbezogener Subkulturen”, gefolgt von ungeeigneten Anreiz- und Karrieresystemen, unzureichender Anpassung von Ressourcen und Entscheidungskompetenzen und politischen Widerständen. Als wichtigste Erfolgsfaktoren werden nach dem bereits angesprochenen Top Management Commitment die folgenden genannt: Motivation der Beteiligten, Umfassende Kommunikation der Prozesse an die Mitarbeiter sowie abgestimmte und verständliche Prozessbeschreibungen.

Die komplette Studie wird man nach Erscheinen bei der gfo beziehen können.

by Thomas Allweyer at February 09, 2015 11:34 AM