Refreshing Approaches to Workflow Automation

Workflow automation has been born of necessity and has evolved an increasingly sophisticated set of tools to manage the growing complexity of the automation itself.

The same theme keeps emerging across the broader spectrum of enterprise and research IT. For instance, we spoke recently about the need to profile software and algorithms when billions of events per iteration are generated from modern GPU systems. This is a similar challenge and fortunately, not all traditional or physical business processes fall into this scale bucket. Many are much less data intensive, but can have a such a critical impact in “time to market”, or wasted human effort for delivering complex systems, it is important to have the right tools to gain the right insight.

To understand a little more about one development in this space, The Next Platform spoke with Jakob Freund, the co-founder and CEO of Camunda. Camunda have been going for about 10 years, originally starting out as a consulting shop.

Camunda now list customers such as Goldman Sachs, AT&T, and NASA JPL, who for example are using their kit for their Mars 2020 mission. The Camunda business model is similar to Confluent and Elastic, and now with a couple hundred listed customers, including Universal, AT&T and others, they total about 80 staff, and also have yet to take a dime of venture capital. Like many who are building “commercial open source” businesses, Camunda have built a large community of followers and developers who help with enhancements and product development.

Buried deep inside the technology of “Business Process Model and Notation” (BPMN) they accidentally built a software development group during their work on automating processes for their consulting customers while applying a Smörgåsbord of open source technologies. Freund and his colleague Bernd Rücker then wrote a book about their real world experiences with BPMN at the end of 2012.  By that point, they had grown to approximately 15-20 staffers before eventually deciding to pivot their entire team into a software company in 2013.

The company software stack has been primarily focused on taking “visual workflows” or models and then abstracting those “diagrams” into serialized data that can then be used downstream for interactive analysis. To conceptualize, if you would, try to think Visio and Powerpoint diagrams of workflows but then connected directly into databases and analytics that result in super interactive flowcharts. That should give you a picture of what they are trying to achieve.

Optimize” is an extension to the existing Camunda stack consisting of a modeler, tasklist and other more traditional “BPMN” process tools. The “Optimize” tool kit sits between the traditional onsite relational database infrastructure and an ElasticSearch system to query workflow and identify “hot spots”. It’s an entirely similar technique that C and Fortran programmers have used for decades to “profile” their code, only while this approach is similar, it has traditionally been used for slightly less data intensive tasks. This is the area that Camunda wants to explore further.


Albeit the system currently scales to 30,000 events per second, most business and process control stacks aren’t anywhere near so data intensive. A hiring and staffing process came up in our discussion that showed a clear 2 week bottle neck in the process. This Optimize product is interesting, in that it effectively takes data and the XML representation of a “drawing”, (or a traditional workflow), to then apply results sets against the model looking for “hot spots”, kind of reverse engineering the business process with analytics. This then marks up the diagram where the hot spots in the process lie, giving insight to the management chain on what is actually going on in their complex systems.

Return on Investment.

Asking the critical question of why teams would implement the Camunda stack, and how it affects their bottom line, Freund was eager to point out it isn’t as simple matter as how many dollars saved. The total BPMN market they claim to be worth $17 billion by 2023, growing from $4.7 billion in 2017. Freund explained that customers are clearly saving money, but in many cases it is hard to quantify the direct benefit in real world dollars.They do however see advantages in their development cycle, but it is hard to really capture the benefits, mostly stating it is quicker to implement the workflows and they have better time to market as advantages. This seems to be the typical business case – automation and then save manpower and reap financial savings downstream from the R&D cycles.

One use case they did describe was MyJAR – a short term loan company. They could only carry out their business if they had tightly integrated and automated systems. Freund explained that the time to market for them can be so critical in so many other applications, because often the business doesn’t have “line of sight” into their complex processes. In that case, every little helps.

Artificial Intelligence and Workflows?

Freund explained that their roadmap includes more machine learning and their customers are starting us “automate” the automated workflow analysis.  They term this “Assisted Analytics”. Most of these efforts are coming from the community right now. For example, Zymergen are due to present at their next user community meeting. They plan to explain how they use an integrated design-build-test-analyze-learn (DBTAL) cycle to improve the performance of their engineered microbes. Universal Music will also present and discuss how they are using workflow automation to manage 1000’s of pieces of digital content pieces. The business and use cases for workflow automation are clearly wide and varied.

Local not SasS

The product is an “install it yourself” deal. Business processes tend to be guarded by the business and leaking of robotic automation outside the walled garden could be deleterious to any organization. The types of organizations who install the Camunda stack are large, and are fortunate to have considerable resources in terms of SRE and DEVOPS onsite who also run the rest of their business. The computing capacity isn’t critical, there are no “GPU” accelerators and “low latency” networks at play here. They do focus on issues of performance in the software stack. Freund also mentioned that the Daily Telegraph does for example install their stack directly on Amazon Web Services rather than their own tin. As with many of these up coming analytics software systems a degree of flexibility is needed and required by the community. The reliance on ElasticSearch on the backend also requires expertise in house to manage scale outside a single instance, but again many are having to develop these in-house skills.

Microservices Orchestration

Towards the end of the discussion, an interesting part of the conversation focused around their opensource Zeebe offering. Zeebe is attempting to orchestrate workers and microservices with existing visual workflows. The bold design specification includes:

  • Batching of I/O operations
  • Linear read/write data access patterns
  • Compact, cache-optimized data structures
  • Lock-free algorithms and actor concurrency (green threads model)
  • Broker is garbage-free in the hot/data path

They pitch Zeebe of being capable of processing events at a rate of 160K – 200K events per second on modern SSDs and Gbit Ethernet. Zeebe is under active development and is available as a containerized install from github

From a distance Zeebe looks a whole lot like another integration of microservices.  Apache Kafka has been used to orchestrate similar microservices. Freund claims that Kafka has a lack of visibility in to the business process and this is where he is planning to take his development and experiences with BPMN. The Next Platform discussed Kafka in detail in previous articles so you can compare and contrast.

So there you have it. A traditional open source business model from a team of folks who have pivoted from consulting services around BPMN to building new software to help orchestrate containerized and microservices for the business. The core software is free to download with the “Optimize” peice being part of their Enterprise bundle available on a standard 30 day trial so you can try it out yourself and see if it improves your business processes.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.