CamundaLocal Toronto 2024

When conferences end up in my back yard, I usually find a way to attend. Today, I’m at CamundaLocal Toronto, a one-day conference and workshop for Camunda customers, prospects and partners. Toronto is the hot spot for Canadian finance, and we can see the major bank towers from our perch at the top of the Bisha Hotel. I’ve worked for a long time as a process automation consultant in this region and there is really an overwhelming number of investment firms, bank, insurance and other related financial organizations within a short distance. Lots of representation from the big banks and other financials in attendance today.

Our host for the day is Lisa-Marie Fernandes, Camunda Strategic Account Executive, and after she introduced the day and gave some background on Camunda, we heard from Sathya Sethuraman, Field CTO, on the problems with siloed automation versus Camunda’s vision of universal process orchestration. Every technology company and CIO have been talking about the necessity of business transformation for years, and Sathya pointed out that process automation is a fundamental part of any sort of digital transformation. The problem arises with siloed automation: local automation within departments or even sub-departments, with no automated interaction between parts of the process. Although people in that department are convinced of the value of their local automation, customers deal with a much broader end-to-end process in order to do business with an organization. With siloed automation, that customer journey is fragmented and difficult, with the customer having to fill in the gaps of the process themselves through multiple phone calls, forms and emails, while re-entering their information and explaining their issues at each step. Contrast that with an organization that has end-to-end orchestration using a product such as Camunda to bind together these departmental processes into a full customer journey. The customer no longer has to deal with multiple processes (while trying to figure out what those processes are from the outside), and people inside the organization no longer have to make heroic efforts just to meet the customer’s needs.

Sathya pointed out the risks of broken (or missing) end-to-end automation, including the increased complexity caused by multiple isolated processes, and the disconnect between business and IT due to lack of a common language for process automation. From an organization’s standpoint, there are issues with efficiency and quality of internal processes, but the bigger issue is that of customer satisfaction: if a competitor has a fully-automated end-to-end process that makes it easier for the customer, then many customers will choose that path of least resistance when they are choosing who to give their business to. This, of course, is not new. However, with the changes that we’ve seen in the past four years due to the pandemic, work from home and shifting supply chains, more streamlined and automated processes have become a true competitive differentiator. He had an interesting slide showing a heatmap of the many process orchestration opportunities just in the consumer banking value chain: probably 50-60 distinct processes across customer management, retail, lending, cards, payment, risk, finance and accounting, and corporate management where many financial organizations have room for improvement.

Daniel Meyer, Camunda CTO, then gave us a product roadmap update. He kicked off with a view of the common reality of most large organizations, echoing Sathya’s points about islands of automation without end-to-end process orchestration. Many companies focus on improving the local automation — making their core banking systems better, for example — without considering how this is making the customer journey worse because there’s no integration or linkage between these disparate systems. Opening a bank account as a new customer? There are likely different systems and processes for know-your-client checks, credit checks, customer onboarding and account opening; when one isn’t connected to the others, the customer experiences delays and an increasingly negative view of the organization as they struggle to get the new account open. The missing link is end-to-end process orchestration, and this is the focus that Camunda has defined for their product over the past few years.

The challenges of end-to-end orchestration include endpoint diversity — it’s more difficult to integrate a heterogeneous set of endpoints since they have different interfaces and may be at different levels — and process complexity that goes beyond a simple sequence of steps. He showed where some of the customer use cases live on the endpoint diversity versus process complexity graph, and the sweet spot for Camunda in the top right quarter of that graph, where processes and the system integrations are both complex. Of course, those are also the mission-critical processes that are controlled by IT, even though business people may be involved in the design and requirements. In this context, their vision is Camunda as the universal process orchestrator: providing the capability of complex process flows as well as being able to integrate in human work, AI, business rules, microservices, RPA, IoT and APIs of all types. That being said, they still have quite a developer-centric bias.

Daniel spent a bit of time on some of the terminology and usage: process orchestration versus business process management (essentially the same when you’re talking about the automation side of BPM, and Camunda also uses the term “workflow engine” as the core of process automation), and who within an organization creates BPMN diagrams. This was a pretty technical audience, and there was a bit of discussion on the value of BPMN once you got down to the level of details that a non-technical process owner isn’t going to be looking at. I believe it’s still incredibly valuable as a graphical process-oriented development language even for those models that are not directly viewed by the business people: it provides a level of functionality as well as development guardrails (e.g., model validation) that can accelerate development and increase reusability while making it more accessible to less experienced developers.

Camunda, of course, is only one piece of the entire process improvement cycle, albeit a critical core piece of the automation. Most customers are also using some combination of process mining, more advanced business modeling, and business intelligence; these essential activities of discovery, design and analysis/improvement fit around Camunda’s process automation and low-code integration offerings. Some automation technologies, including AI, RPA and an event bus, are not part of the core Camunda platform but easily integrated to allow the best of breed components to be added. And although Camunda has recently provided a some front-end UI capabilities, it’s pretty rudimentary forms and most companies will be integrating Camunda into their existing application/UI development environment.

Daniel took us through the investment focus that Camunda has for product development in the areas of developer productivity, AI/ML, collaboration, business intelligence, low-code and universal connectivity. He also highlighted the availability of connectors in the Camunda Marketplace that can be pulled in and used in any model for additional connectivity beyond what is included in the out of the box product.

Camunda has a regular cycle of releases every six months, and he gave us a quick overview of what’s coming up in 8.5 in April, 8.6 in October, as well as future plans for 8.7 and beyond. This is all for systems based on the Zeebe architecture (V8+); not clear from this presentation what, if anything, they are doing for customers still on V7 aside from encouraging and assisting with migration to V8. As with many vendors that completely replatform, migrations to the new platform are likely much slower than they anticipated, although with the added complication that they are leaving behind their open source legacy with the shift to their modern Zeebe engine.

After lunch, there was a short customer presentation by Norfolk & Dedham insurance, who are using Camunda 7 and Spring Boot as an embedded engine invoked from their own UI portal for managing claims. They customized the Camunda Modeler to integrate their own applications and data sources, in order to accelerate development. In addition to the usual benefits from process automation and integration, the data generated by process instances is hugely valuable to them when looking at how to translate activities to more strategic actions when planning future improvements to their business operations. Now that their claims applications are in advanced testing stages, they are expanding the same technical stack to their underwriting applications, and are seeing a significant benefit from the reusability of components that they created during the claims project. They aren’t yet planning a V8 migration, although are very cognizant that this is likely in their future.

Following the customer presentation, Gustavo Mendoza, a Camunda senior sales engineer, gave us a technical demo of the V8 product stack to expand on some of the points that Daniel Meyer went through in his presentation earlier. We saw demos of the process and decision modelers, including the collaboration features, the UI forms builder for task handling, then the execution environment to see the user interaction with a running process as well as the Operate monitoring portal and integration with Slack.

The remainder of the afternoon was split into a hands-on live coding workshop for customers, and a partner workshop. A full and worthwhile day if you’re in the Camunda ecosystem.

Camunda has a couple of other CamundaLocal events coming up soon in Chicago and San Francisco, plus their main European conference in Berlin in May and their North American conference later in the year (September in New York, I think).

Have a Question? Ask Ada!

Today is Ada Lovelace Day — marking the contributions of the world’s first “programmer” — and the perfect day for Bizagi to launch their Ask Ada generative AI that helps knowledge workers find answers to questions about their organization’s data. Check out the short video clip on the AI product page to see how it looks to a user; basically, this is conversational analytics out of the box without having to predefine the analytics.

I had a sneak peak a few days ago with the always-informative Rachel Brennan, Bizagi’s VP of Product and Solutions Marketing, and she pointed out some of the important governance and privacy safeguards that they have put in place:

  • Ada uses Azure Private OpenAI GPT Service, rather than the public service
  • Ada is trained on Bizagi’s data structure and does not share any private data
  • Ada filters information to present only what is authorized and relevant to the user’s role and context

This focus on governance and privacy is something that a lot of companies are struggling with, but Bizagi seems to be moving in the right direction.

Many companies are choosing to focus on genAI for “co-pilot” developer tasks, including creating process models, or for replacing human steps in processes; having Ada as a trusted advisor for knowledge workers is a different angle to how AI can be used in the context of business processes. I’m imagining many other types of “user assist” tasks where AI can be applied, such as summarizing a long-running customer transaction so that the worker don’t have to read through every piece of content associated with that customer.

Ask Ada will be released this month, and is free to Bizagi customers until June 30, 2024. Looking forward to see how they expand these capabilities in the months to come.

CamundaCon 2023 Day 2: GPT Meets BPM for AI-Driven Workflow

Bennet Krause of Holisticon, an IT consultancy, presented some of the integrations that they’ve created between Camunda and GPT, which could be applied to other Large Language Models (LLMs). Camunda provides an OpenAI connector, but there are many other LLMs that may provide better functionality depending on the situation. Holisticon has created an open source GPT connector, which Bennet demonstrated in a scenario for understanding an inbound customer email and constructing an outbound response after the issue has been resolved by a customer service representative.

They have a number of foundational connectors — extract structured data from unstructured data, make decisions or classifications, compose text from instructions and templates, and natural language translation — as well as what he calls agentic connectors, which are automated agents interacting with the outside world.

The addition of the agentic connector allowed some paths in his customer service example to become completely automated, replacing the customer service representative with an automated agent. These connectors include a database connector to query SQL databases, an OpenAI connector to interact with REST services, a Q&A retrieval connector to answer questions based on documentation, a process connector to dynamically model and execute processes, and a plan and execute connector.

He warned of some of the potential issues with replacing human decisions and actions with AI, including bias in the LLMs, then finished with their plans for new and experimental connectors. In spite of the challenges, LLMs can help to automate or assist many BPM tasks and you can expect to see much more interaction between AI and BPM in the future.

This is the last session I’ll be at on-site for this edition of CamundaCon: we have the afternoon break now, then I need to head for the airport shortly after. I’ll catch up on the last couple of sessions that I missed when the on-demand comes out next week, and will post a link to the slides and presentations in case you want to (re)view any of the sessions.

CamundaCon 2023 Day 2: Process and Decision Standards

Falko Menge and Marco Lopes from Camunda gave a presentation on the involvement of Camunda with the development of OMG’s core process and decision standards, BPMN and DMN. Camunda (and Falko in particular) has been involved in OMG standards for a long time, and embrace these two standards in their products. Sadly, at least to me, they gave up support for the case management standard, CMMN, due to a lackluster market adoption; other vendors such as Flowable support all three of the standards in their products and have viable use cases for CMMN.

Falko and Marco gave a shout out to universities and the BPM academic research conference that I attended recently as promoters of both the concepts of standards and future research into the standards. Camunda has not only participated in the standards efforts, but the co-founders wrote a book on Real-Life BPMN as they discovered the ways that it can best be used.

They gave a good history of the development of the BPMN standard and also of Camunda’s implementation of it, from the early days of the Eclipse-based BPMN modeler to the modern web-based modelers. Camunda became involved in the BPMN Model Interchange Working Group (MIWG) to be able to exchange models between different modeling platforms, because they recognized that a lot of organizations do much broader business modeling in tools aimed at business analysts, then want to transfer the models to a process execution platform like Camunda. Different vendors choose to participate in the BPMN MWIG tests, and the results are published so that the level of interoperability is understood.

DMN is another critical standard, allowing modelers to create standardized decision models and also supports the Friendly-Enough Expression Language (FEEL) for scripting within the models. The DMN Technolgy Compatibility Kit (TCK) is a set of decision models and expected results that provides test results similar to that of the BPMN MIWG tests: information about the vendors’ products test coverage are published so that their implementation of DMN can be assessed by potential customers.

Although standards are sometimes decried as being too difficult for business people to understand and use (they’re really not), they create an environment where common executable models of processes and decisions can be created and exchanged across many different vendor platforms. Although there are many other parts of a technology stack that can create vendor lock-in, process and decision models don’t need to be part of that. Also, someone working at a company that uses BPMN and DMN modeling tools can easily move to a different organization that uses different tools without having to relearn a proprietary modeling language. From a model richness standpoint, many vendors and researchers working together towards a common goal can create a much better and more extensive standard (as long as they’re not squabbling over details).

They went on to discuss some of the upcoming new standards: BPM+ Harmonization Model and Notation (BHMN), Shared Data Model and Notation (SDMN), and Knowledge Package Model and Notation (KPMN), all of which are in some way involved in integrating BPMN and DMN due to the high degree of overlap between these standards in many organizations. Since these standards aren’t close enough to release, they’re not planned for a near-future version of Camunda, but they’ll be added to the product management roadmap as the standards evolve and the customer requirements for the standards becomes clear.

CamundaCon 2023 Day 2: Healthcare Workflow to Improve Patient Outcomes

Steven Gregory of Cardinal Health™ Sonexus™ Access and Patient Support, a healthcare technology provider, presented on some of the current US healthcare trends — including value-based care and telemedicine — and the technology trends that are changing healthcare, from IoT wearable devices to AI for clinical decisioning. Healthcare is a very process-driven industry, but many of the processes are manual, or embedded within forms, or within legacy systems: scheduling, admin/discharge, insurance, and health records management. As with many other industries, these “hidden” workflows are critical to patient outcomes but it’s not possible to see how the flows work at any level, much less end-to-end.

There’s some amount of history of clinical workflow automation; I worked with Siemens Medical Systems (now Cerner) on their implementation of TIBCO’s workflow more than 10 years ago, and even wrote a paper on the uses of BPM in healthcare back in 2014. What Steven is talking about is a much more modern version of that, using Camunda and a microservice architecture to automate processes and link legacy systems.

They implemented a number of patient journey workflows effectively: appointment creating, rescheduling and cancellation; benefits verification and authorization; digital enrollment; and some patient-facing chatbot flows. Many of these are simply automation of the existing manual processes, but there’s a lot of benefit to be gained as long as you recognize that’s not the final version of the flow, but a milestone on the journey to process improvement.

He discussed a really interesting use case of cell and gene therapy: although they haven’t rolled this out this yet, it’s a complex interaction of systems integration, data tracking across systems, unique manufacturing processes while providing personalized care to patients. He feels that Camunda is key for orchestrating complex processes like this. In the Q&A, he also spoke about the difference in ramp-up time for their developers, and how much faster it is to learn Camunda and individual microservices than a legacy system.

Great examples of moving beyond straightforward process orchestration for improving critical processes.

CamundaCon 2023 Day 2 Technical Keynote

The second day of CamundaCon started with a keynote by Camunda co-founder and chief technologist Bernd Ruecker and CTO Daniel Meyer. They started with the situation that plagues many organizations: point-to-point integrations between heterogeneous legacy systems and a lot of manual work, resulting in inefficiencies and fragile system architecture. News flash: your customers don’t care about your aging IT infrastructure, they just want to be served in a way that works for them.

You can swap all of this with a “big bang” approach that changes everything at once, but that’s usually pretty painful and doesn’t work that well. Instead, they advocate starting with a gradual modernization which looks more like the following.

First, model your process and track the flow as it moves through different systems and steps. This allows you to understand how things work without making any changes, and identify the opportunities for change. You can actually run the modeled processes, with someone manually moving them through the steps as the work completes on other systems, and tracking the work as it passes through the model.

Next, start orchestrating the work by taking the flow that you have, identifying the first best point to integrate, and doing the integration to the system at that step. Once’s that’s working, continue integrating and automating until all the steps are done and the legacy systems are integrated into this simple flow.

Then, start improving the process by adding more logic, rearranging the steps, and integrating/automating other systems that may be manually integrated.

That’s a great approach for a first project, or when you’re just focused on automating a couple of processes, but you also need to consider the broader transformation goals and how to drive it across your entire organization. There are a number of different components of this: establishing a link between value chains, orchestrations and down through to business and technical capabilities; driving reuse within your organization using the newly-launched Camunda Marketplace; and providing self-service deployment of Camunda to remove any barriers to getting started.

An important part of your modernization journey will be the use of connectors, while allow you to expose integrations into a wide variety of system types directly into a process model without the modeler needed to understand the technical intricacies of the system being called. This, and the use of microservices to provide additional plug-in functionality, makes it easier for developers and technical analysts to build and update process-centric applications quickly. Underpinning that is how you structure development teams within your organization (autonomy versus centralization) and support them with a CoE, smoothing the path to successful implementations.

In short, the easier you make it for teams to build new applications that fit into corporate standards and meet business goals, the less likely you are to have business teams be forced go out and try to solve the problem themselves when they really need a more technical approach, or just suffer with a manual approach. You’ll be able to categorize your use cases to understand when a business-driven low-code solution will work, and what you need the technical developers to focus on.

Camunda now includes a much friendlier out of the box user interface, rich(er) forms support and testing directly in the process modeler; this allows more of the “yellow” areas in the diagram above to be implemented by less-technical developers and analysts. They are also looking at how AI can be used for generating simple process models or provide help to a person who is building a model, as well as the more common use of predictive decisioning. They’ve even had a developer in the community create BpmnGPT to demonstrate how an AI helper can assist with model development.

They wrapped up with a summary of the journey from your first project to scaling adoption to a much broader transformation framework. Definitely some good goals for those on any process automation journey.

CamundaCon 2023 Day 1 Wrapup

Our first day at CanundaCon in New York wrapped up with a conversation between Camunda CTO Daniel Meyer and Ernst and Young’s Deepak Tiwari, operated by Amy Johnston. The focus was on implementing best of breed automation in financial services — definitely a topic that I can get behind, as you might have known from my Rolling Your Own Digital Automation Platform presentation at bpmNEXT in 2019 that ended up with me delivering a similar keynote later that same year at CamundaCon 2019 in Berlin. Although there is a technical challenge with selecting the right products to mix into your hyperautomation platform, the bigger issue can be managing multiple vendors, especially when something goes wrong. You also need to create and enforce the best practices for how to use the different products together through a CoE, so that you don’t end up with someone using a process model as a decision tree or other design atrocities.

They covered a variety of topics: how FIs are now considering themselves to be software companies in their own right (which can be okay, or can go horribly horribly wrong); the impact of low code platforms; legacy home-grown platforms versus best of breed based on commercial components; using a process-first approach; IT modernization via cloud replatforming which then triggers application modernization; generative AI (because you can’t have a tech conversation these days without mentioning it at least once); and the interaction of personal devices with financial services. Much of what they discussed was not specific to financial services, and some of the audience questions were specific to Camunda directions.

We finished with a brief fireside chat (sans fire) between Amy and Jakob Freund, touching on CoE, process-first orientation, AI for solution acceleration, and how Camunda has changed and grown as a company.

Having spent the day perched on a hard wooden chair, balancing my folding keyboard and tablet on my lap, I’m ready to go off the record. Back with day 2 coverage tomorrow.

CamundaCon 2023 Day 1: Using Camunda for Electronic Document Handling at CERN

As we reached the end of the first day of CamundaCon 2023 in NYC, Dimitri Kekelidze of Conseil Européen pour la Recherche Nucléaire (CERN, or what we in English would call the European Council for Nuclear Research) presented on their use of Camunda for handling electronic documents. Although an “administrative” sort of workflow, it’s pretty critical: they started in the early 1990s to move from paper to electronic forms and documents, and those are used for everything from access requests to specific buildings to all types of HR forms to training requests to purchasing.

The volume is pretty significant: there have been 2 million leave requests alone since the system was started in 1992, and this is not just about filling out a form online, it’s the entire process of handling the request that is kicked off by the form. In short, document-driven processes where the document is an electronic form filled out on their employee portal.

CERN started their electronic document handling in 1992 (long before Camunda) with a home-built workflow solution, then moved to Oracle workflow in 1998, then to ActiveVOS (a name I haven’t heard in years!) in 2006, Activiti in 2013, and Camunda in 2021. Making the move from Activiti to Camunda meant that they could migrate quite a bit of the BPMN and code, although there were specific functions that he discussed that required a bit of work for migration. Since then, they’ve migrated 65 processes, and have 9 still to migrate; this has necessitated an architecture that supports both Activiti and Camunda, depending on the type of workflow.

Because of the business criticality, volume and complexity of these processes, there is a significant amount of testing prior to migration. They had a number of bugs/glitches during the migration: some because they had some legacy code that was no longer required (such as marking workflow instances busy/free in an external database), and some due to overly complex BPMN diagrams that could be redrawn in a simpler fashion to remove manual escalations and error handling.

In the upcoming months, they plan to complete the migrate of the remaining 9 processes so that they can decommission Activiti. They will also be upgrading to Camunda 8, and adding some custom plugins to Cockpit for monitoring and managing the flows.

CamundaCon 2023 Day 1: Provinzial and CoE

We’ve heard a few things already today on the value of centres of excellence (CoE): achieving the strategic scaled adoption of process orchestration maturity is going to require a CoE. André Wickenhöfer and Björn Brinkmann of Provinzial, a large Germany insurance company, were joined by Leon Strauch of Camunda to discuss CoE in general and how they are used at Provinzial.

CoEs can be centralized or more distributed — often they start out centralized but end up decentralized and federated within large organizations — but have some common activities and areas of expertise.

Provinzial started developing their CoE in conjunction with some successful implementation projects over the last 10 years; they spent quite a bit of time discussing the complexity of their projects, and how having some shared knowledge, methodologies and tools via their CoE has made it possible to do projects of this scope so successfully. The CoE supports the hyperautomation tech stack that is then used by the projects, allowing for common areas of discovery, design, automation and improvement. One of their primary conclusions was that this type of scaling is only possible through the use of a CoE and federated delivery teams.

Although the presentation was supposed to be about their CoE, we heard a lot of great information on the automation projects that they’ve been developing, and how each project can push new tools and methods back to the CoE for more widespread acceptance — worth watching the replay when it’s available to better understand what they learned along this journey.