Categories: BPM

Slides from OOP 2014 Online: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data

Just a short blog post with my slides from OOP 2014: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data.

Content

Business processes are often executed without access to relevant data because technical challenges occur when trying to integrate big masses of data from different sources into the BPM engine. Companies miss a huge opportunity here! This session shows how to achieve intelligent business processes to improve performance and outcomes by integrating big data – just with open source tooling.

Target Audience: Architects, Developers, Project Leader, Manager, Decision Makers
Prerequisites: Basic knowledge in a programming language, databases, and BPM concepts
Level: Introductory

You will learn:
1) Learn how to create intelligent business processes
2) Learn why to combine BPM and Big Data
2) Learn how to combine BPM and Big Data

Extended abstract:
BPM is established, tools are stable, many companies use it successfully. However, today’s business processes are based on “dumb” data from relational databases or web services. Humans make decisions due to this information. Companies also use business intelligence and other tools to analyze their data. Though, business processes are executed without access to this important information because technical challenges occur when trying to integrate big masses of data from many different sources into the BPM engine. Additionally, bad data quality due to duplication, incompleteness and inconsistency prevents humans from making good decisions. That is status quo. Companies miss a huge opportunity here!
This session explains how to achieve intelligent business processes, which use big data to improve performance and outcomes. A live demo – based on open source frameworks such as Apache Hadoop – shows how big data can be integrated into business processes easily. In the end, the audience will understand why big data needs BPM to improve data quality, and why BPM needs big data to achieve intelligent business processes.

Slides

Click on the button to load the content from www.slideshare.net.

Load content

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Share
Published by
Kai Waehner

Recent Posts

Mainframe Integration with Data Streaming: Architecture, Business Value, Real-World Success

The mainframe is evolving—not fading. With cloud-native features, AI acceleration, and quantum-safe encryption, platforms like…

3 days ago

How OpenAI uses Apache Kafka and Flink for GenAI

OpenAI revealed how it builds and scales the real-time data streaming infrastructure that powers its…

7 days ago

­­The Rise of the Durable Execution Engine (Temporal, Restate) in an Event-driven Architecture (Apache Kafka)

Durable execution engines like Temporal and Restate are redefining how developers orchestrate long-running, stateful workflows…

2 weeks ago

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…

2 weeks ago

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…

3 weeks ago

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…

3 weeks ago