Categories: BPM

Slides from OOP 2014 Online: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data

Just a short blog post with my slides from OOP 2014: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data.

Content

Business processes are often executed without access to relevant data because technical challenges occur when trying to integrate big masses of data from different sources into the BPM engine. Companies miss a huge opportunity here! This session shows how to achieve intelligent business processes to improve performance and outcomes by integrating big data – just with open source tooling.

Target Audience: Architects, Developers, Project Leader, Manager, Decision Makers
Prerequisites: Basic knowledge in a programming language, databases, and BPM concepts
Level: Introductory

You will learn:
1) Learn how to create intelligent business processes
2) Learn why to combine BPM and Big Data
2) Learn how to combine BPM and Big Data

Extended abstract:
BPM is established, tools are stable, many companies use it successfully. However, today’s business processes are based on “dumb” data from relational databases or web services. Humans make decisions due to this information. Companies also use business intelligence and other tools to analyze their data. Though, business processes are executed without access to this important information because technical challenges occur when trying to integrate big masses of data from many different sources into the BPM engine. Additionally, bad data quality due to duplication, incompleteness and inconsistency prevents humans from making good decisions. That is status quo. Companies miss a huge opportunity here!
This session explains how to achieve intelligent business processes, which use big data to improve performance and outcomes. A live demo – based on open source frameworks such as Apache Hadoop – shows how big data can be integrated into business processes easily. In the end, the audience will understand why big data needs BPM to improve data quality, and why BPM needs big data to achieve intelligent business processes.

Slides

Click on the button to load the content from www.slideshare.net.

Load content

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

Share
Published by
Kai Waehner

Recent Posts

Apache Kafka + Flink + Snowflake: Cost Efficient Analytics and Data Governance

Snowflake is a leading cloud data warehouse and transitions into a data cloud that enables…

3 days ago

Snowflake Data Integration Options for Apache Kafka (including Iceberg)

The integration between Apache Kafka and Snowflake is often cumbersome. Options include near real-time ingestion…

1 week ago

Snowflake Integration Patterns: Zero ETL and Reverse ETL vs. Apache Kafka

Snowflake is a leading cloud-native data warehouse. Integration patterns include batch data integration, Zero ETL…

1 week ago

When (Not) to Choose Google Apache Kafka for BigQuery?

Google announced its Apache Kafka for BigQuery cloud service at its conference Google Cloud Next…

3 weeks ago

Apache Kafka and Tinybird (ClickHouse) for Streaming Analytics HTTP APIs

Apache Kafka became the de facto standard for data streaming. However, the combination of an…

4 weeks ago

When NOT to Use Apache Kafka? (Lightboard Video)

Apache Kafka is the de facto standard for data streaming to process data in motion.…

1 month ago