Open Source Project Flogo – Overview, Architecture and Live Demo

In October 2016, the open source IoT integration framework Flogo was published as first developer preview. This blog post is intended to give a first overview about Flogo. You can either browse through the slide deck or watch the videos.

In short, Flogo is an ultra-lightweight integration framework powered by Go programming language. It is open source under the permissive BSD license and easily extendable for your own use cases. Flogo is used to develop IoT edge apps or cloud-native / serverless microservices. Therefore, it is complementary to other integration solutions and IoT cloud platforms.

Some key characteristics:

  • Ultra-light footprint (powered by Golang) for edge devices with zero dependency model, very low disk and memory footprint, and very fast startup time
  • Can be run on a variety of platforms (edge device, edge gateway, on premise, cloud, container)
  • Connectivity to IoT technologies (MQTT, CoaP, REST, …)
  • Highly optimized for unreliable IoT environments
  • Intended to be used by developers / integration specialists / citizen integrators either by writing source code or leveraging the Web UI for visual coding, testing and debugging
  • Includes some innovating features like a web-native step-back debugger to interactively design / debug your process, simulate sensor events, and change data / configuration without restarting the complete process

Overview, Architecture and Use Cases

The following slide deck shows an overview, architecture and use cases for Flogo:

You are currently viewing a placeholder content from Default. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

You can also watch the following 45min video where I walk you through these slides and also show some live demos and source code:

Flogo Live Demo and Source Code

If you just want to see the live demo, watch the following 15min video:

 

Any feedback or questions are highly appreciated. Please use the Community Q&A for to ask whatever you want to know.

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming and applied AI.

Recent Posts

Dashboards and Queries for Apache Kafka: Operational, Explorative, and the Role of the Context Engine

Dashboards are a popular way to make streaming data visible and useful, but they are…

6 days ago

Data Streaming at MWC 2026: How Apache Kafka, Flink and Agentic AI Power Telecom Trends

Mobile World Congress (MWC) 2026 highlights the shift from batch systems to real time data…

2 weeks ago

From Takeoff to Touchdown: Real-Time Aviation with Data Streaming at Qantas

This blog post explores how data streaming transforms airline operations by enabling real-time visibility, faster…

4 weeks ago

The Ultimate Data Streaming Guide is Back – Second Edition of the Book and Industry Editions Now Available

The second edition of The Ultimate Data Streaming Guide is now available as a free…

1 month ago

When (Not) to Use Queues for Kafka?

Apache Kafka has long been the foundation for real-time data streaming. With the release of…

2 months ago

Diskless Kafka at FinTech Robinhood for Cost-Efficient Log Analytics and Observability

Diskless Kafka is transforming how fintech and financial services organizations handle observability and log analytics.…

2 months ago