The oil & gas and mining industries require edge computing for low latency and zero trust use cases. Most IT architectures are hybrid with big data analytics in the cloud and safety-critical data processing in disconnected and often air-gapped environments. This blog post shares a panel discussion that explores the challenges, use cases, and hardware/software/network technologies to reduce cost and innovate. A key focus is on the open-source framework Apache Kafka, the de facto standard for processing data in motion at the edge and in the cloud.
This blog post explores low latency data processing and edge computing with Apache Kafka, 5G telco networks, and cloud-native AWS Wavelength infrastructure. Learn about use cases and architectures across industries to combine mission-critical and analytics workloads, and a concrete hybrid implementation for energy production and distribution.
Apache Kafka became the de facto standard for event streaming. Various vendors added Kafka and related tooling to their offerings or provide a Kafka cloud service. This blog post uses the car analogy – from the motor engine to the self-driving car – to explore the different Kafka offerings available on the market. The goal is not a feature-by-feature comparison. Instead, the intention is to educate about the different deployment models, product strategies, and trade-offs from the available options.
Hybrid cloud architectures are the new black for most companies. A cloud-first is obvious for many, but legacy infrastructure has to be maintained, integrated, and (maybe) replaced over time. Event Streaming with the Apache Kafka ecosystem is a perfect technology for building hybrid replication in real-time at scale.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.