Kinetica in Motion
The Kinetica Blog
Blog »
Manan Goel

“ETL is dead” – Five Big Statements from Kafka Summit

Share
Tweet about this on TwitterShare on LinkedIn45Share on Facebook2Share on Reddit0Share on Google+0

This week, over 500 technologists – from enterprises such as BNY Mellon, Goldman Sachs, Google, ING, Target and many more – came together at Kafka Summit to learn and share tips in working with and getting value from high-speed data. At the event it was evident there is growing interest in instantaneous insights, and how real-time apps are driving demand for streaming platforms. Topics covered included low latency ingest, processing, and egress of continuous, fast-moving data.

Here are five statements heard from the leaders of the streaming world:

It’s not just a messaging queue anymore, Kafka is evolving into a full streaming platform

Jay Kreps, the CEO and Co-founder of Confluent and original developer of Kafka, kicked off the summit with his keynote titled, “The Rise of the Streaming Platform.” Kafka is seeing strong traction in and is in use at over 1/3rd of the Fortune 500 companies in sectors such as travel, retail, and banking. Organizations are turning to Kafka for three main use cases: messaging queues, Hadoop made fast, and fast ETL and scalable data integration. However, as Jay pointed out, with capabilities such as enterprise scale, true storage with replication and failover, and real-time stream processing, Kafka is evolving from a messaging solution to a streaming platform that addresses use cases such as publish and subscribe, stream storage and processing, and real-time business apps.

Streaming is moving to the cloud

Neha Narkhede, the CTO and Co-founder of Confluent and original author of Kafka, unveiled the new Confluent Cloud–a managed services offering for Apache Kafka in AWS. Confluent cloud provides Kafka on the cloud as a managed services offering to quickly build data pipelines and ingest, process, and egress streaming data. Kafka can stream fast-moving data in and out of Kinetica’s GPU database and Kinetica can manage, enrich, analyze, and visualize fast moving data to build real-time data and analytics applications such as those for just-in-time inventory, recommendation engines, and financial risk and compliance.

“ETL is dead…. long live streams”

“ETL is ready to retire as soon as streams get out of puberty,” says Fred Scheepers, the Chief Enterprise Architect at ING. ING has been using Kafka to build a streaming platform to ingest customer interaction data, find relevant data points, and use machine learning models to deliver relevant, personal, real-time, and actionable customer experiences. He demoed ING’s mobile app to automatically enable debit card use as customers travel internationally. As to when streams will completely replace ETL, Fred observed that Kafka still needs to catch up in areas such as metadata, governance, and lineage to see broader adoption.

Streaming is on the rise

The results from a survey of the Kafka community, covering 350 organizations in 47 countries, point to broadening Kafka use cases beyond the traditional publisher–subscriber use case. Kafka is being used for building data pipelines, micro services, IoT, real-time monitoring, data integration, messaging, and log aggregation. For more details, read Luanne’s blog or download the full report.

Growing impact from Kafka Connect and Streams API:

These additional Kafka features are driving adoption and popularity, with the Connect API providing access to new data sources and the Streams API providing stream processing. Organizations use the Connect API to connect applications, databases, Hadoop, websites, sensors, and devices as sources or sinks to their Kafka cluster. Streams API is powering use cases for business applications, ETL, analytics, recommendation engines, and IoT.

It’s an exciting time for streaming data. At Kinetica, we’re starting to see the full potential of what happens when streaming data can be fed into a database that can both absorb and provide insights on data moving at speed. From our partnership with Confluent, Kinetica’s Kafka connector provides bi-directional read and write connectivity that can power a wide variety of real-time applications–such as IoT, recommendation engines, and ad targeting.

Listen also…

You might be interested in Manan’s interview with Bloor Research for Kafka Summit

Leave a Comment