In a compelling keynote at the Current 2025 event in London, Confluent’s CEO Jay Kreps shared his vision for the future of data systems, highlighting how real-time data streaming is becoming foundational in the AI-driven enterprise era. He emphasized three core themes: unifying batch and stream processing, the emergence of data-intensive applications, and the evolving architecture required to support real-time AI and agent-based systems.
Jay began with a personal note on his favorite topics: data streaming and AI. He argued that in today’s world, businesses are increasingly run by software rather than merely supported by it. The rise of autonomous software systems handling customer interactions, supply chains, and core operations underscores the growing importance of real-time feedback loops within organizations.
Kreps traced the evolution from siloed, UI-centric applications to the modern interconnected architecture dominated by analytics, warehousing, and now, streaming systems. The key transformation, he noted, is the continuous nature of software operations and decision-making, pushing enterprises to rethink infrastructure from being storage-centric to stream-oriented.
The real challenge, he explained, lies in building, testing, and iterating on data-intensive applications that must work with live, evolving datasets. Traditional batch systems, while useful for testing, lack real-time adaptability. On the other hand, REST-based systems, while scalable, aren’t optimized for iterative development. Streaming offers the best of both worlds—real-time responsiveness and data richness.
Kreps introduced the concept of “table-stream duality,” showing how streaming data and stateful tables are interchangeable and complementary. Using practical analogies like real-time census tracking and commit logs in databases, he illustrated how combining Kafka with Delta/Iceberg tables and Flink enables a unified platform for streaming and analytics.
The unification of these technologies through Confluent’s Tableflow creates a seamless loop: developers can iterate on data and models, integrate AI, and deploy to production without re-architecting. The result is a robust foundation for building the next generation of intelligent, real-time applications.
“This isn’t just about improving data pipelines. It’s about transforming how we build, operate, and innovate with software,” Kreps concluded.
Earlier in the day, Shaun Clowes, Chief Product Officer of Confluent, in his keynote, emphasized the urgent need to unify operational and analytical data systems to meet the demands of modern, real-time, data-intensive applications—especially in the AI era. “Traditional pipelines are reaching a breaking point as businesses require immediate, accurate insights to drive automation and intelligent decision-making. From retail to healthcare and logistics, streaming data is becoming the backbone of responsive, AI-powered operations,” he said.