ChannelLife UK - Industry insider news for technology resellers
Illustration interconnected cloud servers streaming data into analytical tables multicloud environment

Confluent unveils Tableflow for real-time analytics with Delta Lake

Thu, 30th Oct 2025

Confluent has announced the general availability of integrations with Delta Lake and Databricks Unity Catalog for its Tableflow offering, alongside early access availability on Microsoft OneLake.

These developments provide new capabilities for enterprises seeking to connect operational, analytical, and artificial intelligence systems by transforming streaming data into governed, analytics-ready tables across hybrid and multicloud environments.

Integrations now generally available

The general availability of Delta Lake and Databricks Unity Catalog integrations within Tableflow allows Apache Kafka topics to be brought directly into Delta Lake or Apache Iceberg tables. This includes automated quality controls, catalogue synchronisation, and security features suitable for enterprise environments.

Tableflow, which is aimed at reducing the reliance on brittle ETL processes and manual integrations, expands Confluent's reach into multicloud deployments. The new features are designed to unify real-time and analytical data management under a single governance model for organisations operating at scale.

"Customers want to do more with their real-time data, but the friction between streaming and analytics has always slowed them down," said Shaun Clowes, Chief Product Officer at Confluent. "With Tableflow, we're closing that gap and making it easy to connect Kafka directly to governed lakehouses. That means high-quality data ready for analytics and AI the moment it's created."

Enterprise scale features

The release introduces several enterprise-focused capabilities. Customers are able to convert Kafka topics directly into Delta Lake tables stored in cloud object storage such as Amazon S3 or Azure Data Lake Storage. For flexible, cross-format analytics, both Delta Lake and Iceberg formats can be enabled per topic.

Integration with Unity Catalog allows for the automatic synchronisation of metadata, schemas, and access policies, providing organisations with centralised governance and consistent data management. A Dead Letter Queue feature has been added, enabling the isolation and management of malformed records without halting data flow-a design intended to improve transparency and recovery.

Tableflow's upsert functionality maintains currency and consistency of datasets by automatically updating and inserting records as data changes, which eliminates the need for manual maintenance in keeping Delta Lake and Iceberg tables deduplicated and analytics-ready.

Security measures have also been enhanced. The Bring Your Own Key feature allows customers to use their own encryption keys with Tableflow, providing greater control over data at rest-a requirement for regulated industries, such as financial services, healthcare, and the public sector.

These features build on existing capabilities, including schema evolution, compaction, automated table maintenance, and integrations with Apache Iceberg, AWS Glue, and Snowflake Open Catalog.

David Kinney, Principal Solutions Architect at Attune, outlined the impact on their operations:

"At Attune, delivering real-time insights from smart building Internet of Things (IoT) data is central to our mission. With just a few clicks, Confluent Tableflow lets us materialize key Kafka topics into trusted, analytics-ready tables, giving us accurate visibility into customer engagement and device behavior. These high-quality datasets now power analytics, machine learning (ML) models, and generative AI applications, all built on a reliable data foundation. Tableflow has simplified our data architecture while opening new possibilities for how we leverage data."

Expansion to Microsoft OneLake

Tableflow is now available for early access on Azure, integrated with Microsoft OneLake. This expansion enables organisations to materialise Kafka topics as open tables in Microsoft OneLake, allowing queries from Microsoft Fabric or third-party tools without requiring manual ETL work or schema management. The aim is to accelerate time-to-insight and reduce operational complexity for customers working in Azure-native analytics environments.

The service automates schema mapping, type conversion, and table maintenance, and also supports streamlined integration with Azure's analytics and AI services. This is managed via the Confluent Cloud user interface, command-line interface, or Terraform, offering choice for deployment workflows.

Dipti Borkar, Vice President and GM of Microsoft OneLake and ISV Ecosystem, described the release as a key step for real-time analytics:

"Access to real-time data is critical for customers to make fast and accurate decisions. With Confluent's Tableflow now available on Microsoft Azure, customers can stream Kafka events to OneLake as Apache Iceberg or Delta Lake tables and query them instantly via Microsoft Fabric and popular 3rd party engines using OneLake Table APIs, cutting complexity and speeding up decisions."

The release of these integrations is positioned as an important element in Confluent's ongoing partnership with both Microsoft and Databricks. The company aims to provide organisations with a unified, governed analytics platform for real-time data, catering for regulatory requirements and enabling streamlined data operations.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X