fbpx
Techitup Middle East
AIB2B Technology

Confluent, Databricks Partner to Usher in New Age of Real-Time AI

Confluent, Inc. and Databricks, announced a major expansion in their partnership that brings together Confluent’s complete Data Streaming Platform and Databricks’ Data Intelligence Platform to empower enterprises with real-time data for AI-driven decision-making. New integrations between Confluent’s Tableflow and Databricks Unity Catalog will seamlessly govern data across operational and analytical systems, allowing businesses to build AI applications more efficiently.

“Real-time data is the fuel for AI,” said Jay Kreps, Co-founder and CEO, Confluent. “But too often, enterprises are held back by disconnected systems that fail to deliver the data they need, in the format they need, at the moment they need it. Together with Databricks, we’re ensuring businesses can harness the power of real-time data to build sophisticated AI-driven applications for their most critical use cases.”

“For companies to maximize returns on their AI investments, they need their data, AI, analytics and governance all in one place,” said Ali Ghodsi, Co-founder and CEO, Databricks. “As we help more organizations build data intelligence, trusted enterprise data sits at the center. We are excited that Confluent has embraced Unity Catalog and Delta Lake as its open governance and storage solutions of choice, and we look forward to working together to deliver long-term value for our customers.”

Confluent-Databricks

Real-time Data, Ready for AI

To bridge the data divide, Confluent and Databricks are announcing new integrations to ensure real-time interoperability and empower teams across the business to collaborate successfully. A bidirectional integration between Confluent’s Tableflow with Delta Lake and Databricks’ Unity Catalog, a unified and open governance solution for data and AI, will provide consistent, real-time data across operational and analytical systems that is discoverable, secure, and trustworthy.

Delta Lake, an open-format storage layer pioneered by Databricks, was originally developed for streaming use cases with fast writes. It has become the most adopted lakehouse format, proven out at a massive scale – processing over 10 exabytes of data daily. Now, Tableflow with Delta Lake makes operational data available immediately to Delta Lake’s rich ecosystem. Confluent and Databricks customers will be able to bring any engine or AI tool such as Apache Spark, Trino, Polars, DuckDB and Daft to their data in Unity Catalog.

Further, custom integrations between Tableflow and Databricks’ Unity Catalog will ensure metadata – a critical enabler for AI applications – is automatically applied to data exchanged between platforms. This makes operational data discoverable and actionable for data scientists and analysts working in Databricks while ensuring analytical data is equally accessible and useful for application developers and streaming engineers in Confluent. Additionally, Confluent’s Stream Governance suite will provide upstream governance and metadata to enhance fine-grained governance, end-to-end stream lineage, and automated data quality monitoring in Unity Catalog.

With these new capabilities, operational data from Confluent becomes a first-class citizen in Databricks, and Databricks data is easily accessible by any processor in the enterprise. The streaming data topics that AI applications consume and the tables that data analysts use will now have consistent views of the same real-time data, enabling faster, smarter AI-driven decision-making across the organization. This seamless integration between enterprise applications, analytics, and governance is critical for AI innovation at scale.

Related posts

Opinion: IT Teams Need The Right Tools to Drive Cloud Performance Using OpenTelemetry

Editor

Acronis Announces New XDR Solution, Acronis XDR

Editor

Riverbed Unveils In-Depth Insight into AI Adoption, Challenges

Editor

Leave a Comment