14 May 2024

Min Read

DeltaStream Joins the Connect with Confluent Partner Program

We’re excited to share that DeltaStream has joined the Connect with Confluent technology partner program.

Why this partnership matters

Confluent is a leader in streaming data technology, used by many industry professionals. This collaboration enables organizations to process and organize their Confluent Cloud data streams easily and efficiently from within DeltaStream. This breaks down silos and opens up powerful insights into your streaming data, the way it should be.

Build real-time streaming applications with DeltaStream

DeltaStream is a fully managed stream processing platform that enables users to deploy streaming applications in minutes, using simple SQL statements. By integrating with Confluent Cloud and other streaming storage systems, DeltaStream users can easily process and organize their streaming data in Confluent or wherever else their data may live. Powered by Apache Flink, DeltaStream users can get the processing capabilities of Flink without any of the overhead it comes with.

Unified view over multiple streaming stores

DeltaStream enables you to have a single view into all your streaming data across all your streaming stores. Whether you are using one Kafka cluster, multiple Kafka clusters, or multiple platforms like Kinesis and Confluent, DeltaStream provides a unified view of the streaming data and you can write queries on these streams regardless of where they are stored.

Break down silos with secure sharing

With the namespacing, storage abstraction and role based access control, DeltaStream breaks down silos for your streaming data and enables you to share streaming data securely across multiple teams in your organizations. With all your Confluent data connected into DeltaStream, data governance becomes easy and manageable.

How to configure the Confluent connector

While we have always supported integration with Kafka and continue to do so, we have now simplified the process for integrating with Confluent Cloud by adding a specific “Confluent” Store type. To configure access to Confluent Cloud within DeltaStream, users can simply choose “Confluent” as the Store type while defining their Store. Once the Store is defined, users will be able to share, process, and govern their Confluent Cloud and other streaming data within DeltaStream.

To learn how to create a Confluent Cloud Store, either follow this tutorial or watch the video below.

Getting Started

To get started with DeltaStream, schedule a demo with us or sign up for a free trial. You can also learn more about the latest features and use cases on our blogs page.

21 Dec 2023

Min Read

DeltaStream: A Year of Innovation and Growth

2023 has been a remarkable year for DeltaStream, filled with innovation, growth, and resilience, it has been nothing short of transformative. A year of significant advancements in our mission to make stream processing accessible and powerful for everyone.

We wanted to take a moment to highlight the key achievements from this year that have propelled us forward:

Expanding Expertise

Previously focused solely on engineering, we welcomed a dedicated Go-To-Market team in 2023, leading to increased content creation through engaging blog posts, webinars, and video tutorials. This expansion reflects our commitment to broader engagement and community building through education.

Building a Robust Platform

2023 saw the release of many exciting capabilities by DeltaStream. We launched a public SaaS offering, where both control and data planes reside within DeltaStream’s VPC. While this solution serves many, we recognize the need for stricter data management. In response, we built our private SaaS offering, Bring Your Own Cloud (BYOC). With this option, DeltaStream runs the data plane within the user’s own VPC, ensuring data sovereignty. Announced in August 2023, BYOC has received enthusiastic feedback from customers and prospects.

Deepening Ecosystem Integration

Through user feedback, we identified Snowflake and Databricks as popular destinations for streaming data. We responded by establishing seamless integrations within DeltaStream, allowing users to build streaming pipelines and materialize results in either platform.

Open Source Contributions

Recognizing the power of collaboration, we open-sourced our Snowflake connector for Apache Flink. This connector facilitates native integration between other data sources and Snowflake. Open sourcing this connector aligns with our vision of providing a unified view over all data and to make stream processing possible for any product use case.

Streamlining Change Data Capture

DeltaStream now supports Change Data Capture (CDC) for PostgreSQL, enabling real-time data integration and consistent updates across systems.

Free Trial Availability

We understand the importance of hands-on experience. This year, we launched a free trial that allows users to explore DeltaStream’s features and functionalities firsthand. Additionally, we launched a click-through experience so users are able to experience our first-in-class UI without registration.

Achieving SOC 2 Compliance

Demonstrating our commitment to security, DeltaStream achieved SOC 2 Type II compliance late Q2 2023. This certification was important for us to show our commitment to protecting customer data, ensuring operational excellence, and maintaining a secure environment.

Engaging the Community

We actively participated in various conferences and events, including Current 2023, Flink Forward, and numerous Data Infrastructure gatherings. We sponsored Current 2023, hosted networking events, presented a session on securing streaming data, and showcased DeltaStream to the broader community.

Looking Ahead

2023 has been a year of tremendous growth and innovation for DeltaStream. We have achieved significant milestones and established ourselves as a leading provider of stream processing solutions. As we look forward to 2024 and beyond, we are committed to continuous development and community engagement. We believe that stream processing has the power to revolutionize data-driven decision-making, and we are excited to be at the forefront of this transformation.

Thank you to our customers, community, and partners for your unwavering support. Together, we are making stream processing a reality for organizations of all sizes.

07 Nov 2023

Min Read

Open Sourcing our Apache Flink + Snowflake Connector

At DeltaStream our mission is to bring a serverless and unified view of all streams to make stream processing possible for any product use case. By using Apache Flink as our underlying processing engine, we have been able to leverage its rich connector ecosystem to connect to many different data systems, breaking down the barriers of siloed data. As we mentioned in our Building Upon Apache Flink for Better Stream Processing article, using Apache Flink is more than using a robust software with a good track record at DeltaStream. Using Flink has allowed us to iterate faster on improvements or issues that arise from solving the latest and greatest data engineering challenges. However, one connector that was missing until today was the Snowflake connector.

Today, in our efforts to make solving data challenges possible, we are open sourcing our Apache Flink sink connector built for writing data to Snowflake. This connector has already provided DeltaStream with native integration between other sources of data and Snowflake. This also aligns well with our vision of providing a unified view over all data, and we want to open this project up for public use and contribution so that others in the Flink community can benefit from this connector as well.

The open source repository will be open for any contributions, suggestions, or discussions. In this article, we touch on some of the highlights around this new Flink connector.

Utilizing the Snowflake Sink

The Flink connector uses the latest Flink Sink<InputT> and SinkWriter<InputT> interfaces to build a Snowflake sink connector and write data to a configurable Snowflake table, respectively:

Diagram 1: Each SnowflakeSinkWriter inserts rows into Snowflake table using their own dedicated ingest channel

The Snowflake sink connector can be configured with a parallelism of more than 1, where each task relies on the order of data in which they receive from their upstream operator. For example, the following shows how data can be written with parallelism of 3:

  2. DataStream<InputT>.sinkTo(SnowflakeSinkWriter<InputT>).setParallelism(3);

Diagram 1 shows the flow of data between TaskManager(s) and the destination Snowflake table. The diagram is heavily simplified to focus on the concrete SnowflakeSinkWriter<InputT>, and it shows that each sink task connects to its Snowflake table using a dedicated SnowflakeStreamingIngestChannel from Snowpipe Streaming APIs.

The SnowflakeSink<InputT> is also shipped with a generic SnowflakeRowSerializationSchema<T> interface that allows each implementation of the sink to provide its own concrete serialization to a Snowflake row of Map<String, Object> based on a given use case.

Write Records At Least Once

The first version of the Snowflake sink can write data into Snowflake tables with the delivery guarantee of NONE or AT_LEAST_ONCE, using AT_LEAST_ONCE by default. Supporting EXACTLY_ONCE semantics is a goal for a future version of this connector.

The sink writes data into its destination table after buffering records for a fixed time interval. This buffering time interval is also bounded by Flink’s checkpointing interval, which is configured as part of the StreamExecutionEnvironment. In other words, if Flink’s checkpointing interval and buffering time are configured to be different values, then records are flushed as fast as the shorter interval:

  2. StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  3. env.enableCheckpointing(100L);
  4. SnowflakeSink<Map<String, Object>> sf_sink = SnowflakeSink.<Row>builder()
  5. .bufferTimeMillis(1000L)
  6. .build(jobId);
  7. env.fromSequence(1, 10).map(new SfRowMapFunction()).sinkTo(sf_sink);
  8. env.execute();

In this example, checkpointing interval is set to 100 milliseconds and buffering interval is configured as 1 second, which tells the Flink job to flush the records at least every 100 milliseconds, i.e. on every checkpoint.

Read more about Snowpipe Streaming best practices in the Snowflake documentation.

The Flink Community, to Infinity and Beyond

We are very excited about the opportunity to contribute our Snowflake connector to the Flink community. We’re hoping that this connector will add more value to the rich connector ecosystem of Flink that’s powering many data application use cases.

If you want to check out the connector for yourself, head over to the GitHub repository, or if you want to learn more about DeltaStream’s integration with Snowflake, read our Snowflake integration blog.

01 Aug 2023

Min Read

Private SaaS General Availability and Free Trial Opportunity for DeltaStream

Throughout the first half of 2023 DeltaStream has made strides exiting stealth and emerging in the real-time streaming market with the  general availability of Private SaaS, also known as “Bring Your Own Cloud” (BYOC) and a free trial version of our platform. With the rising volume of real-time data use, streaming data customers can increase their data processing efficiency to better address mission-critical business challenges.

The general availability of DeltaStream with Private SaaS offers a significant advantage to our customers to offload operations while ensuring security of their data. This move will provide our customers with the opportunity to scale their own infrastructure while maintaining data security requirements. DeltaStream will operate the data plane within a customer’s VPC and will manage all aspects of the operations.

“It has been a great year so far for DeltaStream,” Hojjat Jafarpour, founder of DeltaStream said, “we’ve been engaging with the data streaming community and showing how DeltaStream can improve their real-time data experience with our free trial.” As for BYOC, Jafarpour had this to say, “We believe a Private SaaS option is vital for our customers moving forward, security and efficiency is a top priority for companies handling data and it is a top priority for us.”

Key H1 Announcements:

  • Launch of New User Interface
    • In Q2 DeltaStream unveiled an improved UI for the DeltaStream platform. Working with a decorated UI designer, the updated interface prioritizes human-centric design and usability.
  • SOC 2 Compliance
    • DeltaStream achieved SOC 2 Type II compliance late Q2 2023. This third-party industry validation demonstrates DeltaStream’s commitment to security.
  • Attendance at RTA Summit, Databricks Summit
    • Throughout 2023, DeltaStream has attended industry shows and conferences, including sponsoring the RTA Summit – meeting data-enthusiasts, learning from speakers, and showing off the DeltaStream platform.
  • Hosted First Webinar
    • In June DeltaStream successfully hosted their first webinar discussing Streaming Analytics and Streaming Databases. Hojjat Jafarpour, founder of DeltaStream and creator of KSQL, hosted the event to an engaged audience.
  • Launch of New Website
    • Following their first marketing hire in early Q1, DeltaStream unveiled a new website which included an ongoing series of product content and updated look and feel.

20 Jun 2023

Min Read

DeltaStream announces SOC 2 Type 2 Compliance

DeltaStream is excited to announce today that we have achieved SOC 2 Type II compliance in accordance with American Institute of Certified Public Accountants (AICPA) standards for SOC for Service Organizations. Achieving this standard with an unqualified opinion serves as third-party industry validation that DeltaStream Inc provides enterprise-level security for customer’s data secured in our system.

Achieving SOC 2 compliance shows our commitment to protecting customer data, ensuring operational excellence, and maintaining a secure environment.  To current and future customers – we are committed to managing data with the highest standard of security and compliance.  

See our security page for more.

07 Mar 2023

Min Read

Introducing DeltaStream 2023

We created DeltaStream because I saw the complexity, lack of features and cost of stream processing solutions, and knew there was a better way.

For the last 2 years we’ve been working diligently to bring DeltaStream to life.

DeltaStream is a unified serverless stream processing platform to manage, secure and process all your event streams and is based on Apache Flink. We built DeltaStream to provide a comprehensive stream processing platform that is easy to use, easy to operate, and scales automatically.

Given the current economic environment, the buyer increasingly is demanding solutions that solve key business problems and provide a tangible ROI without runaway costs. DeltaStream does this. I see the opportunities for DeltaStream to enable businesses to quickly derive insights from real-time data, while alleviating the engineering toil of managing infrastructure.

Our Platform

Unified: Addressing the fragmented stream processing space.

We built a platform that

  • Works across multiple data streaming stores (i.e; Apache Kafka and Kinesis).
  • Work across multiple Kafka clusters or Kinesis Data streams.
  • Is complete – as it enables both stateless and stateful stream processing-from joins to materialized views.

Serverless and Scale: Automating infrastructure and scale

We made a platform that was easy to use, scale and maintain

  • Zero operations – Users can get started in minutes and deploy apps on Day 1
  • Multi-tenancy – With Query isolation scaling existing use cases or onboarding new ones is no longer a concern.
  • Scale on Demand – Our serverless architecture automatically scales up/down as needed

Secure: Access and Sharing

We developed a platform with a security story for “Data in Motion” that is on par with “Data at Rest”

  • Access Control – Our RBAC allows granular control over event streams
  • Namespacing – Create logical boundaries to organize and manage your real-time data.
  • Sharing and Governance: Leveraging RBAC + Namespacing to securely share event streams in real-time not only within the organization, but also with 3rd party users.

Our team continues to grow and further expand our platform. We’re excited by what we can accomplish in 2023 and beyond. If you’re going to be at RTA Summit in April, I hope you’ll come see me speak or find time to meet.

Please reach out to us for a demo of Deltastream, I am excited to share it with you.

11 Jul 2022

Min Read

Introducing DeltaStream and announcing our $10M seed funding and private beta availability.

Imagine you are a financial institution and are receiving a stream of credit card transactions that your customers are performing anytime, anywhere. You need to process the transactions and detect if any of them is fraudulent and if so block the fraudulent transaction. Timeliness of such processing is essential and you cannot rely on nightly or even hourly jobs to perform the processing and detect the fraud. By the time a periodic batch processing job detects a fraudulent transaction, the transaction has been approved and your institution has suffered the loss. However, by employing stream processing, you can detect and respond to such fraud scenarios with sub second latency and prevent substantial financial loss for your institution. This is just one example how accessing fresh, low latency data can provide a huge competitive advantage to enterprises. You can see similar use cases in areas like banking, Internet of Things, retail, IT, gaming, health care, manufacturing and many more. Customers and users demand low latency services and organizations that provide such service gain significant competitive advantage.

Stream processing, then and now

When I joined Confluent and started ksqlDB (formerly known as KSQL) project, streaming storage platforms such as Apache Kafka were mainly used by tech savvy Silicon Valley companies. Most of the data was at rest and people were reluctant to deal with the complexity of introducing streaming in their architecture. Fast forward six years to 2022 and event streaming systems such as Apache Kafka are one of the main components of modern data infrastructure. Many enterprises have adopted or are in the process of adopting event streaming platforms as the central nervous system of their data infrastructure. Furthermore, availability of such systems as managed services on cloud has made adoption even more compelling. Confluent Cloud, AWS Kinesis, Azure Event hub and GCP Pub/Sub are just a few of such streaming storage services that are available on cloud.

Adoption of such streaming storage services also resulted in many applications to be built on top of such platforms. Such applications enabled the processing and reacting to events in sub second latency which in turn resulted in enormous financial gain for enterprises. An online retail business can increase its revenue by analyzing customer behavior in real-time and recommending the right products while the user is shopping online. Such analysis cannot be done in batch mode since by the time the result is available, the customer has left the online store. Such recommendation applications along with many others such as streaming pipelines, anomaly detection, customer 360, click stream analysis, inventory logistics and log aggregation are just a few of applications that are built on top of platforms like Apache Kafka. However, building real-time streaming applications has been a challenging endeavor requiring highly skilled teams of developers in distributed systems and data management. Delivery guarantees, fault tolerance, elasticity and security are just a few of many challenges that make building real-time streaming applications out of reach for many organizations. Furthermore, even if organizations overcome these challenges and build real-time applications, operating such applications 24/7 in reliable, secure and scalable ways is a huge burden on the teams.

Enter DeltaStream

DeltaStream is a serverless stream processing platform to manage, secure and process all your streams on cloud. We built DeltaStream to take away the complexity from building and operating scalable, secure and reliable real-time streaming applications and make it as easy, fast and accessible as possible. To achieve this goal we are bringing the tried and true benefits of relational data management to the streaming world. Relational databases have successfully been used to manage and process data at rest for the last few decades and have played a crucial role in democratizing access to data in organizations. In addition to processing capabilities, these systems also provide capabilities to organize and secure data in a familiar way. With DeltaStream, we bring not just familiar processing capabilities to the streaming world, but also provide similar ways to manage and secure streaming data, a uniue differentiator capability in DeltaStream.

The following are some of the principals that DeltaStream has built upon and make it a unique service:

  • DeltaStream is serverless: this means as a developer, data engineer or anyone who interacts with real-time streaming data, you don’t have to provision, scale or maintain servers or clusters to build and run your real-time applications. No need to decide how big of a deployment or how many tasks to allocate to your applications. DeltaStream takes care of all those complexities so you can focus on building your core products that bring value to you and your customers instead of worrying about managing and operating distributed stream processing infrastructure. Indeed, there is no notion of cluster or deployment in DeltaStream and you can assume unlimited resources available for your applications while you are building them. You only pay for what you use and DeltaStream seamlessly scales up or down your applications as needed and recovers them from any failures.
  • Embrace SQL database model: for the past few decades, SQL databases have proven to be a great way to manage and process data. Simplicity and ubiquity of SQL has made it easy to query and access the data. However, many real-time streaming systems either do not utilize these capabilities or only try to use them partially for expressing processing logic with SQL and ignore other capabilities such as managing and securing access to the data. DeltaStream brings all the capabilities you know and are used to in the SQL databases for data at rest to the streaming world.
    • SQL: DeltaStream enables users to easily build real-time applications and pipelines with familiar SQL dialect. From simple stateless processing such as filtering and projections to complex stateful processing such as joins and aggregations can be done in DeltaStream with a few lines of SQL. DeltaStream seamlessly provides desired delivery guarantees (exactly once or at least once), automatic checkpointing and savepointing for elasticity and fault tolerance.
    • Organizing your data in motion: Similar to SQL databases, DeltaStream enables you to organize your streaming data in databases and schemas. A database is a logical grouping of schemas and a schema is a logical grouping of objects such as streams, tables and views. This is the basis of namespacing in DeltaStream and enables our users to organize their data much more effectively compared to a flat namespace.
    • Securing your data in motion: securing your data is one of the foundational features in DeltaStream. In addition to common security practices such as data security, authentication and authorization, DeltaStream provides the familiar Role- based Access Control(RBAC) model from SQL databases that enables users to control who can access data and what operations they can perform with data. Users can define roles and grant or revoke privileges the same way they do in other SQL databases. Combination of DeltaStream’s namespacing and security capabilities provide a powerful tool for the users to secure their data in motion.
  • Separation of Compute and Storage: DeltaStream architecture separates the compute and storage resulting in the well known benefits such as elasticity, cost efficiency and high availability. Additionally, DeltaStream’s model of providing the compute layer on top of users’ streaming storage systems, such as Apache Kafka or AWS Kinesis, eliminates the need for data duplication and doesn’t add unnecessary latency to the real-time applications and pipelines. DeltaStream also is agnostic to the underlying storage service and can read from and write into data in motion storage services such as Apache Kafka or AWS Kinesis and data at rest storage services such as AWS S3 or DataLakes. Such flexibility gives DeltaStream the capability to provide an abstraction layer on top of many storage services where users can read data from one or more services, perform desired computation and write the results in one or more storage services seamlessly.

As a cloud service, DeltaStream provides a REST API with GraphQL. There are three ways of using it today.

  • Web App: DeltaStream web app provides a browser-based application that users can interact with the service
  • Command Line Interface(CLI): Users can also use the DeltaStream cli application to interact with the service through their terminal. The CLI provides all the functionalities that the Web App provides.
  • Direct access to Rest API: Users can directly access the service through the provided Rest API. This enables users to integrate DeltaStream into their applications or their CI/CD pipelines.

Currently, DeltaStream is available on AWS cloud and we plan to offer it on GCP and Azure soon.

What’s Next?

Today, we are excited to announce we have raised a $10M seed round led by New Enterprise Associates (NEA). This funding will allow us to speed ahead in providing DeltaStream. We are also announcing the availability of DeltaStream’s Private Beta. If you are on AWS and use any event streaming service such as Confluent Cloud, Redpanda, AWS MSK or AWS Kinesis, please consider joining our Private Beta program to try DeltaStream and help shape our product roadmap. The Private Beta is Free and we would only ask for your feedback in using DeltaStream. To join our Private Beta program, please fill out the form here and tell us about your use cases.

We are also expanding our team and hiring for different roles. If you are passionate about building streaming systems, drop us a line and let’s chat.


Please enter a valid email address.

Request Submitted

Thank you for requesting a demo.
You will receive your login information to your email soon.