About us:

DeltaStream is an innovative leader in real-time data processing and streaming analytics, empowering businesses to harness the full potential of their data. Built by the creator of ksqlDB and powered by Apache Flink, our innovative solutions empower organizations to harness the power of real-time data to drive actionable insights and enhance decision-making processes. Backed by world class investors NEA, Galaxy and Sanabil, DeltaStream recently closed its $15M Series A financing in September 2024, bringing DeltaStream’s total raised capital to $25M.

About the Role:

As a Data platform engineer, you will design, develop, and build the core stream processing platform. Collaborate with cross-functional teams including product managers, and other engineering teams to deliver end-to-end solutions.

Key Responsibilities:

  • Participate in the design, development, and deployment of scalable and reliable data processing pipelines.
  • Implement robust developer and testing infrastructure to streamline development workflows and ensure high-quality code.
  • Stay current with the latest technologies and industry trends, evaluating and integrating new tools and methodologies as appropriate.
  • Work closely with development, operations, and other teams to ensure alignment and collaboration.
  • Demonstrate strong debugging, documentation, and communication skills.
  • Communicate effectively, both verbally and in writing, to technical and non-technical audiences.

Required Skills and Experience:

  • At least 4+ years of experience in large-scale software development with a specific focus on data processing.
  • Strong proficiency with large-scale data processing technologies like Apache Flink, Apache Spark, Kafka, Kinesis.
  • Proficiency in Java and Python.
  • Comfortable dealing with distributed system complexity.
  • Experience in relational data models and databases.
  • Experience with SQL queries and optimization.
  • Experience with GitHub tooling (actions, workflows, repositories).
  • Familiarity with CI/CD pipelines and automation tools.
  • Problem-solving and troubleshooting skills.
  • Strong communication and collaboration abilities.

Bonus Points:

  • Experience building or designing database systems.
  • Contributions to open-source projects (especially related to Flink, Kafka or Spark)
  • Proficiency with containerization and orchestration technologies (Docker, Kubernetes.
  • Proficiency with cloud platforms (AWS, GCP, or Azure).
  • Proficiency with Golang.
  • Understanding of communication protocols (REST, Grpc) and how to use them when building microservices.
  • Proficiency with Antlr or other compiler tools.
  • Knowledge of security best practices and compliance standards.

By joining our team, you'll have the opportunity to work on cutting-edge technologies and make a significant impact on our organization's reliability and success.

At this time we are only accepting applicants from within the United States.

Please submit your resume to [email protected] with "Data Platform Engineer" in the subject line.