We are expanding our global data platform infrastructure team and were hiring top-tier engineering talent in Israel. This is a critical role focused on building scalable, secure, and intelligent platform capabilities that empower data engineers, analysts, ML engineers, and data scientists across the company.
Youll be part of a high-impact international team driving the foundation of our enterprise data platform designing automation, governance, and observability frameworks that power secure, compliant, and efficient data operations at scale.
Location: Raanana, Israel (Hybrid 3 days/week onsite, office is right on the train station)
What Youll Do
Design and build infrastructure for real-time, secure, and governed data operations empowering data practitioners across
Develop capabilities to support data ingestion, modeling, analytics, and MLOps using modern tooling and scalable cloud-native architecture
Implement security best practices, including automatic key rotation, RBAC, data classification, and encryption at rest/in transit
Own and automate infrastructure components using Infrastructure as Code (e.g., Terraform)
Build and maintain operations for tools like Airflow and dbt to enable seamless orchestration and transformation workflows
Collaborate across data engineering, analytics, data science, and DevOps teams to enhance platform capabilities and adoption
Support modern metadata standards and technologies like Model Context Protocol (MCP) and Apache Iceberg
Monitor and improve platform performance, reliability, and security posture.
Requirements: 8+ years of experience in platform infrastructure, DevOps for data, or data platform engineering roles
Strong Python development skills
Good SQL knowledge and experience with query optimization
Hands-on experience with Airflow, Kafka and dbt operationalization
Strong background in cloud data platforms on AWS and GCP
Hands-on experience with Kubernetes for containerized environments.
Familiarity with Infrastructure as Code practices using tools like Terraform
Solid understanding of RBAC, data security, and enterprise compliance
Experience with data lake/lakehouse table formats like Apache Iceberg
Experience with modern data warehouses or lakehouses such as Snowflake or Databricks
Knowledge of data lineage, metadata frameworks, and data observability practices.
This position is open to all candidates.