דרושים » ניהול ביניים » Senior Data Platform Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 6 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek an experienced and highly skilled Senior Engineer to join our Data Platform Team. This is an exciting opportunity to work on cutting-edge technologies, drive innovation, and play a key role in designing and implementing the companys data infrastructure and architecture.
What you'll do:

Database Infrastructure Development - Design, develop, and maintain scalable and reliable self-serve database infrastructure, including PostgreSQL, MongoDB, SingleStore, and AWS Aurora.


Big Data Infrastructure Development - Design, develop, and maintain scalable and reliable self-serve big data infrastructure, including cloud-agnostic data lakehouse, distributed databases, and platform-powered data pipelines.


End-to-End Ownership - Take responsibility for the entire lifecycle of data infrastructure, from DevOps (Terraform, Kubernetes) to application-level components (infrastructure libraries, core components in data pipelines).


Architecture Leadership: Collaborate with the Architecture group to define and maintain the companys core architectural vision.


Partner with development teams, providing technical guidance, best practices, and support from the design phase to production deployment.


Innovation and Exploration - Work on diverse and impactful projects across different layers of the tech stack, exploring new technologies and approaches to improve reliability, efficiency, and scalability.
Requirements:
Extensive Experience in software engineering and data infrastructure.
Extensive Expertise in the administration of OLTP and OLAP databases.
Strong knowledge of big data frameworks, including Apache Spark, Athena, Trino, and Iceberg.
Hands-on experience with DevOps tools such as Terraform, Kubernetes, and Cloud infrastructure.
Proficiency in building and managing data pipelines and core infrastructure components.
Strong problem-solving, communication, and leadership skills, with the ability to mentor and collaborate across teams.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8478335
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Solutions Data Engineer who possess both technical depth and strong interpersonal skills to partner with internal and external teams to develop scalable, flexible, and cutting-edge solutions. Solutions Engineers collaborate with operations and business development to help craft solutions to meet customer business problems.
A Solutions Engineer works to balance various aspects of the project, from safety to design. Additionally, a Solutions Engineer researches advanced technology regarding best practices in the field and seek to find cost-effective solutions.
Job Description:
Were looking for a Solutions Engineer with deep experience in Big Data technologies, real-time data pipelines, and scalable infrastructuresomeone whos been delivering critical systems under pressure, and knows what it takes to bring complex data architectures to life. This isnt just about checking boxes on tech stacksits about solving real-world data problems, collaborating with smart people, and building robust, future-proof solutions.
In this role, youll partner closely with engineering, product, and customers to design and deliver high-impact systems that move, transform, and serve data at scale. Youll help customers architect pipelines that are not only performant and cost-efficient but also easy to operate and evolve.
We want someone whos comfortable switching hats between low-level debugging, high-level architecture, and communicating clearly with stakeholders of all technical levels.
Key Responsibilities:
Build distributed data pipelines using technologies like Kafka, Spark (batch & streaming), Python, Trino, Airflow, and S3-compatible data lakesdesigned for scale, modularity, and seamless integration across real-time and batch workloads.
Design, deploy, and troubleshoot hybrid cloud/on-prem environments using Terraform, Docker, Kubernetes, and CI/CD automation tools.
Implement event-driven and serverless workflows with precise control over latency, throughput, and fault tolerance trade-offs.
Create technical guides, architecture docs, and demo pipelines to support onboarding, evangelize best practices, and accelerate adoption across engineering, product, and customer-facing teams.
Integrate data validation, observability tools, and governance directly into the pipeline lifecycle.
Own end-to-end platform lifecycle: ingestion → transformation → storage (Parquet/ORC on S3) → compute layer (Trino/Spark).
Benchmark and tune storage backends (S3/NFS/SMB) and compute layers for throughput, latency, and scalability using production datasets.
Work cross-functionally with R&D to push performance limits across interactive, streaming, and ML-ready analytics workloads.
Operate and debug object storebacked data lake infrastructure, enabling schema-on-read access, high-throughput ingestion, advanced searching strategies, and performance tuning for large-scale workloads.
Requirements:
24 years in software / solution or infrastructure engineering, with 24 years focused on building / maintaining large-scale data pipelines / storage & database solutions.
Proficiency in Trino, Spark (Structured Streaming & batch) and solid working knowledge of Apache Kafka.
Coding background in Python (must-have); familiarity with Bash and scripting tools is a plus.
Deep understanding of data storage architectures including SQL, NoSQL, and HDFS.
Solid grasp of DevOps practices, including containerization (Docker), orchestration (Kubernetes), and infrastructure provisioning (Terraform).
Experience with distributed systems, stream processing, and event-driven architecture.
Hands-on familiarity with benchmarking and performance profiling for storage systems, databases, and analytics engines.
Excellent communication skillsyoull be expected to explain your thinking clearly, guide customer conversations, and collaborate across engineering and product teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8442983
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
7 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Backend Engineer, you will design, build, and scale the core backend systems that power our security platform-directly shaping the safety of millions of Web3 users. Youll work closely with a team of world-class engineers to deliver highly reliable, performant, and secure infrastructure.



Your Chain of Impact:

Lead feature development end-to-end, from requirements to production.
Build systems that protect millions of users from fraud, scams, and hacks.
Contribute across the full software development lifecycle in an agile environment.
Drive technical innovation by staying ahead of emerging technologies.
Uphold the highest standards of code quality, scalability, and performance.
Take true ownership of our product, tech stack, and development processes.
Requirements:
810 years of backend-focused engineering experience, with a proven track record of building and scaling complex systems.
Success across 23 fast-paced startup environments, demonstrating adaptability and impact.
Strong proficiency in Python for backend development.
Deep understanding of data systems, including SQL/NoSQL databases and data warehouses.
Hands-on experience with cloud platforms (AWS/GCP), Kubernetes, and CI/CD pipelines.
Curiosity and eagerness to learn; background in cybersecurity, blockchain, or data analysis is a strong plus.
Experience designing and building scalable data pipelines and real-time processing systems.
Skilled with- ETL/ELT tools (Apache Airflow/dbt/Dagster) and streaming platforms (Kafka/Kinesis/RabbitMQ/..).
Familiar with real-time analytics frameworks such as Flink, Spark Streaming, or similar.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8468030
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Backend Engineer to join our Data Engineering team and develop backend components and data-oriented infrastructure for large-scale data processing.

In this role, you will develop infrastructure that enables the generation and evolution of data entities and their reliable distribution across internal systems.

You will work closely with Data Science and Backend core teams to build scalable, cost-efficient, production-grade solutions in a data-heavy environment.



What you will do

Design and implement infrastructure that supports high-volume data processing
Continuously optimize performance, cost, and reliability at scale
Collaborate with Data Science and Backend core teams to define data contracts, interfaces, and integration patterns
Maintain high engineering standards testing, observability, clean code, CI/CD fundamentals
Requirements:
35 years of experience in backend development
Strong hands-on skills with Python


Advantages / Nice to Have

Experience with modern data platforms Snowflake / Delta Lake / Iceberg
Experience with distributed data processing (Databricks, PySpark, or similar)
Experience with cloud platforms Azure
Experience with leadership team management, project ownership, or task management
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445724
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Provide the direction of our data architecture. Determine the right tools for the right jobs. We collaborate on the requirements and then you call the shots on what gets built.
Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance.
Optimize and monitor the team-related cloud costs.
Design and construct monitoring tools to ensure the efficiency and reliability of data processes.
Implement CI/CD for Data Workflows
Requirements:
5+ Years of Experience in data engineering and big data at large scales. - Must
Extensive experience with modern data stack - Must:
Snowflake, Delta Lake, Iceberg, BigQuery, Redshift
Kafka, RabbitMQ, or similar for real-time data processing.
Pyspark, Databricks
Strong software development background with Python/OOP and hands-on experience in building large-scale data pipelines. - Must
Hands-on experience with Docker and Kubernetes. - Must
Expertise in ETL development, data modeling, and data warehousing best practices.
Knowledge of monitoring & observability (Datadog, Prometheus, ELK, etc)
Experience with infrastructure as code, deployment automation, and CI/CD.
Practices using tools such as Helm, ArgoCD, Terraform, GitHub Actions, and Jenkins.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445716
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for engineers with proven experience working with large-scale datasets and distributed systems, who are passionate about data quality, performance, and system reliability.

Responsibilities:

Design and implement scalable, fault-tolerant ETL pipelines using Apache Spark for high-throughput, real-time and batch data processing.
Develop and manage CI/CD pipelines, testing strategies, and data quality frameworks to ensure robust data workflows.
Collaborate with data scientists, analysts, and product teams to build data models, maintain data lineage, and surface insights that drive business value.
Evaluate and integrate new technologies to enhance performance, scalability, and cost-efficiency in our data ecosystem.
Own and evolve critical components of our data infrastructure, with a deep understanding of both technical architecture and business context.
Requirements:
6+ years of hands-on experience as a Data Engineer or Backend Engineer, with a strong focus on data-intensive systems.
Mandatory: Proven, production-grade experience with Apache Spark at scale (tens of terabytes daily or more).
3+ years of experience python.
Experience with cloud-native architectures, especially AWS (e.g., S3, EMR, Athena, Glue).
Expertise in designing and maintaining ETL pipelines using orchestration tools like Airflow (or similar).
Strong analytical skills and proficiency in SQL.
Experience working in distributed environments and optimizing performance for large-scale data workloads.
Ability to lead initiatives and work independently in a fast-paced environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8448694
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a Data Engineering Lead to own and scale data platform infrastructure

Build our new data Lakehouse to support various product and business needs
Support cutting-edge AI Agents and cybersecurity use cases
Be part of the Data and AI Algorithms group
Collaborate closely with dev teams, AI/ML engineers, security researchers and business teams
Ensure the availability, reliability and quality of our data infrastructure
Help define best practices for data modeling and orchestration at scale
Requirements:
6+ years of hands-on experience in building, modelling and managing data warehouse at scale - Must
Production experience with big-data distributed systems such as Apache Spark, Ray or similar - Must
Hands-on with modern data lakes and open table formats (Delta Lake, Apache Iceberg) - Must
Experience in batch and streaming processing pipelines - Must
Strong coding skills in Python. Strong CI/CD and infrastructure-as-code capabilities.
Experience with cloud-native data services (e.g., AWS EMR, Athena, Azure Data Explorer etc.).
Familiarity with orchestration tools like Airflow, Kubeflow, Dagster or similar
Excellent communication skills, ownership mindset, and problem-solving capabilities
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8448818
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
17/12/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
XM Cyber is the leader in hybrid-cloud security posture management, using the attacker’s perspective to find and remediate critical attack paths across on-premises and multi-cloud networks. XM Cyber is looking for a talented Senior Big Data Developer. Join the VRM team to build high-performance vulnerability management capabilities that translate large-scale security data and research findings into prioritized, actionable remediation paths. You’ll work closely with researchers, product and operations teams to drive measurable reductions in customer risk. This role provides the opportunity to lead end-to-end design & implementation of advanced server-side features using a wide range of technologies: Apache Spark, Apache Airflow, Scala, k8s, Python, Redis, Kafka, and Dockers. If you are up for the challenge and you have the ‘XM factor’, come and join us!
Requirements:
* 5+ years of experience in software development with proven ability to take full responsibility and lead advanced software projects that require team collaboration.
* Capable of facing a wide range of cutting edge technologies and challenging development tasks, designing new features from scratch and diving into existing infrastructure.
* 2+ years experience of spark with scala/Python - Must.
* Experience in server-side development with APIs, Microservices Architecture, databases, caches, queues. - Must
* Experience in delivering fully tested production-level code using CI/CD pipeline and maintaining large-scale production systems. - Must
* Highly motivated leader with a can-do approach and strong interpersonal skills that thrives in a fast-paced startup environment.
* Relevant Cyber Security experience - Advantage
* Experience in cloud development (AWS / Azure / GCP)
* Experience with k8s operator, spark and airflow - Big Advantage
* Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8461791
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
XM Cyber is the leader in hybrid-cloud security posture management, using the attacker’s perspective to find and remediate critical attack paths across on-premises and multi-cloud networks. XM Cyber is looking for a talented Senior Big Data Developer. Join a core team of experts responsible for developing innovative cyber-attack techniques for Cloud-based environments (AWS, Azure, GCP, Kubernetes) that integrate into XM Cyber’s fully automated attack simulation. This role provides the opportunity to lead end-to-end design & implementation of advanced server-side features using a wide range of technologies: Apache Spark, Apache Airflow, Scala, k8s, Node.js (JS/Typescript) with MongoDB, Redis, Kafka, Dockers and Flink (Big Data Stream Processing) If you are up for the challenge and you have the ‘XM factor’, come and join us!
Requirements:
* 5+ years of experience in software development with proven ability to take full responsibility and lead advanced software projects that require team collaboration.
* Capable of facing a wide range of cutting edge technologies and challenging development tasks, designing new features from scratch and diving into existing infrastructure.
* 2+ years experience of spark with scala/Python - Must.
* Experience in server-side development with APIs, Microservices Architecture (Docker), databases, caches, queues.
* Experience in delivering fully tested production-level code using CI/CD pipeline and maintaining large-scale production systems.
* Highly motivated leader with a can-do approach and strong interpersonal skills that thrives in a fast-paced startup environment.
* Relevant Cyber Security experience - Advantage
* Experience in cloud development (AWS / Azure / GCP) - Advantage
* Experience with k8s operator, spark and airflow - Big Advantage
* Experience with Node.js (JS/Typescript) - Advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8385415
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
This role provides the opportunity to lead end-to-end design & implementation of advanced server-side features using a wide range of technologies: Apache Spark, Apache Airflow, Scala, k8s, Node.js (JS/Typescript) with MongoDB, Redis, Kafka, Dockers and Flink (Big Data Stream Processing)

If you are up for the challenge and you have the XM factor, come and join us!
Requirements:
5+ years of experience in software development with proven ability to take full responsibility and lead advanced software projects that require team collaboration.
Capable of facing a wide range of cutting edge technologies and challenging development tasks, designing new features from scratch and diving into existing infrastructure.
2+ years experience of spark with scala/Python - Must.
Experience in server-side development with APIs, Microservices Architecture (Docker), databases, caches, queues.
Experience in delivering fully tested production-level code using CI/CD pipeline and maintaining large-scale production systems.
Highly motivated leader with a can-do approach and strong interpersonal skills that thrives in a fast-paced startup environment.
Relevant Cyber Security experience - Advantage
Experience in cloud development (AWS / Azure / GCP) - Advantage
Experience with k8s operator, spark and airflow - Big Advantage
Experience with Node.js (JS/Typescript) - Advantage
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437855
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a hands-on Individual Contributor Data Engineer to design, build, and operate large-scale data product Youll own mission-critical pipelines and services, balancing pre-computation with on-demand execution to deliver complex, business-critical insights with the right cost, latency, and reliability.
RESPONSIBILITIES:
Design and run Spark data pipelines, orchestrated with Airflow, governed with Unity Catalog.
Build scalable batch and on-demand data products, aiming for the sweet spot between pre-compute and on-demand for complex logic - owning SLAs/SLOs, cost, and performance.
Implement robust data quality, lineage, and observability across pipelines.
Contribute to the architecture and scaling of our Export Center for off-platform report generation and delivery.
Partner with Product, Analytics, and Backend to turn requirements into resilient data systems.
Requirements:
BSc degree in Computer Science or an equivalent
5+ years of professional Backend/Data-Engineering experience
2+ years of Data-Engineering experience
Production experience with Apache Spark, Airflow, Databricks, and Unity Catalog.
Strong SQL and one of Python/Scala; solid data modeling and performance tuning chops.
Proven track record building large-scale (multi-team, multi-tenant) data pipelines and services.
Pragmatic approach to cost/latency trade-offs, caching, and storage formats.
Experience shipping reporting/exports pipelines and integrating with downstream delivery channels.
IC mindset: you lead through design, code, and collaboration (no direct reports).
OTHER REQUIREMENTS:
Delta Lake, query optimization, and workload management experience.
Observability stacks (e.g., metrics, logging, data quality frameworks).
GCS or other major cloud provider experience.
Terraform IAC experience.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8427453
סגור
שירות זה פתוח ללקוחות VIP בלבד