דרושים » ניהול ביניים » data Platform Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
משרה בלעדית
1 ימים
Location: Tel Aviv-Yafo
Salary: 30,000-35,000
תיאור: Hiring a data Platform Engineer to join a leading company based in Tel Aviv for the growing data and technology team 

What Youll Do:
Design, build, and maintain robust ETL /ELT pipelines that ingest data from diverse systems ( ERP, eCom platforms, APIs, files, etc.) 
Develop and optimize data lakes and data warehouse environments (Azure Synapse, Snowflake, or similar) 
Collaborate with data analysts and business teams to deliver high-quality datasets and modeling support 
Implement and automate data quality checks, validation routines, and cleansing procedures 
Monitor and troubleshoot pipeline performance, data delays, or failures 
Contribute to the standardization of schema, ation, and pipeline patterns
Requirements:
5+ years of experience in data engineering or data platform development 
Proficiency with SQL, PLSQL, and working with structured and semi-structured data (CSV, Parquet, JSON, APIs) 
Experience building and maintaining pipelines using ETL tools or frameworks (e.g., ADF, SSIS, dbt, Airflow) 
Familiarity with cloud data warehouse platforms such as Azure Synapse, Snowflake, Redshift, or BigQuery 
Strong communication skills and the ability to work cross-functionally 
Python or scripting knowledge - bonus
Experience in Azure or AWS, and familiarity with streaming tools - bonus
Background in retail or omni-channel data environments  - bonus
This position is open to all candidates.
 
Hide
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8486668
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior Data Engineer to join our Data team - someone whos passionate about building reliable, scalable data infrastructure and thrives on solving complex technical challenges.
In this role, youll own the design and development of end-to-end data pipelines that power analytics and data-driven decision-making.
Youll collaborate closely with both business and technical stakeholders to ensure data flows smoothly, accurately, and efficiently across the company.
What You Will Do:
Design, implement, and maintain large-scale ETL and ELT pipelines using modern data frameworks and cloud technologies.
Work with Redshift data warehouses to design efficient schemas and optimize performance.
Build and manage data ingestion processes from multiple sources - APIs, SaaS platforms, internal systems, and databases.
Collaborate with stakeholders to deliver clean, well-modeled, and high-quality data.
Build and evolve a modern, efficient, and scalable data warehouse architecture.
Ensure observability, monitoring, and testing across all data processes.
Apply best practices in CI/CD, version control (Git), and data quality validation.
Requirements:
5+ years of experience as a Data Engineer or ETL Developer, building large-scale data pipelines in a cloud environment (AWS, GCP, or Azure).
Strong SQL expertise, including query optimization and data modeling.
Hands-on experience with ETL/ELT tools such as Matillion, Rivery, SSIS, Talend, or similar.
Solid understanding of data warehouse concepts and dimensional modeling.
Excellent analytical and problem-solving skills.
A collaborative mindset and the ability to work cross-functionally with internal teams.
A self-starter and agile learner who thrives in a fast-paced, dynamic environment.
AI/Data-related development capabilities experience building or integrating AI-driven data solutions is a plus.
Nice to Have:
Experience with Redshift and Matillion - big advantage.
Experience with BI tools such as Qlik or Power BI - big advantage.
Familiarity with CI/CD pipelines.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8435478
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were seeking our first Data Engineer to join the Revenue Operations team. This is a high-impact role where youll build the foundations of our data infrastructure - connecting the dots between systems, designing and maintaining our data warehouse, and creating reliable pipelines that bring together all revenue-related data. Youll work directly with the Director of Revenue Operations and partner closely with Sales, Finance, and Customer Success.
This is a chance to shape the role from the ground up and create a scalable data backbone that powers smarter decisions across the company.
Role Overview:
As the Data Engineer, you will own the design, implementation, and evolution of our data infrastructure. Youll connect core business systems (CRM, finance platforms, billing systems,) into a central warehouse, ensure data quality, and make insights accessible to leadership and revenue teams. Your success will be measured by the accuracy, reliability, and usability of the data foundation you build.
Key Responsibilities:
Data Infrastructure & Warehousing:
Design, build, and maintain a scalable data warehouse for revenue-related data.
Build ETL/ELT pipelines that integrate data from HubSpot, Netsuite, billing platforms, ACP, and other business tools.
Develop a clear data schema and documentation that can scale as we grow.
Cross-Functional Collaboration:
Work closely with Sales, Finance, and Customer Success to understand their reporting and forecasting needs.
Translate business requirements into data models that support dashboards, forecasting, and customer health metrics.
Act as the go-to partner for data-related questions across revenue teams.
Scalability & Optimization:
Continuously monitor and optimize pipeline performance and warehouse scalability.
Ensure the infrastructure can handle increased data volume and complexity as the company grows.
Establish and enforce best practices for data quality, accuracy, and security.
Evaluate and implement new tools, frameworks, or architectures that improve automation, speed, and reliability.
Build reusable data models and modular pipelines to shorten development time and reduce maintenance.
Requirements:
4-6 years of experience as a Data Engineer or in a similar role (preferably in SaaS, Fintech, or fast-growing B2B companies).
Strong expertise in SQL and data modeling; comfort working with large datasets.
Hands-on experience building and maintaining ETL/ELT pipelines (using tools such as Fivetran, dbt, Airflow, or similar).
Experience designing and managing cloud-based data warehouses (Snowflake, BigQuery, Redshift, or similar).
Familiarity with CRM (HubSpot), ERP/finance systems (Netsuite), and billing platforms.
Strong understanding of revenue operations metrics (ARR, MRR, churn, LTV, CAC, etc.).
Ability to translate messy business requirements into clean, reliable data structures.
Solid communication skills - able to explain technical concepts to non-technical stakeholders.
What Sets You Apart:
Youve been the first data hire before and know how to build from scratch (not a must).
Strong business acumen with a focus on revenue operations.
A builder mindset: you like solving messy data problems and making systems talk.
Comfortable working across teams and translating business needs into data solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481826
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
This role has been designed as Hybrid with an expectation that you will work on average 2 days per week from an HPE office.
Job Description:
We are looking for a highly skilled Senior Data Engineer with strong architectural expertise to design and evolve our next-generation data platform. You will define the technical vision, build scalable and reliable data systems, and guide the long-term architecture that powers analytics, operational decision-making, and data-driven products across the organization.
This role is both strategic and hands-on. You will evaluate modern data technologies, define engineering best practices, and lead the implementation of robust, high-performance data solutionsincluding the design, build, and lifecycle management of data pipelines that support batch, streaming, and near-real-time workloads.
What Youll Do
Architecture & Strategy
Own the architecture of our data platform, ensuring scalability, performance, reliability, and security.
Define standards and best practices for data modeling, transformation, orchestration, governance, and lifecycle management.
Evaluate and integrate modern data technologies and frameworks that align with our long-term platform strategy.
Collaborate with engineering and product leadership to shape the technical roadmap.
Engineering & Delivery
Design, build, and manage scalable, resilient data pipelines for batch, streaming, and event-driven workloads.
Develop clean, high-quality data models and schemas to support analytics, BI, operational systems, and ML workflows.
Implement data quality, lineage, observability, and automated testing frameworks.
Build ingestion patterns for APIs, event streams, files, and third-party data sources.
Optimize compute, storage, and transformation layers for performance and cost efficiency.
Leadership & Collaboration
Serve as a senior technical leader and mentor within the data engineering team.
Lead architecture reviews, design discussions, and cross-team engineering initiatives.
Work closely with analysts, data scientists, software engineers, and product owners to define and deliver data solutions.
Communicate architectural decisions and trade-offs to technical and non-technical stakeholders.
Requirements:
610+ years of experience in Data Engineering, with demonstrated architectural ownership.
Expert-level experience with Snowflake (mandatory), including performance optimization, data modeling, security, and ecosystem components.
Expert proficiency in SQL and strong Python skills for pipeline development and automation.
Experience with modern orchestration tools (Airflow, Dagster, Prefect, or equivalent).
Strong understanding of ELT/ETL patterns, distributed processing, and data lifecycle management.
Familiarity with streaming/event technologies (Kafka, Kinesis, Pub/Sub, etc.).
Experience implementing data quality, observability, and lineage solutions.
Solid understanding of cloud infrastructure (AWS, GCP, or Azure).
Strong background in DataOps practices: CI/CD, testing, version control, automation.
Proven leadership in driving architectural direction and mentoring engineering teams
Nice to Have
Experience with data governance or metadata management tools.
Hands-on experience with DBT, including modeling, testing, documentation, and advanced features.
Exposure to machine learning pipelines, feature stores, or MLOps.
Experience with Terraform, CloudFormation, or other IaC tools.
Background designing systems for high scale, security, or regulated environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8461496
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role:We are seeking an experienced Senior Data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data-driven solutions.Key Responsibilities:

Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
Develop and optimize data models, schemas, and storage solutions to support analytics and machine learning initiatives.
Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
Ensure data quality, integrity, and governance across multiple sources and systems.
Monitor and troubleshoot data workflows, resolving performance and reliability issues.
Evaluate and implement new data technologies and frameworks to improve the data platform.
Document processes, best practices, and data architecture.
Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Bachelors or Masters degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering, ETL development, or a similar role.
Strong proficiency in SQL and experience with relational and NoSQL databases.
Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
Solid programming skills in Python, Java, or Scala.
Strong problem-solving, analytical, and communication skills.
Knowledge of data governance, security, and compliance standards.
Experience with data warehousing, big data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437853
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping our data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform
Develop and optimize data infrastructure to support real-time analytics and reporting
Implement data governance, security, and privacy controls to ensure data quality and compliance
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445576
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.
In this role, you will be responsible for building the foundations of our data analytics platform - enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.
You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.
If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle - this role is for you!
Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.
Lead the design and architecture of the data platform - from integration to transformation, modeling, storage, and access.
Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.
Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.
Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.
Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.
Promote a data-driven culture - be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.
Strong hands-on experience with ETL orchestration tools (Apache Airflow), and data lakehouses (e.g., Snowflake/BigQuery/Databricks)
Vast knowledge in both batch processing and streaming processing (e.g., Kafka, Spark Streaming).
Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).
Familiarity with data visualization tools ( Power BI, Looker, or similar.
BSc in Computer Science or a related field from a leading university
Nice to have
Experience working in early-stage projects, building data systems from scratch.
Background in building operational analytics pipelines, in which analytical data feeds real-time product business logic.
Hands-on experience with ML model training pipelines.
Experience in cost optimization in modern cloud environments.
Knowledge of data governance principles, compliance, and security best practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482840
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Job Type: Full Time
We use cutting-edge innovations in financial technology to bring leading data and features that allow individuals to be qualified instantly, making purchases at the point-of-sale fast, fair and easy for consumers from all walks of life.
As part of our Data Engineering team, you will not only build scalable data platforms but also directly enable portfolio growth by supporting new funding capabilities, loan sales and securitization, and improving cost efficiency through automated and trusted data flows that evolve our accounting processes.
Responsibilities
Design and build data solutions that support our companys core business goals, from enabling capital market transactions (loan sales and securitization) to providing
reliable insights for reducing the cost of capital.
Develop advanced data pipelines and analytics to support finance, accounting, and product growth initiatives.
Create ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Develop data-driven finance products that accelerate funding capabilities and automate accounting reconciliations.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Create new features from scratch, enhance existing features, and optimize existing functionality.
Collaborate with stakeholders across Finance, Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Requirements:
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of direct experience with SQL (Redshift/Snowflake), data modeling, data warehousing, and building ELT/ETL pipelines (DBT & Airflow preferred).
3+ years of experience in scalable data architecture, fault-tolerant ETL, and data quality monitoring in the cloud.
Hands-on experience with cloud environments (AWS preferred) and big data technologies (EMR, EC2, S3, Snowflake, Spark Streaming, Kafka, DBT).
Strong troubleshooting and debugging skills in large-scale systems.
Deep understanding of distributed data processing and tools such as Kafka, Spark, and Airflow.
Experience with design patterns, coding best practices, and data modeling.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have
Familiarity with fintech business processes (funding, securitization, loan servicing, accounting).- Huge advantage
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
DevOps experience in AWS.
Microservices experience.
2+ years of experience in Spark and the broader Data Engineering ecosystem.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481603
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced BI Data Engineer to join our Data team within the Information Systems group.
In this role, you will be responsible for building and maintaining scalable, high-quality data pipelines, models, and infrastructure that support business operations across the entire company, with a primary focus on GTM domains.
You will take ownership of core data architecture components, ensuring data consistency, reliability, and accessibility across all analytical and operational use cases.
Your work will include designing data models, orchestrating transformations, developing internal data applications, and ensuring that business processes are accurately represented in the data.
This role requires a combination of deep technical expertise and strong understanding of business operations.
You will collaborate closely with analysts, domain experts, and engineering teams to translate complex business processes into robust, scalable data solutions. If you are passionate about data architecture, building end-to-end data systems, and solving complex engineering challenges that directly impact the business wed love to meet you!
Key Responsibilities:
Design, develop, and maintain end-to-end data pipelines, ensuring scalability, reliability, and performance.
Build, optimize, and evolve core data models and semantic layers that serve as the organizations single source of truth.
Implement robust ETL/ELT workflows using Snowflake, dbt, Rivery, and Python.
Develop internal data applications and automation tools to support advanced analytics and operational needs.
Ensure high data quality through monitoring, validation frameworks, and governance best practices.
Improve and standardize data modeling practices, naming conventions, and architectural guidelines.
Continuously evaluate and adopt new technologies, features, and tooling across the data engineering stack.
Collaborate with cross-functional stakeholders to deeply understand business processes and translate them into scalable technical solutions.
Requirements:
5+ years of experience in BI data engineering, data engineering, or a similar data development role.
Bachelors degree in Industrial Engineering, Statistics, Mathematics, Economics, Computer Science, or a related field required.
Strong SQL expertise and extensive hands-on experience with ETL/ELT development required.
Proficiency with Snowflake, dbt, Python, and modern data engineering workflows essential.
Experience building and maintaining production-grade data pipelines using orchestration tools (e.g., Rivery, Airflow, Prefect) an advantage.
Experience with cloud platforms, CI/CD, or DevOps practices for data an advantage.
Skills and Attributes:
Strong understanding of business processes and the ability to design data solutions that accurately represent real-world workflows.
Strong analytical and problem-solving skills, with attention to engineering quality and performance.
Ability to manage and prioritize tasks in a fast-paced environment.
Excellent communication skills in Hebrew and English.
Ownership mindset, curiosity, and a passion for building high-quality data systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8441718
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
About the Platform group
The Platform Group accelerates our productivity by providing developers with tools, frameworks, and infrastructure services. We design, build, and maintain critical production systems, ensuring our platform can scale reliably. We also introduce new engineering capabilities to enhance our development process. As part of this group, youll help shape the technical foundation that supports our entire engineering team.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437264
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Platform Engineer to design, build, and scale next-generation data platform, the backbone powering our AI-driven insights.
This role sits at the intersection of data engineering, infrastructure, and MLOps, owning the architecture and reliability of our data ecosystem end-to-end.
Youll work closely with data scientists,r&d teams, analysts to create a robust platform that supports varying use cases, complex ingestion, and AI-powered analytics.
Responsibilities:
Architect and evolve a scalable, cloud-native data platform that supports batch, streaming, analytics, and AI/LLM workloads across R&D.
Help define and implement standards for how data is modeled, stored, governed, and accessed
Design and build data lakes and data warehouses
Develop and maintain complex, reliable, and observable data pipelines
Implement data quality, validation, and monitoring frameworks
Collaborate with ML and data science teams to connect AI/LLM workloads to production data pipelines, enabling RAG, embeddings, and feature engineering flows.
Manage and optimize relational and non-relational datastores (Postgres, Elasticsearch, vector DBs, graph DBs).
Build internal tools and self-service capabilities that enable teams to easily ingest, transform, and consume data.
Contribute to data observability, governance, documentation, and platform visibility
Drive strong engineering practices
Evaluate and integrate emerging technologies that enhance scalability, reliability, and AI integration in the platform.
Requirements:
7+ years experience building/operating data platforms
Strong Python programming skills
Proven experience with cloud data lakes and warehouses (Databricks, Snowflake, or equivalent).
Data orchestration experience (Airflow)
Solid understanding of AWS services
Proficiency with relational databases and search/analytics stores
Experience designing complex data pipelines, managing data quality, lineage, and observability in production.
Familiarity with CI/CD, GitOps, and IaC
Excellent understanding of distributed systems, data partitioning, and schema evolution.
Strong communication skills, ability to document and present technical designs clearly.
Advantages:
Experience with vector databases and graph databases
Experience integrating AI/LLM workloads into data pipelines (feature stores, retrieval pipelines, embeddings).
Familiarity with event streaming and CDC patterns.
Experience with data catalog, lineage, or governance tools
Knowledge of monitoring and alerting stacks
Hands-on experience with multi-source data product architectures.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8470086
סגור
שירות זה פתוח ללקוחות VIP בלבד