רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

מהנדס/ת דאטה/DATA ENGINEER

אני עדיין אוסף
מידע על תפקיד זה

לעדכן אותך כשהכל מוכן?

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
5 טיפים לכתיבת מכתב מקדים מנצח
נכון, לא כל המגייסים מקדישים זמן לקריאת מכתב מק...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Were seeking an experienced and skilled Data and AI Infra Tech Lead to join our Data Infrastructure team and drive the companys data capabilities at scale.
As the company is fast growing, the mission of the data and AI infrastructure team is to ensure the company can manage data at scale efficiently and seamlessly through robust and reliable data infrastructure.
Requirements:
7+ years of experience in data infra or backend engineering.
Strong knowledge of data services architecture, and ML Ops.
Experience with cloud-based data infrastructure in the cloud, such as AWS, GCP, or Azure.
Deep experience with SQL and NoSQL databases.
Experience with Data Warehouse technologies such as Snowflake and Databricks.
Proficiency in backend programming languages like Python, NodeJS, or an equivalent.
Proven leadership experience, including mentoring engineers and driving technical initiatives.
Strong communication, collaboration, and stakeholder management skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8495880
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Data Engineer, you will play a key role in shaping and driving our analytics data pipelines and solutions to empower business insights and decisions. Collaborating with a variety of stakeholders, you will design, develop, and optimize scalable, high-performance data analytics infrastructures using modern tools and technologies. Your work will ensure data is accurate, timely, and actionable for critical decision-making.

Key Responsibilities:

Lead the design, development, and maintenance of robust data pipelines and ETL processes, handling diverse structured and unstructured data sources.
Collaborate with data analysts, data scientists, product engineers and product managers to deliver impactful data solutions.
Architect and maintain the infrastructure for ingesting, processing, and managing data in the analytics data warehouse.
Develop and optimize analytics-oriented data models to support business decision-making.
Champion data quality, consistency, and governance across the analytics layer.
Requirements:
5+ years of experience as a Data Engineer or in a similar role.
Expertise in SQL and proficiency in Python for data engineering tasks.
Proven experience designing and implementing analytics-focused data models and warehouses.
Hands-on experience with data pipelines and ETL/ELT frameworks (e.g Airflow, Luigi, AWS Glue, DBT).
Strong experience with cloud data services (e.g., AWS, GCP, Azure).
A deep passion for data and a strong analytical mindset with attention to detail.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8495847
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/01/2026
Location: Herzliya
Job Type: Full Time
As a Data Engineer , you will own the architecture and optimization of large-scale ETL processes that transform raw heavy-duty vehicle telemetry into production-grade intelligence. You will operate at the intersection of Big Data and AI, building scalable pipelines, enforcing data quality standards, and managing cost-efficiency for a system processing billions of time-series records. You will be a technical owner, collaborating directly with Data Scientists to ensure our fleet intelligence models run reliably in production.
What Youll Do
Architect and build robust ETLs and scalable data pipelines on Databricks and AWS.
Optimize high-throughput ingestion workflows for billions of time-series records, ensuring low latency and data integrity.
Engineer data validation frameworks and automated monitoring to proactively detect anomalies before they impact models.
Drive cost-efficiency by tuning Spark jobs and managing compute resources in a high-volume environment.
Transform raw IoT/telemetry signals into structured, enriched Feature Stores ready for Machine Learning production.
Define best practices for data engineering, CI/CD for data, and lakehouse architecture across the organization.
Requirements:
Production Experience: 3+ years in Data Engineering with strong proficiency in Python, SQL, and PySpark.
Big Data Architecture: Proven track record working with distributed processing frameworks (Spark, Delta Lake) and cloud infrastructure (AWS preferred).
Scale: Experience handling high-volume datasets (TB scale or billions of rows); familiarity with time-series or IoT data is a strong advantage.
Engineering Rigor: Deep understanding of data structures, orchestration (Databricks Workflows), and software engineering best practices (Git, CI/CD).
Problem Solving: Ability to diagnose complex performance bottlenecks in distributed systems and implement cost-effective solutions.
Ownership: A self-starter mindset with the ability to take a vague requirement and deliver a deployed, production-ready pipeline.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8494087
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Backend Data Engineer
As part of the role you will have the opportunity to:
Build end-to-end development of data infrastructure features, scalable data processing, database interaction and integration with CI/CD.
Take part in expanding our core data platform solutions, build new pipelines from scratch that ingest and process data at scale.
Work across a rich stack of technologies from Apache Kafka, Apache Storm, NoSQL, and relational databases.
Analyze and optimize performance, scalability, and stability of our product environments.
Work closely with the data-science team to implement production grade pipelines based on AI research.
Requirements:
4+ years of experience as a Data Engineer with backend development
Proficiency in Java and Spring - Must
Hands on experience with developing and maintaining a distributed data processing pipelines such as: Apache Storm, Kafka, Spark or Airflow
Familiarity with design principles such as Data Modelling, Distributed Processing, Streaming vs. Batch processing
Proven experience in leading design and system architecture of complex features
Experienced in database optimization tasks such as: sharding, rollup, optimal indexes etc.
Familiarity with cloud platforms
Willing to work in a fast, high growth start-up environment and be able to switch between devops/programming/debugging tasks
Self-management skills and ability to work well both independently and as part of a team, sense of ownership and of urgency
Good communication skills in English.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490318
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
We are looking for a Senior Data Engineer to build and operate a multi-tenant analytics platform on AWS + Kubernetes (EKS), delivering streaming and batch pipelines via GitOps as a Platform-as-a-Service (PaaS).
Responsibilities:
Ingestion pipelines: Build and operate Flink / Spark streaming and batch jobs ingesting from Kafka, S3, APIs, and RDBMS into OpenSearch and other data stores.
Platform delivery: Provide reusable, multi-tenant pipelines as a self-service PaaS.
Workflow orchestration: Manage pipeline runs using Argo Workflows.
GitOps delivery: Deploy and operate pipelines via ArgoCD across environments.
IaC & AWS: Provision infrastructure with Terraform and secure access using IAM / IRSA.
Reliability: Own monitoring, stability, and troubleshooting of production pipelines.
Collaboration: Work with product, analytics, and infra on schemas and data contracts.
Requirements:
Software skills: Senior-level, hands-on data engineering experience building and operating production systems with ownership of reliability and scale.
Processing: Strong experience with Flink and Spark (streaming + batch).
Data sources & sinks: Experience integrating with Kafka, S3, REST APIs, and RDBMS, and publishing to OpenSearch / Elasticsearch, data warehouses, or NoSQL databases.
Big Data: Familiarity with big-data systems; Iceberg / PyIceberg a plus.
Cloud & DevOps: Hands-on experience with EKS, RBAC, ArgoCD, and Terraform for infrastructure and delivery workflows.
Datastores: Hands-on experience with OpenSearch / Elasticsearch including indexing strategies, templates/mappings, and operational troubleshooting.
AI tools: Experience with AI-assisted development tools. (e.g., CursorAI, GitHub Copilot, or similar).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490223
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Senior Data Engineer.
As a Senior Data Engineer you will be helping us design and build a flexible and scalable system that will allow our business to move fast and innovate. You will be expected to show ownership and responsibility for the code you write, but it doesn't stop there.
you are encouraged to think big and help out in other areas as well.
Key focuses:
Designing and writing code that is critical for business growth
Mastering scalability and enterprise-grade SAAS product implementation
Sense of ownership - leading design for new products and initiatives as well as integrating with currently implemented best-practices
Review your peer's design and code
Work closely with product managers, peer engineers, and business stakeholders
Requirements:
5+ years of experience as a hands on software engineer (Python, TypeScript, Node.JS)
Hands on experience in managing major clouds vendors infrastructure (AWS, GCP, Azure)
Hands on experience in designing and implementing data pipelines, distributed systems and restful APIs
Proficiency with SQL, modeling and working with relational and non databases, and pushing them past their limits
Experience working with CI/CD systems, Docker and orchestration tools such as Kubernetes
Enjoy communicating and collaborating, sharing your ideas and being open to honest feedback
The ability to lead new features from design to implementation, taking into consideration topics such as performance, scalability, and impact on the greater system
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490187
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are seeking a highly motivated and experienced BI & Data Engineer to join our fast-growing Data team. Reporting to our Development Team Leader. You will be supporting the team on all Data, pipelines and reports. Help turn raw data into business insights. Managing and designing BI solutions, including ETL processes, Data modeling and reporting. Our BI & data Developer would also enjoy our future data technological stack like: Airflow, DBT, Kafka streaming, AWS/Azure, Python, Advanced ETL tools and more.
Responsibilities:
Gathering requirements from internal customers and designing and planning BI solutions.
Develop and maintain ETL/ELT pipelines using Airflow for orchestration and DBT for transformation
Design and optimize data models, ensuring performance, scalability, and cost efficiency.
Collaborate with BI developers, analysts, AI agents, and product teams to deliver reliable datasets for reporting and advanced analytics.
Development in various BI and big data tools according to R&D methodologies and best practices
Maintain and manage production platforms.
Requirements:
5+ years experience working as a BI Developer or as a Data Engineer
Highly skilled with SQL and building ETL workflows - Mandatory
2+ years Experience in Python - Mandatory.
Experience developing in ETL tools like SSIS or Informatica - Mandatory
2+ years experience with Airflow & DBT - Mandatory
Experience developing data integration processes, DWH, and data models.
Experience with columnar DB and working with Pipelines and streaming data (SingleStore/Snowflake) - advantage.
Experience working with BI reporting tools (Power BI, Tableau, SSRS, or other)
Experience with cloud-based products (AWS, Azure) - advantage.
Enhance operational efficiency and product innovation using AI (co-pilot/cursor AI)
Preferred Qualifications:
Familiarity with CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes)
Experience with Git (or other source control)
Familiarity with AI Agents and Models to improve reliability and Data Integrity
Experience in Kafka is an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8490189
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ramat Gan
Job Type: Full Time
Were looking for a Data Engineer to join our core Data group at headquarters. In this role, youll design and build central data platform infrastructure, driving the companys data strategy forward.
Youll collaborate closely with data professionals and R&D teams, taking ownership of key systems and having a direct impact on the success of our data ecosystem.
What Youll Do:
Be a member of the team responsible for the design and creation of the organizations data platform and data pipelines.
Lead GenAI developments in data related areas
In charge of development and maintenance of data infrastructure hosting data processes consolidating varied data types from multiple origins
Development of cloud infrastructure for data engineering and BI use
Build core MLOps infrastructure and processes for ML and AI models
Requirements:
3+ years of experience as a Data Engineer or Backend Engineer, with proven experience building data pipelines and ETL processes from scratch.
Strong Python programming skills.
Hands-on experience with databases (SQL and NoSQL), preferably Google BigQuery.
Experience working in a cloud environment (GCP preferred) and familiarity with data streaming tools.
Proficiency with Kafka.
Knowledge of data analysis concepts and tools (e.g., Pandas).
B.Sc. in Computer Science or related field (Masters degree is a plus).
Experience with Airflow and dbt - advantage.
Experience developing AI agents - advantage.
MLOps background - advantage.
Who You Are:
A strong communicator who can work collaboratively with cross-functional teams.
Curious, proactive, and passionate about learning and adopting new technologies.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable in a dynamic, fast-paced environment.
A team player who can also take ownership and work independently.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8487774
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ramat Gan
Job Type: Full Time
We are looking for an experienced, proactive, and technically skilled Senior BI Engineer to take ownership of data products and business knowledge across our multi-brand ecosystem. You will be instrumental in designing the data architecture that drives global innovation and growth across the entire company.
What You Will Own and Deliver:
As a Senior BI Engineer, you will operate as a strategic partner, deeply leveraging our value of being obsessed with Data.
Domain Data Ownership: Become the go-to expert for high-impact domains like B2C Marketing and Supply, deeply understanding their metrics, business logic, and data needs to build comprehensive, accurate, and consolidated data models across all Travelier brands.
Strategic Architecture & Modeling: Design DWH to be AI-first, accessible by agents. Design, develop, and maintain advanced, scalable data models and ETL pipelines within our Google Cloud Platform (GCP) environment, focusing on performance, reliability, and cost optimization for petabyte-scale data. Modern Data Stack Implementation: Drive the adoption and best practices of our modern data stack, specifically leveraging DBT for data transformation, testing, and documentation, and Airflow for robust pipeline orchestration.
Executive-Level Product Development: Partner directly with executive management and business leaders across different global brands to translate complex, high-impact business questions into insightful, interactive BI products (semantic layers, self-service data models).
Technical Leadership & Mentorship: Act as a technical leader, mentoring junior team members on data modeling, SQL optimization, and data governance.
Data Governance & Quality: Enforce data governance standards, ensuring data quality, accuracy, and reliability across all owned domains in a consolidated environment.
Requirements:
5+ years of progressive, proven experience as a BI Engineer, Data Engineer, or similar role, with at least 2 years in a senior capacity driving technical strategy and standards.
Expert-level SQL skills, including performance tuning, complex window functions, and advanced dimensional data modeling.
Mandatory Experience with Modern Cloud Data Warehouses such as Google BigQuery (preferred), Snowflake, or Redshift.
Demonstrated experience in a modern data stack environment, specifically using:
Dbt: for version-controlled data transformation.
Airflow (or similar orchestration tool like Cloud Composer): for scheduling and monitoring complex ETL/ELT pipelines.
Proven ability to own and drive data knowledge in at least one key business domain, such as B2C Marketing (Customer Acquisition, LTV, ROAS).
A "Figure It Out" mentality - a self-starter who uses critical thinking, acts quickly, and is not afraid to tackle complex, ambiguous problems.
Advantages:
Hands-on experience with Python for data scripting, data quality checks, and API integrations.
Familiarity with CI/CD processes for data pipelines.
Experience in the e-commerce, marketplace, or travel tech industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8487760
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Petah Tikva
Job Type: Full Time
Were looking for a highly skilled and motivated Data Engineer to join the Resolve (formerly DevOcean) team .
In this role, youll be responsible for designing, building, and optimizing the data infrastructure that powers our SaaS platform.
Youll play a key role in shaping a cost-efficient and scalable data architecture while building robust data pipelines that serve analytics, search, and reporting needs across the organization.
Youll work closely with our backend, product, and analytics teams to ensure our data layer remains fast, reliable, and future-proof. This is an opportunity to influence the evolution of our data strategy and help scale a cybersecurity platform that processes millions of findings across complex customer environments
Roles and Responsibilities:
Design, implement, and maintain data pipelines to support ingestion, transformation, and analytics workloads.
Collaborate with engineers to optimize MongoDB data models and identify opportunities for offloading workloads to analytical stores (ClickHouse, DuckDB, etc.).
Build scalable ETL/ELT workflows to consolidate and enrich data from multiple sources.
Develop data services and APIs that enable efficient querying and aggregation across large multi-tenant datasets.
Partner with backend and product teams to define data retention, indexing, and partitioning strategies to reduce cost and improve performance.
Ensure data quality, consistency, and observability through validation, monitoring, and automated testing.
Contribute to architectural discussions and help define the long-term data platform vision.
Requirements:
8+ years of experience as a Data Engineer or Backend Engineer working in a SaaS or data-intensive environment.
Strong proficiency in Python and experience with data processing frameworks (e.g., Pandas, PySpark, Airflow, or equivalent).
Deep understanding of data modeling and query optimization in NoSQL and SQL databases (MongoDB, PostgreSQL, etc.).
Hands-on experience building ETL/ELT pipelines and integrating multiple data sources.
Familiarity with OTF technologies and analytical databases such as ClickHouse, DuckDB and their role in cost-efficient analytics.
Experience working in cloud environments (AWS preferred) and using native data services (e.g., Lambda, S3, Glue, Athena).
Strong understanding of data performance, storage optimization, and scalability best practices.
Excellent problem-solving skills and a proactive approach to performance and cost optimization.
Strong collaboration and communication abilities within cross-functional teams.
Passion for continuous learning and exploring modern data architectures.
Nice to Have:
Experience with streaming or CDC pipelines (e.g., Kafka, Debezium).
Familiarity with cloud security best practices and data governance.
Exposure to multi-tenant SaaS architectures and large-scale telemetry data.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8486352
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ra'anana
Job Type: Full Time
we are looking for a skilled Data Center Engineer to join our Lab team. The ideal candidate will be responsible for maintaining, monitoring, and optimizing our data center environments to ensure high availability, resiliency, and performance. This role involves hands-on technical work, troubleshooting hardware and network issues, and collaborating with cross-functional teams to support mission-critical systems. You will report to the Lab Manager.
Key Responsibilities
Data Center Operations
Perform installation, configuration, and maintenance of servers, network equipment, storage systems, and cabling.
Monitor facility systems, including power, cooling, fire suppression, and physical security.
Conduct routine inspections to ensure environmental and operational stability.
Manage data center capacity planning (space, power, cooling).
Hardware & Infrastructure Support
Troubleshoot and resolve hardware failures for servers, routers, switches, and other equipment.
Coordinate with vendors for RMA processes and hardware lifecycle management.
Perform firmware upgrades and hardware refresh activities.
Network & Systems Support
Support deployment and maintenance of network infrastructure (LAN/WAN, routing, switching).
Assist with installation and configuration of OS and virtualization technologies (Linux, VMware, Hyper-V, etc.).
Work with IT/security teams to ensure compliance and secure configuration of data center assets.
Monitoring & Incident Response
Monitor systems and respond to alerts to ensure uptime and SLA adherence.
Maintain accurate documentation of incidents, configurations, and procedures.
Process & Documentation
Follow and improve standard operating procedures.
Maintain asset inventory, rack diagrams, and documentation.
Support audits and compliance requirements.
Requirements:
Must-Have
2-5+ years of experience in data center operations or infrastructure engineering (adjust depending on level).
Strong understanding of server hardware, cabling standards, and rack/stack procedures.
Experience with network fundamentals (TCP/IP, VLANs, routing & switching).
Ability to lift and move equipment (up to ~25 kg / 50 lbs if required).
Knowledge of monitoring tools and ticketing systems.
Excellent troubleshooting and problem-solving skills.
Nice-to-Have
Experience with virtualization platforms (VMware, KVM, Hyper-V).
Knowledge of Linux/Windows system administration.
Familiarity with Data Center Infrastructure Management (DCIM) tools.
Experience in cloud or hybrid environments.
Certifications: CompTIA Server+, Network+, CCNA, or equivalent.
Soft Skills
Strong communication and documentation abilities.
Ability to work independently and collaboratively.
High attention to detail and operational discipline.
Ability to perform in high-pressure, mission-critical environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8485765
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced Data Engineer to join our DataWarehouse team in TLV.
In this role, you will play a pivotal role in the Data Platform organization, leading the design, development, and maintenance of our data warehouse. In your day-to-day, youll work on data models and Backend BI solutions that empower stakeholders across the company and contribute to informed decision-making processes all while leveraging your extensive experience in business intelligence.
This is an excellent opportunity to be part of establishing our companys state-of-the-art data stack, implementing cutting-edge technologies in a cloud environment.
In this role youll
Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into the semantic layer
Develop processes to enforce schema evaluation, cover anomaly detection, and monitor data completeness and freshness
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency
Implement best practices for data warehouse and database performance tuning
Conduct thorough testing of data applications and implement robust validation processes
Collaborate with Data Infra Engineers, Developers, ML Platform Engineers, Data Scientists, Analysts, and Product Managers.
Requirements:
3+ years of experience as a BI Engineer or Data Engineer
Proficiency in data modeling, ELT development, and DWH methodologies
SQL expertise and experience working with Snowflake or similar technologies
Prior experience working with DBT
Experience with Python and software development, an advantage
Excellent communication and collaboration skills
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8483027
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll :
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482879
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.
In this role, you will be responsible for building the foundations of our data analytics platform - enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.
You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.
If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle - this role is for you!
Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.
Lead the design and architecture of the data platform - from integration to transformation, modeling, storage, and access.
Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.
Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.
Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.
Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.
Promote a data-driven culture - be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.
Strong hands-on experience with ETL orchestration tools (Apache Airflow), and data lakehouses (e.g., Snowflake/BigQuery/Databricks)
Vast knowledge in both batch processing and streaming processing (e.g., Kafka, Spark Streaming).
Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).
Familiarity with data visualization tools ( Power BI, Looker, or similar.
BSc in Computer Science or a related field from a leading university
Nice to have
Experience working in early-stage projects, building data systems from scratch.
Background in building operational analytics pipelines, in which analytical data feeds real-time product business logic.
Hands-on experience with ML model training pipelines.
Experience in cost optimization in modern cloud environments.
Knowledge of data governance principles, compliance, and security best practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482840
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/01/2026
Location: Petah Tikva
Job Type: Full Time
We are seeking a Senior Backend & Data Engineer to join its SaaS Data Platform team.
This role offers a unique opportunity to design and build large-scale, high-performance data platforms and backend services that power our cloud-based products.
You will own features end to end-from architecture and design through development and production deployment-while working closely with Data Science, Machine Learning, DevOps, and Product teams.
Key Responsibilities:
Design, develop, and maintain scalable, secure backend services and data platforms on AWS
Build and operate batch and streaming ETL/ELT pipelines using Spark, Glue, Athena, Iceberg, Lambda, and EKS
Develop backend components and data processing workflows in a cloud-native environment
Optimize performance, reliability, and observability of data pipelines and backend services
Collaborate with ML, backend, DevOps, and product teams to deliver data-driven solutions
Lead best practices in code quality, architecture, and technical excellence
Ensure security, compliance, and auditability using AWS best practices (IAM, encryption, auditing).
Requirements:
8+ years of experience in Data Engineering and/or Backend Development in AWS-based, cloud-native environments
Strong hands-on experience writing Spark jobs (PySpark) and running workloads on EMR and/or Glue
Proven ability to design and implement scalable backend services and data pipelines
Deep understanding of data modeling, data quality, pipeline optimization, and distributed systems
Experience with Infrastructure as Code and automated deployment of data infrastructure
Strong debugging, testing, and performance-tuning skills in agile environments
High level of ownership, curiosity, and problem-solving mindset.
Nice to Have:
AWS certifications (Solutions Architect, Data Engineer)
Experience with ML pipelines or AI-driven analytics
Familiarity with data governance, self-service data platforms, or data mesh architectures
Experience with PostgreSQL, DynamoDB, MongoDB
Experience building or consuming high-scale APIs
Background in multi-threaded or distributed system development
Domain experience in cybersecurity, law enforcement, or other regulated industries.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482582
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו