דרושים » דאטה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 7 שעות
חברה חסויה
Location: Merkaz
Job Type: Full Time
We are looking for a Data Engineer.
Key Responsibilities
Collect and arrange data from various sources
Automate workflows and improve performance of existing data processes
Implement data quality checks, monitoring, and alerting to ensure accuracy and reliability
Optimize performance and costs of our data platform
Ensure compliance with data governance, security, and privacy standard.
Requirements:
4+ years of industry experience as a data engineer
Hands-on experience with Spark
Data driven with understanding of business processes and logics
Experienced coding with python or Scala​
Experience working with different ETL / ELT tools
Advantage
Knowledge of working with Databricks platform and services
Strong SQL skills
Familiar with AWS environment and services
Familiar with CICD processes​
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8439887
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role:We are seeking an experienced Senior Data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data-driven solutions.Key Responsibilities:

Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
Develop and optimize data models, schemas, and storage solutions to support analytics and machine learning initiatives.
Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
Ensure data quality, integrity, and governance across multiple sources and systems.
Monitor and troubleshoot data workflows, resolving performance and reliability issues.
Evaluate and implement new data technologies and frameworks to improve the data platform.
Document processes, best practices, and data architecture.
Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Bachelors or Masters degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering, ETL development, or a similar role.
Strong proficiency in SQL and experience with relational and NoSQL databases.
Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
Solid programming skills in Python, Java, or Scala.
Strong problem-solving, analytical, and communication skills.
Knowledge of data governance, security, and compliance standards.
Experience with data warehousing, big data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437853
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/10/2025
חברה חסויה
Location: Netanya
Job Type: Full Time
We are a fast-growing global medical device company, developing and manufacturing innovative drug delivery and infusion solutions across the continuum of care - from the hospital to the home. We are looking for an excellent data Engineer to join the winning team!
Job Description
We are looking to hire a highly skilled and experienced professional to fill the role of data Engineer. In this role, you will design, build, and maintain the data infrastructure that powers analytics, product innovation, and business decision-making across the organization. You will work closely with data scientists, data analysts, software engineers, product managers, and other teams to ensure reliable, secure and scalable access to data. We are seeking a detail-oriented professional with strong problem-solving skills, a passion for working with complex real-world data, and the drive to contribute to meaningful innovations in healthcare. We are looking for someone who is self-driven, brings a can-do spirit, and thrives in a collaborative, fast-paced environment where teamwork and clear communication are essential. Job Responsibilities: Design, develop, and maintain scalable ETL /ELT pipelines for ingesting, processing, and storing data from multiple sources Build and optimize data warehouses, data lakes, and other Storage solutions to support analytics and Machine Learning use cases. Collaborate with data scientists and analysts to ensure datasets are clean, structured, and accessible for advanced modeling and reporting. Implement data quality monitoring, validation frameworks, and automated workflows to ensure reliability and accuracy. Integrate data from disparate systems into unified views that support business intelligence and operational efficiency. Ensure compliance with healthcare data regulations and implement security measures to safeguard company data and systems. Research and adopt best practices in data engineering, cloud-native architectures, and modern data tooling. Support the deployment of data -driven applications into production environments.
Requirements:
Job Requirements:
Must Have:
* 2-4 years of hands-on experience in data engineering or a related field.
* Proficiency in Python, SQL, and data pipeline frameworks (e.g., Airflow, dbt, Azure data Factory).
* Experience with data warehouse technologies (e.g., Snowflake, BigQuery, Azure Synapse).
* Familiarity with cloud platforms (preferably Azure).
* Knowledge of data modeling, database design, and performance optimization.
* Experience building APIs or integrating data across different applications.
* Self learning ability and Can do attitude. Nice to Have / Advantage:
* Knowledge and hands-on experience with Elasticsearch for data indexing, search, and analytics.
* Familiarity with distributed data processing (e.g., Spark).
* Familiarity with event-driven architectures and streaming technologies (e.g., Kafka / RabbitMQ).
* Experience with NoSQL databases and engines (e.g., MongoDB, Redis).
* Experience with containerization and orchestration (e.g., Docker, Kubernetes).
* Background in healthcare or regulated environments (e.g., HIPAA, MDR, FDA). Education Requirements
* Bachelors degree in Computer Science, Information Systems, Engineering, or a related field An Advantage. Language skills Fluent English - writing and verbal.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8335614
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
05/11/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are the leader in hybrid-cloud security posture management, using the attackers perspective to find and remediate critical attack paths across on-premises and multi-cloud networks. we are looking for a talented Senior data Engineer Join a core team of experts responsible for developing innovative cyber-attack techniques for Cloud-based environments (AWS, Azure, GCP, Kubernetes) that integrate into our fully automated attack simulation. About the Role:We are seeking an experienced Senior data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data -driven solutions.Key Responsibilities:
* Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
* Develop and optimize data models, schemas, and Storage solutions to support analytics and Machine Learning initiatives.
* Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
* Ensure data quality, integrity, and governance across multiple sources and systems.
* Monitor and troubleshoot data workflows, resolving performance and reliability issues.
* Evaluate and implement new data technologies and frameworks to improve the data platform.
* Document processes, best practices, and data architecture.
* Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Required Qualifications:
* Bachelors or Masters degree in Computer Science, Engineering, or a related field.
* 5+ years of experience in data engineering, ETL development, or a similar role.
* Strong proficiency in SQL and experience with relational and NoSQL databases.
* Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
* Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
* Solid programming skills in Python, JAVA, or Scala.
* Strong problem-solving, analytical, and communication skills.
* Knowledge of data governance, security, and compliance standards.
* Experience with data warehousing, Big Data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks. Preferred Qualifications (Advantage):
* Familiarity with Machine Learning workflows and MLOps practices.
* Work with data Lakehouse architectures and technologies such as Apache Iceberg.
* Experience working with data ecosystems in Open Source/On-Premise environments. Why Join Us:
* Work with cutting-edge technologies and large-scale data systems.
* Collaborate with a talented and innovative team.
* Opportunities for professional growth and skill development.
* Make a direct impact on data -driven decision-making across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8401647
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.

In this role, you will be responsible for building the foundations of our data analytics platform enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.

You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.

If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle this role is for you!

Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.

Lead the design and architecture of the data platform from integration to transformation, modeling, storage, and access.

Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.

Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.

Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.

Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.

Promote a data-driven culture be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.

Proven track record of designing and implementing large-scale data platforms or ETL architectures from the ground up.

Strong hands-on experience with ETL tools and data Warehouse/Lakehouse products (Airflow, Airbyte, dbt, Databricks)

Experience supporting both batch pipelines and real-time streaming architectures (e.g., Kafka, Spark Streaming).

Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).

Familiarity with data visualization tools like Power BI, Looker, or similar.

BSc in Computer Science or a related field from a leading university
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8423261
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
About the Platform group
The Platform Group accelerates our productivity by providing developers with tools, frameworks, and infrastructure services. We design, build, and maintain critical production systems, ensuring our platform can scale reliably. We also introduce new engineering capabilities to enhance our development process. As part of this group, youll help shape the technical foundation that supports our entire engineering team.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437264
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our team and help shape a modern, scalable data platform. Youll work with cutting-edge AWS technologies, Spark, and Iceberg to build pipelines that keep our data reliable, discoverable, and ready for analytics.
Whats the Job?
Design and maintain scalable data pipelines on AWS (EMR, S3, Glue, Iceberg).
Transform raw, semi-structured data into analytics-ready datasets using Spark.
Automate schema management, validation, and quality checks.
Optimize performance and costs with smart partitioning, tuning, and monitoring.
Research and evaluate new technologies, proposing solutions that improve scalability and efficiency.
Plan and execute complex data projects with foresight and attention to long-term maintainability.
Collaborate with engineers, analysts, and stakeholders to deliver trusted data for reporting and dashboards.
Contribute to CI/CD practices, testing, and automation.
Requirements:
Strong coding skills in Python (PySpark, pandas, boto3).
Experience with big data frameworks (Spark) and schema evolution.
Knowledge of lakehouse technologies (especially Apache Iceberg).
Familiarity with AWS services: EMR, S3, Glue, Athena.
Experience with orchestration tools like Airflow.
Solid understanding of CI/CD and version control (GitHub Actions).
Ability to research, evaluate, and plan ahead for new solutions and complex projects.
Nice to have:
Experience with MongoDB or other NoSQL databases.
Experience with stream processing (e.g., Kafka, Kinesis, Spark Structured Streaming).
Ability to create visualized dashboards and work with Looker (Enterprise).
Infrastructure-as-code (Terraform).
Strong debugging and troubleshooting skills for distributed systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8409800
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Petah Tikva
Job Type: Full Time
What you will do:
The AI, Data & Research unit is at the forefront of our companys innovation, building data-driven, ML-powered, and intelligent security solutions. We are looking for an experienced and passionate Data Engineering Team Leader to lead a team of 5 seasoned data engineers. You will play a critical role in designing and delivering scalable, reliable data infrastructure and ETLs, enabling our machine learning, analytics, and product teams to deliver maximum impact.
Lead and grow a high-performing team of experienced data engineers.
Define and implement the teams technical roadmap, ensuring alignment with AI, Data, and Researchs achieving strategic objectives and product roadmap.
Own the development, and operation of robust, scalable, and efficient data pipelines and services.
Oversee the quality, performance, and maintainability of the teams codebase and delivered solutions.
Manage delivery plans, track progress, resolve blockers, and ensure on-time delivery.
Drive best practices in software engineering, data modeling, data quality, code quality, security, observability, and operational excellence.
Translate business and research requirements into scalable technical data solutions.
Foster a culture of ownership, collaboration, continuous learning, drive improvement and innovation within the team.
Actively mentor team members, guide career development, and conduct regular performance reviews.
Requirements:
Bachelors degree in computer science, Software Engineering, or a related field.
3+ years of experience as a data engineering team leader in a product-based company.
5+ years of hands-on in SQL and Python / Pyspark (or Java / Scala), data modeling techniques and concepts - such as Facts, Dimension, Partitions, etc.
Experience in designing and implementing data ingestion, ETL processes and 3rd party tools, preferably with experience in Data Lakehouse and Data Warehouse architecture
Proven experience with CI/CD processes and agile development methodologies.
Strong interpersonal and communication skills.
Demonstrated ability to manage and develop a team with varying levels of experience.
Skilled in identifying risks, tracking progress, and resolving technical challenges proactively.
Self-motivated and goal-oriented with a high work ethic.
Ability to balance long-term architectural vision with fast-paced delivery, in early-stage development or startup environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8390151
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced Data Engineer to join our DataWarehouse team in TLV.

In this role, you will play a pivotal role in the Data Platform organization, leading the design, development, and maintenance of our data warehouse. In your day-to-day, youll work on data models and Backend BI solutions that empower stakeholders across the company and contribute to informed decision-making processes all while leveraging your extensive experience in business intelligence.

This is an excellent opportunity to be part of establishing state-of-the-art data stack, implementing cutting-edge technologies in a cloud environment.

In this role youll
Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements

Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs

Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into the semantic layer

Develop processes to enforce schema evaluation, cover anomaly detection, and monitor data completeness and freshness

Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency

Implement best practices for data warehouse and database performance tuning

Conduct thorough testing of data applications and implement robust validation processes

Collaborate with Data Infra Engineers, Developers, ML Platform Engineers, Data Scientists, Analysts, and Product Managers
Requirements:
3+ years of experience as a BI Engineer or Data Engineer

Proficiency in data modeling, ELT development, and DWH methodologies

SQL expertise and experience working with Snowflake or similar technologies

Prior experience working with DBT

Experience with Python and software development, an advantage

Excellent communication and collaboration skills

Ability to work in an office environment a minimum of 3 days a week
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8421158
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Data Engineer to join our growing team!
This is a great opportunity to be part of one of the fastest-growing infrastructure companies in history, an organization that is in the center of the hurricane being created by the revolution in artificial intelligence.
"our company's data management vision is the future of the market."- Forbes
we are the data platform company for the AI era. We are building the enterprise software infrastructure to capture, catalog, refine, enrich, and protect massive datasets and make them available for real-time data analysis and AI training and inference. Designed from the ground up to make AI simple to deploy and manage, our company takes the cost and complexity out of deploying enterprise and AI infrastructure across data center, edge, and cloud.
Our success has been built through intense innovation, a customer-first mentality and a team of fearless company ronauts who leverage their skills & experiences to make real market impact. This is an opportunity to be a key contributor at a pivotal time in our companys growth and at a pivotal point in computing history.
In this role, you will be responsible for:
Designing, building, and maintaining scalable data pipeline architectures
Developing ETL processes to integrate data from multiple sources
Creating and optimizing data models for efficient storage and retrieval
Implementing data quality controls and monitoring systems
Collaborating with data scientists and analysts to deliver data solutions
Building and maintaining data warehouses and data lakes
Performing in-depth data analysis and providing insights to stakeholders
Taking full ownership of data quality, documentation, and governance processes
Building and maintaining comprehensive reports and dashboards
Ensuring data security and regulatory compliance.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field
3+ years experience in data engineering
Strong proficiency in SQL and Python
Experience with ETL tools and data warehousing solutions
Knowledge of big data technologies (Hadoop, Spark, etc.)
Experience with cloud platforms (AWS, Azure, or GCP)
Understanding of data modeling and database design principles
Familiarity with data visualization tools - Tableau, Sisense
Strong problem-solving and analytical skills
Excellent communication and collaboration abilities
Experience with version control systems (Git).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8384334
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of Lemonades data ecosystem.

The groups mission is to build a state-of-the-art Data Platform that drives Lemonade toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.

In this role youll :
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams

Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights

Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance

Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights

Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions

Collaborate closely with other Staff Engineers across Lemonade to align on cross-organizational initiatives and technical strategies

Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions

Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas

A B.Sc. in Computer Science or a related technical field (or equivalent experience)

Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions

Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines

A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage

Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions

Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases

Ability to work in an office environment a minimum of 3 days a week

Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8420751
סגור
שירות זה פתוח ללקוחות VIP בלבד