דרושים » תוכנה » Senior Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
11/06/2025
משרה זו סומנה ע"י המעסיק כלא אקטואלית יותר
שם חברה חסוי
מיקום המשרה: תל אביב יפו
סוג משרה: משרה מלאה
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek a Director of Data to join us and lead our data group.
As our Director of Data, you will be a key member of our R&D leadership team. You will be responsible for developing and executing a data strategy that aligns with our business goals, overseeing data management, analytics, and validation, and ensuring data integrity at every stage of product development and production.
A day in the life and how youll make an impact:
Define and execute a strategic data roadmap aligned with business objectives, fostering a data-driven culture and leading a high-performing team of data engineers, scientists, and analysts.
Establish robust data validation frameworks, ensuring product integrity and accuracy through all stages, from data acquisition to end-user delivery.
Build and optimize scalable data infrastructure and pipelines to support our data needs and ensure data security, compliance, and accessibility.
Collaborate with product and engineering teams to create and launch data-driven products, ensuring they are built on reliable data and designed to meet customer needs.
Guide the team in generating actionable insights to drive business decisions and product innovation in areas such as personalization, marketing, and customer success.
Implement data governance policies and maintain compliance with industry regulations and best practices.
Requirements:
10+ years of experience in data-related roles, with at least 5 years in a leadership position (ideally within a tech or AI-driven startup environment).
M.Sc. or PhD in Data Science/Computer Science/Engineering/Statistics, or a related field.
Extensive experience with cloud platforms (AWS, GCP, or Azure) and modern data warehouses (Snowflake, BigQuery, or Redshift).
Proficiency in data technologies, such as SQL, Python, R, Looker and big data tools (e.g., Hadoop, Spark).
Proven experience in leveraging data for product development, business intelligence, and operational optimization.
Strong track record of building and managing cross-functional data teams and influencing across all levels of an organization.
Excellent communication skills, with the ability to convey complex data insights in an accessible manner to non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8234801
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
We're seeking talented data engineers to join our rapidly growing team, which includes senior software and data engineers. Together, we drive our data platform from acquisition and processing to enrichment, delivering valuable business insights. Join us in designing and maintaining robust data pipelines, making an impact in our collaborative and innovative workplace.

Responsibilities
Design, implement, and optimize scalable data pipelines for efficient processing and analysis.
Build and maintain robust data acquisition systems to collect, process, and store data from diverse sources.
Take part in developing agentic capabilities.
Mentor, support, and guide junior team members, sharing expertise and fostering their professional development.
Collaborate with DevOps, Data Science, and Product teams to understand needs and deliver tailored data solutions.
Monitor data pipelines and production environments proactively to detect and resolve issues promptly.
Apply and be responsible for best practices in data security, integrity, and performance across all systems.
Requirements:
6+ years of experience in data or backend engineering, with strong proficiency in Python for data tasks.
Proven track record in designing, developing, and deploying complex data applications.
Hands-on experience with orchestration and processing tools such as Apache Airflow and Apache Spark.
Deep experience with public cloud platforms, and expertise in cloud-based data storage and processing.
Experience working with Docker and Kubernetes.
Hands-on experience with CI tools such as GitHub Actions.
Bachelors degree in Computer Science, Information Technology, or a related field or equivalent practical experience.
Ability to perform under pressure and make strategic prioritization decisions in fast-paced environments.
Excellent communication skills and a strong team player, capable of working cross-functionally.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8278570
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
14/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
At our company, were reinventing DevOps and MLOps to help the worlds greatest companies innovate -- and we want you along for the ride. This is a special place with a unique combination of brilliance, spirit and just all-around great people. Here, if youre willing to do more, your career can take off. And since software plays a central role in everyones lives, youll be part of an important mission. Thousands of customers, including the majority of the Fortune 100, trust our company to manage, accelerate, and secure their software delivery from code to production - a concept we call liquid software. Wouldn't it be amazing if you could join us in our journey?
About the Team
We are seeking a highly skilled Senior Data Engineer to join our company's ML Data Group and help drive the development and optimization of our cutting-edge data infrastructure. As a key member of the company's ML Platform team, you will play an instrumental role in building and evolving our feature store data pipeline, enabling machine learning teams to efficiently access and work with high-quality, real-time data at scale.
In this dynamic, fast-paced environment, you will collaborate with other data professionals to create robust, scalable data solutions. You will be responsible for architecting, designing, and implementing data pipelines that ensure reliable data ingestion, transformation, and storage, ultimately supporting the production of high-performance ML models.
We are looking for data-driven problem-solvers who thrive in ambiguous, fast-moving environments and are passionate about building data systems that empower teams to innovate and scale. We value independent thinkers with a strong sense of ownership, who can take challenges from concept to production while continuously improving our data infrastructure.
As a Data Engineer at our company's ML you will...
Design and implement large-scale batch & streaming data pipelines infrastructure
Build and optimize data workflows for maximum reliability and performance
Develop solutions for real-time data processing and analytics
Implement data consistency checks and quality assurance processes
Design and maintain state management systems for distributed data processing
Take a crucial role in building the group's engineering culture, tools, and methodologies
Define abstractions, methodologies, and coding standards for the entire Data Engineering pipeline.
Requirements:
5+ years of experience as a Software Engineer with focus on data engineering
Expert knowledge in building and maintaining data pipelines at scale
Strong experience with stream/batch processing frameworks (e.g. Apache Spark, Flink)
Profound understanding of message brokers (e.g. Kafka, RabbitMQ)
Experience with data warehousing and lake technologies
Strong Python programming skills and experience building data engineering tools
Experience with designing and maintaining Python SDKs
Proficiency in Java for data processing applications
Understanding of data modeling and optimization techniques
Bonus Points
Experience with ML model deployment and maintenance in production
Knowledge of data governance and compliance requirements
Experience with real-time analytics and processing
Understanding of distributed systems and cloud architectures
Experience with data visualization and lineage tools/frameworks and techniques.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8257535
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're seeking talented data engineers to join our rapidly growing team, which includes senior software and data engineers. Together, we drive our data platform from acquisition and processing to enrichment, delivering valuable business insights. Join us in designing and maintaining robust data pipelines, making an impact in our collaborative and innovative workplace.

Responsibilities
Design, implement, and optimize scalable data pipelines for efficient processing and analysis.
Build and maintain robust data acquisition systems to collect, process, and store data from diverse sources.
Take part in developing agentic capabilities.
Collaborate with DevOps, Data Science, and Product teams to understand needs and deliver tailored data solutions.
Monitor data pipelines and production environments proactively to detect and resolve issues promptly.
Apply best practices for data security, integrity, and performance across all systems.
Requirements:
4+ years of experience in data or backend engineering, with strong proficiency in Python for data tasks.
Proven track record in designing, developing, and deploying complex data applications.
Hands-on experience with orchestration and processing tools such as Apache Airflow and Apache Spark.
Bachelors degree in Computer Science, Information Technology, or a related field or equivalent practical experience.
Ability to perform under pressure and make strategic prioritization decisions in fast-paced environments.
Excellent communication skills and a strong team player, capable of working cross-functionally.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8278589
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
6 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Scientist (Applied AI).
As a Senior Data Scientist on our Applied AI team, you will join our Tel Aviv office and play a hands-on, end-to-end role in delivering innovative capabilities that help mayors and other city leaders understand their communities and improve the lives of millions worldwide. Reporting to the Applied AI Team Lead, you will collaborate with product, engineering, and fellow data-science teammates to turn cutting-edge research into production-ready solutionsquickly, reliably, and with maximum real-world impact. Youll work with a rich mix of data sources (including social media, news stories, survey results, resident feedback, and more) to create models and AI-powered features that scale.
Day to Day:
Design, build, and deploy AI and machine-learning solutions, from data exploration through modeling, evaluation, and integration into customer-facing products and internal tools.
Optimize models for quality and scalability through feature engineering, hyper-parameter tuning, runtime profiling, and thoughtful architectural choices.
Build and maintain data pipelines using tools such as Airflow, Spark, and Databricks to ensure clean, reliable inputs for downstream models.
Collaborate closely with product managers, engineers, and designers to refine problem statements, iterate rapidly, and ship impactful features on schedule.
Champion technical excellence by conducting code reviews, sharing best practices, and mentoring teammates across data science and engineering.
Stay current with the latest developments in AI-including LLMs, RAG systems, and AI agents-and proactively propose ways to incorporate new techniques into our workflows.
Work an in-person or hybrid schedule, spending at least three days per week in our Tel Aviv office.
Requirements:
5 + years of hands-on experience developing and deploying machine-learning or data-science solutions with Python and SQL.
Proven, end-to-end experience building AI- and machine learning-based solutions from prototype to production deployment.
Demonstrated success shipping data-intensive services to production on cloud infrastructure (AWS preferred) using data tools such as PostgreSQL, Databricks, Spark, or Airflow.
Deep understanding of machine-learning fundamentals and practical expertise with frameworks such as TensorFlow, PyTorch, or scikit-learn.
Expertise in machine learning metrics and quality control.
Solid understanding of software-engineering best practices (including design patterns, data structures, and version control).
Excellent interpersonal and communication skills, with the ability to explain complex technical concepts to non-technical stakeholders and collaborate across teams.
Its even better if you have:
Experience with Agile development in fast-paced, delivery-driven environments.
Familiarity with CI/CD practices, containers, Kubernetes, and serverless or microservice architectures.
Experience with geospatial analysis, government data, survey research, or civic-tech applications.
A track record of contributing to open-source projects.
A college or graduate degree in a relevant field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8273710
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: More than one
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers.
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8232260
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are making the future of Mobility come to life starting today.
At our company we support the worlds largest vehicle fleet operators and transportation providers to optimize existing operations and seamlessly launch new, dynamic business models - driving efficient operations and maximizing utilization.
At the heart of our platform lies the data infrastructure, driving advanced machine learning models and optimization algorithms. As the owner of data pipelines, you'll tackle diverse challenges spanning optimization, prediction, modeling, inference, transportation, and mapping.
As a Senior Data Engineer, you will play a key role in owning and scaling the backend data infrastructure that powers our platformsupporting real-time optimization, advanced analytics, and machine learning applications.
What You'll Do
Design, implement, and maintain robust, scalable data pipelines for batch and real-time processing using Spark, and other modern tools.
Own the backend data infrastructure, including ingestion, transformation, validation, and orchestration of large-scale datasets.
Leverage Google Cloud Platform (GCP) services to architect and operate scalable, secure, and cost-effective data solutions across the pipeline lifecycle.
Develop and optimize ETL/ELT workflows across multiple environments to support internal applications, analytics, and machine learning workflows.
Build and maintain data marts and data models with a focus on performance, data quality, and long-term maintainability.
Collaborate with cross-functional teams including development teams, product managers, and external stakeholders to understand and translate data requirements into scalable solutions.
Help drive architectural decisions around distributed data processing, pipeline reliability, and scalability.
Requirements:
4+ years in backend data engineering or infrastructure-focused software development.
Proficient in Python, with experience building production-grade data services.
Solid understanding of SQL
Proven track record designing and operating scalable, low-latency data pipelines (batch and streaming).
Experience building and maintaining data platforms, including lakes, pipelines, and developer tooling.
Familiar with orchestration tools like Airflow, and modern CI/CD practices.
Comfortable working in cloud-native environments (AWS, GCP), including containerization (e.g., Docker, Kubernetes).
Bonus: Experience working with GCP
Bonus: Experience with data quality monitoring and alerting
Bonus: Strong hands-on experience with Spark for distributed data processing at scale.
Degree in Computer Science, Engineering, or related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8238970
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a brilliant, quick learner Data Engineer for our data engineering team - an independent, logical thinker who understands the importance of data structuring for macro-business decisions.
The position combines high technical skills with a business orientation. It involves working closely with the analysts and the R&D team and directly affecting the company's cross-department decisions. Our Data Engineer should be able to speak in technical and practical terms and, more importantly, lead from one to the other while dealing with challenges and creating them to make our team even better.
Roles and Responsibilities:
Creating and structuring end-to-end data pipelines and ETLs in light of business requirements: from the source to the analyst's hands, enabling them the ideal conditions to make smart and data-driven business decisions.
Cracking top industry data challenges while initiating and building creative technical solutions- in-house Device Graph, Server-to-Server to multiple systems, Privacy challenges, Online-to-Offline, and more.
Deep understanding of the business needs, technical requirements, and the companys roadmap, translating it into custom-made data solutions and scalable products.
Craft code following best practices to ensure efficiency while integrating CI/CD principles.
Writing multi-step scalable processes from more than 50 data sources- Marketing, Operations, CS, Product, CRM, and more, tying them up to a valuable and useful source of insights for the analysts.
Understanding data challenges and weaknesses, and managing high standards, monitoring, and reliability processes.
Requirements:
B.A / B.Sc. degree in a highly quantitative field
3+ years of hands-on experience as a Data Engineer querying data warehouses (SQL), and structuring data processes using quantitative techniques
Fast learner with high attention to detail, and proven ability and passion to multitask on several projects at a time
Strong communication skills and a proven ability to collaborate effectively with different stakeholders on various projects and business/technical goals
Proven experience in Python and Infrastructure in the data context
High analytical skills and the ability to deep dive into details
Google Cloud Data tools (BigQuery, Cloud Composer/Airflow, Pub/Sub, Cloud Functions) or parallel tools on AWS
Experience in designing and building scalable data systems for various data applications - an advantage
Experience in analyzing data and deriving actionable insights - an advantage
Experience working for a data-driven company in a large-scale environment - an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8255373
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Now were looking for an experienced Software Engineer to join our Data team. In this key role, you will develop our data platform, working on cloud-based microservices data pipelines. As we are building the company data platform, it has direct impact on our customers and also enabled many of the other groups in the organization. You will also be responsible for building microservices that need to work on big data scale (about 250K records/sec) in a law latency as part of a dynamic team.
Key Responsibilities:
End-to-end development of company massive data infrastructures and services.
Researching new technologies and adapting them for use in the companys product
Working closely with the product, DevOps, and security teams.
Requirements:
6+ years of experience with massive large-scale data systems platforms (Storm, Spark, Kafka, SQS...) and design principles (Data Modeling, Streaming vs Batch processing, Distributed Messaging...)
Expertise in one or more of the following languages: Java, Scala, Go
Hands-on experience with design and development production of large-scale distributed systems with an emphasis on performance
Familiarity with no-SQL DBs and relational DBs. Were using technologies such as Elasticsearch, MySQL, Clickhouse, and Redis
Deep understanding of Object-Oriented Programming and software engineering principles
Experience with microservices, k8s - Advantage
Familiar with AWS platform- Advantage
Motivated fast independent learner and great at problem-solving
A team player with excellent collaboration and communication skills.
Bsc. in Computer Science from a known university.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8276872
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer with a passion for analytics to join our growing data team! This role is ideal for someone who enjoys working across the entire data pipeline.
From data ingestion and transformation, all the way to creating analytics-ready datasets.
Youll get hands-on experience with modern tools, collaborate across functions, and help deliver data-driven insights that shape key decisions.
Youll be part of a supportive team, where mentorship, impact, and learning go hand in hand.
Responsibilities
What Youll Do:
Design, develop and maintain end-to-end data pipelines: extract raw data from sources such as MongoDB, MySQL, Neo4j, and Kafka; transform and load it into our Snowflake data warehouse.
Contribute to data modeling and data quality efforts to ensure reliable, analytics-ready datasets.
Collaborate with analytics, engineering, and business teams to understand data needs and translate requirements into actionable data solutions.
Enable data-driven decisions by building dashboards and reports using tools like dbt and AWS QuickSight.
Learn and grow in both the technical and business-facing sides of data.
Requirements:
13 years of experience in a data-related role (data engineering, analytics engineering, BI) or strong projects/coursework if you're just starting out.
Strong experience with SQL and Python for building, manipulating, and analyzing data
Comfortable with modern data tooling such us - Snowflake, dbt, Airflow, or similar
Enthusiastic about working collaboratively with teammates and stakeholders to deliver business value from data
Strong communicator and continuous learner, ready to tackle new challenges in a fast-paced environment
Hands-on experience with cloud platforms such as AWS, GCP, or Azure, and familiarity with services like AWS Glue, Google BigQuery, or Azure Data Factory.
Hands-on experience with ETL/ELT processes, data ingestion, data transformation, data modeling, and monitoring.
Nice to Have:
Experience with AWS or other cloud platforms.
Familiarity with streaming data (Kafka), Infrastructure as Code (Terraform), or Git-based workflows
Knowledge of SaaS analytics, especially for product or customer behavior.
Understanding of PII, data privacy, or compliance standards.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8228707
סגור
שירות זה פתוח ללקוחות VIP בלבד