רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

מהנדס/ת דאטה/DATA ENGINEER

אני עדיין אוסף
מידע על תפקיד זה

לעדכן אותך כשהכל מוכן?

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
להשיב נכון: "ספר לי על עצמך"
שימו בכיס וצאו לראיון: התשובה המושלמת לשאלה שמצ...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
Were building the future workforce of Technology organization, while working tightly with domain owners and top management to define what per-role excellence means, leveraging data from development tools to help infer proficiency levels, providing tailored development plans and training & learning material, and harnessing the individuals themselves for their professional and personal development.
Responsibilities:
Design, implement, and maintain the complete data flow, from extraction to visualization using external tools Work within a product team where solutions are collaboratively proposed. You will be expected to translate requirements into technical designs and implement them as part of large-scale data engineering solutions Utilize various Machine Learning models to explore data, collaborating with cutting-edge Big Data tools and techniques Collaborate with various stakeholders to plan, design, develop, TEST and maintain extraordinary features Collaborating with a team of skilled developers to develop high-traffic, cloud-based applications.
Requirements:
Qualifications:
At least 5 years of experience in Big-Data technologies as a data engineer, including ETL /ELT processes and data exploration.
At least 3 years of experience in building end-to-end data pipelines using Spark, Databricks or similar tools is required.
At least 3 years of experience in Python programming and SQL queries. Experience as a backend Developer - advantage.
Team player committed to the success of the team.
Strong verbal and written communication skills with the ability to clearly explain technical concepts.
Excellent debugging, investigating and problem-solving abilities.
Curious learner who loves sharing knowledge and best practices, and can work both independently and in a team.
Ability to thrive in a fast-paced, ambiguous, and changing environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8138977
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer, you will be instrumental in building and maintaining the data infrastructure that power our analytics and decision-making processes. Working closely with the broader data team, R&D, and various stakeholders, you will design, implement, and optimize data pipelines and storage solutions, ensuring efficient and reliable data flow across the organization.

Responsibilities:
Design, develop, and maintain scalable data pipelines using tools such as Airflow and DBT.​
Manage and optimize our data warehouse in Snowflake, ensuring data integrity and performance.​
Collaborate with analytics and business teams to understand data requirements and deliver appropriate solutions.​
Implement and maintain data integration processes between various systems and platforms.​
Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption.​
Stay updated with the latest industry trends and technologies to continually improve our data infrastructure.​
Requirements:
Requirements:
3+ years of experience in data engineering or a related field.​
Proficiency in SQL and experience with modern lakehouse modeling
Hands-on experience with data pipeline orchestration tools like Apache Airflow.​
Experience with DBT for data transformation and modeling.​
Familiarity with data visualization tools such as Tableau.​
Strong programming skills in languages such as Python or Java.​
Hands-on experience with AWS data solutions (or other major cloud vendor)
Excellent problem-solving skills and attention to detail.​
Strong communication skills and the ability to work collaboratively in a team environment.​
Relevant academic degree in Computer Science, Engineering, or related field (or equivalent work experience).

Preferred Qualifications:
Experience in the travel or insurance industries.​
Familiarity with Mixpanel or similar analytics platforms.​
Knowledge of data security and privacy best practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8230814
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Herzliya
Job Type: Full Time
We are looking for a talented and experienced big data engineer to join our cyber defence group to take over a cutting-edge data solution.

The successful candidate should have hands-on experience with big data processing technologies and architectures.

The candidate will join our growing team of analysts, developers, data scientists, and architects who design and develop innovative solutions to the most critical needs of our customers.



Responsibilities

Designing, architecting, implementing, and supporting our data pipelines and processing flows.
Collaborate with analytics department to analyze and understand data sources.
Provide insights and guidance on database technology and data modelling best practices.
Insure the maintenance of data integrity, managing data and analysis flows with attention to detail and high responsibility.
Implementing algorithms with our data scientists.
Requirements:
BSc/BA in Computer Science or similar.
At least 6 years proven experience as a Big Data Engineer.
At least 3 years of experience with Python.
Experience with both SQL and NoSQL databases, including Elastic Search, Splunk, MongoDB.
Experience with processing of large data sets.
Experience with Linux environment.
Experience with software design and development in a test-driven environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8229958
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior Data Engineer to help scale our data platform and deliver reliable, high-quality data services to both internal teams and external customers. If you thrive on solving complex data challenges, collaborating with diverse stakeholders, and building scalable systems that last, wed love to meet you.
What Youll Own:
Design and implement scalable ETL pipelines using Apache Spark and related technologies.
Build robust data services to support multiple internal teams, including product and analytics.
Architect end-to-end data solutions and translate them into actionable engineering plans.
Maintain clean, reliable data interfaces for microservices and systems requiring accurate, timely data.
Collaborate closely with product teams to understand data needs and co-create solutions.
Ensure observability, data quality, and pipeline reliability through monitoring and automated validation.
Participate in code reviews, architecture discussions, and mentor less experienced engineers.
Requirements:
6+ years of experience building and maintaining production-grade ETL pipelines.
Hands-on experience with orchestration tools such as Databricks, Airflow, dbt, or similar.
Proven ability to design systems that support diverse data consumers with varying SLAs.
Deep understanding of data modeling, distributed systems, and cloud infrastructure.
Strong background in Apache Spark (PySpark or Scala).
Familiarity with microservices architectures and clean API/data contracts.
Excellent communication and collaboration skills youre proactive, approachable, and solution-oriented.
Ability to think in systems: conceptualize high-level architecture and break it into components.
Nice to Have:
Knowledge of data governance, lineage, and observability best practices.
Experience with real-time streaming technologies (e.g., Kafka, Flink).
Exposure to DevOps practices for data systems, including CI/CD, monitoring, and infrastructure-as-code.
Previous experience developing customer-facing data products or analytics tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8229100
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer with a passion for analytics to join our growing data team! This role is ideal for someone who enjoys working across the entire data pipeline.
From data ingestion and transformation, all the way to creating analytics-ready datasets.
Youll get hands-on experience with modern tools, collaborate across functions, and help deliver data-driven insights that shape key decisions.
Youll be part of a supportive team, where mentorship, impact, and learning go hand in hand.
Responsibilities
What Youll Do:
Design, develop and maintain end-to-end data pipelines: extract raw data from sources such as MongoDB, MySQL, Neo4j, and Kafka; transform and load it into our Snowflake data warehouse.
Contribute to data modeling and data quality efforts to ensure reliable, analytics-ready datasets.
Collaborate with analytics, engineering, and business teams to understand data needs and translate requirements into actionable data solutions.
Enable data-driven decisions by building dashboards and reports using tools like dbt and AWS QuickSight.
Learn and grow in both the technical and business-facing sides of data.
Requirements:
13 years of experience in a data-related role (data engineering, analytics engineering, BI) or strong projects/coursework if you're just starting out.
Strong experience with SQL and Python for building, manipulating, and analyzing data
Comfortable with modern data tooling such us - Snowflake, dbt, Airflow, or similar
Enthusiastic about working collaboratively with teammates and stakeholders to deliver business value from data
Strong communicator and continuous learner, ready to tackle new challenges in a fast-paced environment
Hands-on experience with cloud platforms such as AWS, GCP, or Azure, and familiarity with services like AWS Glue, Google BigQuery, or Azure Data Factory.
Hands-on experience with ETL/ELT processes, data ingestion, data transformation, data modeling, and monitoring.
Nice to Have:
Experience with AWS or other cloud platforms.
Familiarity with streaming data (Kafka), Infrastructure as Code (Terraform), or Git-based workflows
Knowledge of SaaS analytics, especially for product or customer behavior.
Understanding of PII, data privacy, or compliance standards.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8228707
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for an experienced Backend Developer to join our growing data platform team.

As a Backend Developer, you'll work on a massive data processing pipeline, ingesting over a billion daily events from multiple sources. You'll also create the next-generation pipeline and help us scale from a billion events a day to tens of billions of events.

Responsibilities:
Own projects from initial discussions to release, including data exploration, architecture design, benchmarking new technologies, and product feedback.
Work with massive amounts of data from different sources using state-of-the-art technology to make big data accessible in real-time.
Develop and deploy real-time and batch data processing infrastructures.
Manage the development of distributed data pipelines and complex software designs to support high data rates (millions of daily active users) using cloud-based tools.
Work closely with company stakeholders on data-related issues.
Develop unit, integration, end-to-end (e2e), and load tests.
Requirements:
Requirements:
4+ years of experience as a Software Engineer, including design & development.
Proven experience with Java or Go.
Experience in the design and development of scalable big data solutions.
Experience working in a cloud-based environment.
Passionate about technologies, frameworks, and best practices.
Ability to work in a fast-paced environment.

Advantages:
Experience with Spring / Kubernetes.
Experience with Terraform / Helm / Argo.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8228524
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/06/2025
Location: Haifa
Job Type: Full Time and Hybrid work
Required Data Team Lead.
Requirements:
1. Technical Expertise
Cloud Data Platforms: Expertise in Google Cloud Platform (GCP)
services, especially in data-related services like BigQuery, Cloud SQL,
Dataproc, Pub/Sub, and Dataflow.
Data Engineering: Strong knowledge of data pipelines, ETL/ELT
processes, and data integration using GCP tools.
Database Management: Experience with both relational and
non-relational databases, including MySQL, PostgreSQL, and NoSQL
solutions like Firestore or Bigtable.
Programming Skills: Proficiency in Python, SQL, and possibly other
languages like Java or Scala, with a focus on data manipulation and
processing.
APIs and Integrations: Experience working with Google APIs and
third-party APIs for data extraction and integration.
Machine Learning: Familiarity with Google Cloud AI tools, such as
Vertex AI, and how to integrate machine learning workflows into the data
architecture.
Data Governance: Knowledge of data security, privacy, governance, and
regulatory requirements (GDPR, HIPAA, etc.).
2. Leadership and Management

Team Management: Ability to lead a team of data engineers, analysts,
and architects, providing guidance, mentorship, and performance
management.
Project Management: Strong skills in managing data projects, ensuring
timely delivery while meeting client requirements.
Collaboration: Experience working cross-functionally with stakeholders
such as solution architects, developers, and business teams to translate
business needs into data solutions.
Resource Allocation: Efficiently managing cloud resources to optimize
cost and performance.
3. Client-Facing Skills
Consulting: Experience in providing consulting services to clients,
including understanding their data needs, providing solutions, and
guiding them through the data modernization process.
Pre-sales Support: Supporting sales teams in scoping out client
projects, preparing presentations, and explaining technical aspects of
data solutions during pre-sales.
Client Relationship Management: Building and maintaining strong
relationships with clients to ensure satisfaction and long-term
partnership.
4. Certifications and Education
● Google Cloud Certifications:
● Professional Data Engineer (preferred)
● Professional Cloud Architect (preferred)
● Other Certifications: Optional certifications in data engineering or
machine learning from organizations like Cloudera, AWS, or Microsoft
Azure can be advantageous.
● Educational Background: Bachelors or Masters degree in Computer
Science, Data Science, Engineering, or a related field.
5. Soft Skills
● Problem-Solving: Ability to troubleshoot data-related issues and
implement scalable solutions.
● Adaptability: Ability to keep up with the rapidly changing cloud and data
landscape.
● Communication: Excellent verbal and written communication skills to
explain complex data systems to non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8227525
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/06/2025
Location: Haifa
Job Type: Full Time
We seek a highly skilled and experienced Solution Architect with a deep understanding of Data and operations to join our team. In this role, you will be responsible for designing and implementing data-centric solutions that enable our customers to achieve their business objectives, with a strong focus on building efficient and reliable ETL pipelines. You will work closely with clients to understand their needs, translate them into technical requirements, and architect scalable, reliable, and secure data pipelines.
Key Responsibilities:
Collaborate with clients to gather requirements and understand their data
challenges.
Design and implement end-to-end DataOps solutions, including data
ingestion, processing, storage, and analysis.
Leverage cloud technologies and best practices to architect scalable and
cost-effective data architectures.
Ensure data quality, integrity, and security throughout the data lifecycle.
Automate data pipelines and workflows to improve efficiency and reduce
manual effort.
Stay up-to-date with the latest DataOps trends and technologies.
Provide technical guidance and mentorship to other team members.
Requirements:
Proven experience as a Solution Architect or in a similar role, with a strong focus on DataOps.
Deep understanding of data management principles, data modeling, and ETL processes.
Expertise in cloud technologies (AWS, Azure, GCP) and data platforms (Hadoop, Spark, Snowflake, Looker).
Strong knowledge of programming languages (Python, Java, Scala) and scripting languages (Bash, PowerShell).
Experience with DevOps tools and practices (CI/CD, containerization, orchestration).
Excellent communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.
Strong problem-solving and analytical skills
Bachelors degree in Computer Science, Engineering, or a related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8227516
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
What your day will look like:
Design, plan, and build all aspects of the platforms data, machine learning (ML) pipelines, and infrastructure.
Build and optimize an AWS-based Data Lake using best practices in cloud architecture, data partitioning, metadata management, and security to support enterprise-scale data operations.
Collaborate with engineers, data analysts, data scientists, and other stakeholders to understand data needs.
Solve challenging data integration problems, utilizing optimal ETL/ELT patterns, frameworks, query techniques, and sourcing from structured and unstructured data sources.
Lead end-to-end data projects from infrastructure design to production monitoring.
Requirements:
Have 5+ years of hands-on experience in designing and maintaining big data pipelines across on-premises or hybrid cloud environments, with proficiency in both SQL and NoSQL databases within a SaaS framework.
Proficient in one or more programming languages: Python, Scala, Java, or Go.
Experienced with software engineering best practices and automation, including testing, code reviews, design documentation, and CI/CD.
Experienced in building and designing ML/AI-driven production infrastructures and pipelines.
Experienced in developing data pipelines and maintaining data lakes on AWS - big advantage.
Familiar with technologies such as Kafka, Snowflake, MongoDB, Airflow, Docker, Kubernetes (K8S), and Terraform - advantage.
Bachelor's degree in Computer Science or equivalent experience.
Strong communication skills, fluent in English, both written and verbal.
A great team player with a can-do approach.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8225677
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו