רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

מהנדס/ת דאטה/DATA ENGINEER

אני עדיין אוסף
מידע על תפקיד זה

לעדכן אותך כשהכל מוכן?

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
כל מה שרציתם לדעת על מבחני המיון ולא העזתם לשאול
זומנתם למבחני מיון ואין לכם מושג לקראת מה אתם ה...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/12/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are building next-generation GenAI security intelligence and SaaS Security Posture Management (SSPM) solutions that protect enterprises worldwide. If you enjoy turning complex security data into actionable insights and delivering end-to-end systems, this role is for you.

About the role
You will own, build, and maintain our Pythonic data pipeline and enrichment system on top of PostgreSQL and BigQuery. This system powers security analytics, detections, and intelligence. A core part of your job will be to design and implement new components, improve reliability and performance, and ensure data quality and observability.



Key Responsibilities
Own, build, and maintain production data pipelines and enrichment services using Python, PostgreSQL, and BigQuery.
Architect data systems end to end, including design, deployment, monitoring, and iterative improvement.
Analyze complex security datasets and SaaS telemetry to uncover risks, patterns, and opportunities.
Research emerging threat vectors and contribute to automated intelligence feeds and published reports.
Work across security domains such as SSPM, Shadow Integrations, DLP, and GenAI Protection.
Requirements:
4+ years in data-focused roles (engineering, analytics, science)
Strong SQL and Python skills
Experience with cloud platforms (GCP, AWS, Azure) and modern data warehouses (BigQuery, Databricks)
Proven ability to build data infrastructure from scratch
Ability to turn complex data into actionable insights
Fast learner with systematic problem-solving skills
Comfortable with technical research in unfamiliar domains
Independent and determined, with strong collaboration skills
BSc in Computer Science, Mathematics, Statistics, or related field
Excellent communication skills for technical and non-technical audiences
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8473453
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Merkaz
Job Type: Full Time
we are looking for a Data Engineer.
What youll do:
Design, build, and optimize large-scale data pipelines and workflows for both batch and real-time processing.
Architect and maintain Airflow-based orchestration frameworks to manage complex data dependencies and data movement.
Develop high-quality, maintainable data transformation and integration processes across diverse data sources and domains.
Lead the design and implementation of scalable, cloud-based data infrastructure ensuring reliability, performance, and cost efficiency.
Drive data modeling and data architecture practices to ensure consistency, reusability, and quality across systems.
Collaborate closely with Product, R&D, BizDev, and Data Science teams to define data requirements, integrations, and delivery models.
Own the technical roadmap for key data initiatives, from design to production deployment.
Requirements:
6+ years of experience as a Data Engineer working on large-scale, production-grade systems.
Proven experience architecting and implementing data pipelines and workflows in Airflow - must be hands-on and design-level proficient.
Strong experience with real-time or streaming data processing (Kafka, Event Hubs, Kinesis, or similar).
Advanced proficiency in Python for data processing and automation.
Strong SQL skills and deep understanding of data modeling, ETL/ELT frameworks, and DWH methodologies.
Experience with cloud-based data ecosystems (Azure, AWS, or GCP) and related services (e.g., Snowflake, BigQuery, Redshift).
Experience with Docker, Kubernetes, and modern CI/CD practices.
Excellent communication and collaboration skills with experience working across multiple stakeholders and business units.
A proactive, ownership-driven approach with the ability to lead complex projects end-to-end.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8473165
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our team and help shape a modern, scalable data platform. Youll work with cutting-edge AWS technologies, Spark, and Iceberg to build pipelines that keep our data reliable, discoverable, and ready for analytics.

Whats the Job?
Design and maintain scalable data pipelines on AWS (EMR, S3, Glue, Iceberg).
Transform raw, semi-structured data into analytics-ready datasets using Spark.
Automate schema management, validation, and quality checks.
Optimize performance and costs with smart partitioning, tuning, and monitoring.
Research and evaluate new technologies, proposing solutions that improve scalability and efficiency.
Plan and execute complex data projects with foresight and attention to long-term maintainability.
Collaborate with engineers, analysts, and stakeholders to deliver trusted data for reporting and dashboards.
Contribute to CI/CD practices, testing, and automation.
Requirements:
Requirements:
Strong coding skills in Python (PySpark, pandas, boto3).
Experience with big data frameworks (Spark) and schema evolution.
Knowledge of lakehouse technologies (especially Apache Iceberg).
Familiarity with AWS services: EMR, S3, Glue, Athena.
Experience with orchestration tools like Airflow.
Solid understanding of CI/CD and version control (GitHub Actions).
Ability to research, evaluate, and plan ahead for new solutions and complex projects.

Nice to have:
Experience with MongoDB or other NoSQL databases.
Experience with stream processing (e.g., Kafka, Kinesis, Spark Structured Streaming).
Ability to create visualized dashboards and work with Looker (Enterprise).
Infrastructure-as-code (Terraform).
Strong debugging and troubleshooting skills for distributed systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471922
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
The Performance Marketing Analytics team is seeking a highly skilled Senior Data platform Engineer to establish, operate, and maintain our dedicated Performance Marketing Data Mart within the Snowflake Cloud Data Platform. This is a critical, high-autonomy role responsible for the end-to-end data lifecycle, ensuring data quality, operational excellence, and governance within the new environment. This role will directly enable the Performance Marketing team's vision for data-driven marketing and increased ownership of our analytical infrastructure
Responsibilities
Snowflake Environment Management
Administer the Snowflake account (roles, permissions, cost monitoring, performance tuning).
Implement best practices for security, PII handling, and data governance.
Act as the subject matter expert for Snowflake within the team.
DevOps & Model Engineering
Establish and manage the development and production environments.
Maintain CI/CD pipeline using GitLab to automate the build, test, and deployment process.
Implement normal engineering practices such as code testing and commit reviews to prevent tech debt.
Data Operations & Reliability
Monitor pipeline executions to ensure timely, accurate, and reliable data.
Set up alerting, incident management, and SLAs for marketing data operations.
Troubleshoot and resolve platform incidents quickly to minimize business disruption.
Tooling & Integration
Support the integration of BI, monitoring, and orchestration tools
Evaluate and implement observability and logging solutions for platform reliability.
Governance & Compliance
Ensure alignment with Entain data governance and compliance policies.
Document operational procedures, platform configurations, and security controls.
Act as the team liaison with procurement, infrastructure, and security teams for platform-related topics.
Collaboration & Enablement
Work closely with BI, analysts and data engineers, ensuring the platform supports their evolving needs.
Provide guidance on best practices for query optimization, cost efficiency, and secure data access.
Requirements:
At least 4 years of experience in data engineering, DevOps, or data platform operations roles.
Expert Proficiency in Snowflake: 2+ years of Deep, hands-on experience with Snowflake setup, administration, security, warehouse management, performance tuning and cost management.
Programming: Expertise in SQL, and proficiency in Python for data transformation and operational scripting
Experience implementing CI/CD pipelines (preferably GitLab) for data/analytics workloads
Hands-on experience with modern data environments (cloud warehouses, dbt, orchestration tools)
Ability to work effectively in a fast-paced and dynamic environment
Bachelor's degree in a relevant field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471910
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/12/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an exceptional Big Data Engineer to join our R&D team. This is a unique opportunity for a recent graduate or early-career engineer to enter the world of Big Data. You will work alongside the best engineers and scientists in the industry to develop systems that process and analyze data from around the digital world.
So what will you be doing all day?
Learn and Build: Assist in the design and implementation of high-scale systems using a variety of technologies, mentored by senior engineers.
Collaborate: Work in a data research team alongside data engineers, data scientists, and data analysts to tackle data challenges.
Optimize: Help improve the existing infrastructure of code and data pipelines and learn how to identify and eliminate bottlenecks.
Innovate: Experiment with various technologies in the domain of Machine Learning and big data processing.
Monitor: Assist in maintaining monitoring infrastructure to ensure smooth data ingestion and calculation.
Requirements:
Education: BSc degree in Computer Science, Engineering, or a related technical field (Recent graduates are welcome).
Programming: Proficiency in one or more of the following languages: Python, Java, or Scala.
Fundamentals: Strong grasp of Computer Science fundamentals, including Data Structures, Design Patterns, and Object-Oriented Programming.
Soft Skills: Excellent communication skills with the ability to engage in dialogue within data teams.
Mindset: Passionate about data , holds a strong sense of ownership , and comfortable in a fast-paced environment.
Advantages (Not required, but great to have)
Previous internship or work experience in software or data engineering.
Basic understanding or familiarity with CI/CD practices and Git.
Familiarity with containerization technologies like Docker and Kubernetes.
Familiarity with Cloud providers (AWS / Azure / GCP) or Big Data frameworks (Spark, Airflow, Kafka).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471696
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/12/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Big Data Engineer to develop and integrate systems that retrieve, process and analyze data from around the digital world, generating customer-facing data. This role will report to our Team Manager, R&D.
Why is this role so important?
We are a data-focused company, and data is the heart of our business.
As a big data engineer developer, you will work at the very core of the company, designing and implementing complex high scale systems to retrieve and analyze data from millions of digital users.
Your role as a big data engineer will give you the opportunity to use the most cutting-edge technologies and best practices to solve complex technical problems while demonstrating technical leadership.
So, what will you be doing all day?
Your role as part of the R&D team means your daily responsibilities may include:
Design and implement complex high scale systems using a large variety of technologies.
You will work in a data research team alongside other data engineers, data scientists and data analysts. Together you will tackle complex data challenges and bring new solutions and algorithms to production.
Contribute and improve the existing infrastructure of code and data pipelines, constantly exploring new technologies and eliminating bottlenecks.
You will experiment with various technologies in the domain of Machine Learning and big data processing.
You will work on a monitoring infrastructure for our data pipelines to ensure smooth and reliable data ingestion and calculation.
Requirements:
Passionate about data.
Holds a BSc degree in Computer Science\Engineering or a related technical field of study.
Has at least 4 years of software or data engineering development experience in one or more of the following programming languages: Python, Java, or Scala.
Has strong programming skills and knowledge of Data Structures, Design Patterns and Object Oriented Programming.
Has good understanding and experience of CI/CD practices and Git.
Excellent communication skills with the ability to provide constant dialog between and within data teams.
Can easily prioritize tasks and work independently and with others.
Conveys a strong sense of ownership over the products of the team.
Is comfortable working in a fast-paced dynamic environment.
Advantage:
Has experience with containerization technologies like Docker and Kubernetes.
Experience in designing and productization of complex big data pipelines.
Familiar with a cloud provider (AWS / Azure / GCP).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471314
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/12/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an Analytics Engineer to join our team and play a key role in shaping how data drives our product and business decisions.
This role is perfect for someone who enjoys working at the intersection of data, product, and strategy. While Product Analysts focus on turning data into insights, youll focus on building the strong data foundations that make those insights possible. You wont just run queries; you will design the data architecture and own the "source of truth" tables that power our strategic decision-making.
Youll work closely with our Growth and Solutions teams, helping them move faster and smarter by making sure the data behind Generative AI, Data-as-a-Service (DaaS), and advanced product models is clear, reliable, and easy to use. Your work will have a direct impact on how we build, scale, and innovate our products.
What Youll Do
Define the Source of Truth: Take raw, complex data and transform it into clean, well-structured tables that Product Analysts and Business Leads can use for high-stakes decision-making.
Translate Strategy into Logic: Work with Product, Growth, and Solutions leads to turn abstract concepts (like "Activation," "Retention," or "Feature Adoption") into precise SQL definitions and automated datasets.
Enable High-Tech Initiatives: Partner with our AI and DaaS specialists to ensure they have the structured data foundations they need to build models and external data products.
Optimize for Usability: Ensure our data is not just "there," but easy to use. You will design the data logic that powers our most important product dashboards and growth funnels.
Maintain Data Integrity: Act as the guardian of our metrics. You will ensure that the numbers used across our product and business reports are consistent, reliable, and logical.
Requirements:
Expert SQL Mastery: You are a SQL power-user. You enjoy solving complex logic puzzles using code and care deeply about query efficiency and data accuracy.
The "Bridge" Mindset: You can sit in a meeting with a Product Manager to understand a business need, and then translate that into a technical data structure that serves that need.
Logical Architecture: You have a natural talent for organizing information. You know how to build a table that is intuitive and easy for other analysts to query.
Product & Business Acumen: You understand SaaS metrics (ARR, funnels, activation, etc.) and how data logic impacts product growth and strategy.
Experience with Analytics Tools: Proficiency in BI tools (Looker, Tableau, etc.) and a strong understanding of how data flows from technical logs to the end-user interface.
Degree: B.Sc. in Industrial Engineering, Information Systems, Economics, Computer Science or a related quantitative field.
Experience: 1+ years of prior experience in a relevant analytics/technical role.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471303
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
19/12/2025
Job Type: Full Time
Welcome to Chargeflow Chargeflow is at the forefront of fintech + AI innovation, backed by leading venture capital firms. Our mission is to build a fraud-free global commerce ecosystem by leveraging the newest technology, freeing online businesses to focus on their core ideas and growth. We are building the future, and we need you to help shape it. Who We're Looking For - The Dream Maker We're in search of an experienced and skilled Senior Data Engineer to join our growing data team. As part of our data team, you'll be at the forefront of crafting a groundbreaking solution that leverages cutting-edge technology to combat fraud. The ideal candidate will have a strong background in designing and implementing large-scale data solutions, with the potential to grow into a leadership role. This position requires a deep understanding of modern data architectures, cloud technologies, and the ability to drive technical initiatives that align with business objectives. Our ultimate goal is to equip our clients with resilient safeguards against chargebacks, empowering them to safeguard their revenue and optimize their profitability. Join us on this thrilling mission to redefine the battle against fraud. Your Arena
* Design, develop, and maintain scalable, robust data pipelines and ETL processes
* Architect and implement complex data models across various storage solutions
* Collaborate with R&D teams, data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality solutions
* Ensure data quality, consistency, security, and compliance across all data systems
* Play a key role in defining and implementing data strategies that drive business value
* Contribute to the continuous improvement of our data architecture and processes
* Champion and implement data engineering best practices across the R&D organization, serving as a technical expert and go-to resource for data-related questions and challenges
* Participate in and sometimes lead code reviews to maintain high coding standards
* Troubleshoot and resolve complex data-related issues in production environments
* Evaluate and recommend new technologies and methodologies to improve our data infrastructure
Requirements:
What It Takes - Must haves: 5+ years of experience in data engineering, with specific, strong proficiency in Python & software engineering principles - Must
* Extensive experience with GraphDB - MUST
* Extensive experience with AWS, GCP, Azure and cloud-native architectures - Must
* Deep knowledge of both relational (e.g., PostgreSQL) and NoSQL databases - Must
* Designing and implementing data warehouses and data lakes - Must
* Strong understanding of data modeling techniques - Must
* Expertise in data manipulation libraries (e.g., Pandas) and big data processing frameworks - Must
* Experience with data validation tools such as Pydantic & Great Expectations - Must
* Proficiency in writing and maintaining unit tests (e.g., Pytest) and integration tests - Must Nice-to-Haves
* Apache Iceberg - Experience building, managing and maintaining Iceberg lakehouse architecture with S3 storage and AWS Glue catalog - Strong Advantage
* Apache Spark - Proficiency in optimizing Spark jobs, understanding partitioning strategies, and leveraging core framework capabilities for large-scale data processing - Strong Advantage
* Modern data stack tools - DBT, DuckDB, Dagster or any other Data orchestration tool (e.g., Apache Airflow, Prefect) - Advantage
* Designing and developing backend systems, including- RESTful API design and implementation, microservices architecture, event-driven systems, RabbitMQ, Apache Kafka - Advantage
* Containerization technologies- Docker, Kubernetes, and IaC (e.g., Terraform) - Advantage
* Stream processing technologies (e.g., Apache Kafka, Apache Flink) - Advantage
* Understanding of compliance requirements (e.g., GDPR, CCPA) - A
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8397445
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו