רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

מהנדס/ת דאטה/DATA ENGINEER

אני עדיין אוסף
מידע על תפקיד זה

לעדכן אותך כשהכל מוכן?

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
5 טיפים לכתיבת מכתב מקדים מנצח
נכון, לא כל המגייסים מקדישים זמן לקריאת מכתב מק...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

משרות בלוח החם
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
 
משרה בלעדית
3 ימים
דרושים בGotFriends
מיקום המשרה: מספר מקומות
סוג משרה: משרה מלאה
החברה מפתחת פלטפורמה מבוססת AI, NLP שסורקת מקורות מידע ברשת במטרה לאתר עבירות והפרות חוק שחברות גדולות ותאגידים עושים כלפי אזרחים.

החברה ממוקמת בתל אביב- קו רכבת, משלבת מודל עבודה היברידי ומונה מאות עובדים גלובלית.

מהות התפקיד: חלק מצוות של 4 עובדים, דיווח לראש צוות, צוות שאחראי על בניית הפלטפורמה מאפס. העבודה כוללת אחריות מלאה על Design, Build ו-Scale של data Platform Cloud Native לפתרונות AI ו-LLM של החברה. הקמה ותחזוקה של data Lakes ו- data Warehouses, פיתוח data Pipelines מורכבים ל-Batch ו-Streaming, חיבור AI/LLM Workloads ל-Production. ניהול ואופטמיזציה של data Bases- Postgres, Elasticsearch, Vector DBs, Graph DBs.
דרישות:
- 7 שנות ניסיון כ- data Engineer
- ניסיון בפיתוח ב- Python
- ניסיון בפיתוח עם AWS
- ניסיון בפיתוח עם data lake- Databricks, Snowflake המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8525804
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
דרושים בNishapro
Location: More than one
Job Type: Full Time and Hybrid work
For a fast-growing company in the renewable energy sector, a data Architect is required to lead the organization's data infrastructure.
What youll do:

Design and own a modern cloud-based data platform (Azure or equivalent).
Connect everything from SCADA/EMS/BMS to market APIs, ERP (Priority), and finance systems.
Define scalable models, ensure data quality security, and deliver trusted datasets.
Empower teams with analytics and Power BI dashboards.
Lay the groundwork for AI/ML: predictive maintenance, Storage optimization, forecasting.
Shape the long-term data strategy and grow the team.
Requirements:
What were looking for:

5+ years in data architecture/engineering.
Hands-on with cloud platforms, SQL Python.
Strong ETL /ELT large-scale dataset experience (time-series/IoT a plus).
Bonus: energy/industrial data, Priority ERP, energy trading.
This position is open to all candidates.
 
Show more...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8413026
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
4 ימים
דרושים בדאטה קור
מיקום המשרה: מספר מקומות
סוג משרה: משרה מלאה
תיאור המשרה:
אנחנו מחפשים מוביל.ה טכנולוגי.ת יוצא.ת דופן להובלת צוות ה- Data Engineering בחברה קמעונאית מהגדולות בישראל. התפקיד מיועד למנהל.ת מנוסה עם תשוקה לדאטה, יכולת הובלה מוכחת וניסיון עמוק בבניית ארכיטקטורות נתונים מורכבות. זוהי הזדמנות להוביל את אסטרטגיית הנתונים של הארגון, לנהל פרויקטים רחבי היקף ולעבוד עם טכנולוגיות הענן המתקדמות ביותר ב-Scale עצום.

-משרה מלאה
-מיקום המשרדים: הרצליה

תיאור התפקיד:
- ניהול והובלה: ניהול ישיר של צוות ה- Data Engineers (לפחות שנתיים ניסיון), תיעדוף משימות, חניכה מקצועית והובלת ה Roadmap הטכנולוגי.
- תכנון וארכיטקטורה: אפיון ותכנון פרויקטים מקצה לקצה, בניית Data Lakes ותהליכי ETL /ELT מורכבים.
- בקרת איכות: ביצוע Code Reviews,הבטחת ביצועי מערכת (Performance) ושמירה על איכות נתונים גבוהה.
- סמכות מקצועית: מתן פתרונות טכנולוגיים לבעיות מורכבות והטמעת מתודולוגיות עבודה (Agile, CI/CD)
דרישות:
דרישות התפקיד:
- ניסיון ניהולי: ניסיון מוכח בניהול צוות פיתוח/ דאטה - חובה.
- ניסיון ב-GCP: לפחות שנה אחת של ניסיון מעשי עמוק בעבודה עם תשתיות הדאטה של Google Cloud Platform כגון: BigQuery, Dataflow, Cloud Composer/Airflow -חובה.
- מומחיות ב ETL: ניסיון עשיר בתכנון ופיתוח תהליכי ETL /ELT מורכבים מעל Data Lake ו- Data Warehouse
- ידע טכני: שליטה מלאה ב- Python וב SQL ברמה גבוהה מאוד.
- מתודולוגיות: הבנה עמוקה וניסיון עבודה במתודולוגיות פיתוח מקובלות, עבודה עם Git וסביבות ענן.

יתרונות:
- ניסיון מחברה קמעונאית גדולה או עבודה עם נפחי דאטה גבוהים (High Volume)
- היכרות עם כלי Streaming כגון: Kafka, Pub/Sub
- ניסיון בכתיבת תשתיות כקוד (Terraform)
- תואר ראשון במדעי המחשב / הנדסת תוכנה או תחום רלוונטי. המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8529760
סגור
שירות זה פתוח ללקוחות VIP בלבד
לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a Data Engineer to join our dynamic data team. In this role, you will design, build, and maintain robust data systems and infrastructure that support data collection, processing, and analysis. Your expertise will be crucial in developing scalable data pipelines, ensuring data quality, and collaborating with cross-functional teams to deliver actionable insights.

Key Responsibilities:

Design, develop, and maintain scalable ETL processes for data transformation and integration.
Build and manage data pipelines to support analytics and operational needs.
Ensure data accuracy, integrity, and consistency across various sources and systems.
Collaborate with data scientists and analysts to support AI model deployment and data-driven decision-making.
Optimize data storage solutions, including data lakehouses and databases, to enhance performance and scalability..
Monitor and troubleshoot data workflows to maintain system reliability.
Stay updated with emerging technologies and best practices in data engineering.
Requirements:
3+ years of experience in data engineering or a related role within a production environment.
Proficiency in Python and SQL
Experience with both relational (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Elasticsearch).
Familiarity with big data AWS tools and frameworks such as Glue, EMR, Kinesis etc.
Experience with containerization tools like Docker and Kubernetes.
Strong understanding of data warehousing concepts and data modeling.
Excellent problem-solving skills and attention to detail.
Strong communication skills, with the ability to work collaboratively in a team environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8524239
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Our data engineering team is looking for an experienced professional with expertise in SQL, Python, and strong data modeling skills. In this role, you will be at the heart of our data ecosystem, designing and maintaining cross-engineering initiatives and projects, as well as developing high-quality data pipelines and models that drive decision-making across the organization.

You will play a key role in ensuring data quality, building scalable systems, and supporting cross-functional teams with clean, accurate, and actionable data.



What you will do:

Design, develop, and optimize data services and solutions required to support various company products (like FeatureStore or synthetic data management); Work closely with data analysts, data scientists, engineers, and cross-functional teams to understand data requirements and deliver high-quality solutions.
Design and integrate LLM- and agent-based capabilities into data platforms and services, enabling smarter data operations and AI-driven data products.
Design, develop, and optimize scalable data pipelines to ensure data is clean, accurate, and ready for analysis.
Build and maintain robust data models that support clinical, business intelligence, and operational needs.
Implement and enforce data quality standards, monitoring, and best practices across systems and pipelines.
Manage and optimize large-scale data storage and processing systems to ensure reliability and performance.
Requirements:
5+ years of experience as a Data Engineer / Backend Engineer (with strong emphasis on data processing)
Python Proficiency: Proven ability to build services, solutions, data pipelines, automations, and integration tools using Python.
SQL Expertise: Deep experience in crafting complex queries, optimizing performance, and working with large datasets.
Strong knowledge of data modeling principles and best practices for relational and dimensional data structures.
A passion for maintaining data quality and ensuring trust in business-critical data.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8523783
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
About us:
We are international Multi-Cloud experts, utilizing the power of the cloud for smart digital transformation. With 5 sites over 4 continents around the globe, +450 experts, +1000 customers, and +30 years of proven experience, our mission is to deliver the best Multi-Cloud service to our customers, accelerate their business and help them grow. As tech-savvies, To help our customers stay on top of their game, our teams are constantly developing new strategies and tools that will help them improve cloud performance, spending, visibility, control, and automation. Our cloud experts will make any digital transformation a quick, smart, and easy process
What You'll Do:
Design, build, and maintain data pipelines and infrastructure
Develop and implement data quality checks and monitoring processes
Work with engineers to integrate data into our systems and applications
Collaborate with scientists and analysts to understand their data needs.
Requirements:
3 years of experience as a Data Engineer or a related role
Experience with big data technologies such as Hadoop, Spark, or Elastic Search.
Proven experience in designing, building, and maintaining data pipelines and infrastructure
Service in Unit 8200 or another technology unit- An Advance.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8523478
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a Data/Backend Engineer with strong hands-on experience in Python-based ETL pipelines and cloud-native development on AWS.
You will work directly with the customers technical leadership, participating in architecture discussions, solution design, and ongoing implementation for a long-term project.
Core Responsibilities:
Develop and maintain Python ETL pipelines, data ingestion flows, and processing jobs.
Build backend components and microservices deployed on AWS Kubernetes (EKS).
Integrate and optimize data models, APIs, and batch/real-time processes.
Collaborate with technical stakeholders to understand requirements and translate them into scalable solutions.
Ensure reliability, observability, and performance of pipelines and services.
Work in a hybrid environment, coordinating closely with onsite teams and customer technical representatives.
Requirements:
Required Skills & Experience:
3-5+ years of hands-on backend or data engineering experience.
Strong proficiency in Python, specifically for ETL/data pipelines.
Experience with AWS services (Lambda, S3, EC2, IAM, VPC, etc.).
Hands-on work with Kubernetes (preferably AWS EKS).
Understanding of CI/CD processes and Git-based workflows.
Ability to work independently with customer teams and present technical solutions clearly.
Nice to Have:
Experience with Neo4j or other graph databases.
Experience with Frontend (React) development.
Knowledge of streaming platforms, data modeling, or distributed systems.
Experience working in consulting, retainer, or hourly engagement models.
Personal Qualities:
Strong communication skills, especially in customer-facing environments.
Self-driven, responsible, and comfortable handling end-to-end technical tasks.
Team player with the ability to work onsite when needed.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8521670
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're seeking a Mid to Senior Data Engineering Manager to join our Cloud Identity & Perimeter, a critical component of CrowdStrike's security infrastructure. Our team develops and maintains complex data pipelines that process billions of records daily, analyzing identity-related security patterns, effective permissions, internet exposure, and attack paths. We're at the forefront of securing enterprise identities and delivering actionable security insights at scale.

What You'll Do:

Lead and mentor a team of data engineers building high-performance data processing pipelines.

Define technical strategy and architecture for complex data transformations using Apache Spark.

Oversee development of real-time data streaming solutions using Kafka and scalable ETL processes.

Drive cross-functional collaboration with product, security research, and engineering teams to deliver high-impact security features.

Establish best practices for data modeling, query optimization, and storage patterns for large-scale distributed systems.

Manage team performance, including hiring, career development, and performance reviews.

Balance technical execution with strategic planning and resource allocation.

Champion operational excellence through code reviews, system monitoring, and continuous improvement initiatives.
Requirements:
What You'll Need:

5+ years of experience in data engineering with 2+ years in a technical leadership or management role.

Strong technical background in Go and/or Java with hands-on experience in big data technologies.

Proven experience managing and scaling engineering teams.

Deep expertise with distributed databases (Cassandra, Elasticsearch) and production-grade data pipeline architecture.

Track record of delivering complex data infrastructure projects on time.

Excellent communication skills - ability to translate technical concepts to non-technical stakeholders.

Strong leadership abilities including mentoring, conflict resolution, and team building.

BS/MS in Computer Science or related field, or equivalent experience.

Experience with cloud platforms (AWS, GCP, Azure).


Bonus Points:

Experience with identity and access management concepts or security analytics platforms.

Knowledge of security analytics and threat detection.

Contributions to open-source projects or technical community leadership.

Background in cybersecurity or security analytics.

Experience managing distributed or remote teams.

Prior experience at a high-growth technology company.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8520053
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a talented Data Engineer to join our Data team and spread our power. The Data team develops and maintains the infrastructure for internal data and product analytics. We are looking for data enthusiasts, independent, logical thinkers with a can-do approach and a passion for problem-solving. We run a SaaS-based stack using BigQuery, Snowflake and dbt.
WHAT YOU'LL DO
Design, build, and maintain data pipelines, datasets and catalogs for fast-growing products and business groups.
Develop self-service data analytics solutions and infrastructure.
Support ad hoc needs and requests of internal stakeholders.
Collaborate with analysts, engineers, and internal customers from Product, Finance, Revenue, and Marketing.
Requirements:
Bachelors or Masters degree in a relevant technical field, or equivalent hands-on experience in software development or DevOps.
3+ years of experience working as a Data Engineer, including end-to-end designing, orchestrating, and building cloud-based data pipelines (e.g., Airflow, Prefect, Dagster).
3+ years of experience with dimensional data modeling and data warehouse implementation, specifically MPP databases like BigQuery, Snowflake, and Redshift.
Strong knowledge of Python and Python-based data analysis tools such as Jupyter Notebooks and pandas.
Strong SQL writing skills. Ability to write highly performant queries.
Strong track record of executing projects independently in dynamic environments.
Fast understanding of data and business needs and ability to translate them into data models.
Team player with excellent communication skills.
Containerization (Docker): Essential for reproducible environments.
Knowledge of software engineering best practices: CI/CD concepts, code reviews, and unit testing.
ADVANTAGE
Production-level experience with dbt, including project design, transformation, testing, and documentation.
Infrastructure-as-Code (Terraform): Managing cloud resources (S3 buckets, IAM roles) via code.
CI/CD pipelines (GitHub Actions/Jenkins): Automating the testing and deployment of data models.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8515929
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Principal Data Engineer to join our Engineering team. This is a hybrid role based in Israel, reporting to the Senior Manager, Software Development Engineering. You will play a key role in designing, building, and maintaining the data pipelines that fuel business insights for the worlds largest cloud security platform. You will bring vision and passion to a team of experts enabling organizations worldwide to harness speed and agility through a cloud-first strategy.
What youll do (Role Expectations):
Design, develop, and implement scalable resilient data pipelines using Apache Spark, Databricks, and Delta Lake
Lead projects end-to-end through implementation, testing, deployment, and monitoring while collaborating with engineering teams across the organization
Research and investigate new SaaS platforms and APIs to uncover opportunities for new security detections
Optimize data infrastructure to ensure smooth operation, performance efficiency, and data integrity throughout the data lifecycle
Act as a force multiplier by mentoring senior engineers and setting high technical standards across the team.
Requirements:
Who You Are (Success Profile):
You thrive in ambiguity. You're comfortable building the path as you walk it. You thrive in a dynamic environment, seeing ambiguity not as a hindrance, but as the raw material to build something meaningful.
You act like an owner. Your passion for the mission fuels your bias for action. You operate with integrity because you genuinely care about the outcome. True ownership involves leveraging dynamic range: the ability to navigate seamlessly between high-level strategy and hands-on execution.
You are a problem-solver. You love running towards the challenges because you are laser-focused on finding the solution, knowing that solving the hard problems delivers the biggest impact.
You are a high-trust collaborator. You are ambitious for the team, not just yourself. You embrace our challenge culture by giving and receiving ongoing feedback-knowing that candor delivered with clarity and respect is the truest form of teamwork and the fastest way to earn trust.
You are a learner. You have a true growth mindset and are obsessed with your own development, actively seeking feedback to become a better partner and a stronger teammate. You love what you do and you do it with purpose.
What Were Looking for (Minimum Qualifications):
7+ years of experience in data engineering with a proven track record of designing and building large-scale data systems
Demonstrated mastery of Apache Spark (PySpark preferred) and a deep architectural understanding of distributed data processing frameworks
Expert-level proficiency in SQL, data modeling, and the principles of modern data warehousing and data lake architecture
What Will Make You Stand Out (Preferred Qualifications):
Direct experience working in cybersecurity, threat intelligence, or a related security-focused domain
Proven experience building production-grade data solutions specifically using Databricks
A security-first mindset and a strong curiosity for understanding how systems can be monitored and secured.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8515641
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
we are Crafting Tomorrow's Retail Experience
We started with a simple question: How can vision AI make shopping better for everyone? Today, our team of passionate experts is transforming how people shop and how retailers operate, one store at a time.
Our goal is to solve real retail challenges by developing advanced computer vision AI that prevents loss, enables grab-and-go shopping, and turns store data into valuable insights, all while delivering clear value from day one.
With multiple global deployments, deep partnerships with leading retailers, and a product suite thats proving its impact every day, our company isnt just imagining the future of retail, were building it.
We are looking for a Data Engineer to join our Customer Performance team and help shape the foundation of our companys data platform.
You will own the data layer, evolve our BI semantic layer on a modern BI platform, and enable teams across the company to access and trust their data. Working closely with analysts and stakeholders in Operations, Product, R&D, and Customer Success, youll turn complex data into insights that drive performance and customer value.
A day in the life
Build and expand our companys analytics data stack
Design and maintain curated data models that power self-service analytics and dashboards
Develop and own the BI semantic layer on a modern BI platform
Collaborate with analysts to define core metrics, KPIs, and shared business logic
Partner with Operations, R&D, Product, and Customer Success teams to translate business questions into scalable data solutions
Ensure data quality, observability, and documentation across datasets and pipelines
Support complex investigations and ad-hoc analyses that drive customer and operational excellence.
Requirements:
6+ years of experience as a Data Engineer, Analytics Engineer, or similar hands-on data role
Strong command of SQL and proficiency in Python for data modeling and transformation
Experience with modern data tools such as dbt, BigQuery, and Airflow (or similar)
Proven ability to design clean, scalable, and analytics-ready data models
Familiarity with BI modeling and metric standardization concepts
Experience partnering with analysts and stakeholders to deliver practical data solutions
A pragmatic, problem-solving mindset and ownership of data quality and reliability
Excellent communication skills and ability to connect technical work with business impact
Nice to have
Experience implementing or managing BI tools such as Tableau, Looker, or Hex
Understanding of retail, computer vision, or hardware data environments
Exposure to time-series data, anomaly detection, or performance monitoring
Interest in shaping a growing data organization and influencing its direction.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8514427
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
22/01/2026
מיקום המשרה: תל אביב יפו
סוג משרה: משרה מלאה
מפתח/ת אינטגרציה Hands-On לצוות טכנולוגיות ודיגיטל, לפיתוח ותחזוקת ממשקי API (REST/SOAP) ותהליכי data בין מערכות ליבה, מערכות חיצוניות ו- BI, כולל עבודה עם SQL, CI/CD וסטנדרטים מעולם הבריאות.
תל אביב.
קו"ח למייל. המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8513265
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Solutions Data Engineer who possess both technical depth and strong interpersonal skills to partner with internal and external teams to develop scalable, flexible, and cutting-edge solutions. Solutions Engineers collaborate with operations and business development to help craft solutions to meet customer business problems.
A Solutions Engineer works to balance various aspects of the project, from safety to design. Additionally, a Solutions Engineer researches advanced technology regarding best practices in the field and seek to find cost-effective solutions.
Job Description:
Were looking for a Solutions Engineer with deep experience in Big Data technologies, real-time data pipelines, and scalable infrastructure-someone whos been delivering critical systems under pressure, and knows what it takes to bring complex data architectures to life. This isnt just about checking boxes on tech stacks-its about solving real-world data problems, collaborating with smart people, and building robust, future-proof solutions.
In this role, youll partner closely with engineering, product, and customers to design and deliver high-impact systems that move, transform, and serve data at scale. Youll help customers architect pipelines that are not only performant and cost-efficient but also easy to operate and evolve.
We want someone whos comfortable switching hats between low-level debugging, high-level architecture, and communicating clearly with stakeholders of all technical levels.
Key Responsibilities:
Build distributed data pipelines using technologies like Kafka, Spark (batch & streaming), Python, Trino, Airflow, and S3-compatible data lakes-designed for scale, modularity, and seamless integration across real-time and batch workloads.
Design, deploy, and troubleshoot hybrid cloud/on-prem environments using Terraform, Docker, Kubernetes, and CI/CD automation tools.
Implement event-driven and serverless workflows with precise control over latency, throughput, and fault tolerance trade-offs.
Create technical guides, architecture docs, and demo pipelines to support onboarding, evangelize best practices, and accelerate adoption across engineering, product, and customer-facing teams.
Integrate data validation, observability tools, and governance directly into the pipeline lifecycle.
Own end-to-end platform lifecycle: ingestion → transformation → storage (Parquet/ORC on S3) → compute layer (Trino/Spark).
Benchmark and tune storage backends (S3/NFS/SMB) and compute layers for throughput, latency, and scalability using production datasets.
Work cross-functionally with R&D to push performance limits across interactive, streaming, and ML-ready analytics workloads.
Operate and debug object store-backed data lake infrastructure, enabling schema-on-read access, high-throughput ingestion, advanced searching strategies, and performance tuning for large-scale workloads.
Requirements:
2-4 years in software / solution or infrastructure engineering, with 2-4 years focused on building / maintaining large-scale data pipelines / storage & database solutions.
Proficiency in Trino, Spark (Structured Streaming & batch) and solid working knowledge of Apache Kafka.
Coding background in Python (must-have); familiarity with Bash and scripting tools is a plus.
Deep understanding of data storage architectures including SQL, NoSQL, and HDFS.
Solid grasp of DevOps practices, including containerization (Docker), orchestration (Kubernetes), and infrastructure provisioning (Terraform).
Experience with distributed systems, stream processing, and event-driven architecture.
Hands-on familiarity with benchmarking and performance profiling for storage systems, databases, and analytics engines.
Excellent communication skills-youll be expected to explain your thinking clearly, guide customer conversations, and collaborate across engineering and product teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8512434
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
21/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Your Mission As a Senior Data Engineer, your mission is to build the scalable, reliable data foundation that empowers us to make data-driven decisions. You will serve as a bridge between complex business needs and technical implementation, translating raw data into high-value assets. You will own the entire data lifecycle-from ingestion to insight-ensuring that our analytics infrastructure scales as fast as our business.

Key Responsibilities:
Strategic Data Modeling: Translate complex business requirements into efficient, scalable data models and schemas. You will design the logic that turns raw events into actionable business intelligence.
Pipeline Architecture: Design, implement, and maintain resilient data pipelines that serve multiple business domains. You will ensure data flows reliably, securely, and with low latency across our ecosystem.
End-to-End Ownership: Own the data development lifecycle completely-from architectural design and testing to deployment, maintenance, and observability.
Cross-Functional Partnership: Partner closely with Data Analysts, Data Scientists, and Software Engineers to deliver end-to-end data solutions.
Requirements:
What You Bring:
Your Mindset:
Data as a Product: You treat data pipelines and tables with the same rigor as production APIs-reliability, versioning, and uptime matter to you.
Business Acumen: You dont just move data; you understand the business questions behind the query and design solutions that provide answers.
Builders Spirit: You work independently to balance functional needs with non-functional requirements (scale, cost, performance).
Your Experience & Qualifications:
Must Haves:
6+ years of experience as a Data Engineer, BI Developer, or similar role.
Modern Data Stack: Strong hands-on experience with DBT, Snowflake, Databricks, and orchestration tools like Airflow.
SQL & Modeling: Strong proficiency in SQL and deep understanding of data warehousing concepts (Star schema, Snowflake schema).
Data Modeling: Proven experience in data modeling and business logic design for complex domains-building models that are efficient and maintainable.
Modern Workflow: Proven experience leveraging AI assistants to accelerate data engineering tasks.
Bachelors degree in Computer Science, Industrial Engineering, Mathematics, or an equivalent analytical discipline.
Preferred / Bonus:
Cloud Data Warehouses: Experience with BigQuery or Redshift.
Coding Skills: Proficiency in Python for data processing and automation.
Big Data Tech: Familiarity with Spark, Kubernetes, Docker.
BI Integration: Experience serving data to BI tools such as Looker, Tableau, or Superset.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8511741
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
21/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Are you a talented and experienced Data Engineer? If so, we want you to be part of our dynamic Data Engineering Team, a part of the R&D, contributing to our vision and making a difference in the eCommerce landscape. Join us on this journey as we seek the best and brightest minds to drive our mission forward.

Responsibilities:
Developing, implementing and supporting robust, scalable solutions to improve business analysis capabilities.
Managing data pipelines from multiple sources, including designing, implementing, and maintaining.
Translating business priorities into data models by working with business analysts and product analysts.
Collaborate across the business with various stakeholders, such as data developers, systems analysts, data scientists and software engineers.
Owning the entire data development process, including business knowledge, methodology, quality assurance, and maintenance.
Work independently while considering all functional and non-functional aspects and provide high quality and robust infrastructures to the organization.
Requirements:
What you need:
Bachelors degree in Computer Science, Industrial engineering, Maths, or other numerate/analytical degree equivalent.
4 years of experience working as a BI Developer / Data Engineer or a similar role.
Advanced proficiency and deep understanding of SQL.
Skills in data modeling, business logic processes, as well as experience with DWH design.
An enthusiastic, fast-learning, team-player, motivated individual who loves data.

Advantage:
Experience working with DBT (big advantage).
Knowledge in BI tools such as Looker, Tableau or Superset.
Experience with Python.
Experience working with DWH, such as BigQuery/Snowflake/Redshift.
Experience working with Spark, Kubernetes, Docker.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8511686
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineer to join our growing team!
This is a great opportunity to be part of one of the fastest-growing infrastructure companies in history, an organization that is in the center of the hurricane being created by the revolution in artificial intelligence.
In this role, you will be responsible for:
Designing, building, and maintaining scalable data pipeline architectures
Developing ETL processes to integrate data from multiple sources
Creating and optimizing data models for efficient storage and retrieval
Implementing data quality controls and monitoring systems
Collaborating with data scientists and analysts to deliver data solutions
Building and maintaining data warehouses and data lakes
Performing in-depth data analysis and providing insights to stakeholders
Taking full ownership of data quality, documentation, and governance processes
Building and maintaining comprehensive reports and dashboards
Ensuring data security and regulatory compliance.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field
3+ years experience in data engineering
Strong proficiency in SQL and Python
Experience with ETL tools and data warehousing solutions
Knowledge of big data technologies (Hadoop, Spark, etc.)
Experience with cloud platforms (AWS, Azure, or GCP)
Understanding of data modeling and database design principles
Familiarity with data visualization tools - Tableau, Sisense
Strong problem-solving and analytical skills
Excellent communication and collaboration abilities
Experience with version control systems (Git).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8511545
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
About us:
A pioneering health-tech startup on a mission to revolutionize weight loss and well-being. Our innovative metabolic measurement device provides users with a comprehensive understanding of their metabolism, empowering them with personalized, data-driven insights to make informed lifestyle choices.
Data is at the core of everything we do. We collect and analyze vast amounts of user data from our device and app to provide personalized recommendations, enhance our product, and drive advancements in metabolic health research. As we continue to scale, our data infrastructure is crucial to our success and our ability to empower our users.
About the Role:
As a Senior Data Engineer, youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.
Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.
Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.
Responsibilities:
Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
Technical Skills:
Proficiency in Python and SQL.
Hands-on experience with workflow orchestration tools like Airflow.
Experience with real-time data streaming technologies like Kafka.
Familiarity with container orchestration using Kubernetes (K8s) and dependency management with Poetry.
Cloud Infrastructure: Proven experience with AWS cloud services (e.g., EC2, S3, RDS).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8510072
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
20/01/2026
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
our company Security is one of the main pillars of our company's long-term strategy. We are pushing the boundaries of security analysis for both binaries and code, shifting left and delivering powerful capabilities to developers and DevOps teams. We are looking for an experienced Senior Database Engineer with deep PostgreSQL expertise and strong application-side awareness to join our DevOps organization and support multiple R&D teams at scale.
As a Senior Database Engineer at our company Security you will
Own PostgreSQL performance, scalability, and reliability across large-scale security and scanning workloads
Deep dive into database performance issues, including query optimization, execution plans, indexing strategies, locking, bloat, vacuuming, and connection management
Work closely with application teams to optimize database access patterns from services and APIs
Review application code and SQL usage with a focus on correctness, performance, and scalability
Design, evolve, and optimize schemas and migrations for long-lived, high-volume datasets
Support and fine-tune PostgreSQL in both cloud-managed environments, such as RDS and self-managed deployments
Establish best practices, standards, and guidelines for PostgreSQL usage across teams
Mentor engineers and raise the overall database maturity of the organization
Act as a trusted expert for troubleshooting complex production database incidents.
Requirements:
5+ years of hands-on experience working with PostgreSQL in production high-scale environments
Proven expertise in PostgreSQL performance tuning, optimization, and internals
Excellent SQL skills, including complex and performance-critical queries
Strong understanding of how applications interact with databases, including connection pools, ORMs, transaction patterns, and failure modes
Experience working with large datasets up to a few TBs in volume and high concurrency with 5000+ connections
Familiarity with Go and/or Python at a level sufficient for code review and performance analysis - an advantage
Experience with both cloud-managed PostgreSQL and self-managed deployments
Solid understanding of schema design, migrations, and backward-compatible data evolution
Ability to work cross-team as part of a DevOps organization providing shared infrastructure and expertise
Strong communication skills with the ability to mentor, influence standards, and guide teams under pressure.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8509945
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו
ישנן -77 משרות במרכז אשר לא צויינה בעבורן עיר הצג אותן >