משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
להשיב נכון: "ספר לי על עצמך"
שימו בכיס וצאו לראיון: התשובה המושלמת לשאלה שמצ...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/02/2026
Job Type: Full Time
Welcome to Chargeflow Chargeflow is at the forefront of fintech + AI innovation, backed by leading venture capital firms. Our mission is to build a fraud-free global commerce ecosystem by leveraging the newest technology, freeing online businesses to focus on their core ideas and growth. We are building the future, and we need you to help shape it. Who We're Looking For - The Dream Maker We are seeking an experienced Senior Data Platform Engineer to design and scale the robust, cost-efficient infrastructure powering our groundbreaking fraud prevention solution. In this role, you will architect distributed systems and cloud-native technologies to safeguard our clients' revenue while driving technical initiatives that align with business objectives and operational efficiency. Our ultimate goal is to equip our clients with resilient safeguards against chargebacks, empowering them to safeguard their revenue and optimize their profitability. Join us on this thrilling mission to redefine the battle against fraud. Your Arena Infrastructure & FinOps: Design scalable, robust backend services while owning cloud cost management to ensure maximum resource efficiency. High-Performance Engineering: Architect distributed systems and real-time pipelines capable of processing millions of daily transactions. Operational Excellence: Champion Infrastructure-as-Code (IaC), security, and observability best practices across the R&D organization. Leadership: Lead technical initiatives, mentor engineers, and drive cross-functional collaboration to solve complex infrastructure challenges.
Requirements:
What It Takes - Must haves: Experience: 5+ years of experience in data platform engineering, backend engineering, or infrastructure engineering. Language Proficiency: Specific, strong proficiency in Python & software engineering principles. Cloud Native: Extensive experience with AWS, GCP, or Azure and cloud-native architectures. Databases: Deep knowledge of both relational (e.g., PostgreSQL) and NoSQL databases, including performance optimization, cost tuning, and scaling strategies. Architecture: Strong experience designing and implementing RESTful APIs, microservices architecture, and event-driven systems. Containerization & IaC: Experience with containerization technologies (Docker, Kubernetes) and Infrastructure-as-Code (e.g., Terraform, CloudFormation). System Design: Strong understanding of distributed systems principles, concurrency, and scalability patterns. Nice-to-Haves Strong Advantage: Apache Iceberg (Lakehouse/S3/Glue), Apache Spark (Optimization), Message Queues (Kafka/Kinesis), Graph Databases (Experience with schema design, cluster setup, and ongoing management of engines like Amazon Neptune or Neo4j). Tech Stack: Orchestration (Temporal/Dagster/Airflow), Modern Data Stack (dbt/DuckDB), Streaming (Flink/Kafka Streams), Observability (Datadog/Grafana). Skills: FinOps (Cost Explorer/Spot instances), CI/CD & DevOps, Data Governance (GDPR), Pydantic, and Mentorship/Leadership experience. Our Story Chargeflow is a leading force in fintech innovation, tackling the pervasive issue of chargeback fraud that undermines online businesses. Born from a deep passion for technology and a commitment to excel in eCommerce and fintech, we've developed an AI-driven solution aimed at combating the frustrations of credit card disputes. Our diverse expertise in fintech, eCommerce, and technology positions us as a beacon for merchants facing unjust chargebacks, supported by a unique success-based approach. Backed by $49M led by Viola Growth, OpenView, Sequoia Capital and other top tier global investors, Chargeflow has embarked on a product-led growth journey. Today, we represent a tight-knit community of passionate individuals and entrepreneurs, united in our mission to revolutionize eCommerce and fight against chargeback fraud, marking us as pioneers in protecting online business revenues.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8476565
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
12/02/2026
מיקום המשרה: תל אביב יפו
סוג משרה: משרה מלאה
בוגרי מערכות מידע או הנדסת תעשייה וניהול - ההזדמנות שלכם מתחילה כאן!
אנחנו בחברתנו מתרגשים להכריז על Bootcamp חדש בשיתוף פעולה עם Oracle, שמכין אתכם לעבודה כמנהלי דאטה בענן, בחברות המובילות במשק!
מה במסלול?
הכשרה ממוקדת עם הסילבוס המעודכן ביותר בתחום, ליווי אישי לאורך כל הדרך, והכי חשוב סיוע בהשתלבות בעבודה יציבה בסיום ההכשרה.
למי זה מתאים?
אקדמאים בעלי תואר ראשון בהנדסת תעשייה וניהול עם התמחות במערכות מידע או תואר ראשון במערכות מידע.
או בוגרי ניהול מערכות מידע
עבודה בפריסה ארצית
פרטי ההכשרה:
משך: חודש וחצי, 5 ימים בשבוע
שעות: 9:00-17:00
מיקום: תל אביב
לפרטים ולהגשת קורות חיים במייל. המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8543120
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/02/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a talented person who is passionate about data and what makes it valuable.
This person will help us improve and evolve our telemetric data across all stages - collection, modeling, analysis, and serving.
We are looking for someone who can understand the entire path of how telemetry data is collected, processed, analyzed, improved, described, and consumed inside the organization.
Key Responsibilities
Contribute to transforming the companys telemetry data and taking it to the next level.
Assist in maintaining and improving diagnostics and monitoring data deployed across various customer hardware and software environments.
Work with large-scale data to generate actionable insights and integrate diverse internal data sources.
Help in serving, exposing, and documenting the collected data for internal teams across the organization.
Requirements:
2+ years of experience with SQL/NoSQL databases.
Experience with data platforms such as Databricks, Datadog, Snowflake - must.
DevOps skills using one of the public cloud providers - must.
Experience with data visualization tools like powerBI - must.
Proven record of building, planning, and maintaining, high performance data pipelines.
Understanding in coding/scripting (Mostly pyspark, pandas, ML libs) - advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8541880
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Job Type: Full Time
Required Senior Data Engineer (Customer)
Responsibilities
Design and build data solutions that support our Credit Card and Servicing business goals.
Develop advanced data pipelines to support the infrastructure, architecture and the product growth initiatives.
Create ETL/ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Collaborate with stakeholders across Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Requirements:
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of experience in data modeling and building scalable ELT/ETL pipelines across leading Data Warehouses (Snowflake - Preferred, Redshift, BigQuery).
3+ years of experience designing and managing automated data pipelines using Apache Airflow.
3+ years of experience developing scalable, production-grade data models with DBT.
Hands-on experience with cloud environments (AWS preferred) and big data technologies.
Strong troubleshooting and debugging skills in large-scale systems.
Proven experience packaging applications with Docker and utilizing Argo Workflows to automate, execute, and monitor containerized task sequences.
Experience with design patterns, coding best practices.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have:
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
Experience with microservices architecture.
Familiarity with Airbyte or other modern ETL platforms.
Experience with Apache Spark or Apache Kafka and the broader Data Engineering ecosystem.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8541632
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Job Type: Full Time
Required Data Infrastructure Engineer
Binyamina & Tel Aviv
About the Role:
We use cutting-edge innovations in financial technology to bring leading data and features that allow individuals to be qualified instantly, making purchases at the point-of-sale fast, fair and easy for consumers from all walks of life.
As part of our Data Engineering team, you will not only build scalable data platforms but also directly enable portfolio growth by supporting new funding capabilities, loan sales and securitization, and improving cost efficiency through automated and trusted data flows that evolve our accounting processes.
Responsibilities:
Design and build data solutions that support our core business goals, from enabling capital market transactions (loan sales and securitization) to providing
reliable insights for reducing the cost of capital.
Develop advanced data pipelines and analytics to support finance, accounting, and product growth initiatives.
Create ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Develop data-driven finance products that accelerate funding capabilities and automate accounting reconciliations.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Create new features from scratch, enhance existing features, and optimize existing functionality.
Collaborate with stakeholders across Finance, Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Requirements:
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of direct experience with SQL (Redshift/Snowflake), data modeling, data warehousing, and building ELT/ETL pipelines (DBT & Airflow preferred).
3+ years of experience in scalable data architecture, fault-tolerant ETL, and data quality monitoring in the cloud.
Hands-on experience with cloud environments (AWS preferred) and big data technologies (EMR, EC2, S3, Snowflake, Spark Streaming, Kafka, DBT).
Strong troubleshooting and debugging skills in large-scale systems.
Deep understanding of distributed data processing and tools such as Kafka, Spark, and Airflow.
Experience with design patterns, coding best practices, and data modeling.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have:
Familiarity with fintech business processes (funding, securitization, loan servicing, accounting).- Huge advantage
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
DevOps experience in AWS.
Microservices experience.
2+ years of experience in Spark and the broader Data Engineering ecosystem.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8541607
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/02/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineering Team Leader with deep expertise in building and managing data pipelines and streaming architecture. This role is ideal for an experienced and proactive leader with strong technical skills in distributed systems and data platforms. You will drive the architecture, design, and development of scalable data ingestion and processing solutions. This is an exciting opportunity to join a growing product in an enterprise environment with significant impact and room for professional growth.
This job is located in Tel Aviv (hybrid).
About Us
At our company, were creating the industrys leading SASE platform, merging advanced security with seamless connectivity. Our mission is to empower businesses to thrive in a cloud-first world, and data is at the heart of this transformation.
Key Responsibilities
Inspire and mentor a top-tier data engineering team to deliver mission-critical solutions
Architect and optimize data ingestion, enrichment, and storage for massive scale and reliability
Collaborate with cross-functional teams to ensure seamless integration and data availability
Define best practices and enforce engineering excellence across the data domain.
Requirements:
4+ years of hands-on experience in data engineering, with strong knowledge of streaming technologies (Kafka/MSK, Flink) and distributed systems on AWS
2+ years of leadership experience in data engineering or related fields.
Strong development skills in Java and deep understanding of data modeling, ETL, and real-time analytics
Experience in developing and maintain a multi-tenant SaaS solution on top of AWS
Experience with React - advantage
A natural leader with strong communication skills and a can-do, hands-on approach.
BSc in computer science/software engineering (or equivalent).
Fluent English (written & spoken).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8541236
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/02/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Join our companys AI research group, a cross-functional team of ML engineers, researchers and security experts building the next generation of AI-powered security capabilities. Our mission is to leverage large language models to understand code, configuration, and human language at scale, and to turn this understanding into security AI capabilities that will drive our companys future security solutions.
We foster a hands-on, research-driven culture where youll work with large-scale data, modern ML infrastructure, and a global product footprint that impacts over 100,000 organizations worldwide.
Key Responsibilities
Your Impact & Responsibilities
As a Data Engineer - AI Technologies, you will be responsible for building and operating the data foundation that enables our LLM and ML research: from ingestion and augmentation, through labeling and quality control, to efficient data delivery for training and evaluation.
You will:
Own data pipelines for LLM training and evaluation
Design, build and maintain scalable pipelines to ingest, transform and serve large-scale text, log, code and semi-structured data from multiple products and internal systems.
Drive data augmentation and synthetic data generation
Implement and operate pipelines for data augmentation (e.g., prompt-based generation, paraphrasing, negative sampling, multi-positive pairs) in close collaboration with ML Research Engineers.
Build tagging, labeling and annotation workflows
Support human-in-the-loop labeling, active learning loops and semi-automated tagging. Work with domain experts to implement tools, schemas and processes for consistent, high-quality annotations.
Ensure data quality, observability and governance
Define and monitor data quality checks (coverage, drift, anomalies, duplicates, PII), manage dataset versions, and maintain clear documentation and lineage for training and evaluation datasets.
Optimize training data flows for efficiency and cost
Design storage layouts and access patterns that reduce training time and cost (e.g., sharding, caching, streaming). Work with ML engineers to make sure the right data arrives at the right place, in the right format.
Build and maintain data infrastructure for LLM workloads
Work with cloud and platform teams to develop robust, production-grade infrastructure: data lakes / warehouses, feature stores, vector stores, and high-throughput data services used by training jobs and offline evaluation.
Collaborate closely with ML Research Engineers and security experts
Translate modeling and security requirements into concrete data tasks: dataset design, splits, sampling strategies, and evaluation data construction for specific security use.
דרישות:
What You Bring
3+ years of hands-on experience as a Data Engineer or ML/Data Engineer, ideally in a product or platform team.
Strong programming skills in Python and experience with at least one additional language commonly used for data / backend (e.g., SQL, Scala, or Java).
Solid experience building ETL / ELT pipelines and batch/stream processing using tools such as Spark, Beam, Flink, Kafka, Airflow, Argo, or similar.
Experience working with cloud data platforms (e.g., AWS, GCP, Azure) and modern data storage technologies (object stores, data warehouses, data lakes).
Good understanding of data modeling, schema design, partitioning strategies and performance optimization for large datasets.
Familiarity with ML / LLM workflows: train/validation/test splits, dataset versioning, and the basics of model training and evaluation (you dont need to be the primary model researcher, but you understand what the models need from the data).
Strong software engineering practices: version control, code review, testing, CI/CD, and documentation.
Ability to work independently and in collaboration with ML engineers, researchers and security experts, and to translate high-level requirements into concrete data engineering tasks.
Nice to Have המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8541065
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
10/02/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a seasoned R&D Group Manager to lead a group of talented engineers in designing and building scalable data pipelines and cloud-native applications. This role is perfect for an experienced and proactive person with outstanding leadership skills. The ideal candidate is a team player who has a passion for data processing and applications on a large scale. This is an incredible opportunity to join a growing product in an enterprise company with ample opportunities for professional growth.
This job is located in Tel Aviv (hybrid).
About Us
At our company, were building the industrys leading SASE platform, merging advanced security with seamless connectivity. Our mission is to empower businesses to thrive in a cloud-first world, and data is at the heart of this transformation.
Why Youll Love This Role
Lead multiple teams designing and scaling real-time data pipelines and cloud-native applications
Work with cutting-edge technologies to build resilient, high-performance systems
Shape architecture and strategy in a fast-paced environment where your decisions impact thousands of users globally
Key Responsibilities
Inspire and mentor engineering teams to deliver mission-critical solutions for data infrastructure and backend services
Architect and oversee robust, scalable data pipelines and real-time processing systems
Drive design and implementation of microservices and serverless applications using AWS and Azure ADX
Collaborate with product managers, data scientists, and stakeholders to turn business needs into technical excellence
Foster a culture of innovation, ownership, and continuous improvement
Manage hiring, performance reviews, and career development for team members.
Requirements:
8+ years in software development, with at least 3 years leading multiple teams
Proven experience building large-scale data pipelines and distributed systems
Strong hands-on experience with AWS services and stream processing (Kafka)
Proficiency in Python and Java, and deep understanding of ETL and data models
Experience in developing and maintain a multi-tenant SaaS solution on top of AWS
Exceptional leadership, communication, and collaboration skills
Bachelors or Masters degree in Computer Science, Engineering, or related field
A team player with excellent communication and leadership skills
Fluent English (written & spoken).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8540408
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
10/02/2026
מיקום המשרה: באר שבע
סוג משרה: משרה מלאה
רוצה לקחת חלק בפעילות בטחונית של מדינת ישראל ובחזית הטכנולוגיה?
אנחנו מחפשים את האחד/ אחת שעם יכולת התמודדות עם אתגרים מורכבים, כישורי תקשורת יוצאי דופן, נלהבות לגבי מוצר ו- UX, ועם גישה של Get things done. מזדהה? הצטרפ/י אלינו!
לחברתנו, פעילות המגזר הבטחוני, דרוש/ה Senior data Engineer.
התפקיד כולל, הקמת מערכות של הזרמת מידע בזמן אמת למיפוי הימצאות כטבמ"ים ורחפנים.
משרה מלאה, בבאר שבע.
מתאים לכם?
מוזמנים לשלוח קורות חיים למייל.
דרישות:
ניסיון של ארבע שנים ומעלה כ- data Engineer - חובה.
ניסיון של ניסיון של שש שנים ומעלה ב- JAVA /Spring או Python בסביבות ייצור - חובה.
ניסיון של שנתיים ומעלה בפיתוח Near Real-Time systems בטיפול בכ-10 מיליון הודעות ביום- חובה.
ניסיון רב במודלים של נתונים- חובה.
נסיון בכתיבת שאילתות SQL מורכבות - חובה.
יכולת מוכחת בהגדרה ובניית data pipelines למערכות ייצור- חובה.
ניסיון עם מתווכים בהודעות כגון קפקא או RabbitMQ - חובה.
ניסיון עם מסדי נתונים NoSQL בסביבות ייצור, כגון: Elastic, Cassandra, DynamoDB או Aerospike - חובה.
ניסיון במודלים של נתונים מרחביים - יתרון.
ניסיון במודלים של נתונים עבור מספר ספקים וצרכנים - יתרון.
ניסיון באחריות למאגרי מידע אנליטיים - יתרון. המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8539274
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and creative data scientist, with a software development touch, to join our Data Science and family. A data scientist that understands and loves data, and lots of it.
What you will do...
Solve Applied Product Challenges: Deep-dive into the product ecosystem to identify and solve high-impact problems. You will translate complex customer needs into scalable ML features that directly improve the user experience.
Lead Exposure Intelligence R&D: Develop features that unify asset data (CAASM) with threat intelligence. This includes building models for entity resolution (deduplicating assets across fragmented sources) and automated risk assessment.
Advanced NLP & Knowledge Extraction: Use NLP and LLMs to parse unstructured security data-such as CVEs, threat intel feeds, and security advisories-to automate the mapping of vulnerabilities to specific business contexts.
Predictive Prioritization: Design and optimize algorithms that go beyond static CVSS scores. You will incorporate exploitability (EPSS), reachability, and business criticality to help clients focus on the 1% of exposures that matter most.
End-to-End Ownership: Work closely with Product Managers and Data analysts and Engineers to ensure your models aren't just accurate in a notebook, but are robust, explainable, and deliver clear value within the product UI.
Graph-Based Attack Surface Mapping: Identify hidden patterns and relationships between assets, users, and vulnerabilities to visualize the potential "blast radius" of a security gap.
Requirements:
Academic Background: M.Sc./PhD in Computer Science, Statistics, Engineering, or a related field (or equivalent high-level professional experience).
Industry Experience: 6+ years in Data Science, with a heavy focus on NLP and solving complex, real-world problems using ML/Deep Learning.
Technical Mastery: Hands-on experience with PyTorch, Hugging Face, scikit-learn, and SQL. Familiarity with processing large-scale datasets (PySpark, or similar) is highly valued.
Domain Awareness: Proven ability to apply statistical modeling to cybersecurity, risk management, or complex system analysis. Experience with Vulnerability Management or Graph Theory is a significant plus.
Product and collaborative Driven Mindset: You are obsessed with understanding the "Why" behind the data. You enjoy learning the nuances of the product and the cybersecurity domain to ensure your DS solutions are highly applicable.
Curiosity & Grit: A passion for diving into "messy" data and finding the signal within the noise of the modern attack surface.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8538489
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek an experienced and highly skilled Senior Engineer to join our Data Platform Team. This is an exciting opportunity to work on cutting-edge technologies, drive innovation, and play a key role in designing and implementing the companys data infrastructure and architecture.
What you'll do
Database Infrastructure Development - Design, develop, and maintain scalable and reliable self-serve database infrastructure, including PostgreSQL, MongoDB, SingleStore, and AWS Aurora.
Big Data Infrastructure Development - Design, develop, and maintain scalable and reliable self-serve big data infrastructure, including cloud-agnostic data lakehouse, distributed databases, and platform-powered data pipelines.
End-to-End Ownership - Take responsibility for the entire lifecycle of data infrastructure, from DevOps (Terraform, Kubernetes) to application-level components (infrastructure libraries, core components in data pipelines).
Architecture Leadership: Collaborate with the Architecture group to define and maintain the companys core architectural vision.
Partner with development teams, providing technical guidance, best practices, and support from the design phase to production deployment.
Innovation and Exploration - Work on diverse and impactful projects across different layers of the tech stack, exploring new technologies and approaches to improve reliability, efficiency, and scalability.
Requirements:
Extensive Experience in software engineering and data infrastructure.
Extensive Expertise in the administration of OLTP and OLAP databases.
Strong knowledge of big data frameworks, including Apache Spark, Athena, Trino, and Iceberg.
Hands-on experience with DevOps tools such as Terraform, Kubernetes, and Cloud infrastructure.
Proficiency in building and managing data pipelines and core infrastructure components.
Strong problem-solving, communication, and leadership skills, with the ability to mentor and collaborate across teams.
If youre a passionate and experienced senior engineer looking to make a significant impact, wed love to hear from you!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8538484
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/02/2026
Location: Ramat Gan
Job Type: Full Time
Were seeking a strategic and collaborative Data Science Group Leader to lead a combined group of data scientists and developers, which is focused on researching, building, and delivering beta and experimental capabilities that explore whats next in analytics, machine learning, and Generative AI. In this role, you will ensure alignment with our organization's vision and product direction.
As a manager, you will cultivate a culture of innovation and accountability, fostering a team that serves as both a creative proving ground for ideas and a reliable pipeline for transitioning validated capabilities into our core products.
You will lead two disciplines through team leaders and senior contributors:
Data Scientists, focused on advanced analytics, ML, and GenAI
Full-Stack Software Engineers responsible for rapidly turning ideas and models into usable, customer-facing lab applications
This role sits at the intersection of AI leadership, hands-on technical guidance, and product innovation, and is designed for a leader who can bridge research and engineering disciplines rather than replace strong specialists.
Requirements:
A minimum of 5+ years in managerial capacity, successfully leading team leads and individual contributors.
At least 7+ years in the field of applied AI, data science, or machine learning roles, demonstrating a proven track record of delivering innovative solutions in predictive analytics and machine learning, with experience in designing and implementing modern Generative AI systems.
Advanced degree in Data Science, Statistics, Computer Science, or equivalent experience
A leader who excels at both building and leading teams, maintaining a strong connection to technical details while effectively navigating ambiguity and fast-paced environments.
Ability to translate business and product needs into practical and impactful AI lab initiatives.
Deep fluency in applied ML and Generative AI, with the ability to credibly engage in software architecture and system design discussions, guiding tradeoffs without needing to own every line of code
Proven experience delivering predictive analytics and ML solutions
Proven experience designing and delivering modern GenAI systems
Proven ability to leverage AI-assisted development practices, including AI coding assistants and agents, to accelerate research, prototyping, and software delivery
Advantages:
Experience in insurance, modeling, or regulated industries
Familiarity with modern AI platforms and ecosystems
External visibility through conference speaking, publications, or community contributions
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8536601
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/02/2026
Location: Ra'anana
Job Type: Full Time
We are seeking a hands-on Data Engineering Team Leader to oversee our real-time data domain and lead the evolution of our data platform as it transitions from a heavy build-out to stability, quality, and scale.

This role is responsible for maintaining, hardening, and extending our event processing infrastructure, focusing on adding targeted real-time processing capabilities to support analytics and informed decision-making, our decisions are deeply data-driven. This critical role is centered on maintaining, hardening, and extending our event processing infrastructure, with a primary focus on adding targeted real-time processing capabilities. Your work will directly support our analytics and enable informed, data-driven business decision-making based on production data. Specifically, you will be responsible for the core data component of capturing the clickstream events from the website, which is the foundational functionality for all our analytics.

Looking forward, this role will help lead the team into its next phase: improving data quality and developer experience, investing in modern table formats (Iceberg), empowering analysts and data scientists, and continuously reducing cost and operational friction.

This is a player-coach role, combining hands-on technical leadership with ownership, prioritization, and mentorship.

What You'll Do
Own the end-to-end real-time and event data domain, including ingestion, processing, and downstream consumption.

Ensure stability, correctness, and observability of existing streaming and near-real-time pipelines.
Lead the design and implementation of select real-time processing components to support analytics and decision-making use cases.
Define clear ownership, SLAs, and best practices for event data usage across the company.
Lead the teams investment in modern data infrastructure
Drive architectural decisions with a long-term view on maintainability, cost, and scalability.
Team Leadership & Execution

Lead, mentor, and support a team of data engineers; set technical standards and review designs and code.
Act as a hands-on contributor in critical areas of the system.
Own planning and prioritization, balancing new investments with platform stability and technical debt.
Promote a culture of quality, documentation, and operational excellence.
Partner with Data Analytics and Data Science teams to improve data accessibility, trust, and self-service capabilities.
Improve observability across the data platform (data quality, freshness, lineage, failures).
Explore and adopt AI-assisted development tools to improve engineering velocity and code quality.
Stay current with emerging technologies in data engineering and analytics, and evaluate their relevance to the platform.
Requirements:
7+ years of experience in Data Engineering, including ownership of production-grade data platforms.
Proven experience in a technical leadership or lead IC role.
Strong hands-on experience with Scala and Python.
Solid experience with AWS, including S3, EC2, EMR, Kinesis, Firehose, Spark
Strong understanding of event-driven and real-time architectures, even in maintenance-heavy environments.
Strong experience with Airflow for orchestration of batch and near-real-time data pipelines, including production operations and troubleshooting.
Deep knowledge of data warehousing and analytics workflows, including SQL.
Ability to work across teams ( DevOps, IT, Dev, Product, and Analytics ) and translate business needs into technical solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8536548
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/02/2026
Location: Ra'anana
Job Type: Full Time
a Senior Data Engineer should have strong communication and collaboration abilities, as they will be working closely with other members of the data and analytics team, as well as other stakeholders, to identify and prioritize data engineering projects and to ensure that the data infrastructure is aligned with the overall business goals and objectives.

What You'll Do
Work closely with data scientists/analytics and other stakeholders to identify and prioritize data engineering projects and to ensure that the data infrastructure is aligned with business goals and objectives
Design, build and maintain optimal data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources, including external APIs, data streams, and data stores.
Continuously monitor and optimize the performance and reliability of the data infrastructure, and identify and implement solutions to improve scalability, efficiency, and security
Stay up-to-date with the latest trends and developments in the field of data engineering, and leverage this knowledge to identify opportunities for improvement and innovation within the organization
Solve challenging problems in a fast-paced and evolving environment while maintaining uncompromising quality.
Implement data privacy and security requirements to ensure solutions comply with security standards and frameworks.
Enhance the team's dev-ops capabilities.
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
2+ years of proven experience developing large-scale software using an object-oriented or functional language.
5+ years of professional experience in data engineering, focusing on building and maintaining data pipelines and data warehouses
Strong experience with Spark, Scala, and Python, including the ability to write high-performance, maintainable code
Experience with AWS services, including EC2, S3, Athena, Kinesis/Firehose Lambda and EMR
Familiarity with data warehousing concepts and technologies, such as columnar storage, data lakes, and SQL
Experience with data pipeline orchestration and scheduling using tools such as Airflow
Strong problem-solving skills and the ability to work independently as well as part of a team
High-level English - a must.
A team player with excellent collaboration skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8536545
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/02/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Applied Data Scientist, you will drive the architectural and algorithmic strategy for our core evaluation engine. You will collaborate closely with our engineering and product teams, and take ownership of complex technical challenges, designing solutions that are contextually efficient, robust, and scalable. You will play a key role in defining our algorithmic roadmap, ensuring our logic is optimized to handle the complexities of modern LLM applications.
Requirements:
Experience communicating complex technical concepts to non-technical stakeholders (Product, Sales, Leadership) and influencing roadmap decisions.
Proven ability to take vague, high-level requirements and break them down into actionable technical roadmaps.
M.Sc. in Computer Science, Mathematics, Physics, or a related quantitative field (Ph.D. is a plus).
At least 6 years of industry experience in Machine Learning or Data Science.
Strong proficiency in Python and the data science stack (e.g.pandas, scikit-learn, pytorch/tensorflow).
Experience with NLP, Large Language Models (LLMs) and agentic frameworks - A Big Plus.
Ability to write high-quality, production-grade code and work within a software engineering workflow (CI/CD, code reviews, testing).
Fast learner with a passion for solving hard algorithmic problems.
Results-oriented mindset, can-do attitude, and a focus on delivering impact and value.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8536277
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו