משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
5 טיפים לכתיבת מכתב מקדים מנצח
נכון, לא כל המגייסים מקדישים זמן לקריאת מכתב מק...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a Data Warehouse Tech Lead to drive the technical vision and execution of our data infrastructure that powers decision-making across.
You'll lead both the technology and the business coordination for our data warehouse - architecting scalable solutions while working closely with stakeholders and data providers to ensure our platform serves the entire organization's needs. This role combines deep technical leadership with strategic business partnership as we build our next-generation data stack.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role you'll:
Lead technical architecture - design and develop scalable data warehouse solutions that support multiple products and serve the entire organization's analytics needs
Manage the technical roadmap - set strategy and guide execution for the Data Warehouse team, ensuring our platform evolves with business requirements
Drive business process coordination - translate business needs into technical requirements while establishing clear data contracts with R&D, Analytics, and external data providers
Establish and implement best practices - set technical standards for data warehouse architecture, performance tuning, and development methodologies that guide the entire team's approach to building scalable data solutions
Create and maintain sustainable data pipelines - build resilient systems capable of handling unstructured data and managing an evolving schema registry across diverse data sources
Implement advanced data modeling - create robust data structures using methodologies like dimensional modeling, and optimize ETL/ELT processes for our semantic layer
Establish data quality standards - build processes for schema evaluation, anomaly detection, and monitoring data completeness and freshness across all sources
Lead cross-team collaboration - work directly with Data Engineers, ML Platform Engineers, Data Scientists, Analysts, and Product Managers to align technical solutions with business goals
Requirements:
7+ years as a BI Engineer or Data Engineer, with 2+ in a technical leadership or architect role
Proven experience managing complex data warehouses that serve multiple products and entire organizations
Strong expertise in data modeling, ELT development, and data warehouse methodologies
Advanced SQL skills and hands-on experience with Snowflake or similar cloud-native data warehouse platforms
Extensive experience with dbt for data transformation and modeling
Python and software development experience (a strong plus)
Excellent communication skills - you can mentor technical team members and explain complex data concepts to business stakeholders
Ready to work in an office environment most days of the week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594850
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Data Group Tech Lead, Staff Engineer to join our Data Platform group. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives us toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role youll:
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
דרישות:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQ המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594845
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced Data Engineer to join our DataWarehouse team in TLV.
In this role, you will play a pivotal role in the Data Platform organization, leading the design, development, and maintenance of our data warehouse. In your day-to-day, youll work on data models and Backend BI solutions that empower stakeholders across the company and contribute to informed decision-making processes all while leveraging your extensive experience in business intelligence.
This is an excellent opportunity to be part of establishing our state-of-the-art data stack, implementing cutting-edge technologies in a cloud environment.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role youll:
Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into the semantic layer
Develop processes to enforce schema evaluation, cover anomaly detection, and monitor data completeness and freshness
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency
Implement best practices for data warehouse and database performance tuning
Conduct thorough testing of data applications and implement robust validation processes
Collaborate with Data Infra Engineers, Developers, ML Platform Engineers, Data Scientists, Analysts, and Product Managers
Requirements:
3+ years of experience as a BI Engineer or Data Engineer
Proficiency in data modeling, ELT development, and DWH methodologies
SQL expertise and experience working with Snowflake or similar technologies
Prior experience working with DBT
Experience with Python and software development, an advantage
Excellent communication and collaboration skills
Ready to work in an office environment most days of the week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594839
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a Senior Business Analyst to help build the technical foundation that powers our cross-sell growth and revenue measurement across all DTC products.
You'll join our Analytics team in Tel Aviv to create the data infrastructure that turns customer journeys into measurable expansion opportunities. As we scale across insurance lines, this role bridges the gap between raw data and the automated systems that drive our AI-powered revenue growth.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role you'll:
Facilitate cross-sell initiatives by building data frameworks (identity resolution, behavioral tagging) that help Company Leads and Product teams identify and measure opportunities across car, pet, and renters products
Automate revenue measurement systems by taking existing MMM, incrementality, and attribution models and building the ETL logic layer that makes them work together autonomously
Drive MarTech and revenue automation through DBT models that ensure seamless data flows between our warehouse and platforms like Google Ads and AppsFlyer
Contribute to partnership systems by supporting the technical rebuild of our partnership data infrastructure for accurate tracking and attribution of third-party leads
Collaborate with Data Platform teams to ensure revenue requirements are reflected in our data stack, acting as the technical bridge between raw data lakes and business analytics
Enable revenue squads by building shared DBT models and centralized tracking systems that Car, Pet, and Renters teams need to hit their DTC goals
Build scalable frameworks that turn analytical concepts into production-grade, automated pipelines within our data warehouse,
Requirements:
Adaptability, drive, and an efficiency mindset - we believe these matter most in human-AI collaboration
5+ years as a Business or Marketing analyst in a technical data role, ideally in fast-paced B2C or FinTech environments
Expert-level SQL skills for writing complex, performant queries and managing large-scale datasets
Strong hands-on experience with ETL processes and data modeling - experience with DBT and Snowflake is a plus
Strong understanding of the advertising ecosystem and how data flows through Google Ads, Meta ads, AppsFlyer, and tag management systems
Ability to take existing analytical concepts and turn them into automated, production-grade pipelines
Solid understanding of revenue measurement concepts like attribution, incrementality, and MMM in a DTC context
Ability to work effectively with Engineering, Product, and Company leadership, while serving as a technical partner to the revenue analytics team-scaling impact by building the shared infrastructure and automated workflows needed to execute their work.
Bachelor's degree in a quantitative field like Mathematics, Statistics, Computer Science, or similar
Ready to work in an office environment most days of the week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594815
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Product Data Analytics Lead
What youre applying for:
We're transforming insurance with AI, and we need a Product Data Analytics Lead who's ready to shape how millions of people experience insurance through data-driven insights!
You'll lead our Growth Product Analytics team - the people who turn user behavior into actionable product opportunities. You'll lead a team of analysts while working closely with Product Managers and other business stakeholders to turn user behavior into actionable insights.
As part of our broader Analytics organization, you'll focus specifically on product analytics while collaborating with other Data and Analytics teams across the company.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role you'll:
Lead product analytics team, while fostering a culture of analytical excellence.
Gain deep understanding of our users behavior and translate complex behavioral patterns into clear, actionable recommendations
Prioritize your team's focus and mentor them to deliver insights with high impact on product decisions.
Partner with Product, Engineering, and business teams to prioritize work that moves the needle
Stay hands-on with strategic analyses while orchestrating your team's delivery on high-impact projects
Guide your team to uncover opportunities that others miss - the insights that become our competitive advantage
Help raise the analytics bar across while contributing to our AI-driven initiatives.
Requirements:
What you'll need
7-10+ years in product analytics, with deep expertise in user behavior analysis, funnel optimization, A/B testing, and KPI development
2-3+ years successfully leading analysts, including senior team members, with a people-first leadership approach
Advanced SQL skills and hands-on experience with modern analytics and AI tools
Excellent stakeholder management-you can communicate complex insights to anyone, from engineers to executives
Genuine enthusiasm for AI and its potential to transform how we understand and serve customers
Ready to work in an office environment most days of the week.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594805
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a talented AI research team lead for the Data Science group, to help us develop advanced AI solutions that will empower impactful projects across. The Data Science group leads the AI research and innovation efforts of the company, and is focused on pushing AI boundaries to enhance the product, optimize processes, and deliver personalized experiences.
Lead a team of AI researchers
Manage and conduct advanced hands-on research projects that will impact 250M+ users
Leverage the latest advancements in LLM customization, multimodal models, knowledge representation, and conversational AI
Build horizontal AI capabilities to be used across
Collaborate with other internal departments to drive impactful data-driven projects.
Requirements:
5+ years of experience in data science, with experience in managing full-cycle projects from initial concept to production deployment
Msc / PhD in Computer Science, Mathematics, Physics, Statistics, or a related field
Comprehensive understanding of machine learning and deep learning principles and techniques, including hands-on experience
Experience in leading a data science / research team.
Specialized expertise in LLMs, Transformers architectures and computer vision (multimodality is an advantage)
Ability to write production-ready code
Publications and talks at leading conferences - an advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8565143
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ramat Gan
Job Type: Full Time
you will be part of our company rem department. our road experience management (rem) is an end-to-end mapping and localization engine. this process leverages advanced algorithms, massive parallelization, and Big Data technologies, creating a highly complex system that demands both a deep understanding of the map creation workflow and strong technical expertise. were looking for a senior data engineer to lead the architecture and development of large-scale, production-grade data pipelines supporting ml inference systems. what will your job look like:
architect and own end-to-end data pipelines for large-scale model inference
design high-throughput, scalable data streaming to the cloud
integrate data conversion into data collection and inference pipelines
drive performance, scalability, and reliability across distributed systems
partner with ml, platform, and infrastructure teams
Requirements:
all you need is: 5+ years of experience as a data engineer in production environments
strong Python expertise
hands-on experience with spark, polars, pandas, duckdb, aws
proven experience designing distributed data architectures
strong understanding of data performance, i/o, and scalability
experience working with ml or inference pipelines .we change the way we drive, from preventing accidents to semi and fully autonomous vehicles. if you are an excellent, bright, hands-on person with a passion to make a difference come to lead the revolution!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8579321
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
27/03/2026
Location: Yokne`am
Job Type: Full Time
join our company outstanding team and contribute to our legacy of innovation in the senior quality data analyst engineer role! at our company, we are dedicated to pushing the boundaries of pioneering technology. the senior data analyst engineer will lead advanced data analysis and optimization for networking products in mass production. the successful candidate will leverage technical expertise and manufacturing experience to deliver actionable insights from complex datasets.
what you'll be doing:
develop automated dashboards, kpis, and reports to monitor production health, yield, TEST results, quality trends and customers satisfaction indicators
identify and drive root cause analysis of anomalies or degradations in key metrics (yield loss, reliability issues, field returns, customers complaints).
collaborate with cross-functional teams in different time zones (such as manufacturing, TEST engineering, r&d, quality and sales).
Requirements:
what we need to see:
bachelors or masters degree in data science, electrical engineering, industrial engineering, Computer Science, or related field or equivalent experience
8+ years of experience in a data analytics role, preferably in a high-volume electronics hardware environment.
5+ years proven experience in data collection and statistical analysis for high-volume products datasets
proficiency in Big Data platforms, and statistical analysis tools; skilled in data visualization (e.g., power BI )
ways to stand out from the crowd:
proven experience with ai-driven tools, llms and Machine Learning
strong integration capabilities
ability to operate independently and influence cross-functional teams without direct authority.
results-oriented, organized, and detail-driven.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594210
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
Location: Yokne`am
Job Type: Full Time
we are looking for an expert data engineer to build and evolve the data backbone for our r&d telemetry and performance analytics ecosystem. responsibilities include processing raw, large quantities of data from live systems at the cluster level: hardware, communication units, software, and efficiency indicators. youll be part of a fast paced r&d organization, where system behavior, schemas, and requirements evolve constantly. your mission is to develop flexible, reliable, and scalable data handling pipelines that can adapt to rapid change and deliver clean, trusted data for engineers and researchers.
what youll be doing:
build flexible data ingestion and transformation frameworks that can easily handle evolving schemas and changing data contracts
develop and maintain ETL /elt workflows for refining, enriching, and classifying raw data into analytics-ready form
collaborate with r&d, hardware, DevOps, ml engineers, data scientists and performance analysts to ensure accurate data collection from Embedded systems, firmware, and performance tools
automate schema detection, versioning, and validation to ensure smooth evolution of data structures over time
maintain data quality and reliability standards, including tagging, metadata management, and lineage tracking
enable self-service analytics by providing curated datasets, apis, and databricks notebooks
Requirements:
what we need to see:
b.sc. or m.sc. in Computer Science, computer engineering, or a related field
5+ years of experience in data engineering, ideally in telemetry, streaming, or performance analytics domains
confirmed experience with databricks and apache spark (pyspark or scala)
understanding of streaming processes and their applications (e.g., apache kafka for ingestion, schema registry, event processing)
proficiency in Python and sql for data transformation and automation
shown knowledge in schema evolution, data versioning, and data validation frameworks (e.g., delta lake, great expectations, iceberg, or similar)
experience working with cloud platforms (aws, gcp, or azure) - aws preferred
familiarity with data orchestration tools (airflow, prefect, or dagster)
experience handling time-series, telemetry, or Real-Time data from distributed systems
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8593679
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Operations Team Lead to lead and scale our data operations function. This role sits at the intersection of data, algorithms, and operations, and is critical to the success of our computer vision and AI models.
You will lead a distributed team responsible for data collection, annotation, quality control, and delivery, while working closely with Algorithm teams to understand model needs, prioritize requests, and ensure reliable, high-quality data pipelines.
This is a hands-on leadership role - you will manage people and processes, but also actively design, build, and improve data workflows.
About The Role
Lead and manage the Data Operations team, including annotation teams in Israel and a large remote team in India
Serve as the primary interface between Data Operations and Algorithm teams: understand model requirements, prioritize tasks, and plan data delivery
Own end-to-end data workflows, from model improvement needs through data definition, annotation guidelines, execution, and delivery
Ensure high data quality through close monitoring, validation, troubleshooting, and root-cause analysis
Design and maintain clear annotation guidelines, documentation, and training materials
Closely manage remote annotation teams, including weekly syncs, hands-on oversight, and deadline management
Own and operate the annotation pipeline using industry tools (e.g., CVAT)
Monitor progress, track performance, and continuously improve efficiency and quality
Own annotation budgets, monthly reporting, and validation of hours and outputs
Evaluate, benchmark, and implement new tools and processes in the data and annotation domain
Work hands-on with scripts, AI tools, and monitoring systems as needed to support data quality and operations.
Requirements:
Proven experience leading teams, preferably including remote or global teams
Strong background in data operations, data quality, or data-centric workflows
Experience working closely with Algorithm / ML / Computer Vision teams
Strong prioritization and execution skills in a fast-paced environment
Hands-on technical mindset, including basic scripting and tool usage
Ability to create clear documentation, guidelines, and training materials
Excellent communication skills and ability to manage multiple stakeholders
High ownership mentality and attention to detail
Nice to Have
Experience managing teams in India
Experience with data annotation tools such as CVAT
Familiarity with Elastic / Kibana or similar monitoring tools
Experience in AI / Computer Vision environments
Experience evaluating and implementing new data tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8562063
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for an Analytics Engineer to join Orcas next-generation big data platform. In this role, you will play a significant part in implementing algorithms and AI-driven logic into our product, helping notify customers about ocean dangers and making maritime voyages safer worldwide.

You will play a central role in building the semantic tables and business logic layer that power advanced product capabilities, real-time notifications, and data-driven decision-making across our platform. Working closely with the Data Science team, you will define high-impact features for their models and integrate algorithms directly into the semantic layer.

This is a temporary position (Maternity leave replacement)

What Youll Do

Design and implement business logic and semantic tables that serve as the foundation for advanced platform features.
Build robust, scalable services that generate real-time notifications for hundreds of ships and fleet managers worldwide.
Deliver high-quality data products that serve operational systems, algorithms, and human-facing applications.
Quickly learn Orcas maritime domain and turn complex data into reliable semantic tables that power product features
Collaborate closely with data scientists, product teams, and engineers to embed algorithms and AI capabilities into production systems.
Be a pivotal contributor in a highly professional, impact-driven data team.
Requirements:
B.Sc. / B.Eng. in Computer Science, Engineering, or an equivalent background.
3+ years of experience as an Analytics Engineer or Data Engineer.
Strong proficiency in Python.
Advanced SQL skills.
Ability to translate complex business logic into scalable, maintainable data models.
Fast learner with the ability to understand complex domains.
Strong team player with a collaborative mindset.

Nice to Have

Experience with cloud-based data solutions.
Hands-on experience with Snowflake.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8565985
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
Job Type: Seniors and Full Time
Welcome to Chargeflow Chargeflow is building the operating system for chargeback management. We help merchants automatically fight and recover disputed payments, reducing revenue loss and operational overhead. Our platform processes large volumes of payment and dispute data across merchants, payment processors, and financial systems. As Chargeflow scales its platform and customer base, data becomes one of the company’s most strategic assets - powering better products, smarter decisions, and operational efficiency.
Who We're Looking For - The Dream Maker We are looking for a Head of Data Analytics to build and lead the company’s data platform and data strategy. The Head of Data will own Chargeflow’s entire data platform, research, and data strategy This includes building the data infrastructure that powers the company, establishing reliable company-wide metrics, enabling data-driven decision making, and turning Chargeflow’s dispute and payments data into a long-term product advantage. This is a foundational leadership role responsible for making data a core capability across the organization. The Head of Data Analytics will work closely with Product, Engineering, Operations, and Leadership to ensure data becomes a central part of how the company builds products and runs the business. ?Your Arena ?Data Platform and Architecture - Design and lead the development of Chargeflow’s modern data infrastructure:
* Building scalable data pipelines across product, payments, disputes, and internal systems
* Developing a centralized data warehouse as the company’s single source of truth
* Ensuring data quality, reliability, and observability
* Establishing data governance and best practices Company Metrics and Decision Intelligence - Define and standardize the core metrics used across the company:
* Establishing a consistent metrics framework across product, revenue, and operations
* Building dashboards used by leadership and operational teams
* Ensuring reliable and accessible reporting across the organization
* Supporting data-driven decision making at the leadership level Product Data and Insights - Partner with Product and Engineering to leverage Chargeflow’s unique dispute and payment data:
* Identifying patterns and insights across disputes and recovery performance
* Enabling data-driven product development
* Supporting predictive capabilities and automation in dispute management
* Helping build product features powered by data insights Operational Intelligence - Provide visibility and insights into dispute operations and business performance:
* Building operational analytics dashboards
* Supporting forecasting and capacity planning
* Identifying automation opportunities
* Improving operational performance through data insights Building the Data Organization - Build and lead the company’s data team:
* Hiring and managing data engineers, analytics engineers, and analysts
* Establishing strong data culture and best practices
* Enabling teams across the company to use data effectively
* Ensuring data privacy, security, and governance standards What Success Looks Like - Within the first 12–18 months:
* Chargeflow operates from a trusted company-wide data platform
* Leadership and teams rely on consistent and reliable metrics
* Product teams leverage dispute data to build data-powered features
* Operational performance becomes clearly measurable and optimized
* A strong Data organization is established and growing
Requirements:
We are looking for a leader who has built and scaled data capabilities in high-growth technology companies. Ideal candidates typically have:
* 8+ years of experience in data, analytics, or data infrastructure leadership
* Experience building and scaling data platforms in high-growth startups
* Strong understanding of modern data stacks and architecture
* Experience part
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8593482
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
מיקום המשרה: אילות
סוג משרה: משרה מלאה
מגייסים תותח/ית data Platform Operations!
מחפשים לקחת Ownership תפעולי מלא על מערכת דאטה מורכבת מבוססת Google Cloud? מקומכם איתנו!
התפקיד כולל אחריות מקצה לקצה על יציבות וזמינות המערכת ב-Production, ניטור data Pipelines, אינטגרציית data Sources חדשים וניהול הרשאות (IAM).
מיקום: חבל אילות (אופציה להיברידית).
דרישות:
מה חשוב שיהיה לכם?
3+ שנות ניסיון ב- data Platform Operations
היכרות מעשית וידיים על המקלדת ב-GCP
ניסיון פיתוח ב- Python
הבנה מעמיקה בתהליכי data Ingestion, ELT, data Quality
היכרות עם עקרונות Security, IAM ו- data Governance המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8573816
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
Location: Herzliya
Job Type: Full Time
Responsibilities
Design, build, and maintain scalable ETL/ELT pipelines to integrate data from diverse sources, optimizing for performance and cost efficiency.
Leverage Databricks and other modern data platforms to manage, transform, and process data
Collaborate with software teams to understand data needs and ensure data solutions meet business requirements.
Optimize data processing workflows for performance and scalability.
Requirements:
5+ years of experience in Data Engineering, including cloud-based data solutions.
Proven expertise in implementing large-scale data solutions.
Proficiency in Python, PySpark.
Experience with ETL / ELT processes.
Experience with cloud and technologies such as Databricks (Apache Spark).
Strong analytical and problem-solving skills, with the ability to evaluate and interpret complex data.
Experience leading and designing data solutions end-to-end, integrating with multiple teams, and driving tasks to completion.
Advantages
Familiarity with either On-premise or Cloud storage systems
Excellent communication and collaboration skills, with the ability to work effectively in a multidisciplinary team.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8593222
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/03/2026
Location: Herzliya
Job Type: Full Time
The Security Models Training team builds and operates the large‑scale AI training and adaptation engines that power Security products, turning cutting‑edge research into reliable, production‑ready capabilities.
As Lead Applied Scientist, you will own end‑to‑end model development for security scenarios, set technical strategy across multiple model efforts and teams, including developing new model architectures, continual pre‑training, task‑focused fine‑tuning, reinforcement learning, and objective, benchmark‑driven evaluation.
You will drive training efficiency and reliability on distributed GPU systems, deepen model reasoning and tool‑use capabilities, and embed Responsible AI, privacy, and compliance into every stage of the workflow. The role is hands‑on and impact‑focused, partnering closely with engineering and product to translate innovations into shipped, measurable outcomes, defining quality gates and readiness criteria across teams, and mentoring senior scientists and engineers to scale results across globally distributed teams.

You will combine strong coding, experimentation, and debugging skills with a systems mindset to accelerate iteration cycles, improve throughput and cost‑effectiveness, and help shape the next generation of secure, trustworthy AI for our customers.
Responsibilities:
Youll work as part of an Applied Science team on high-impact, technically ambitious AI projects that directly shape the future of AI in Cyber security, with ownership for taking advanced research through to production impact.
Technical Leadership & Ownership: set technical direction for major security domain initiatives and align roadmaps across multiple teams; lead security model programs spanning pre‑training, task tuning, reinforcement learning, and evaluation; translate cutting‑edge research into production‑ready capabilities. This role influences portfolio‑level technical tradeoffs, investment prioritization, and long‑term architecture decisions for security models.
Advanced Model Design - Building and customizing deep learning model architectures (e.g., modifying transformer blocks, attention/memory modules, etc.) at the SLM/LLM scale; making principled architectural tradeoffs to improve reliability, robustness, and security‑specific behavior.
Advanced Model Training - Apply deep expertise in pre-training, post-training, and reinforcement learning (RL) for both language and other modalities, including time-series. 
Design & Evaluate Datasets - Build high-quality datasets and benchmarks; define objective evaluation frameworks and quality gates; run ablation studies to measure impact and optimize data and training effectiveness to support confident product decisions.
Develop Data Infrastructure - Create and maintain scalable pipelines for ingestion, preprocessing, filtering, and annotation of large, complex datasets, with attention to privacy, governance, and long‑term reuse across security scenarios. 
Research & Innovation - Collaborate with cross-functional teams to push research and product boundaries, delivering models that make a real-world impact. 
דרישות:
M.Sc. / Ph.D. in Computer Science, Information Systems, Electrical or Computer Engineering or Data Science (Ph.D. strongly preferred). Candidates with M.Sc. / Ph.D. in related fields with proven industry experience or a strong publication record in the areas of LLM, Information Retrieval, Machine Learning, Natural Language Processing, Time Series Forecasting and Deep Learning are considered as well.  
Proven hands-on experience of at least 8 years (including post-grad work) in building and deploying Machine Learning products. Key areas of expertise include Natural Language Processing and Large Language Models, along with an understanding of concepts su המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8567239
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו