משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
להשיב נכון: "ספר לי על עצמך"
שימו בכיס וצאו לראיון: התשובה המושלמת לשאלה שמצ...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Job Type: Full Time
we are looking for a Data Engineer to join our growing BI team. This role goes beyond building pipelines. You will help shape our data platform as a shared product - supporting analytics, reporting, and decision-making across key company data domains such as Product, Sales, HR, and others. Your work will directly influence how stakeholders interact with data today and how the platform evolves in the years ahead.

What Youll Be Doing

Architect & Own: Lead the design and development of scalable data warehouse and BI solutions. You will make early-stage architectural decisions and own their long-term impact.
Infrastructure as a Product: Build core data infrastructure and developer experiences that others rely on, ensuring high availability and system reliability.
End-to-End ELT/ETL: Solve complex integration problems by sourcing data from structured and unstructured sources using Rivery, Python, and optimal ETL patterns.
Data Quality & Governance: Implement frameworks for schema evolution, anomaly detection, and data freshness. You will determine security models based on privacy requirements and evolve governance processes.
Strategic Collaboration: Partner with Engineers, Product Managers, and Data Analysts to conceptualize data needs and represent key insights in a meaningful way.
Optimization: Assist in owning production processes, optimizing complex code through advanced algorithmic concepts to manage operational cost-benefit tradeoffs.
Requirements:
Experience: 5+ years of experience in Data Engineering, Infrastructure, or Platform Engineering (ideally in organizations operating at a meaningful scale).
Technical Mastery: 5+ years of hands-on experience with Python and SQL. Deep proficiency in data modeling (Star/Snowflake schema) and DWH methodologies.
Cloud & Tools: Proven experience with Snowflake and AWS. Familiarity with Rivery or similar orchestration tools (like DBT) is a major advantage.
Production-First Mindset: Track record of leading data initiatives end-to-end from design and building to shipping and operating production flows.
Analytical Rigor: Ability to triage issues, resolve data quality problems, and design systems that handle system complexity with ease.
Education: Bachelors degree in Computer Science, Computer Engineering, or a relevant technical field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595997
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data & AI Project Manager to lead the delivery of AI and Generative AI-based solutions for our clients. In this role, you will own projects end-to-end - from initial scoping and requirements definition through development, deployment, and continuous improvement.
This role is well-suited for a technically oriented project or product manager who enjoys working at the intersection of data, AI, engineering, and business, and who thrives in fast-paced, complex environments.
Responsibilities:
Lead the full lifecycle of AI and data projects, from ideation and design through implementation, deployment, and ongoing optimization.
Translate business and user needs into clear system requirements, technical specifications, and structured delivery plans.
Define project scope, priorities, milestones, and deliverables, ensuring execution on time, within scope, and within budget.
Collaborate closely with development, data, and infrastructure teams to support efficient delivery and resolve technical dependencies.
Drive alignment between technical solutions and business objectives, ensuring delivered solutions generate tangible client value.
Manage multiple stakeholders on both client and internal sides, maintaining clear communication and expectations.
Support the design of intuitive, user-centered AI solutions and workflows.
Identify risks, manage trade-offs, and proactively address challenges throughout the project lifecycle.
Requirements:
Bachelors degree (B.Sc) in Computer Science, Engineering, Information Systems, Data Science, or a related field.
3+ years of experience in project or product management, with hands-on involvement in data, AI, or technology-driven initiatives.
Strong understanding of data architectures, data pipelines, analytics frameworks, and AI-driven systems.
Proven experience delivering AI or Generative AI solutions into production environments.
Ability to translate complex technical concepts into clear, structured communication for non-technical stakeholders.
Experience managing complex, multi-stakeholder projects with strong organizational and execution skills.
Analytical mindset with the ability to make data-driven decisions and assess trade-offs.
Excellent verbal and written communication skills in English.
High level of ownership, independence, and ability to perform in dynamic, fast-paced environments.
Nice to Have:
Background in AI, machine learning, or advanced analytics projects.
Experience working in consulting or client-facing delivery roles.
Familiarity with cloud platforms and modern AI tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595880
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a skilled and motivated Data Engineer with expertise in Elasticsearch, cloud technologies, and Kafka. As a data engineer, you will be responsible for designing, building and maintaining scalable and efficient data pipelines that will support our organization's data processing needs.
The role will entail:
Design and develop data platforms based on Elasticsearch, Databricks, and Kafka
Build and maintain data pipelines that are efficient, reliable and scalable
Collaborate with cross-functional teams to identify data requirements and design solutions that meet those requirements
Write efficient and optimized code that can handle large volumes of data
Implement data quality checks to ensure accuracy and completeness of the data
Troubleshoot and resolve data pipeline issues in a timely manner
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
3+ years of experience in data engineering
Expertise in Elasticsearch, cloud technologies (such as AWS, Azure, or GCP), Kafka and Databricks
Proficiency in programming languages such as Python, Java, or Scala
Experience with distributed systems, data warehousing and ETL processes
Experience with Container environment such AKS\EKS\OpenShift is a plus
high security clearance is a plus
The position is open for all genders as well as people with disabilities.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595875
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a skilled and motivated Data Engineer with expertise in Elasticsearch, cloud technologies, and Kafka. As a data engineer, you will be responsible for designing, building and maintaining scalable and efficient data pipelines that will support our organization's data processing needs.
The role will entail:
Design and develop data platforms based on Elasticsearch, Databricks, and Kafka
Build and maintain data pipelines that are efficient, reliable and scalable
Collaborate with cross-functional teams to identify data requirements and design solutions that meet those requirements
Write efficient and optimized code that can handle large volumes of data
Implement data quality checks to ensure accuracy and completeness of the data
Troubleshoot and resolve data pipeline issues in a timely manner
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
3+ years of experience in data engineering
Expertise in Elasticsearch, cloud technologies (such as AWS, Azure, or GCP), Kafka and Databricks
Proficiency in programming languages such as Python, Java, or Scala
Experience with distributed systems, data warehousing and ETL processes
Experience with Container environment such AKS\EKS\OpenShift is a plus
high security clearance is a plus
The position is open for all genders as well as people with disabilities.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595874
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and hands-on Data Engineer to lead the migration of enterprise data platforms to Google Cloud Platform (GCP).
In this role, you will design, build and maintain scalable ETL/ELT pipelines, develop advanced data models in BigQuery and contribute to the creation of a high-performance, reliable and cost-efficient data architecture.
You will work closely with analysts, data scientists and engineers and have real impact on how data is consumed across the organization.
What You Will Do:
Lead the migration of data from on-premise core systems to Google Cloud Platform (GCP).
Design and develop processed data layers (Silver and Gold) and data marts in BigQuery, including complex business logic.
Build, orchestrate and maintain data pipelines using Cloud Composer / Apache Airflow.
Develop robust data transformations, including cleansing, enrichment and data quality improvements.
Write efficient and optimized SQL queries in BigQuery with strong focus on performance and cost.
Create and maintain clear and up-to-date technical documentation for data architecture and processes.
Requirements:
3+ years of hands-on experience as a Data Engineer.
Strong experience working with Google Cloud Platform (GCP) - mandatory.
Proven experience with BigQuery, including data modeling, complex SQL and performance optimization - mandatory.
Strong Python skills for ETL/ELT and data transformations.
Experience with orchestration and workflow management tools such as Cloud Composer, Apache Airflow or similar.
Experience working with Cloud Storage (GCS) and additional GCP data services such as Cloud SQL, Data Lakes and storage solutions.
Nice to Have:
Experience with GCP streaming technologies such as Cloud Pub/Sub and Dataflow.
Familiarity with Git and CI/CD processes.
Previous experience migrating data from legacy systems such as Mainframe or Oracle to the cloud.
Personal Skills:
Ability to work independently and lead projects end-to-end.
Proactive mindset with strong technical curiosity and continuous learning attitude.
Strong collaboration skills and ability to work with cross-functional teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595873
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Ra'anana
Job Type: Full Time
we are looking for a skilled Data Center Engineer to join our Lab team. The ideal candidate will be responsible for maintaining, monitoring, and optimizing our data center environments to ensure high availability, resiliency, and performance. This role involves hands-on technical work, troubleshooting hardware and network issues, and collaborating with cross-functional teams to support mission-critical systems. You will report to the Lab Manager.
Key Responsibilities
Data Center Operations
Perform installation, configuration, and maintenance of servers, network equipment, storage systems, and cabling.
Monitor facility systems, including power, cooling, fire suppression, and physical security.
Conduct routine inspections to ensure environmental and operational stability.
Manage data center capacity planning (space, power, cooling).
Hardware & Infrastructure Support
Troubleshoot and resolve hardware failures for servers, routers, switches, and other equipment.
Coordinate with vendors for RMA processes and hardware lifecycle management.
Perform firmware upgrades and hardware refresh activities.
Network & Systems Support
Support deployment and maintenance of network infrastructure (LAN/WAN, routing, switching).
Assist with installation and configuration of OS and virtualization technologies (Linux, VMware, Hyper-V, etc.).
Work with IT/security teams to ensure compliance and secure configuration of data center assets.
Monitoring & Incident Response
Monitor systems and respond to alerts to ensure uptime and SLA adherence.
Maintain accurate documentation of incidents, configurations, and procedures.
Process & Documentation
Follow and improve standard operating procedures.
Maintain asset inventory, rack diagrams, and documentation.
Support audits and compliance requirements.
Requirements:
Must-Have
2-5+ years of experience in data center operations or infrastructure engineering (adjust depending on level).
Strong understanding of server hardware, cabling standards, and rack/stack procedures.
Experience with network fundamentals (TCP/IP, VLANs, routing & switching).
Ability to lift and move equipment (up to ~25 kg / 50 lbs if required).
Knowledge of monitoring tools and ticketing systems.
Excellent troubleshooting and problem-solving skills.
Nice-to-Have
Experience with virtualization platforms (VMware, KVM, Hyper-V).
Knowledge of Linux/Windows system administration.
Familiarity with Data Center Infrastructure Management (DCIM) tools.
Experience in cloud or hybrid environments.
Certifications: CompTIA Server+, Network+, CCNA, or equivalent.
Soft Skills
Strong communication and documentation abilities.
Ability to work independently and collaboratively.
High attention to detail and operational discipline.
Ability to perform in high-pressure, mission-critical environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595763
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Herzliya
Job Type: Full Time
We are searching for an innovative and experienced Data Engineer that will join us and be part of our reference and alternative data team in our data group.
As a Data Engineer, you will:
Be a part of a cross functional team of data and backend engineers.
Be responsible for ingesting financial data and providing it over numerous APIs in close collaboration with algorithmic teams and other partners.
Lead the architecture, planning, design and development of missin-critical and diverse data pipelines over both public and on-prem cloud solutions.
Requirements:
At least 5 years of experience working as a Data Engineer
At least 5 years of experience working in python development with emphasis on data analysis tools such as NumPy, pandas, polars, Jupyter notebook.
Hands-on experience working with AWS data processing tools and concepts.
Proven understanding in designing, developing and optimizing complex solutions.
Proven experience with the following technologies: Neo4j, MongoDB, Redis, Snowflake
Experience with Docker, Linux, CI/CD tools and concepts, Kubernetes.
Experience with data pipelining tools such as Airflow, Kubeflow or similar.
BSc / MSc degree in Computer Science/ Engineering / Mathematics or Statistics.
Advantages:
Hands-on experience with DataBricks platform.
Experience working on large scale and complex on-premises systems.
Hands-on experience in lower-level programming languages such as C++ or RUST
Familiarity with Capital markets and basic economics knowledge.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595621
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior BI Data Engineer to join our BI team and take end-to-end ownership of high-impact analytics foundations. This role sits at the core of how our company measures success, makes decisions, and scales - turning raw data into trusted, business-critical insights used across Product, GTM, and Finance.
Youll design and evolve data models, pipelines, and the BI layer, work closely with Data Science and business stakeholders, and help raise the bar for analytics engineering across the company.
Hands-on experience using GenAI to improve analytics engineering workflows, automate development processes, and increase delivery speed is a must for this role.
This role is based in Tel Aviv. We work in a hybrid model, with 3 days a week in the office.
This might be for you if:
You enjoy owning data foundations end to end - from raw data to semantic layers
You like turning ambiguous business questions into clear, governed metrics
You care about data quality, performance, and trust at scale
You enjoy mentoring, setting standards, and leading by example
You actively leverage AI tools to improve development speed and analytical accuracy.
Requirements:
5+ years of experience in BI / Data Engineering roles with ownership of scalable data platforms
Deep experience with modern data stacks (Snowflake or Databricks, dbt)
Advanced SQL and Python skills, including data quality, CI/CD, and observability
Strong understanding of dimensional modeling, data warehousing, and semantic layers
Experience with orchestration tools (Airflow) and large-scale data processing
Proven experience using GenAI tools as part of your day-to-day development workflow
A strong builder mindset, business orientation, and ability to lead cross-functional initiatives
Nice to have:
Experience with streaming technologies (Kafka, Spark).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595429
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Software Engineer (Data Platforms) to join the Users & Integrations team within our companys Intelligence Group. This role is built for an experienced engineer who thrives on solving complex backend challenges and scaling data pipelines.
In this role, you will take ownership of crucial user data integrations and architect the sophisticated matching logic that powers our platform from data ingestion and transformation to delivery. You will work extensively with large-scale data pipelines, translate complex algorithms into high-performance production code, and tackle massive scalability challenges to enhance the data experience for our companys customers
Where does this role fit in our vision?
Every role at our company is designed with a clear purpose. At our company, data is everything; its at the heart of everything we do. The Intelligence Group is responsible for shaping the experience of hundreds of thousands of users who rely on our data daily.
The Users Team is the engine behind our companys data connectivity, handling massive-scale user data integrations and engineering complex entity-matching logic. By translating millions of data signals and advanced algorithms into high-performance pipelines, we ensure users receive highly accurate, tailored data - optimizing their overall experience while driving the core KPIs of our Intelligence Group.
What will you be responsible for?
Designing, building, and maintaining robust, scalable ETL/ELT data pipelines and integration solutions within our companys Databricks-based environment.
Implementing and optimizing algorithms for data processing and entity resolution with a strong emphasis on delivering high-quality, high-throughput data.
Deploying data infrastructure leveraging technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Designing innovative data solutions that support millions of data points, at high performance and massive scale.
Requirements:
What we look for:
3+ years of software engineering experience building scalable backend systems
Experience scaling big data pipelines, complex data integrations, and robust data infrastructure.
Expertise in big data technologies, including Spark (or Databricks), Kafka (or other real-time streaming tools), and workflow orchestrators like Airflow.
Experience using GenAI tools for software development (such as Cursor, Claude Code, Codex, etc).
A strong builder mindset, with experience turning ideas into working solutions
Algorithmic experience, including developing and optimizing machine learning models and implementing advanced data algorithms.
Experience working with cloud ecosystems, preferably AWS (S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP).
Expertise in extracting, ingesting, and transforming large datasets efficiently.
A passion for sharing knowledge, fostering a supportive engineering culture, and engaging in collaborative problem-solving with your peers.
Bonus Points:
Hands-on experience working with Vector Databases and embedding techniques, with a focus on search, recommendations, and personalization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595416
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Petah Tikva
Job Type: Full Time
We are seeking a Director of Analytics to lead analytics-driven growth initiatives across our business. This role is highly strategic and hands-on, combining customer-facing leadership, business optimization, and advanced analytics.
The Director of Analytics will partner closely with Customer Success Managers (CSMs), Sales Leaders, the CRO and the leadership team to drive revenue growth, customer acquisition, as well as performance optimization across pre-sales and post-sales.
This role requires an exceptional communicator with strong business acumen, deep analytical capabilities, and proven experience leading high-performing analytics teams in a complex data environment.
Responsibilities:
Lead analytics-driven growth initiatives focused on customer acquisition, expansion, and monetization.
Partner closely with the CRO, Sales leadership, and Customer Success Managers to identify growth levers, support customers analytics requests, and optimize revenue and Conversion rates performance.
Manage and develop a high-performing analytics team, setting clear expectations for analytical rigor, problem definition, and insight storytelling
Coach and mentor analysts with a strong focus on business impact and customer-facing insight delivery
Act as a customer-facing analytics leader, participating in QBRs, executive reviews, and strategic customer conversations
Own the customer and sales data domain, including metric definitions, data consistency, and integrity
Proactively identify, troubleshoot, and resolve data quality and data reliability issues in partnership with BI, R&D, and Product
Work closely with Product and R&D teams to influence roadmap decisions through data and customer insights
Define KPIs, dashboards, and measurement frameworks that align analytics output with business outcomes
Independently initiate and lead cross-functional analytics projects from concept through delivery and impact measurement
Support pre-sales processes with analytics, including customer insights, benchmarking, ROI and value modeling, and data-backed sales narratives.
Support post-sales processes with analytics, including customer monitoring, proposition suggestion and optimization analysis.
Requirements:
8+ years of experience in analytics roles
3+ years of proven experience in leading analytics teams, with a focus on mentoring, developing, and enhancing team capabilities
Experience in B2B companies, preferrably from the eCommerce, SaaS, or marketplace industries
Strong experience working with Sales, Customer Success, and CRO organizations
Demonstrated customer-facing experience (QBRs, exec presentations, strategic discussions)
Excellent English (verbal and written), able to communicate complex insights clearly
Strong business acumen with clear understanding of revenue drivers and the ability to influence senior stakeholders
Advanced analytical skills (funnels, cohorts, segmentation, modeling, product analytics)
Strong SQL experience with a solid understanding of databases
Familiarity with BI and visualization platforms experience (Qlik Sense, Tableau, Power BI, or similar)
Deep understanding of customer and revenue data models and KPIs
Proven experience solving data quality and data integrity challenges
Ability to work closely with Product, R&D, and Tech teams
Independent, proactive, and comfortable leading ambiguity in a fast-paced and dynamic environment, without heavy direction
Experience supporting pre-sales analytics or ROI/value modeling
Background in product analytics or experimentation -Advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595385
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
looking for a Data Engineer to help build and scale our analytics data infrastructure. In this role, you will work closely with analysts and business stakeholders to design reliable data models and support the development of a centralized semantic layer used across the company.

You will play a key role in improving the structure, reliability, and usability of our data stack. This includes building and maintaining dbt models, supporting data pipelines, and ensuring analysts have access to clean, well-documented, and consistent data.

This role is ideal for someone who enjoys working at the intersection of data engineering and analytics - translating business needs into scalable data models and enabling teams to move faster with trusted data.

Responsibilities

Design and implement data models that support analytics across key business domains such as GTM, CX, and Finance
Build and maintain transformation workflows using dbt
Work closely with analysts to translate business questions into scalable and reusable data models
Help define and implement a structured semantic layer that enables consistent metrics across the company
Improve the reliability and clarity of the analytics data stack by centralizing logic into well-designed data models
Support the ingestion and transformation of data from various sources using tools such as Fivetran and Airbyte
Contribute to improving data quality, monitoring, and documentation practices
Help establish best practices for analytics modeling and data usage across teams
Actively leverage AI tools (e.g. Cursor, LLM-based assistants) to improve development speed, data modeling, and data workflows
Requirements:
2-4 years of experience in bi/data engineering, analytics engineering or a similar role.
Strong SQL skills and experience working with modern data warehouses.
Experience building and maintaining data models for analytics.
Familiarity with modern data stack tools such as dbt, Snowflake/Bigquery, Fivetran/Rivery, or similar.
Experience collaborating with analysts or BI teams.
Familiarity with Python for data-related tasks (scripting, automation, or tooling).
Hands-on experience using AI tools (e.g. Cursor, LLMs) as part of day-to-day development workflows.
Strong problem-solving skills and the ability to work in evolving data environments.
Clear communicator who can work effectively with both technical and non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595374
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Product Data Director to lead our global product data strategy and governance. In this role, you will own the vision, implementation, and continuous improvement of how data is captured, structured, and leveraged across our product ecosystem. Youll partner closely with Product, Engineering, BI, and Operations teams to ensure data excellence, enable data-driven decision-making, and unlock insights that enhance our products, customer experience, and business growth.
Responsibilities:
Develop and execute a comprehensive product data strategy aligned with our business goals.
Define and own data standards, taxonomy, and governance frameworks across all product lines.
Partner with Product, BI, and Engineering to design scalable data models and infrastructure supporting product analytics, reporting, and performance tracking.
Lead a cross-functional data excellence initiative, ensuring high-quality, accurate, and consistent data.
Work with product managers to embed data-driven decision-making in product planning, experimentation, and optimization processes.
Oversee the implementation of data instrumentation and tracking for new product launches.
Collaborate with the Analytics and Data Engineering teams to ensure data availability and reliability across systems.
Requirements:
8+ years of experience in data strategy, analytics, or product data management, ideally within fintech, SaaS, or payments.
Proven track record of building and scaling product data frameworks and governance models.
Strong understanding of data architecture, pipelines, and data quality management.
Experience working with data visualization tools (e.g., Tableau, Power BI, Looker) and SQL-based analysis.
Excellent stakeholder management and communication skills, with the ability to influence senior leaders.
Bachelors or Masters degree in Data Science, Computer Science, Information Systems, or related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595035
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are seeking an experienced and visionary Data Science Team Lead to join our dynamic technology organization. The successful candidate will lead a team of talented data scientists, driving innovation and delivering business value through advanced machine learning techniques and Generative AI solutions. This role requires a strategic thinker with hands-on expertise in both traditional and cutting-edge data science methodologies, exceptional leadership skills, and a passion for continuous learning and development.
Responsibilities:
Lead, mentor, and develop a team of data scientists both methodologically and technically to deliver high-impact projects aligned with our business objectives.
Design, implement, and optimize classic machine learning models to solve complex business problems.
Apply gradient boosting techniques (e.g., XGBoost, LightGBM, CatBoost) to enhance predictive accuracy and model robustness.
Drive the exploration and integration of Generative AI applications, including Large Language Models (LLMs) to create innovative solutions for our products and services.
Collaborate with cross-functional teams (engineering, product, business) to translate business requirements into actionable data science projects.
Establish best practices for model development, validation, deployment, and monitoring in production environments.
Promote a data-driven culture, encouraging experimentation, sharing of knowledge, and adoption of state-of-the-art technologies.
Communicate project progress, insights, and results to stakeholders at all levels of the organization.
Requirements:
Bachelors or Masters degree in Computer Science, Mathematics, Statistics, Data Science, or related field; a PhD is an advantage.
5+ years of experience in data science roles, with at least 2 years in a team lead position.
Proven expertise in classic machine learning algorithms and techniques, including regression, classification, clustering, and feature engineering.
Extensive hands-on experience with gradient boosting frameworks such as XGBoost, LightGBM, and CatBoost.
Demonstrated success in designing, deploying, and scaling Generative AI applications (e.g., LLMs) in real-world scenarios.
Strong programming skills in Python and proficiency with data science libraries (scikit-learn, pandas, NumPy, TensorFlow, PyTorch).
Experience with cloud platforms (AWS, Azure) and MLOps tools for model deployment and monitoring in production.
Knowledge and experience in Databricks and Spark.
Excellent leadership, communication, and stakeholder management skills.
Ability to thrive in a fast-paced, collaborative, and innovative environment.
Experience in handling big data of billions of observations.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595026
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/03/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Product Manager to join our fast-growing AI & Data Science group. Reporting to our AI Product Director, you will be supporting the team building products for the next era of AI & Agentic Ecommerce - Where AI agents discover, negotiate, and purchase on behalf of users. You will lead the development of agent-ready experiences, ensuring interoperability with industry protocols, while driving measurable conversion, fraud, and cost outcomes for merchants. The ideal candidate will have a strong background in product management, a deep understanding of AI technologies, a passion for data driven insights, and the ability to translate complex technical concepts into market-leading products.
Responsibilities:
Define and execute the product roadmap for AI & Data Science solutions, aligning with company goals and market needs.
Drive the development and strategic vision for agentic commerce, agent-aware checkout and payment flows. Ensure protocol interoperability and compliance with leading industry standards such as AP2, Visa Trusted Agent Protocol, and Mastercard Agent Pay. Incorporate robust risk and trust signal mechanisms like agent verification, velocity controls, anomaly detection, and human-in-the-loop reviews, all while upholding Responsible AI guardrails.
Collaborate with cross-functional teams, including data science, engineering and commercial teams to bring products from conception to launch.
Conduct market research and competitive analysis to identify trends, opportunities, and challenges in the AI & Agentic Commerce space.
Work closely with data scientists and engineers to define product requirements, prioritize features, and ensure technical feasibility.
Develop and implement go-to-market strategies, ensuring products meet user needs and achieve commercial success.
Monitor product performance, gather feedback from users and stakeholders, and iterate quickly to enhance product offerings.
Stay abreast of advancements in AI & Agentic technologies to continually innovate and improve our product portfolio.
Requirements:
Bachelor's degree or higher in Computer Science, Engineering, Business, or related field.
Minimum of 5 years of experience in product management, preferably in the fintech sector, and managing payment products.
Strong understanding of AI and machine learning technologies, including LLM and agents, and their application in solving business problems.
Proven track record of developing and launching successful products.
Excellent communication and interpersonal skills, with the ability to work effectively with technical and non-technical teams.
Strong analytical and problem-solving skills, with a data-driven approach to decision making.
Ability to thrive in a fast-paced, dynamic environment, managing multiple projects and priorities.
Nice to Have:
Familiarity with Google AP2, Visa Trusted Agent Protocol, Mastercard Agent Pay, or adjacent standards.
Familiarity with Databricks, feature stores, vector DBs, MLflow/MLOps, BI, real-time events and big data.
Prior experience with payment stacks: knowledge of tokenization, card-present vs. card-not-present, 3DS2/SCA, network rules, APMs, and settlement/reconciliation basics.
Prior experience with fraud stacks, dispute resolution, and model governance in regulated environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8595019
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a Data Warehouse Tech Lead to drive the technical vision and execution of our data infrastructure that powers decision-making across.
You'll lead both the technology and the business coordination for our data warehouse - architecting scalable solutions while working closely with stakeholders and data providers to ensure our platform serves the entire organization's needs. This role combines deep technical leadership with strategic business partnership as we build our next-generation data stack.
We believe three things matter for every role: drive to push through challenges, efficiency that keeps standards high while moving fast, and adaptability that lets you pivot with data and AI insights. These aren't buzzwords, they're how we actually work.
Our AI-first approach isn't just a tagline either. We're building the future of insurance with AI at the center, and we need people who are genuinely excited to learn and grow alongside these tools.
In this role you'll:
Lead technical architecture - design and develop scalable data warehouse solutions that support multiple products and serve the entire organization's analytics needs
Manage the technical roadmap - set strategy and guide execution for the Data Warehouse team, ensuring our platform evolves with business requirements
Drive business process coordination - translate business needs into technical requirements while establishing clear data contracts with R&D, Analytics, and external data providers
Establish and implement best practices - set technical standards for data warehouse architecture, performance tuning, and development methodologies that guide the entire team's approach to building scalable data solutions
Create and maintain sustainable data pipelines - build resilient systems capable of handling unstructured data and managing an evolving schema registry across diverse data sources
Implement advanced data modeling - create robust data structures using methodologies like dimensional modeling, and optimize ETL/ELT processes for our semantic layer
Establish data quality standards - build processes for schema evaluation, anomaly detection, and monitoring data completeness and freshness across all sources
Lead cross-team collaboration - work directly with Data Engineers, ML Platform Engineers, Data Scientists, Analysts, and Product Managers to align technical solutions with business goals
Requirements:
7+ years as a BI Engineer or Data Engineer, with 2+ in a technical leadership or architect role
Proven experience managing complex data warehouses that serve multiple products and entire organizations
Strong expertise in data modeling, ELT development, and data warehouse methodologies
Advanced SQL skills and hands-on experience with Snowflake or similar cloud-native data warehouse platforms
Extensive experience with dbt for data transformation and modeling
Python and software development experience (a strong plus)
Excellent communication skills - you can mentor technical team members and explain complex data concepts to business stakeholders
Ready to work in an office environment most days of the week
Enthusiasm about learning and adapting to the exciting world of AI - a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8594850
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו