דרושים » דאטה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
3 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are a Data and innovation team operating under the CTO group. We lead cutting-edge initiatives in data technologies and strategic innovation projects.

Our mission is to explore and implement new technologies and enrich the companys internal data assets through smart collection, integration, and automation.

We move fast, work across multiple domains, and maintain a culture that values curiosity, ownership, and impact.

This is an on-site position.



Responsibilities

Take end-to-end ownership of data pipelines: from extraction (web scraping, APIs), through transformation and orchestration, to delivering accessible and valuable datasets.
Integrate new and external data sources into the companys internal platforms.
Solve real-time issues and optimize pipeline performance through smart automation.
Collaborate with cross-functional teams to improve access to high-quality, structured data.
Work on multiple projects simultaneously in a dynamic and agile environment.
Lead and contribute to early-stage innovation projects directly impacting business strategy.
Requirements:
2+ years of hands-on Python development (ETL, scripting, automation).
Strong knowledge of SQL and ability to work independently with relational databases.
Experience building and maintaining ETL workflows and orchestrating data processes.
Familiarity with scraping tools/frameworks (e.g., requests, Selenium, BeautifulSoup).
Ability to manage multiple tasks and projects independently and efficiently.
A genuine love for technology, a curiosity to explore new tools, and an eagerness to learn.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8249956
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
19/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking talented and passionate Data Engineer to join our growing Data team. In this pivotal role, you will be instrumental in designing, building, and optimizing the critical data infrastructure that underpins innovative creative intelligence platform. You will tackle complex data challenges, ensuring our systems are robust, scalable, and capable of delivering high-quality data to power our advanced AI models, customer-facing analytics, and internal business intelligence. This is an opportunity to make a significant impact on our product, contribute to a data-driven culture, and help solve fascinating problems at the intersection of data, AI, and marketing technology.
Key Responsibilities
Architect & Develop Data Pipelines: Design, implement, and maintain sophisticated, end-to-end data pipelines for ingesting, processing, validating, and transforming large-scale, diverse datasets.
Manage Data Orchestration: Implement and manage robust workflow orchestration for complex, multi-step data processes, ensuring reliability and visibility.
Advanced Data Transformation & Modeling: Develop and optimize complex data transformations using advanced SQL and other data manipulation techniques. Contribute to the design and implementation of effective data models for analytical and operational use.
Ensure Data Quality & Platform Reliability: Establish and improve processes for data quality assurance, monitoring, alerting, and performance optimization across the data platform. Proactively identify and resolve data integrity and pipeline issues.
Cross-Functional Collaboration: Partner closely with AI engineers, product managers, developers, customer success and other stakeholders to understand data needs, integrate data solutions, and deliver features that provide exceptional value.
Drive Data Platform Excellence: Contribute to the evolution of our data architecture, champion best practices in data engineering (e.g., DataOps principles), and evaluate emerging technologies to enhance platform capabilities, stability, and cost-effectiveness.
Foster a Culture of Learning & Impact: Actively share knowledge, contribute to team growth, and maintain a strong focus on how data engineering efforts translate into tangible product and business outcomes.
Requirements:
3+ years of experience as a Data Engineer, building and managing complex data pipelines and data-intensive applications.
Solid understanding and application of software engineering principles and best practices. Proficiency in a relevant programming language (e.g., Python, Scala, Java) is highly desirable.
Deep expertise in writing, optimizing, and troubleshooting complex SQL queries for data transformation, aggregation, and analysis in relational and analytical database environments.
Hands-on experience with distributed data processing systems, cloud-based data platforms, data warehousing concepts, and workflow management tools.
Strong ability to diagnose complex technical issues, identify root causes, and develop effective, scalable solutions.
A genuine enthusiasm for tackling new data challenges, exploring innovative technologies, and continually expanding your skillset.
A keen interest in understanding how data powers product features and drives business value, with a focus on delivering results.
Excellent ability to communicate technical ideas clearly and work effectively within a multi-disciplinary team environment.
Advantages:
Familiarity with the marketing/advertising technology domain and associated datasets.
Experience with data related to creative assets, particularly video or image analysis.
Understanding of MLOps principles or experience supporting machine learning workflows.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8223467
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Senior Data Engineer on our Business Analytics & Business Intelligence Group in Tel Aviv Office, youll play a vital role on all aspects of the data, from ETL processes to optimizing our Business data models and infrastructure.
How youll make an impact:
As a Senior Data Engineer, youll bring value by:
Have end to end ownership: Design, develop, deploy, measure and maintain our Business data infrastructure, ensuring high availability, high scalability and efficient resource utilization.
Lead the BI Engineering domain, including mentoring BI Engineers, establishing best practices, and driving BI engineering strategy.
Automate data workflows to streamline processes, reducing manual efforts and improving accuracy of our BI infrastructure..
Collaborate with other departments (e.g., IT/Data, R&D, Product, IS) to ensure seamless integration of BI tools with other systems and data models.
Work closely with the BI Development team to develop, maintain, and automate operational and executive-level reports and dashboards that offer actionable insights.
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our data pipelines.
Excellent communication and collaboration skills to work across teams.
Requirements:
To thrive in this role, youll need:
Minimum 5 years of experience in a Data Engineering role, working with large scale data.
Excellent coding skills in Java and Python
Experience with Data orchestration tools such as Airflow, or similar.
Experience with designing, developing, maintaining scalable and efficient data pipelines & models.
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries.
Deep understanding of strong Computer Science fundamentals: object-oriented design and data structures systems.
Leading Skills Able to technically lead and mentor other team members on best practices
Strong self-learning capabilities
Excellent attention to details and the ability to remain organized
Strong communications skills, verbal and written
Ability to work in a dynamic environment, with a high level of agility to changing circumstances and priorities
Bonus points if you have:
Experience working in online businesses, especially online advertising, preferably ad-tech.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8205336
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
04/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
What You'll Do:
Shape the Future of Data- Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
Bonus points:
Hands-on experience with our stack: Databricks, Delta Lake, Kafka, Docker, Airflow, Terraform, and AWS.
Experience in building self-serve data platforms and improving developer experience across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8204175
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
16/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for a Data Engineer to join our team and play a key role in designing, building, and maintaining scalable, cloud-based data pipelines. You will work with AWS (Redshift, S3, Glue, Managed Airflow, Lambda) to integrate, process, and analyze large datasets, ensuring data reliability and efficiency.
Your work will directly impact business intelligence, analytics, and data-driven decision-making across the
What Youll Do:
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Data Modeling: Design and maintain logical and physical data models to support business needs.
Optimization & Scalability: Improve process efficiency and optimize runtime performance to handle large-scale data workloads.
Collaboration: Work closely with BI analysts and business stakeholders to define data requirements and functional specifications.
Monitoring & Troubleshooting: Ensure data integrity and reliability by proactively monitoring pipelines and resolving issues.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs.
Requirements:
Education & Experience:
BSc in Computer Science, Engineering, or equivalent practical experience.
3+ years of experience in data engineering or related roles.
Technical Expertise:
Proficiency in Python for data engineering and automation.
Experience with Big Data technologies such as Spark, Databricks, DBT, and Airflow.
Hands-on experience with AWS services (S3, Redshift, Glue, Managed Airflow, Lambda)
Knowledge of Docker, Terraform, Kubernetes, and infrastructure automation.
Strong understanding of data warehouse (DWH) methodologies and best practices.
Soft Skills:
Strong problem-solving abilities and a proactive approach to learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
Nice to Have (Big Advantage):
Experience with JavaScript, React, and Node.js.
Familiarity with K8s for infrastructure as code.
Experience with Retool for internal tool development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219367
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206357
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
the leading global financial news & data platform, is on the lookout for a Senior Data Engineer to join our growing data team!
In this role, you will design and develop scalable data solutions, optimize data workflows, and support critical business processes. You will work with a variety of databases and big data tools in a cloud environment, focusing on data modeling, governance, and analytics.

What youll be doing:
Design, develop, and maintain end-to-end ETL pipelines, from gathering business requirements to implementation.
Work with multiple database technologies, especially BigQuery.
Optimize data models (DWH, fact & dimension tables, RI, SCDs) for performance and scalability.
Implement data governance best practices and maintain comprehensive documentation.
Utilize Big Data tools in cloud environments (GCP preferred).
Develop and support complex business workflows and data processes.
Design and implement monitoring systems to ensure data quality throughout the pipeline.
Workflow orchestration using Apache Airflow.
Collaborate with analysts and stakeholders to ensure high-quality data for business insights.
Support and optimize Tableau infrastructure for data visualization.
Requirements:
4+ years of experience in Data Engineering.
Strong SQL skills and expertise in BigQuery or similar databases.
4+ years of Python experience for data processing and automation.
Proven experience in designing complex business workflows and data processes.
Deep understanding of data modeling principles and best practices.
Hands-on experience with cloud-based big data tools (GCP preferred).
Must have experience with Apache Airflow for orchestrating data workflows.
Strong analytical skills with the ability to translate business needs into technical solutions.
Experience with Tableau infrastructure management is an advantage.
Excellent communication skills and ability to work cross-functionally.
Nice to have:
Familiarity with streaming data frameworks (Kafka, Pub/Sub).
If you are passionate about data, scalability, and building efficient solutions, wed love to hear from you! Apply now and be part of our data-driven journey.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8239782
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
At UVeye, we are on a mission to redefine vehicle safety and reliability on a global scale. Founded in 2016, we have pioneered the world's first fully automated suite of vehicle inspection systems. At the heart of this innovation lies our advanced AI-driven technology, representing the pinnacle of machine learning, GenAI, and computer vision within the automotive sector. With close to $400 million in funding and strategic partnerships with industry giants such as Amazon, General Motors, Volvo, and CarMax, UVeye stands at the forefront of automotive technological advancement. Our growing global team of over 200 employees is committed to creating a workplace that celebrates diversity and encourages teamwork. Our drive for innovation and pursuit of excellence are deeply embedded in our vibrant company culture, ensuring that each individual's efforts are recognized and valued as we unite to build a safer automotive world.
We are looking for an experienced Senior Data Engineer to join our Data team. In this role, you will lead and strengthen our Data Team, drive innovation, and ensure the robustness of our data and analytics platforms.
A day in the life and how you’ll make an impact:
* Design and develop high-performance data pipelines and ETL processes to support diverse business needs.
* Work closely with business intelligence, sales, and other teams to integrate data solutions, ensuring seamless alignment and collaboration across functions.
* Continuously improve our data analytics platforms, optimizing system performance while ensuring a robust and reliable data infrastructure.
* Oversee the entire data lifecycle, from infrastructure setup and data acquisition to detailed analysis and automated reporting, driving business growth through data-driven insights.
* Implement robust data quality checks, monitoring mechanisms, and data governance policies to maintain data integrity and security, troubleshooting and resolving any data-related issues efficiently.
Requirements:
* B.Sc. in computer science/information systems engineering
* 5+ years of experience in data engineering (Preferably from a startup company)
* Familiarity with data engineering tech stack, including ETL tools (Airflow, Spark, Flink, Kafka, Pubsub).
* Strong SQL expertise, working with various databases (relational and NoSQL) such as MySQL, FireStore, Redis, and ElasticSearch.
* Experience with cloud-based data warehouse solutions like BigQuery, Snowflake, and Oracle, and proficiency in working with public clouds (AWS/GCP).
* Coding experience with Python
* Experience with dashboard tools.
* Ability to communicate ideas and analyze results effectively, both verbally and in writing.

Why UVeye: Pioneer Advanced Solutions: Harness cutting-edge technologies in AI, machine learning, and computer vision to revolutionize vehicle inspections. Drive Global Impact: Your innovations will play a crucial role in enhancing automotive safety and reliability, impacting lives and businesses on an international scale. Career Growth Opportunities: Participate in a journey of rapid development, surrounded by groundbreaking advancements and strategic industry partnerships.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8155581
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek a Director of Data to join us and lead our data group.
As our Director of Data, you will be a key member of our R&D leadership team. You will be responsible for developing and executing a data strategy that aligns with our business goals, overseeing data management, analytics, and validation, and ensuring data integrity at every stage of product development and production.
A day in the life and how youll make an impact:
Define and execute a strategic data roadmap aligned with business objectives, fostering a data-driven culture and leading a high-performing team of data engineers, scientists, and analysts.
Establish robust data validation frameworks, ensuring product integrity and accuracy through all stages, from data acquisition to end-user delivery.
Build and optimize scalable data infrastructure and pipelines to support our data needs and ensure data security, compliance, and accessibility.
Collaborate with product and engineering teams to create and launch data-driven products, ensuring they are built on reliable data and designed to meet customer needs.
Guide the team in generating actionable insights to drive business decisions and product innovation in areas such as personalization, marketing, and customer success.
Implement data governance policies and maintain compliance with industry regulations and best practices.
Requirements:
10+ years of experience in data-related roles, with at least 5 years in a leadership position (ideally within a tech or AI-driven startup environment).
M.Sc. or PhD in Data Science/Computer Science/Engineering/Statistics, or a related field.
Extensive experience with cloud platforms (AWS, GCP, or Azure) and modern data warehouses (Snowflake, BigQuery, or Redshift).
Proficiency in data technologies, such as SQL, Python, R, Looker and big data tools (e.g., Hadoop, Spark).
Proven experience in leveraging data for product development, business intelligence, and operational optimization.
Strong track record of building and managing cross-functional data teams and influencing across all levels of an organization.
Excellent communication skills, with the ability to convey complex data insights in an accessible manner to non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8234801
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Staff Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Staff Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Staff Algo Data Engineer Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
To thrive in this role, youll need:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8205385
סגור
שירות זה פתוח ללקוחות VIP בלבד