דרושים » מדעים מדוייקים » Staff Algo Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Staff Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Staff Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Staff Algo Data Engineer Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8272673
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Senior Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Senior Algo Data Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
To thrive in this role, youll need:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8274042
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Data Engineer on our Business Analytics & Business Intelligence Group in Tel Aviv Office, youll play a vital role on all aspects of the data, from ETL processes to optimizing our Business data models and infrastructure.
How youll make an impact:
As a Data Engineer, youll bring value by:
Have end to end ownership: Design, develop, deploy, measure and maintain our Business data infrastructure, ensuring high availability, high scalability and efficient resource utilization.
Lead the BI Engineering domain, including mentoring BI Engineers, establishing best practices, and driving BI engineering strategy.
Automate data workflows to streamline processes, reducing manual efforts and improving accuracy of our BI infrastructure..
Collaborate with other departments (e.g., IT/Data, R&D, Product, IS) to ensure seamless integration of BI tools with other systems and data models.
Work closely with the BI Development team to develop, maintain, and automate operational and executive-level reports and dashboards that offer actionable insights.
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our data pipelines.
Excellent communication and collaboration skills to work across teams.
Requirements:
To thrive in this role, youll need:
Minimum 3 years of experience in a Data Engineering role, working with large scale data.
Excellent coding skills in Java and Python- must
Experience with Data orchestration tools such as Airflow, or similar.
Experience with designing, developing, maintaining scalable and efficient data pipelines & models.
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries.
Deep understanding of strong Computer Science fundamentals: object-oriented design and data structures systems.
Leading Skills Able to technically lead and mentor other team members on best practices
Strong self-learning capabilities
Excellent attention to details and the ability to remain organized
Strong communications skills, verbal and written
Ability to work in a dynamic environment, with a high level of agility to changing circumstances and priorities
Bonus points if you have:
Experience working in online businesses, especially online advertising, preferably ad-tech.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8274192
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
14/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
At our company, were reinventing DevOps and MLOps to help the worlds greatest companies innovate -- and we want you along for the ride. This is a special place with a unique combination of brilliance, spirit and just all-around great people. Here, if youre willing to do more, your career can take off. And since software plays a central role in everyones lives, youll be part of an important mission. Thousands of customers, including the majority of the Fortune 100, trust our company to manage, accelerate, and secure their software delivery from code to production - a concept we call liquid software. Wouldn't it be amazing if you could join us in our journey?
About the Team
We are seeking a highly skilled Senior Data Engineer to join our company's ML Data Group and help drive the development and optimization of our cutting-edge data infrastructure. As a key member of the company's ML Platform team, you will play an instrumental role in building and evolving our feature store data pipeline, enabling machine learning teams to efficiently access and work with high-quality, real-time data at scale.
In this dynamic, fast-paced environment, you will collaborate with other data professionals to create robust, scalable data solutions. You will be responsible for architecting, designing, and implementing data pipelines that ensure reliable data ingestion, transformation, and storage, ultimately supporting the production of high-performance ML models.
We are looking for data-driven problem-solvers who thrive in ambiguous, fast-moving environments and are passionate about building data systems that empower teams to innovate and scale. We value independent thinkers with a strong sense of ownership, who can take challenges from concept to production while continuously improving our data infrastructure.
As a Data Engineer at our company's ML you will...
Design and implement large-scale batch & streaming data pipelines infrastructure
Build and optimize data workflows for maximum reliability and performance
Develop solutions for real-time data processing and analytics
Implement data consistency checks and quality assurance processes
Design and maintain state management systems for distributed data processing
Take a crucial role in building the group's engineering culture, tools, and methodologies
Define abstractions, methodologies, and coding standards for the entire Data Engineering pipeline.
Requirements:
5+ years of experience as a Software Engineer with focus on data engineering
Expert knowledge in building and maintaining data pipelines at scale
Strong experience with stream/batch processing frameworks (e.g. Apache Spark, Flink)
Profound understanding of message brokers (e.g. Kafka, RabbitMQ)
Experience with data warehousing and lake technologies
Strong Python programming skills and experience building data engineering tools
Experience with designing and maintaining Python SDKs
Proficiency in Java for data processing applications
Understanding of data modeling and optimization techniques
Bonus Points
Experience with ML model deployment and maintenance in production
Knowledge of data governance and compliance requirements
Experience with real-time analytics and processing
Understanding of distributed systems and cloud architectures
Experience with data visualization and lineage tools/frameworks and techniques.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8257535
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
19/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking talented and passionate Data Engineer to join our growing Data team. In this pivotal role, you will be instrumental in designing, building, and optimizing the critical data infrastructure that underpins innovative creative intelligence platform. You will tackle complex data challenges, ensuring our systems are robust, scalable, and capable of delivering high-quality data to power our advanced AI models, customer-facing analytics, and internal business intelligence. This is an opportunity to make a significant impact on our product, contribute to a data-driven culture, and help solve fascinating problems at the intersection of data, AI, and marketing technology.
Key Responsibilities
Architect & Develop Data Pipelines: Design, implement, and maintain sophisticated, end-to-end data pipelines for ingesting, processing, validating, and transforming large-scale, diverse datasets.
Manage Data Orchestration: Implement and manage robust workflow orchestration for complex, multi-step data processes, ensuring reliability and visibility.
Advanced Data Transformation & Modeling: Develop and optimize complex data transformations using advanced SQL and other data manipulation techniques. Contribute to the design and implementation of effective data models for analytical and operational use.
Ensure Data Quality & Platform Reliability: Establish and improve processes for data quality assurance, monitoring, alerting, and performance optimization across the data platform. Proactively identify and resolve data integrity and pipeline issues.
Cross-Functional Collaboration: Partner closely with AI engineers, product managers, developers, customer success and other stakeholders to understand data needs, integrate data solutions, and deliver features that provide exceptional value.
Drive Data Platform Excellence: Contribute to the evolution of our data architecture, champion best practices in data engineering (e.g., DataOps principles), and evaluate emerging technologies to enhance platform capabilities, stability, and cost-effectiveness.
Foster a Culture of Learning & Impact: Actively share knowledge, contribute to team growth, and maintain a strong focus on how data engineering efforts translate into tangible product and business outcomes.
Requirements:
3+ years of experience as a Data Engineer, building and managing complex data pipelines and data-intensive applications.
Solid understanding and application of software engineering principles and best practices. Proficiency in a relevant programming language (e.g., Python, Scala, Java) is highly desirable.
Deep expertise in writing, optimizing, and troubleshooting complex SQL queries for data transformation, aggregation, and analysis in relational and analytical database environments.
Hands-on experience with distributed data processing systems, cloud-based data platforms, data warehousing concepts, and workflow management tools.
Strong ability to diagnose complex technical issues, identify root causes, and develop effective, scalable solutions.
A genuine enthusiasm for tackling new data challenges, exploring innovative technologies, and continually expanding your skillset.
A keen interest in understanding how data powers product features and drives business value, with a focus on delivering results.
Excellent ability to communicate technical ideas clearly and work effectively within a multi-disciplinary team environment.
Advantages:
Familiarity with the marketing/advertising technology domain and associated datasets.
Experience with data related to creative assets, particularly video or image analysis.
Understanding of MLOps principles or experience supporting machine learning workflows.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8223467
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are making the future of Mobility come to life starting today.
At our company we support the worlds largest vehicle fleet operators and transportation providers to optimize existing operations and seamlessly launch new, dynamic business models - driving efficient operations and maximizing utilization.
At the heart of our platform lies the data infrastructure, driving advanced machine learning models and optimization algorithms. As the owner of data pipelines, you'll tackle diverse challenges spanning optimization, prediction, modeling, inference, transportation, and mapping.
As a Senior Data Engineer, you will play a key role in owning and scaling the backend data infrastructure that powers our platformsupporting real-time optimization, advanced analytics, and machine learning applications.
What You'll Do
Design, implement, and maintain robust, scalable data pipelines for batch and real-time processing using Spark, and other modern tools.
Own the backend data infrastructure, including ingestion, transformation, validation, and orchestration of large-scale datasets.
Leverage Google Cloud Platform (GCP) services to architect and operate scalable, secure, and cost-effective data solutions across the pipeline lifecycle.
Develop and optimize ETL/ELT workflows across multiple environments to support internal applications, analytics, and machine learning workflows.
Build and maintain data marts and data models with a focus on performance, data quality, and long-term maintainability.
Collaborate with cross-functional teams including development teams, product managers, and external stakeholders to understand and translate data requirements into scalable solutions.
Help drive architectural decisions around distributed data processing, pipeline reliability, and scalability.
Requirements:
4+ years in backend data engineering or infrastructure-focused software development.
Proficient in Python, with experience building production-grade data services.
Solid understanding of SQL
Proven track record designing and operating scalable, low-latency data pipelines (batch and streaming).
Experience building and maintaining data platforms, including lakes, pipelines, and developer tooling.
Familiar with orchestration tools like Airflow, and modern CI/CD practices.
Comfortable working in cloud-native environments (AWS, GCP), including containerization (e.g., Docker, Kubernetes).
Bonus: Experience working with GCP
Bonus: Experience with data quality monitoring and alerting
Bonus: Strong hands-on experience with Spark for distributed data processing at scale.
Degree in Computer Science, Engineering, or related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8238970
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
As a Big Data & GenAI Engineering Lead within our company's Data & AI Department, you will play a pivotal role in building the data and AI backbone that empowers product innovation and intelligent business decisions. You will lead the design and implementation of our companys next-generation lakehouse architecture, real-time data infrastructure, and GenAI-enriched solutions, helping drive automation, insights, and personalization at scale. In this role, you will architect and optimize our modern data platform while also integrating and operationalizing Generative AI models to support go-to-market use cases. This includes embedding LLMs and vector search into core data workflows, establishing secure and scalable RAG pipelines, and partnering cross-functionally to deliver impactful AI applications.
As a Big Data & GenAI Engineering Lead in our company you will...
Design, lead, and evolve our companys petabyte-scale Lakehouse and modern data platform to meet performance, scalability, privacy, and extensibility goals.
Architect and implement GenAI-powered data solutions, including retrieval-augmented generation (RAG), semantic search, and LLM orchestration frameworks tailored to business and developer use cases.
Partner with product, engineering, and business stakeholders to identify and develop AI-first use cases, such as intelligent assistants, code insights, anomaly detection, and generative reporting.
Integrate open-source and commercial LLMs securely into data products using frameworks such as LangChain, or similar, to augment AI capabilities into data products.
Collaborate closely with engineering teams to drive instrumentation, telemetry capture, and high-quality data pipelines that feed both analytics and GenAI applications.
Provide technical leadership and mentorship to a cross-functional team of data and ML engineers, ensuring adherence to best practices in data and AI engineering.
Lead tool evaluation, architectural PoCs, and decisions on foundational AI/ML tooling (e.g., vector databases, feature stores, orchestration platforms).
Foster platform adoption through enablement resources, shared assets, and developer-facing APIs and SDKs for accessing GenAI capabilities.
Requirements:
8+ years of experience in data engineering, software engineering, or MLOps, with hands-on leadership in designing modern data platforms and distributed systems.
Proven experience implementing GenAI applications or infrastructure (e.g., building RAG pipelines, vector search, or custom LLM integrations).
Deep understanding of big data technologies (Kafka, Spark, Iceberg, Presto, Airflow) and cloud-native data stacks (e.g., AWS, GCP, or Azure).
Proficiency in Python and experience with GenAI frameworks like LangChain, LlamaIndex, or similar.
Familiarity with modern ML toolchains and model lifecycle management (e.g., MLflow, SageMaker, Vertex AI).
Experience deploying scalable and secure AI solutions with proper attention to privacy, hallucination risk, cost management, and model drift.
Ability to operate in ambiguity, lead complex projects across functions, and translate abstract goals into deliverable solutions.
Excellent communication and collaboration skills, with a passion for pushing boundaries in both data and AI domains.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8255562
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infra Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Data Infra Engineer on the IT production team in our TLV Office, youll play a vital role impacting billions of users, work with petabytes of data in extreme scale and make a difference, send us your resume. Or In other words, If you like: Database performance tuning, Doing it at scale, Building tools that will help you do it Using AI to make it legendary, this is the place for you.
How Youll Make An Impact, As a Data Infra Engineer , youll bring value by:
Advise on the optimal database design by analyzing application behavior and business requirements
Design data flow processes
Work with AI at scale on developing high impact solutions
Proactively monitor and optimize database queries, processes, tables and resources
Initiate and lead cross-group optimization processes
Write utilities and design dashboards that will help us maintain, monitor and optimize the production database environment and data pipeline
Develop automation tools for large cluster maintenance
Face challenges such as scale, visibility and automation
Be involved in both development and infrastructure projects
Our Tech Stack
Vertica, MySQL, Hbase, HDFS, Kafka, Spark, Cassandra, BigQuery, Iceberg, ElasticSearch, Grafana.
Requirements:
To Thrive In This Role, Youll Need:
Experience working as a Data Engineer/DBA
Deep understanding of performance tuning techniques for relational databases, preferably MySQL/Vertica
Good understanding of infrastructure components, storage, networking and server hardware
Experience working with scripting/programming languages like Bash, Perl or Python
Ability to programmatically build tools that will help the team with its day-to-day tasks
Self motivated, eager to learn, loves challenges
Able to work independently
Excellent English
It Would Be Great If You Also Have:
In-depth Linux system knowledge
Vertica/HBase/Cassandra/Elastic Experience
Hands-on experience with NoSQL databases
Working experience with queuing systems, preferably Kafka
Experience in web facing environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8274189
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
16/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineer to join our team and help advance our Apps solution. Our product is designed to provide detailed and accurate insights into Apps Analytics, such as traffic estimation, revenue analysis, and app characterization. The role involves constructing and maintaining scalable data pipelines, developing and integrating machine learning models, and ensuring data integrity and efficiency. You will work closely with a diverse team of scientists, engineers, analysts, and collaborate with business and product stakeholders.
Key Responsibilities:
Develop and implement complex, innovative big data ML algorithms for new features, working in collaboration with data scientists and analysts.
Optimize and maintain end-to-end data pipelines using big data technologies to ensure efficiency and performance.
Monitor data pipelines to ensure data integrity and promptly troubleshoot any issues that arise.
Requirements:
Bachelor's degree in Computer Science or equivalent practical experience.
At least 3 years of experience in data engineering or related roles.
Experience with big data Machine Learning - a must ! .
Proficiency in Python- must. Scala is a plus.
Experience with Big Data technologies including Spark, EMR and Airflow.
Experience with containerization/orchestration platforms such as Docker and Kubernetes.
Familiarity with distributed computing on the cloud (such as AWS or GCP).
Strong problem-solving skills and ability to learn new technologies quickly.
Being goal-driven and efficient.
Excellent communication skills and ability to work independently and in a team.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219237
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Data Engineer on the IT production team in our TLV Office, youll play a vital role impacting billions of users, work with petabytes of data in extreme scale and make a difference, send us your resume. Or In other words, If you like: Database performance tuning, Doing it at scale, Building tools that will help you do it Using AI to make it legendary this is the place for you.
It Would Be Great If You Also Have:
In-depth Linux system knowledge
Vertica/HBase/Cassandra/Elastic Experience
Hands-on experience with NoSQL databases
Working experience with queuing systems, preferably Kafka
Experience in web facing environments
How Youll Make An Impact, As a Data Engineer , youll bring value by:
Advise on the optimal database design by analyzing application behavior and business requirements
Design data flow processes
Work with AI at scale on developing high impact solutions
Proactively monitor and optimize database queries, processes, tables and resources
Initiate and lead cross-group optimization processes
Write utilities and design dashboards that will help us maintain, monitor and optimize the production database environment and data pipeline
Develop automation tools for large cluster maintenance
Face challenges such as scale, visibility and automation
Be involved in both development and infrastructure projects
Our Tech Stack
Vertica, MySQL, Hbase, HDFS, Kafka, Spark, Cassandra, BigQuery, Iceberg, ElasticSearch, Grafana.
Requirements:
To Thrive In This Role, Youll Need
Experience working as a Data Engineer/DBA
Deep understanding of performance tuning techniques for relational databases, preferably MySQL/Vertica
Good understanding of infrastructure components, storage, networking and server hardware
Experience working with scripting/programming languages like Bash, Perl or Python
Ability to programmatically build tools that will help the team with its day-to-day tasks
Self motivated, eager to learn, loves challenges
Able to work independently
Excellent English.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8274201
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Performance Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Data Performance Engineer on the IT production team in our TLV Office, youll play a vital role impacting billions of users, work with petabytes of data in extreme scale and make a difference, send us your resume. Or In other words, If you like: Database performance tuning, Doing it at scale, Building tools that will help you do it Using AI to make it legendary, this is the place for you.
How Youll Make An Impact, As a Data Performance Engineer , youll bring value by:
Advise on the optimal database design by analyzing application behavior and business requirements
Design data flow processes
Work with AI at scale on developing high impact solutions
Proactively monitor and optimize database queries, processes, tables and resources
Initiate and lead cross-group optimization processes
Write utilities and design dashboards that will help us maintain, monitor and optimize the production database environment and data pipeline
Develop automation tools for large cluster maintenance
Face challenges such as scale, visibility and automation
Be involved in both development and infrastructure projects
Our Tech Stack
Vertica, MySQL, Hbase, HDFS, Kafka, Spark, Cassandra, BigQuery, Iceberg, ElasticSearch, Grafana.
Requirements:
To Thrive In This Role, Youll Need:
Experience working as a Data Engineer/DBA
Deep understanding of performance tuning techniques for relational databases, preferably MySQL/Vertica
Good understanding of infrastructure components, storage, networking and server hardware
Experience working with scripting/programming languages like Bash, Perl or Python
Ability to programmatically build tools that will help the team with its day-to-day tasks
Self motivated, eager to learn, loves challenges
Able to work independently
Excellent English
It Would Be Great If You Also Have:
In-depth Linux system knowledge
Vertica/HBase/Cassandra/Elastic Experience
Hands-on experience with NoSQL databases
Working experience with queuing systems, preferably Kafka
Experience in web facing environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8274186
סגור
שירות זה פתוח ללקוחות VIP בלבד