דרושים » תוכנה » Senior Data Infrastructure Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As part of the Data Infrastructure group, youll help build our companys data platform for our growing stack of products, customers, and microservices.
We ingest our data from our operational DBs, telematics devices, and more, working with several data types (both structured and unstructured). Our challenge is to provide building tools and infrastructure to empower other teams leveraging data-mesh concepts.
In this role youll
Help build our companys data platform, designing and implementing data solutions for all application requirements in a distributed microservices environment
Build data-platform ingestion layers using streaming ETLs and Change Data Capture
Implement pipelines and scheduling infrastructures
Ensure compliance, data-quality monitoring, and data governance on our companys data platform
Implement large-scale batch and streaming pipelines with data processing frameworks
Collaborate with other Data Engineers, Developers, BI Engineers, ML Engineers, Data Scientists, Analysts and Product managers
Share knowledge with other team members and promote engineering standards.
Requirements:
5+ years of prior experience as a data engineer or data infra engineer
B.S. in Computer Science or equivalent field of study
Knowledge of databases (SQL, NoSQL)
Proven success in building large-scale data infrastructures such as Change Data Capture, and leveraging open source solutions such as Airflow & DBT, building large-scale streaming pipelines, and building customer data platforms
Experience with Python, Pulumi\Terraform, Apache Spark, Snowflake, AWS, K8s, Kafka
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206372
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206357
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
4 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
04/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
What You'll Do:
Shape the Future of Data- Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
Bonus points:
Hands-on experience with our stack: Databricks, Delta Lake, Kafka, Docker, Airflow, Terraform, and AWS.
Experience in building self-serve data platforms and improving developer experience across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8204175
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
22/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Data Engineer
We're hiring an experienced Data Engineer to join our growing team of analytics experts in order to help & lead the build-out of our data integration and pipeline processes, tools and platform.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
The right candidate must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our companys data architecture to support our next generation of products and data initiatives.
In this role, you will be responsible for:
Create ELT/Streaming processes and SQL queries to bring data to/from the data warehouse and other data sources.
Establish scalable, efficient, automated processes for large-scale data analyses.
Support the development of performance dashboards & data sets that will generate the right insight.
Work with business owners and partners to build data sets that answer their specific business questions.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization.
Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
Own the data lake pipelines, maintenance, improvements and schema.
Requirements:
BS or MS degree in Computer Science or a related technical field.
3-4 years of Python / Java development experience.
3-4 years of experience as a Data Engineer or in a similar role (BI developer).
3-4 years of direct experience with SQL (No-SQL is a plus), data modeling, data warehousing, and building ELT/ETL pipelines - MUST
Experience working with cloud environments (AWS preferred) and big data technologies (EMR,EC2, S3 ) - DBT is an advantage.
Experience working with Airflow - big advantage
Experience working with Kubernetes advantage
Experience working with at least in one of the big data environments: Snowflake, Vertica, Hadoop (Impala/Hive), Redshift etc MUST
Experience working with Spark advantage
Exceptional troubleshooting and problem-solving abilities.
Excellent verbal/written communication & data presentation skills, including experience communicating to both business and technical teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8188355
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role Appdome is building a new data Department, and were looking for a skilled data Engineer to help shape our data infrastructure. If you thrive in fast-paced environments, take ownership, and enjoy working on scalable data solutions, this role is for you. You'll have the opportunity to grow, influence key decisions, and collaborate with security experts and product teams. What Youll Do
* Design, build, and maintain scalable data pipelines, ETL processes, and data infrastructure.
* Optimize data Storage and retrieval for structured and unstructured data.
* Integrate data solutions into Appdomes products in collaboration with software engineers, security experts, and data scientists.
* Apply DevOps best practices (CI/CD, infrastructure as code, observability) for efficient data processing.
* Work with AWS (EC2, Athena, RDS) and ElasticSearch for data indexing and retrieval.
* Optimize and maintain SQL and NoSQL databases.
* Utilize Docker and Kubernetes for containerization and orchestration.
Requirements:
* B.Sc. in Computer Science, data Engineering, or a related field.
* 3+ years of hands-on experience in large-scale data infrastructures.
* Strong Python programming, with expertise in PySpark and Pandas.
* Deep knowledge of SQL and NoSQL databases, including performance optimization.
* Experience with ElasticSearch and AWS cloud services.
* Solid understanding of DevOps practices, Big Data tools, Git, and Jenkins.
* Familiarity with microservices and event-driven design.
* Strong problem-solving skills and a proactive, independent mindset. Advantages
* Experience with LangChain, ClickHouse, DynamoDB, Redis, and Apache Kafka.
* Knowledge of Metabase for data visualization.
* Experience with RESTful APIs and Node.js. Talent We Are Looking For Independent & Self-Driven Comfortable building from the ground up. Growth-Oriented Eager to develop professionally and take on leadership roles. Innovative Passionate about solving complex data challenges. Collaborative Strong communicator who works well with cross-functional teams. Adaptable Thrives in a fast-paced, dynamic environment with a can-do attitude. About the Company: Appdome's mission is to protect every mobile app worldwide and its users. We provide mobile brands with the only patented, centralized, data -driven Mobile Cyber Defense Automation platform. Our platform delivers rapid no-code, no-SDK mobile app security, anti-fraud, anti-malware, anti-cheat, anti-bot implementations, configuration as code ease, Threat-Events threat-aware UI / UX control, ThreatScope Mobile XDR, and Certified Secure DevSecOps Certification in one integrated system. With Appdome, mobile Developers, cyber and fraud teams can accelerate delivery, guarantee compliance, and leverage automation to build, TEST, release, and monitor the full range of cyber, anti-fraud, and other defenses needed in mobile apps from within mobile DevOps and CI/CD pipelines. Leading financial, healthcare, m-commerce, consumer, and B2B brands use Appdome to upgrade mobile DevSecOps and protect Android & IOS apps, mobile customers, and businesses globally. Today, Appdome's customers use our platform to secure over 50,000+ mobile apps, with protection for over 1 billion mobile end users projected.
Appdome is an Equal Opportunity Employer. We are committed to diversity, equity, and inclusion in our workplace. We do not discriminate based on race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by law. All qualified applicants will receive consideration for employment without regard to any of these characteristics.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8118270
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infrastructure Engineer
Tel Aviv
We are using technology to transform transportation around the world. From changing a single persons daily commute to reducing humanitys collective environmental footprint weve got huge goals.
As a Data Engineer, you will join a talented team of engineers, in charge of our core data infrastructure, while working closely with other engineering and data teams. You will lead the development of complex solutions, work on various cutting edge technologies, and be a critical part in our mission to shape the future of transportation, affecting tens of thousands of riders daily worldwide.
What Youll Do:
Design, build, and maintain scalable data platforms, pipelines, and solutions for data ingestion, processing and orchestration, leveraging Kubernetes, Airflow, Kafka and more.
Develop and manage cloud-based data solutions on AWS utilizing Infrastructure as Code (IaC) frameworks like Terraform or CloudFormation
Manage Snowflake environments, including database architecture, resource provisioning, and access control.
Ensure system reliability and observability with monitoring, logging, and alerting tools like Prometheus and CloudWatch.
Continuously optimize storage, compute, and data processing costs and performance.
Requirements:
BSc. Computer Science, Engineering or similar.
Minimum of 4 years of professional experience in data engineering
Minimum of 4 years of professional experience in backend or software development
Independent, responsible, ready to face challenges head-on
Passionate about data engineering and constantly expanding your knowledge
Excellent communication and strong analytical and problem-solving skills
Proficient in Python
Experience with Data streaming, Kafka, Airflow, cloud platforms (preferably AWS), k8s, terraform or equivalents.
Have a solid background working with data warehousing technologies, such as Snowflake, Databricks, Redshift, BigQuery, etc.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8200225
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Staff Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Staff Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Staff Algo Data Engineer Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
To thrive in this role, youll need:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8205385
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer for the Insight Team to join our Data Group and a new team responsible for developing innovative features based on multiple layers of data. These features will power recommendation systems, insights, and more. This role involves close collaboration with the core teams within the Data Group, working on diverse data pipelines that tackle challenges related to scale and algorithmic optimization, all aimed at enhancing the data experience for our customers.
Where does this role fit in our vision?
Every role at our company is designed with a clear purposeto integrate collective efforts into our shared success, functioning as pieces of a collective brain. Data is everything, its at the heart of everything we do. The Data Group is responsible for shaping the experience of hundreds of thousands of users who rely on our data daily.
The Insight Team monitors user behavior across our products, leveraging millions of signals and time-series entities to power a personalized recommendation and ranking system. This enables users to access more unique and tailored data, optimizing their experience while maintaining a strong focus on the key KPIs that drive the success of our Data Group.
What will you be responsible for?
Develop and implement robust, scalable data pipelines and integration solutions within our Databricks-based environment.
Develop models and implement algorithms, with a strong emphasis on delivering high-quality results.
Leverage technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Design innovative data solutions that support millions of data points, ensuring high performance and reliability.
Requirements:
3+ years of experience in data engineering, building and optimizing scalable data pipelines.
5+ years of experience as a software developer, preferably in Python.
Algorithmic experience, including developing and optimizing machine learning models and implementing advanced data algorithms.
Experience working with cloud ecosystems, preferably AWS (S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP).
Expertise in extracting, ingesting, and transforming large datasets efficiently.
Deep knowledge of big data platforms, such as Spark, Databricks, Elasticsearch, and Kafka for real-time data streaming.
(Nice-to-have) Hands-on experience working with Vector Databases and embedding techniques, with a focus on search, recommendations, and personalization.
AI-savvy: comfortable working with AI tools and staying ahead of emerging trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212720
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Algo Data Engineer
Realize your potential by joining the leading performance-driven advertising company!
As a Senior Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
About Algo platform:
The objective of the algo platform group is to own the existing algo platform (including health, stability, productivity and enablement), to facilitate and be involved in new platform experimentation within the algo craft and lead the platformization of the parts which should graduate into production scale. This includes support of ongoing ML projects while ensuring smooth operations and infrastructure reliability, owning a full set of capabilities, design and planning, implementation and production care.
The group has deep ties with both the algo craft as well as the infra group. The group reports to the infra department and has a dotted line reporting to the algo craft leadership.
The group serves as the professional authority when it comes to ML engineering and ML ops, serves as a focal point in a multidisciplinary team of algorithm researchers, product managers, and engineers and works with the most senior talent within the algo craft in order to achieve ML excellence.
How youll make an impact:
As a Senior Algo Data Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills in Java & Python
Experience with TensorFlow a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8205305
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
a Senior Data Engineer to join its Research Engineering team. The Research Engineering teams mission is to enable research needs. We succeed when our research analysts, data scientists, and engineers have access to the data, tools, knowledge, and support to do their jobs painlessly and efficiently.

We build the foundational infrastructure and key applications that allow internal users to derive actionable insights from vast data. Both enablingand runningBig Data processing at the scale of hundreds of TB is a core part of what our team does, but were not just a Big Data team. We work across the tech stack in support of our mission to enable data-driven research and investigation, from back-end to front-end to data stores.

What you'll be doing:

Design, build, and maintain foundational infrastructure and key applications that allow people to effectively leverage proprietary and public data
Use modern technologies daily, such as: Snowflake, Databricks, Apache Spark, Elasticsearch, AWS, Kafka, and more
Expand and optimize data warehouse and lakehouseweighing tradeoffs between performance, cost, and user experienceacross continually larger and more complex datasets
Model data and define storage strategy, to ensure that data is intuitive for researchers to find, and easy and fast for them to access
Enhance analysts most-used web application, which gives them in-depth insight into every decision we make
Collaborate directly with colleagues from various disciplines: Analysts, Data Scientists, and other Engineers (all of whom write production code!)
Self-manage project planning, milestones, designs, and estimations
Hold yourself and others to a high standard when working with production systems. We take pride in our tests, monitoring & alerting abilities just as we do with our systems.
Debug complex problems across the whole stack
Requirements:
5+ years of proven experience with designing, implementing, and maintaining large-scale backend production systems
Experience building and maintaining data-intensive applications
Experience with data processing technologies, e.g. Spark, Flink, EMR, Iceberg
Experience with data warehousing frameworks, e.g. Snowflake, Hive, Glue, Databricks Unity Catalog
Experience working in Python
Experience with AWS or other public clouds
Professional proficiency in English
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212533
סגור
שירות זה פתוח ללקוחות VIP בלבד