דרושים » תוכנה » Senior Backend Developer, Data Platform IL

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Senior Backend Developer.
What You'll Do:
Design and implement end-to-end features and services within a multi-service architecture.
Build and maintain a scalable data platform, incorporating advanced business logic and developing asynchronous, data-oriented, and scalable services.
Continuously improve and enhance the current data platform to ensure reliability, scalability, and performance.
Enhance and extend the functionality of existing systems.
Requirements:
5+ years as a Data Engineer or Backend Developer.
Proficient in Python; advanced skills are a plus.
Experience designing scalable data platforms, preferably in SaaS.
Expertise in building, optimizing, and maintaining large-scale production systems.
Hands-on with ETL/ELT pipelines, focusing on monitoring, supportability, and resource efficiency.
Strong knowledge of orchestration tools (e.g., Apache Airflow, Dagster).
Clear communication of technical processes to non-technical stakeholders.
Proven ability to design scalable, cloud-based systems in SaaS environments.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8504305
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers - while collaborating closely with product, DevOps and analytics stakeholders.
About the Platform group
The Platform Group accelerates our productivity by providing developers with tools, frameworks, and infrastructure services. We design, build, and maintain critical production systems, ensuring our platform can scale reliably. We also introduce new engineering capabilities to enhance our development process. As part of this group, youll help shape the technical foundation that supports our entire engineering team.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects - from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8499899
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We act as the central nervous system for engineering, enabling platform teams to unify their stack and expose it as a governed layer through golden paths for developers and AI agents.
By combining rich engineering context, workflows, and actions, we help organizations transition from manual processes to autonomous, AI-assisted engineering workflows while maintaining control and accountability.
As a product-led company, we believe in building world-class platforms that fundamentally shape how modern engineering organizations operate.
What youll do:
Lead the design and development of scalable and efficient data lake solutions that account for high-volume data coming from a large number of sources both pre-determined and custom.
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs.
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into a data lake that will serve our company's users.
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency.
Collaborate with cross-functional teams (product, analytics, and R&D) to enhance our company's data solutions.
Who youll work with:
Youll be joining a collaborative and dynamic team of talented and experienced developers where creativity and innovation thrive.
You'll closely collaborate with our dedicated Product Managers and Designers, working hand in hand to bring our developer portal product to life.
Additionally, you will have the opportunity to work closely with our customers and engage with our product community. Your insights and interactions with them will play an important role to ensure we deliver the best product possible.
Together, we'll continue to empower platform engineers and developers worldwide, providing them with the tools they need to create seamless and robust developer portals. Join us in our mission to revolutionize the developer experience!
Requirements:
5+ years of experience in a Data Engineering role
Expertise in building scalable pipelines and ETL/ELT processes, with proven experience with data modeling
Expert-level proficiency in SQL and experience with large-scale datasets
Strong experience with Snowflake
Strong experience with cloud data platforms and storage solutions such as AWS S3, or Redshift
Hands-on experience with ETL/ELT tools and orchestration frameworks such as Apache Airflow and dbt
Experience with Python and software development
Strong analytical and storytelling capabilities, with a proven ability to translate data into actionable insights for business users
Collaborative mindset with experience working cross-functionally with data engineers and product managers
Excellent communication and documentation skills, including the ability to write clear data definitions, dashboard guides, and metric logic
Advantages:
Experience in NodeJs + Typescript
Experience with streaming data technologies such as Kafka or Kinesis
Familiarity with containerization tools such as Docker and Kubernetes
Knowledge of data governance and data security practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8533929
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
20/01/2026
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a skilled Senior Backend Engineer to join our team and help us build our company Cognition, a multi-agent AI platform. The ideal candidate will have strong backend engineering expertise and enthusiasm for working with cutting-edge AI technologies.
About the Role:
You'll be responsible for designing and implementing backend services that integrate AI capabilities including large language models (LLMs), retrieval-augmented generation (RAG), and conversational interfaces into our consumer-facing products. This is an opportunity to expand your expertise into AI while leveraging your strong engineering foundation.
What Youll Do
Design and develop scalable, high-performance backend services and APIs
Implement integrations with AI platforms and services
Work with AI specialists to translate AI capabilities into production-ready features
Ensure high reliability, performance, and security of AI-powered systems
Collaborate with product and engineering teams to gather requirements and deliver solutions.
Requirements:
Proven Backend Expertise: 5+ years of professional experience, with a strong foundation in building and maintaining robust backend systems for large-scale consumer or enterprise products.
Technical Mastery: Advanced proficiency in at least one modern backend language (TypeScript and Node.js), with hands-on experience developing production-grade APIs and microservices. Experience with Lambda functions is a big plus.
Scalable Systems Builder: Demonstrated experience architecting and delivering high-availability services and distributed systems, leveraging patterns for reliability, performance, and security.
Cloud-Native Engineering: Deep familiarity with cloud platforms (AWS, GCP, or Azure), including hands-on experience with serverless technologies (e.g., AWS Lambda), infrastructure-as-code, and automated deployment pipelines.
Data Fluency: Solid experience with both SQL and NoSQL databases; knowledge of vector databases and embedding/search technologies is a big plus.
AI Platform Integration: Experience integrating with modern AI services (OpenAI, Hugging Face, etc.), and enthusiasm for collaborating with ML/AI specialists to bring LLMs, RAG, and conversational AI capabilities into production.
Results-Oriented Delivery: A track record of shipping consumer-facing features at scale, collaborating closely with both product and engineering teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8510188
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Staff Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
As a Staff Algo Data Engineer Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet
Our tech stack:
Java, Python, TensorFlow, Spark, Kafka, Cassandra, HDFS, vespa.ai, ElasticSearch, AirFlow, BigQuery, Google Cloud Platform, Kubernetes, Docker, git and Jenkins.
Requirements:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills - in Java & Python
Experience with TensorFlow - a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8498336
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Algo Data Engineer on the Infra group, youll play a vital role in develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools.
How youll make an impact:
As a Senior Algo Data Engineer, youll bring value by:
Develop, enhance and maintain highly scalable Machine-Learning infrastructures and tools, including CI/CD, monitoring and alerting and more
Have end to end ownership: Design, develop, deploy, measure and maintain our machine learning platform, ensuring high availability, high scalability and efficient resource utilization
Identify and evaluate new technologies to improve performance, maintainability, and reliability of our machine learning systems
Work in tandem with the engineering-focused and algorithm-focused teams in order to improve our platform and optimize performance
Optimize machine learning systems to scale and utilize modern compute environments (e.g. distributed clusters, CPU and GPU) and continuously seek potential optimization opportunities.
Build and maintain tools for automation, deployment, monitoring, and operations.
Troubleshoot issues in our development, production and test environments
Influence directly on the way billions of people discover the internet.
Requirements:
Experience developing large scale systems. Experience with filesystems, server architectures, distributed systems, SQL and No-SQL. Experience with Spark and Airflow / other orchestration platforms is a big plus.
Highly skilled in software engineering methods. 5+ years experience.
Passion for ML engineering and for creating and improving platforms
Experience with designing and supporting ML pipelines and models in production environment
Excellent coding skills - in Java & Python
Experience with TensorFlow - a big plus
Possess strong problem solving and critical thinking skills
BSc in Computer Science or related field.
Proven ability to work effectively and independently across multiple teams and beyond organizational boundaries
Deep understanding of strong Computer Science fundamentals: object-oriented design, data structures systems, applications programming and multi threading programming
Strong communication skills to be able to present insights and ideas, and excellent English, required to communicate with our global teams.
Bonus points if you have:
Experience in leading Algorithms projects or teams.
Experience in developing models using deep learning techniques and tools
Experience in developing software within a distributed computation framework.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8498376
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a DevOps Engineer (Data Platform Group).
Main responsibilities:
Data Architecture Direction: Provide strategic direction for our data architecture, selecting the appropriate componments for various tasks. Collaborate on requirements and make final decisions on system design and implementation.
Project Management: Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance.
Cost Optimization: Monitor and optimize cloud costs associated with data infrastructure and processes.
Efficiency and Reliability: Design and build monitoring tools to ensure the efficiency, reliability, and performance of data processes and systems.
DevOps Integration: Implement and manage DevOps practices to streamline development and operations, focusing on infrastructure automation, continuous integration/continuous deployment (CI/CD) pipelines, containerization, orchestration, and infrastructure as code. Ensure scalable, reliable, and efficient deployment processes.
Our stack: Azure, GCP, Kubernetes, ArgoCD, Jenkins, Databricks, Snowflake, Airflow, RDBMS, Spark, Kafka, Micro-Services, bash, Python, SQL.
Requirements:
5+ Years of Experience: Demonstrated experience as a DevOps professional, with a strong focus on big data environments, or Data Engineer with strong DevOps skills.
Data Components Management: Experiences managing and designing data infrastructure, such as Snowflake, PostgreSQL, Kafka, Aerospike, and Object Store.
DevOps Expertise: Proven experience creating, establishing, and managing big data tools, including automation tasks. Extensive knowledge of DevOps concepts and tools, including Docker, Kubernetes, Terraform, ArgoCD, Linux OS, Networking, Load Balancing, Nginx, etc.
Programming Skills: Proficiency in programming languages such as Python and Object-Oriented Programming (OOP), emphasizing big data processing (like PySpark). Experience with scripting languages like Bash and Shell for automation tasks.
Cloud Platforms: Hands-on experience with major cloud providers such as Azure, Google Cloud, or AWS.
Preferred Qualifications:
Performance Optimization: Experience in optimizing performance for big data tools and pipelines - Big Advantage.
Security Expertise: Experience in identifying and addressing security vulnerabilities within the data platform - Big Advantage.
CI/CD Pipelines: Experience designing, implementing, and maintaining Continuous Integration/Continuous Deployment (CI/CD) pipelines - Advantage.
Data Pipelines: Experience in building big data pipelines - Advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8509784
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Backend Engineer to join our Backend Infrastructure team. In this high-impact role, you will be pivotal in shaping the foundations that enable all of our developers to build and ship software effectively. You will be responsible for designing, building, and owning the scalable backend services that form our core infrastructure-such as authentication, data pipelines, public APIs, and asynchronous job systems. You will also champion our Developer Experience (DevEx) by creating common libraries, building internal tools, and setting engineering best practices. This role requires leading complex technical projects from architecture to deployment, driving improvements in platform scalability and performance, and collaborating closely with other engineering teams, DevOps, and Product to deliver foundational solutions that move the needle.
About the Backend Infrastructure team:
The Backend Infrastructure team is on a mission to build the reliable, scalable, and efficient foundations that power us. We focus on enhancing Developer Experience (DevEx) by providing robust tools, setting engineering best practices, and owning core services like authentication, data pipelines, and CI/CD. We ensure the platform can grow reliably by working in close collaboration with DevOps and all development teams. As a senior member of this team, youll be pivotal in shaping the technical backbone that supports our entire engineering organization.
Requirements:
7+ years of hands-on experience in backend development
Strong proficiency in Node.js and TypeScript - Must
Proven experience with both relational (e.g., MySQL, PostgreSQL) and non-relational (e.g., MongoDB) databases.
Experience designing, building, and maintaining scalable, high-availability, and robust production-grade systems.
Expertise in developing secure, production-grade APIs and backend services for web applications.
Experience in designing and developing complex systems while collaborating with multiple stakeholders (e.g., product, other engineering teams, DevOps).
A collaborative team player with a passion for empowering other engineers.
Fluent English speaker (verbal and written communication is a must)
B.Sc. in Computer Science, or military equivalent certifications
Advantages:
Experience building internal tools, or platforms for other developers
Familiarity with modern Node.js frameworks like NestJS and monorepo tools like Nx.
Familiarity with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481829
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
5 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced Data Engineering Team Leader.

In this role, you will lead and strengthen our Data Team, drive innovation, and ensure the robustness of our data and analytics platforms.

A day in the life and how youll make an impact:

Drive the technical strategy and roadmap for the data engineering function, ensuring alignment with overall business objectives.
Own the design, development, and evolution of scalable, high-performance data pipelines to enable diverse and growing business needs.
Establish and enforce a strong data governance framework, including comprehensive data quality standards, monitoring, and security protocols, taking full accountability for data integrity and reliability.
Lead the continuous enhancement and optimization of the data analytics platform and infrastructure, focusing on performance, scalability, and cost efficiency.
Champion the complete data lifecycle, from robust infrastructure and data ingestion to detailed analysis and automated reporting, to maximize the strategic value of data and drive business growth.
Requirements:
5+ years of Data Engineering experience (preferably in a startup), with a focus on designing and implementing scalable, analytics-ready data models and cloud data warehouses (e.g., BigQuery, Snowflake).
Minimum 3 years in a leadership role, with a proven history of guiding teams to success.
Expertise in modern data orchestration and transformation frameworks (e.g., Airflow, DBT).
Deep knowledge of databases (schema design, query optimization) and familiarity with NoSQL use cases.
Solid understanding of cloud data services (e.g., AWS, GCP) and streaming platforms (e.g., Kafka, Pub/Sub).
Fluent in Python and SQL, with a backend development focus (services, APIs, CI/CD).
Excellent communication skills, capable of simplifying complex technical concepts.
Experience with, or strong interest in, leveraging AI and automation for efficiency gains.
Passionate about technology, proactively identifying and implementing tools to enhance development velocity and maintain high standards.
Adaptable and resilient in dynamic, fast-paced environments, consistently delivering results with a strong can-do attitude.
B.Sc. in Computer Science / Engineering or equivalent.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8527969
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a highly skilled Senior Data Engineer with strong architectural expertise to design and evolve our next-generation data platform. You will define the technical vision, build scalable and reliable data systems, and guide the long-term architecture that powers analytics, operational decision-making, and data-driven products across the organization.
This role is both strategic and hands-on. You will evaluate modern data technologies, define engineering best practices, and lead the implementation of robust, high-performance data solutions-including the design, build, and lifecycle management of data pipelines that support batch, streaming, and near-real-time workloads.
🔧 What Youll Do
Architecture & Strategy
Own the architecture of our data platform, ensuring scalability, performance, reliability, and security.
Define standards and best practices for data modeling, transformation, orchestration, governance, and lifecycle management.
Evaluate and integrate modern data technologies and frameworks that align with our long-term platform strategy.
Collaborate with engineering and product leadership to shape the technical roadmap.
Engineering & Delivery
Design, build, and manage scalable, resilient data pipelines for batch, streaming, and event-driven workloads.
Develop clean, high-quality data models and schemas to support analytics, BI, operational systems, and ML workflows.
Implement data quality, lineage, observability, and automated testing frameworks.
Build ingestion patterns for APIs, event streams, files, and third-party data sources.
Optimize compute, storage, and transformation layers for performance and cost efficiency.
Leadership & Collaboration
Serve as a senior technical leader and mentor within the data engineering team.
Lead architecture reviews, design discussions, and cross-team engineering initiatives.
Work closely with analysts, data scientists, software engineers, and product owners to define and deliver data solutions.
Communicate architectural decisions and trade-offs to technical and non-technical stakeholders.
Requirements:
6-10+ years of experience in Data Engineering, with demonstrated architectural ownership.
Expert-level experience with Snowflake (mandatory), including performance optimization, data modeling, security, and ecosystem components.
Expert proficiency in SQL and strong Python skills for pipeline development and automation.
Experience with modern orchestration tools (Airflow, Dagster, Prefect, or equivalent).
Strong understanding of ELT/ETL patterns, distributed processing, and data lifecycle management.
Familiarity with streaming/event technologies (Kafka, Kinesis, Pub/Sub, etc.).
Experience implementing data quality, observability, and lineage solutions.
Solid understanding of cloud infrastructure (AWS, GCP, or Azure).
Strong background in DataOps practices: CI/CD, testing, version control, automation.
Proven leadership in driving architectural direction and mentoring engineering teams
Nice to Have:
Experience with data governance or metadata management tools.
Hands-on experience with DBT, including modeling, testing, documentation, and advanced features.
Exposure to machine learning pipelines, feature stores, or MLOps.
Experience with Terraform, CloudFormation, or other IaC tools.
Background designing systems for high scale, security, or regulated environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8528005
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
About us:
A pioneering health-tech startup on a mission to revolutionize weight loss and well-being. Our innovative metabolic measurement device provides users with a comprehensive understanding of their metabolism, empowering them with personalized, data-driven insights to make informed lifestyle choices.
Data is at the core of everything we do. We collect and analyze vast amounts of user data from our device and app to provide personalized recommendations, enhance our product, and drive advancements in metabolic health research. As we continue to scale, our data infrastructure is crucial to our success and our ability to empower our users.
About the Role:
As a Senior Data Engineer, youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.
Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.
Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.
Responsibilities:
Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
Technical Skills:
Proficiency in Python and SQL.
Hands-on experience with workflow orchestration tools like Airflow.
Experience with real-time data streaming technologies like Kafka.
Familiarity with container orchestration using Kubernetes (K8s) and dependency management with Poetry.
Cloud Infrastructure: Proven experience with AWS cloud services (e.g., EC2, S3, RDS).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8510072
סגור
שירות זה פתוח ללקוחות VIP בלבד