דרושים » דאטה » Data Engineering Tech Lead

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineering Tech Lead.
What will you be responsible for?
Lead the design and development of scalable, high-performance data workflows, including both batch pipelines and real-time data products.
Define, implement, and enforce engineering best practices related to code quality, testing, CI/CD pipelines, observability, and documentation.
Mentor, support, and grow a team of data engineers, fostering a collaborative and high-performance engineering culture.
Identify opportunities to create new data assets and features that expand product capabilities and value proposition.
Drive architectural decision-making in areas of data modeling, storage solutions, and compute resources within cloud environments such as Databricks and Snowflake.
Collaborate closely with cross-functional stakeholdersincluding Product, DevOps, and R&Dto ensure effective delivery and platform stability.
Promote and champion a data-driven mindset across the organization, balancing technical rigor with business context and strategic goals.
Requirements:
Minimum 5 years of hands-on experience designing, building, and maintaining large-scale data pipelines for both batch processing and streaming use cases.
Deep expertise in Python and SQL, with a focus on writing clean, performant, and maintainable code.
Strong analytical and problem-solving skills, with the ability to break down complex technical challenges and align solutions to business objectives.
Solid background in data modeling, analytics, and designing architectures for scalability, performance, and cost efficiency.
Practical experience working with modern OLAP systems and cloud data platforms, including Databricks, Snowflake, or BigQuery.
Familiarity with AI agent protocols (such as A2A, MCP) and LLM-related technologies (e.g., vector databases, embeddings) is a plus.
AI-savvy, with comfort adopting AI tools and staying current with emerging AI trends and technologies.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8280795
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek a Director of Data to join us and lead our data group.
As our Director of Data, you will be a key member of our R&D leadership team. You will be responsible for developing and executing a data strategy that aligns with our business goals, overseeing data management, analytics, and validation, and ensuring data integrity at every stage of product development and production.
A day in the life and how youll make an impact:
Define and execute a strategic data roadmap aligned with business objectives, fostering a data-driven culture and leading a high-performing team of data engineers, scientists, and analysts.
Establish robust data validation frameworks, ensuring product integrity and accuracy through all stages, from data acquisition to end-user delivery.
Build and optimize scalable data infrastructure and pipelines to support our data needs and ensure data security, compliance, and accessibility.
Collaborate with product and engineering teams to create and launch data-driven products, ensuring they are built on reliable data and designed to meet customer needs.
Guide the team in generating actionable insights to drive business decisions and product innovation in areas such as personalization, marketing, and customer success.
Implement data governance policies and maintain compliance with industry regulations and best practices.
Requirements:
10+ years of experience in data-related roles, with at least 5 years in a leadership position (ideally within a tech or AI-driven startup environment).
M.Sc. or PhD in Data Science/Computer Science/Engineering/Statistics, or a related field.
Extensive experience with cloud platforms (AWS, GCP, or Azure) and modern data warehouses (Snowflake, BigQuery, or Redshift).
Proficiency in data technologies, such as SQL, Python, R, Looker and big data tools (e.g., Hadoop, Spark).
Proven experience in leveraging data for product development, business intelligence, and operational optimization.
Strong track record of building and managing cross-functional data teams and influencing across all levels of an organization.
Excellent communication skills, with the ability to convey complex data insights in an accessible manner to non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8234801
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
As a Big Data & GenAI Engineering Lead within our company's Data & AI Department, you will play a pivotal role in building the data and AI backbone that empowers product innovation and intelligent business decisions. You will lead the design and implementation of our companys next-generation lakehouse architecture, real-time data infrastructure, and GenAI-enriched solutions, helping drive automation, insights, and personalization at scale. In this role, you will architect and optimize our modern data platform while also integrating and operationalizing Generative AI models to support go-to-market use cases. This includes embedding LLMs and vector search into core data workflows, establishing secure and scalable RAG pipelines, and partnering cross-functionally to deliver impactful AI applications.
As a Big Data & GenAI Engineering Lead in our company you will...
Design, lead, and evolve our companys petabyte-scale Lakehouse and modern data platform to meet performance, scalability, privacy, and extensibility goals.
Architect and implement GenAI-powered data solutions, including retrieval-augmented generation (RAG), semantic search, and LLM orchestration frameworks tailored to business and developer use cases.
Partner with product, engineering, and business stakeholders to identify and develop AI-first use cases, such as intelligent assistants, code insights, anomaly detection, and generative reporting.
Integrate open-source and commercial LLMs securely into data products using frameworks such as LangChain, or similar, to augment AI capabilities into data products.
Collaborate closely with engineering teams to drive instrumentation, telemetry capture, and high-quality data pipelines that feed both analytics and GenAI applications.
Provide technical leadership and mentorship to a cross-functional team of data and ML engineers, ensuring adherence to best practices in data and AI engineering.
Lead tool evaluation, architectural PoCs, and decisions on foundational AI/ML tooling (e.g., vector databases, feature stores, orchestration platforms).
Foster platform adoption through enablement resources, shared assets, and developer-facing APIs and SDKs for accessing GenAI capabilities.
Requirements:
8+ years of experience in data engineering, software engineering, or MLOps, with hands-on leadership in designing modern data platforms and distributed systems.
Proven experience implementing GenAI applications or infrastructure (e.g., building RAG pipelines, vector search, or custom LLM integrations).
Deep understanding of big data technologies (Kafka, Spark, Iceberg, Presto, Airflow) and cloud-native data stacks (e.g., AWS, GCP, or Azure).
Proficiency in Python and experience with GenAI frameworks like LangChain, LlamaIndex, or similar.
Familiarity with modern ML toolchains and model lifecycle management (e.g., MLflow, SageMaker, Vertex AI).
Experience deploying scalable and secure AI solutions with proper attention to privacy, hallucination risk, cost management, and model drift.
Ability to operate in ambiguity, lead complex projects across functions, and translate abstract goals into deliverable solutions.
Excellent communication and collaboration skills, with a passion for pushing boundaries in both data and AI domains.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8255562
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
7 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Scientist (Applied AI).
As a Senior Data Scientist on our Applied AI team, you will join our Tel Aviv office and play a hands-on, end-to-end role in delivering innovative capabilities that help mayors and other city leaders understand their communities and improve the lives of millions worldwide. Reporting to the Applied AI Team Lead, you will collaborate with product, engineering, and fellow data-science teammates to turn cutting-edge research into production-ready solutionsquickly, reliably, and with maximum real-world impact. Youll work with a rich mix of data sources (including social media, news stories, survey results, resident feedback, and more) to create models and AI-powered features that scale.
Day to Day:
Design, build, and deploy AI and machine-learning solutions, from data exploration through modeling, evaluation, and integration into customer-facing products and internal tools.
Optimize models for quality and scalability through feature engineering, hyper-parameter tuning, runtime profiling, and thoughtful architectural choices.
Build and maintain data pipelines using tools such as Airflow, Spark, and Databricks to ensure clean, reliable inputs for downstream models.
Collaborate closely with product managers, engineers, and designers to refine problem statements, iterate rapidly, and ship impactful features on schedule.
Champion technical excellence by conducting code reviews, sharing best practices, and mentoring teammates across data science and engineering.
Stay current with the latest developments in AI-including LLMs, RAG systems, and AI agents-and proactively propose ways to incorporate new techniques into our workflows.
Work an in-person or hybrid schedule, spending at least three days per week in our Tel Aviv office.
Requirements:
5 + years of hands-on experience developing and deploying machine-learning or data-science solutions with Python and SQL.
Proven, end-to-end experience building AI- and machine learning-based solutions from prototype to production deployment.
Demonstrated success shipping data-intensive services to production on cloud infrastructure (AWS preferred) using data tools such as PostgreSQL, Databricks, Spark, or Airflow.
Deep understanding of machine-learning fundamentals and practical expertise with frameworks such as TensorFlow, PyTorch, or scikit-learn.
Expertise in machine learning metrics and quality control.
Solid understanding of software-engineering best practices (including design patterns, data structures, and version control).
Excellent interpersonal and communication skills, with the ability to explain complex technical concepts to non-technical stakeholders and collaborate across teams.
Its even better if you have:
Experience with Agile development in fast-paced, delivery-driven environments.
Familiarity with CI/CD practices, containers, Kubernetes, and serverless or microservice architectures.
Experience with geospatial analysis, government data, survey research, or civic-tech applications.
A track record of contributing to open-source projects.
A college or graduate degree in a relevant field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8273710
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to design and implement high-scale, data-intensive platforms, research and develop algorithmic solutions, and collaborate on key company initiatives. You will play a critical role within core data teams, which are responsible for managing and optimizing fundamental data assets.
What will you be responsible for?
Solve Complex Business Problems with Scalable Data Solutions
Develop and implement robust, high-scale data pipelines to power core assets.
Leverage cutting-edge technologies to tackle complex data challenges and enhance business operations.
Collaborate with Business Stakeholders to Drive Impact
Work closely with Product, Data Science, and Analytics teams to define priorities and develop solutions that directly enhance core products and user experience.
Build and Maintain a Scalable Data Infrastructure
Design and implement scalable, high-performance data infrastructure to support machine learning, analytics, and real-time data processing.
Continuously monitor and optimize data pipelines to ensure reliability, accuracy, and efficiency.
Requirements:
3+ years of hands-on experience designing and implementing large-scale, server-side data solutions
4+ years of programming experience, preferably in Python and SQL, with a strong understanding of data structures and algorithms
Proven experience in building algorithmic solutions, data mining, and applying analytical methodologies to optimize data processing and insights
Proficiency with orchestration tools such as Airflow, Kubernetes, and Docker Swarm, ensuring seamless workflow automation
Experience working with Data Lakes and Apache Spark for processing large-scale datasets strong advantage
Familiarity with AWS services (S3, Glue, EMR, Redshift) nice to have
Knowledge of tools such as Kafka, Databricks, and Jenkins a plus
Strong command of a variety of storage engines, including Relational (PostgreSQL, MySQL), Document-based (MongoDB), Time-series / Search (ClickHouse, Elasticsearch), Key-value (Redis)
Comfortable working with AI tools and staying ahead of emerging technologies and trends
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8280797
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
7 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Applied AI Team Lead.
As the Team Lead for our Applied AI team, you will join our Tel Aviv office. You will guide the day‑to‑day execution of innovative projects to help mayors and other city leaders understand their communities and improve the lives of millions worldwide. Reporting to the VP of Data Science, you will combine hands‑on technical leadership with people management skills, overseeing a team with rapidly-growing portfolio of mission-critical projects and collaborating with product, engineering, and other internal partners. Youll be responsible for leading every aspect of the development process, so that our cutting‑edge AI and data science capabilities get to production quickly, reliably, and with maximum impact.
Day to Day:
Lead a 5‑person team of data scientists and engineers, coordinating stand‑ups, code reviews, and 1‑on‑1s while also contributing hands-on to high-priority projects.
Oversee the development and implementation of a diverse project portfolio, focused on both customer-facing product features and internal business use cases.
Scope, execute, and deliver end‑to‑end AI and data science capabilities to meet the requirements of product owners and internal stakeholders.
Play a leading role in the development of AI-focused infrastructure and the implementation of best practices for engineering and data science.
Track project risks, dependencies, and costs, and make proactive adjustments to keep initiatives on time and aligned with requirements.
Stay current with the latest developments in AI technologies and tools, and identify opportunities to integrate the latest innovations into our products and workflows.
Recruit, onboard, and develop team members to help scale our Applied AI program and promote a culture of technical excellence.
Work an in‑person or hybrid schedule, with at least three days per week in our Tel Aviv office.
Requirements:
2 + years as a leader of engineering or data science teams, with a track record of shipping data-intensive tools or services to production.
6 + years hands‑on building data-driven solutions with Python and SQL.
Experience deploying services using cloud-based infrastructure and data tools such as PostgreSQL, Databricks, and Spark.
Strong understanding of the key concepts and usage of modern AI tools such as LLMs, RAG systems, and AI agents.
Solid software engineering fundamentals (including design patterns, data structures, and version control) and familiarity with Agile practices.
Proven ability to manage project scope, timelines, and cross‑team dependencies.
Comfort implementing rapid, iterative development processes while also prioritizing quality, reliability, and maintainability.
Excellent command of English, both verbal and written, and comfortable presenting technical topics to non‑technical stakeholders.
Its even better if you have:
Proficiency with CI/CD tools and practices, containers, serverless architectures, and automated testing and monitoring.
Experience deploying models to production using leading machine learning frameworks such as TensorFlow or PyTorch.
Experience with Agile development processes and project management in JIRA.
Expertise in machine learning metrics and monitoring.
Experience with geospatial analysis, government data, survey research, or civic tech applications.
Demonstrated success hiring and developing technical talent.
Contributions to open‑source projects or an advanced degree in a relevant field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8273734
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: More than one
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers.
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8232260
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are making the future of Mobility come to life starting today.
At our company we support the worlds largest vehicle fleet operators and transportation providers to optimize existing operations and seamlessly launch new, dynamic business models - driving efficient operations and maximizing utilization.
At the heart of our platform lies the data infrastructure, driving advanced machine learning models and optimization algorithms. As the owner of data pipelines, you'll tackle diverse challenges spanning optimization, prediction, modeling, inference, transportation, and mapping.
As a Senior Data Engineer, you will play a key role in owning and scaling the backend data infrastructure that powers our platformsupporting real-time optimization, advanced analytics, and machine learning applications.
What You'll Do
Design, implement, and maintain robust, scalable data pipelines for batch and real-time processing using Spark, and other modern tools.
Own the backend data infrastructure, including ingestion, transformation, validation, and orchestration of large-scale datasets.
Leverage Google Cloud Platform (GCP) services to architect and operate scalable, secure, and cost-effective data solutions across the pipeline lifecycle.
Develop and optimize ETL/ELT workflows across multiple environments to support internal applications, analytics, and machine learning workflows.
Build and maintain data marts and data models with a focus on performance, data quality, and long-term maintainability.
Collaborate with cross-functional teams including development teams, product managers, and external stakeholders to understand and translate data requirements into scalable solutions.
Help drive architectural decisions around distributed data processing, pipeline reliability, and scalability.
Requirements:
4+ years in backend data engineering or infrastructure-focused software development.
Proficient in Python, with experience building production-grade data services.
Solid understanding of SQL
Proven track record designing and operating scalable, low-latency data pipelines (batch and streaming).
Experience building and maintaining data platforms, including lakes, pipelines, and developer tooling.
Familiar with orchestration tools like Airflow, and modern CI/CD practices.
Comfortable working in cloud-native environments (AWS, GCP), including containerization (e.g., Docker, Kubernetes).
Bonus: Experience working with GCP
Bonus: Experience with data quality monitoring and alerting
Bonus: Strong hands-on experience with Spark for distributed data processing at scale.
Degree in Computer Science, Engineering, or related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8238970
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
14/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
At our company, were reinventing DevOps and MLOps to help the worlds greatest companies innovate -- and we want you along for the ride. This is a special place with a unique combination of brilliance, spirit and just all-around great people. Here, if youre willing to do more, your career can take off. And since software plays a central role in everyones lives, youll be part of an important mission. Thousands of customers, including the majority of the Fortune 100, trust our company to manage, accelerate, and secure their software delivery from code to production - a concept we call liquid software. Wouldn't it be amazing if you could join us in our journey?
About the Team
We are seeking a highly skilled Senior Data Engineer to join our company's ML Data Group and help drive the development and optimization of our cutting-edge data infrastructure. As a key member of the company's ML Platform team, you will play an instrumental role in building and evolving our feature store data pipeline, enabling machine learning teams to efficiently access and work with high-quality, real-time data at scale.
In this dynamic, fast-paced environment, you will collaborate with other data professionals to create robust, scalable data solutions. You will be responsible for architecting, designing, and implementing data pipelines that ensure reliable data ingestion, transformation, and storage, ultimately supporting the production of high-performance ML models.
We are looking for data-driven problem-solvers who thrive in ambiguous, fast-moving environments and are passionate about building data systems that empower teams to innovate and scale. We value independent thinkers with a strong sense of ownership, who can take challenges from concept to production while continuously improving our data infrastructure.
As a Data Engineer at our company's ML you will...
Design and implement large-scale batch & streaming data pipelines infrastructure
Build and optimize data workflows for maximum reliability and performance
Develop solutions for real-time data processing and analytics
Implement data consistency checks and quality assurance processes
Design and maintain state management systems for distributed data processing
Take a crucial role in building the group's engineering culture, tools, and methodologies
Define abstractions, methodologies, and coding standards for the entire Data Engineering pipeline.
Requirements:
5+ years of experience as a Software Engineer with focus on data engineering
Expert knowledge in building and maintaining data pipelines at scale
Strong experience with stream/batch processing frameworks (e.g. Apache Spark, Flink)
Profound understanding of message brokers (e.g. Kafka, RabbitMQ)
Experience with data warehousing and lake technologies
Strong Python programming skills and experience building data engineering tools
Experience with designing and maintaining Python SDKs
Proficiency in Java for data processing applications
Understanding of data modeling and optimization techniques
Bonus Points
Experience with ML model deployment and maintenance in production
Knowledge of data governance and compliance requirements
Experience with real-time analytics and processing
Understanding of distributed systems and cloud architectures
Experience with data visualization and lineage tools/frameworks and techniques.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8257535
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
21/07/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Backend Tech Lead to join our recommendations group. As a Tech Lead, you will be instrumental in shaping the technical direction and architecture of our backend systems. You will leverage your deep expertise in backend technologies to design, build and scale backend services while guiding a group of engineers toward delivering robust, high-performance solutions. This role offers the opportunity to collaborate with cross-functional teams and contribute to projects that impact the entire organization. If you're a leader with a passion for backend development, we would love to have you on board!

Role

As a Backend Tech Lead, your responsibilities will include:
Mentoring and guiding backend engineers to ensure high-quality code and effective development processes.
System design and architecture: Designing scalable, efficient, and secure backend systems, including APIs and data storage solutions, using technologies like Python, MySQL, Redis and Elasticsearch.
Tech stack evolution: Driving improvements in backend technologies, frameworks and development practices.
Cross-functional collaboration: Working closely with product managers, frontend teams and other stakeholders to ensure backend services meet business requirements and technical specifications.
Performance and scalability: Ensuring that our backend services are highly available, scalable and optimized for performance.
Cloud infrastructure: Overseeing cloud services and ensuring backend systems are seamlessly integrated with cloud platforms.
Code quality and best practices: Writing and maintaining clean, maintainable and well-documented code, while fostering a culture of code reviews, continuous improvement and adherence to best practices.
Technical leadership: Owning the technical direction of backend systems, making key architectural decisions and promoting industry best practices within the group.
Requirements:
Youre a seasoned Backend Tech Lead with a deep passion for backend systems and strong leadership qualities. Heres what were looking for:
Strong proficiency in Python: Extensive experience with Python and web frameworks in a production environment.
Leadership experience: Proven ability to lead, mentor, and inspire engineering teams.
Backend expertise: Experience with building, deploying, and optimizing large-scale backend systems using databases like MySQL, and NoSQL systems (e.g., Redis, Elasticsearch).
Cloud platforms: Expertise in cloud infrastructure and services for backend service deployment and management.
Microservices and APIs: Solid experience designing and implementing backend APIs and microservices architectures that are modular, scalable and maintainable.
Containerization and orchestration: Familiarity with Docker and Kubernetes for containerized applications and orchestration.
High-performance systems: Ability to identify performance bottlenecks and optimize backend systems to handle large-scale traffic with minimal latency.
Strong communication skills: Excellent verbal and written communication skills, with the ability to explain complex technical concepts.

Desired Skills:
- Experience: At least 5 years of solid backend engineering experience, with at least 2 years in a leadership or tech lead position.
- Education: A degree in Computer Science
Collaborative mindset: A passion for working in a team-oriented environment and helping others grow technically.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8268722
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
13/07/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
We are looking for a Data Engineering Lead to join our Platform Group at our company. In this role, you will drive the development of scalable data pipelines and infrastructures that are crucial to our platforms success. You will collaborate across departments, and innovate to ensure our data ecosystem is robust, secure, and optimized for growth.
As a Data Engineering Lead at our company you will...
Architect and develop data pipelines Lead the design and implementation of data pipelines that support our company's platform, ensuring high data quality, security, and governance. Introduce new tools and technologies to enhance data workflows and integration
Develop a strategic roadmap that outlines key engineering solutions to support our platforms scalability and performance, aligned with our company's overall vision and objectives
Collaborate across teams Work closely with internal teams including DevOps, BI, Product, and development groups to ensure seamless data integration and drive data-driven decision-making across the organization
Establish data guidelines and documentation Define best practices for data generation, consumption, and management within the platform. Create thorough documentation for all data processes to facilitate clear communication and future maintenance.
Requirements:
5+ years hands-on proven experience designing, building, and optimizing scalable and highly available data-intensive systems
In-depth understanding of big data engines and frameworks, such as Spark, and experience with ETL/ELT tools for robust data pipeline development
Proven ability to lead initiatives and drive technical agendas in a hands-on capacity, with potential for team building in the future
Knowledge of machine learning frameworks and strategic industry trends a plus
Strong business skills and strategic thinking, an innovative and growth mindset
Strong interpersonal skills to collaborate with internal teams and external partners effectively.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8255790
סגור
שירות זה פתוח ללקוחות VIP בלבד