דרושים » תוכנה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are the global leader in Go-to-Market Security, trusted by over 15,000 customers worldwide to protect every aspect of their marketing, sales and data operation from bots, fake users, fraud and cyber attacks.
Powered by award-winning cybersecurity technology, CHEQ offers the broadest suite of solutions for securing the entire funnel, from paid marketing to on-site conversion, data, and analytics.
CHEQ is a global company with offices in Tel-Aviv, New York, Tokyo, London
We are seeking an experienced Data Engineer with strong expertise in columnar databases and database engineering to design and maintain our main big data database and pipelines This role involves significant hands-on database engineering and DBA work within an AWS environment. The ideal candidate will have a deep understanding of columnar databases and experience in handling big data environments.
Responsibilities:
Design, implement, and maintain robust, scalable data pipelines and database solutions.
Optimize and manage large-scale data systems, focusing on columnar databases, specifically ClickHouse.
Collaborate with various stakeholders across the company like Product managers, developers and Data Scientists in order to deliver team tasks with high quality.
Handle database engineering tasks, including schema design, query optimization, and database administration in AWS environments.
Provide support for big data analytics by ensuring the reliability and efficiency of data storage and retrieval processes.
Requirements:
4+ years of hands-on experience as a Data Engineer, with a focus on database engineering and administration.
Proven expertise with columnar databases (ClickHouse experience is a significant advantage) -Must
Experience building and optimizing big data pipelines, architectures, and datasets.
Experience with data pipeline technologies such as Spark, Kafka, Hadoop, Amazon Kinesis, or Apache Airflow is a plus.
Working knowledge of Python (for data-related implementation tasks).
Innovative, proactive, and independent, with strong problem-solving skills.
A team player with a can-do attitude.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125120
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
A growing tech company in the automotive space with hubs across the US and Israel. Our mission is to constantly disrupt the industry by creating groundbreaking technologies to help dealers build stronger, more resilient businesses. Our work happens in the fast lane as we bring AI-powered, data-driven solutions to a quickly evolving industry.

Our team consists of curious and creative individuals who are always looking to achieve the impossible. We are bold, collaborative, and goal-driven, and at our core, we believe every voice has value and can impact our bottom line.
If you are a creative, solutions-oriented individual who is ready to put your career in drive,the place for you!

We are looking for an experienced Data Engineering Tech Lead to join our team and make a real impact! In this hands-on role, you will drive the architecture, development, and optimization of our Data infrastructure, ensuring scalable and high-performance data solutions that support analytics, AI, and business intelligence needs. You will collaborate closely with analysts, Product, DevOps, and software engineers to cultivate a robust data ecosystem.

This position will report to the CISO and can be based out of Jerusalem or Tel-Aviv.

What you will be responsible for
Lead the design and implementation and maintenance of our DWH & Data Lake architecture to support both analytical and operational use cases.
Develop scalable ETL/ELT pipelines for ingestion, transformation, and optimization of structured and unstructured data.
Ensure data quality, governance, and security throughout the entire data lifecycle.
Optimize performance and cost-efficiency of data storage, processing, and retrieval.
Work closely with BI and analytics teams to guarantee seamless data integration with visualization tools.
Collaborate with stakeholders (BI teams, Product, and Engineering) to align data infrastructure with business needs.
Mentor and guide analysts, fostering a culture of best practices and professionalism.
Stay updated with industry trends and evaluate new technologies for continuous improvement.
Requirements:
5+ years of experience in data engineering, with at least 2-3 years experience in a Tech Lead role.
At least 3 years of hands-on experience with AWS, including services like S3, Redshift, Glue, Athena, Lambda, and RDS.
Expertise in DWH & Data Lake architectures, including columnar databases, data partitioning, and lakehouse concepts.
Strong experience with cloud data solutions like Redshift, Snowflake, BigQuery, or Databricks.
Proficiency in ETL/ELT tools (e.g., dbt, Apache Airflow, Glue, Dataflow).
Deep knowledge of SQL & Python for data processing and transformation.
Experience working with BI and visualization tools such as Power BI, Tableau, Looker, or similar.
Experience with real-time data streaming (Kafka, Kinesis, Pub/Sub) and batch processing.
Understanding of data modeling (Star/Snowflake), data governance, and security best practices.
Experience with CI/CD, infrastructure-as-code (Terraform, CloudFormation), and DevOps for data.
The personal competencies you need to have:
Excellent communication skills and the ability to work as a team.
Strong sense of ownership, urgency, and drive.
Ability to take the initiative, come up with ideas and solutions, and run it with a "getting things done" attitude.
Ability to work independently and manage tight deadlines.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8129541
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/03/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role Appdome is building a new data Department, and were looking for a skilled data Engineer to help shape our data infrastructure. If you thrive in fast-paced environments, take ownership, and enjoy working on scalable data solutions, this role is for you. You'll have the opportunity to grow, influence key decisions, and collaborate with security experts and product teams. What Youll Do
* Design, build, and maintain scalable data pipelines, ETL processes, and data infrastructure.
* Optimize data Storage and retrieval for structured and unstructured data.
* Integrate data solutions into Appdomes products in collaboration with software engineers, security experts, and data scientists.
* Apply DevOps best practices (CI/CD, infrastructure as code, observability) for efficient data processing.
* Work with AWS (EC2, Athena, RDS) and ElasticSearch for data indexing and retrieval.
* Optimize and maintain SQL and NoSQL databases.
* Utilize Docker and Kubernetes for containerization and orchestration.
Requirements:
* B.Sc. in Computer Science, data Engineering, or a related field.
* 3+ years of hands-on experience in large-scale data infrastructures.
* Strong Python programming, with expertise in PySpark and Pandas.
* Deep knowledge of SQL and NoSQL databases, including performance optimization.
* Experience with ElasticSearch and AWS cloud services.
* Solid understanding of DevOps practices, Big Data tools, Git, and Jenkins.
* Familiarity with microservices and event-driven design.
* Strong problem-solving skills and a proactive, independent mindset. Advantages
* Experience with LangChain, ClickHouse, DynamoDB, Redis, and Apache Kafka.
* Knowledge of Metabase for data visualization.
* Experience with RESTful APIs and Node.js. Talent We Are Looking For Independent & Self-Driven Comfortable building from the ground up. Growth-Oriented Eager to develop professionally and take on leadership roles. Innovative Passionate about solving complex data challenges. Collaborative Strong communicator who works well with cross-functional teams. Adaptable Thrives in a fast-paced, dynamic environment with a can-do attitude. About the Company: Appdome's mission is to protect every mobile app worldwide and its users. We provide mobile brands with the only patented, centralized, data -driven Mobile Cyber Defense Automation platform. Our platform delivers rapid no-code, no-SDK mobile app security, anti-fraud, anti-malware, anti-cheat, anti-bot implementations, configuration as code ease, Threat-Events threat-aware UI / UX control, ThreatScope Mobile XDR, and Certified Secure DevSecOps Certification in one integrated system. With Appdome, mobile Developers, cyber and fraud teams can accelerate delivery, guarantee compliance, and leverage automation to build, TEST, release, and monitor the full range of cyber, anti-fraud, and other defenses needed in mobile apps from within mobile DevOps and CI/CD pipelines. Leading financial, healthcare, m-commerce, consumer, and B2B brands use Appdome to upgrade mobile DevSecOps and protect Android & IOS apps, mobile customers, and businesses globally. Today, Appdome's customers use our platform to secure over 50,000+ mobile apps, with protection for over 1 billion mobile end users projected.
Appdome is an Equal Opportunity Employer. We are committed to diversity, equity, and inclusion in our workplace. We do not discriminate based on race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by law. All qualified applicants will receive consideration for employment without regard to any of these characteristics.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8118270
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a Senior Data Engineer to join our dynamic data team. In this role, you will design, build, and maintain robust data systems and infrastructure that support data collection, processing, and analysis. Your expertise will be crucial in developing scalable data pipelines, ensuring data quality, and collaborating with cross-functional teams to deliver actionable insights.

Key Responsibilities:

Design, develop, and maintain scalable ETL processes for data transformation and integration.
Build and manage data pipelines to support analytics and operational needs.
Ensure data accuracy, integrity, and consistency across various sources and systems.
Collaborate with data scientists and analysts to support AI model deployment and data-driven decision-making.
Optimize data storage solutions, including data lakehouses and databases, to enhance performance and scalability..
Monitor and troubleshoot data workflows to maintain system reliability.
Stay updated with emerging technologies and best practices in data engineering.
Requirements:
4+ years of experience in data engineering or a related role within a production environment.
Proficiency in Python and SQL
Experience with both relational (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Elasticsearch).
Familiarity with big data AWS tools and frameworks such as Glue, EMR, Kinesis etc.
Experience with containerization tools like Docker and Kubernetes.
Strong understanding of data warehousing concepts and data modeling.
Excellent problem-solving skills and attention to detail.
Strong communication skills, with the ability to work collaboratively in a team environment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8158855
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a Big Data DBA Team Lead to join our Data Operations (Data Ops) team in Tel Aviv.
The Data Operations Team's primary responsibility is to build, support and maintain the data pipelines infrastructure and efficient data stores.
We work on improving reliability, scalability and efficiency of data pipelines and data stores by engineering solutions that hosts the large amount of data we use as well as monitor, alert and auto-remediate issues that rises.
What Youll Do:
You will play a critical role in leading a team that ensures the stability, performance, and development of our organization's databases.
Your key responsibilities will include leading a DBA team which runs daily database management, performance tuning, database design and architecture, backup and recovery responsibilities.
You will work closely with R&D, BI teams and project managers on new and ongoing data-related projects.
Manage the team roadmap, team building and project execution.
Collaborate with stakeholders including data engineers, data analyst, BI developers and product managers.
Take full ownership of end-to-end data infrastructure, from design to production. Contribute to building and designing innovative big data architecture and pipelines.
Requirements:
At least 5 years of experience in leading and mentoring a DBA team.
Deep understanding of RDBMS administration (such as MySQL, Oracle, SQLServer, PostgreSQL).
Proven experience with Linux.
Familiarity with scripting (Shell/Python/Ruby).
Experience with NoSQL/KV store (such as Redis, Mongo, Aerospike).
An advantage:
Experience with Hadoop - Major advantage.
Experience with Ansible, Puppet or other configuration management tools.
Experience in working and maintaining Amazon Web Services.
Experience with DevOps tools such as Kubernetes, Airflow, Jenkins, Rundeck.
Familiarity with log shipping and monitoring technologies.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125347
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
18/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior Data Platform Engineer to join our Data Platform team. The ideal candidate will have extensive experience with data engineering, specifically with Kafka and Elasticsearch, to design and maintain our data platforms. You will work closely with cross-functional teams to ensure the scalability and reliability of our data solutions.

Job Summary:
As a Senior Data Platform Engineer, you'll be responsible for design, development, maintenance, troubleshooting, and implementation of our big data architecture. Your expertise in Elastic, Kafka, and Java will be critical in ensuring the scalability and performance of our data systems.

What youll do:

Implement data processing pipelines using Kafka for real-time data streaming.
Optimize and manage search capabilities using Elastic technologies.
Collaborate with product managers, data analysts, and other stakeholders to gather requirements and translate them into technical specifications.
Oversee code reviews, ensure best practices in coding and data handling, and maintain high-quality standards in software development.
Stay up-to-date with emerging trends and technologies in big data and recommend improvements to our architecture and processes.
Troubleshoot and resolve issues in a timely manner to minimize downtime and ensure system reliability.
Requirements:
Bachelors degree in Computer Science, Engineering, or a related field; Masters degree preferred.
8+ years of experience in the software engineering field.
5+ years of experience in big data technologies, with a focus on Elastic and Kafka.
Proficiency in Java programming and experience with related frameworks.
Strong understanding of data modeling, ETL processes, and data warehousing.
Excellent problem-solving abilities and strong analytical skills.
A solid understanding of CI/CD principles.
Experience working with both external and in-house APIs and SDKs
Advantages:

Experience with Docker, Kubernetes
Experience with cloud platforms (e.g., AWS or Azure)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8142364
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're looking for a Big Data Developer to join our Data Engineering team in Tel Aviv.
You will be a part of our Data Management platform, that is responsible for processing, storing, and serving data for all our core systems.
As a Big Data Developer, we will encourage you to take a proactive & autonomous approach in research & development while working collaboratively with the team. Our ideal candidate is a fast but thorough problem solver, and a fun person to work with!
What Youll Do:
End-to-end development and ownership, from design to production, with a strong emphasis on innovation and efficiency.
Take part in building & designing the next generation of company Identity solutions.
Implement high scale Big-Data solutions and contribute to our platform infrastructure and architecture.
Research core technologies and integrations with external APIs and services.
Working with various stakeholders: Product, Engineering, Data providers etc.
Requirements:
Bachelor's degree in computer science/Computer Engineering or equivalent degrees or equivalent experience.
4+ years of software development experience.
Experience in dealing with performance and high scale data systems.
Experience in building and optimizing data pipelines, ETLs, and streaming systems using technologies like Hadoop, Spark, and Kafka.
4+ years of experience in Scala, GO, Python or Java.
Experience with SQL & NOSQL DBs.
Experience with Kubernetes advantage.
Experience with Git.
Experience with Gitlab, Grafana, InfluxDb, Kibana advantage.
Team player, ego less, strong communication skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125352
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're looking for a Big Data Developer to join our Data Engineering team in Tel Aviv.
You will be a part of our Data Management platform, that is responsible for processing, storing, and serving data for all our core systems.
As a Big Data Developer, we will encourage you to take a proactive & autonomous approach in research & development while working collaboratively with the team. Our ideal candidate is a fast but thorough problem solver, and a fun person to work with!
What Youll Do:
End-to-end development and ownership, from design to production, with a strong emphasis on innovation and efficiency.
Take part in building & designing the next generation of company Identity solutions.
Implement high scale Big-Data solutions and contribute to our platform infrastructure and architecture.
Research core technologies and integrations with external APIs and services.
Working with various stakeholders: Product, Engineering, Data providers etc.
Requirements:
Bachelor's degree in computer science/computer engineering or equivalent degrees or equivalent experience.
7+ years of software development experience.
Experience in dealing with performance, high scale data systems or distributed systems.
5+ years of experience in Java.
Experience with SQL & NOSQL DBs.
Experience in building and optimizing data pipelines, ETLs, and streaming systems using technologies like Hadoop, Spark, and Kafka Advantage.
Experience with Kubernetes advantage.
Experience with Git.
Experience with Gitlab, Grafana, InfluxDb, Kibana advantage.
Team player, ego less, strong communication skills.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125302
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
3 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
a well-funded AI startup on a mission to revolutionize passenger journeys with intelligent threat detection powered by machine learning and computer vision. Were building cutting-edge AI tools to ensure safer and more efficient travel.
We are seeking a highly motivated and experienced Data Engineer & MLOps Specialist to join our dynamic Data team. This is a pivotal role where you will design and implement scalable data solutions, develop automation for machine learning workflows, and manage critical infrastructure supporting our AI/ML initiatives.
Responsibilities:
Develop and maintain automated pipelines for data transformation, model inference, and monitoring.
Build and manage an annotation database to ensure high-quality labeled data for AI training.
Design and deploy infrastructure for automated ML inference and neural network analysis using tools like Voxel51 and ClearML.
Develop ETL pipelines to integrate production data into the data warehouse for analysis and reporting.
Work on local servers and in the AWS cloud, ensuring efficient and secure processing, storage, and deployment of data and ML models.
Requirements:
6+ years of experience in data engineering, MLOps, Software or a related field.
Strong programming skills in Python and experience with SQL and data platforms.
Familiarity with tools like Voxel51, ClearML, or similar open-source frameworks.
Hands-on experience in building scalable ETL pipelines and managing large datasets.
Solid understanding of modern data technologies, including data processing frameworks, data storage solutions, and workflow orchestration tools.
Knowledge of machine learning workflows and AI environments is a plus.
Excellent communication and collaboration skills.
Bachelors degree in Information Systems, Computer Science, or a related field; advanced degrees are a bonus.
What We Offer:
A chance to design and implement solutions from scratch, shaping the future of data and AI infrastructure.
Work with AI technology in a high-growth startup environment.
Opportunities for career growth and continuous learning.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8157838
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're hiring an experienced Data Engineer to join our growing team of analytics experts in order to help & lead the build-out of our data integration and pipeline processes, tools and platform.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground u.
The right candidate must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our companys data architecture to support our next generation of products and data initiatives.
In this role, you will be responsible for:
Create ELT/Streaming processes and SQL queries to bring data to/from the data warehouse and other data sources.
Establish scalable, efficient, automated processes for large-scale data analyses.
Support the development of performance dashboards & data sets that will generate the right insight.
Work with business owners and partners to build data sets that answer their specific business questions.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization.
Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
Own the data lake pipelines, maintenance, improvements and schema.
Requirements:
BS or MS degree in Computer Science or a related technical field.
4+ years of Python / Java development experience.
4+ years of experience as a Data Engineer or in a similar role (BI developer).
4+ years of direct experience with SQL (No-SQL is a plus), data modeling, data warehousing, and building ELT/ETL pipelines - MUST
Experience working with cloud environments (AWS preferred) and big data technologies (EMR,EC2, S3 ) - DBT is an advantage.
Experience working with Airflow - big advantage
Experience working with Kubernetes advantage
Experience working with at least in one of the big data environments: Snowflake, Vertica, Hadoop (Impala/Hive), Redshift etc MUST
Experience working with Spark advantage
Exceptional troubleshooting and problem-solving abilities.
Excellent verbal/written communication & data presentation skills, including experience communicating to both business and technical teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125095
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Our Data Platform Team Lead will drive innovation and build scalable data infrastructure that influences the heart health of thousands of users. In this role, you will manage a team of 4 senior data engineers & BI developers responsible for the end-to-end development of data product. You will work in close collaboration with other cross-functional teams, including analysts, data scientists, and engineers, to deliver high-quality solutions that drive data insights and business intelligence at scale. As team lead, you will guide and mentor your team ensuring they grow their skills and deliver impactful solutions. Your leadership will be critical in driving team productivity, technical excellence, and fostering an innovative culure.
Our Technologies stack: Python, Spark, Airflow, DBT, Kafka, AWS (Glue, EMR, S3, Athena and more), Docker, Kubernetes, MongoDB, Redis, Postgres, Elasticsearch, and more.
Responsibilities:
Leadership & Team Management: Lead, mentor, and manage a team of 5 senior data engineers. Foster a culture of collaboration, continuous improvement, and technical excellence.
Data Pipeline Development: Oversee the development and optimization of data pipelines and architectures to support our data flows. Ensure data processing is efficient, scalable, and aligned with business goals.
System Integration: Collaborate with cross-functional teams to integrate and optimize systems that retrieve and analyze data to influence user outcomes.
End-to-End Ownership: Take responsibility for the end-to-end development, deployment, and maintenance of our data systems, ensuring they meet business needs and industry standards.
Innovation & Continuous Learning: Stay up-to-date with emerging technologies and best practices. Identify opportunities to innovate and improve data systems and processes.
Collaboration with Stakeholders: Work closely with researchers, architects, and engineers to ensure that data systems align with research objectives and technical requirements.
Requirements:
Bachelors Degree in Computer Science or related field
At least 4 years of management experience leading a team of engineers
7+ years working experience of designing and implementing server-side Data solutions
Proven experience in creating and optimizing big data processes, pipelines and architectures
Experience with AWS ecosystem
Experience working on production grade projects
Experience leading 3rd party POCs in the Data Engineering domain
Experience with Kubernetes in Production
Experience with Python or similar
Experience with Apache Airflow or similar
Familiarity with distributed systems and technologies (Spark, Hadoop, MapReduce)
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8120502
סגור
שירות זה פתוח ללקוחות VIP בלבד