דרושים » דאטה » Senior Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
Main responsibilities:
Provide the direction of our data architecture. Determine the right tools for the right jobs. We collaborate on the requirements and then you call the shots on what gets built.
Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance.
Optimize and monitor the team-related cloud costs.
Design and construct monitoring tools to ensure the efficiency and reliability of data processes.
Implement CI/CD for Data Workflows
Requirements:
5+ Years of Experience in data engineering and big data at large scales. - Must
Extensive experience with modern data stack - Must:
Snowflake, Delta Lake, Iceberg, BigQuery, Redshift
Kafka, RabbitMQ, or similar for real-time data processing.
Pyspark, Databricks
Strong software development background with Python/OOP and hands-on experience in building large-scale data pipelines. - Must
Hands-on experience with Docker and Kubernetes. - Must
Expertise in ETL development, data modeling, and data warehousing best practices.
Knowledge of monitoring & observability (Datadog, Prometheus, ELK, etc)
Experience with infrastructure as code, deployment automation, and CI/CD.
Practices using tools such as Helm, ArgoCD, Terraform, GitHub Actions, and Jenkins.
Our stack: Azure, GCP, Databricks, Snowflake, Airflow, Spark, Kafka, Kubernetes, Neo4J, AeroSpike, ELK, DataDog, Micro-Services, Python, SQL
Your stack: Proven strong back-end software engineering skills, ability to think for yourself and challenge common assumptions, commitment to high-quality execution, and embrace collaboration.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8114405
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infra Engineer
Main responsibilities:
Data Architecture Direction: Provide strategic direction for our data architecture, selecting the appropriate componments for various tasks. Collaborate on requirements and make final decisions on system design and implementation.
Project Management: Manage end-to-end execution of high-performance, large-scale data-driven projects, including design, implementation, and ongoing maintenance.
Cost Optimization: Monitor and optimize cloud costs associated with data infrastructure and processes.
Efficiency and Reliability: Design and build monitoring tools to ensure the efficiency, reliability, and performance of data processes and systems.
DevOps Integration: Implement and manage DevOps practices to streamline development and operations, focusing on infrastructure automation, continuous integration/continuous deployment (CI/CD) pipelines, containerization, orchestration, and infrastructure as code. Ensure scalable, reliable, and efficient deployment processes.
Our stack: Azure, GCP, Kubernetes, ArgoCD, Jenkins, Databricks, Snowflake, Airflow, RDBMS, Spark, Kafka, Micro-Services, bash, Python, SQL.
Requirements:
Mandatory Qualifications:
5+ Years of Experience: Demonstrated experience as a DevOps professional, with a strong focus on big data environments, or Data Engineer with strong DevOps skills.
Data Componments Management: Experience with managing and design data infrasturacture such as Snowflake, PostgreSQL, Kafka, Aerospike, Object Store.
DevOps Expertise: Proven experience in creating, establishing, and managing big data tools, including automation tasks. Extensive knowledge of DevOps concepts and tools, including Docker, Kubernetes, Terraform, ArgoCD, Linux OS, Networking, Load Balancing, Nginx etc.
Programming Skills: Proficiency in programming languages such as Python and Object-Oriented Programming (OOP) languages with an emphasis on the big data processing (like PySpark). Experience with scripting languages like Bash and Shell for automation tasks.
Cloud Platforms: Hands-on experience with major cloud providers such as Azure, Google Cloud, or AWS.
Preferred Qualifications:
Performance Optimization: Experience in optimizing performance for big data tools and pipelines - Big Advantage.
Security Expertise: Experience in identifying and addressing security vulnerabilities within the data platform - Big Advantage.
CI/CD Pipelines: Experience in designing, implementing, and maintaining Continuous Integration/Continuous Deployment (CI/CD) pipelines Advantage.
Data Pipelines: Experience in building big data pipelines - Advantage.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8114395
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
A growing tech company in the automotive space with hubs across the US and Israel. Our mission is to constantly disrupt the industry by creating groundbreaking technologies to help dealers build stronger, more resilient businesses. Our work happens in the fast lane as we bring AI-powered, data-driven solutions to a quickly evolving industry.

Our team consists of curious and creative individuals who are always looking to achieve the impossible. We are bold, collaborative, and goal-driven, and at our core, we believe every voice has value and can impact our bottom line.
If you are a creative, solutions-oriented individual who is ready to put your career in drive,the place for you!

We are looking for an experienced Data Engineering Tech Lead to join our team and make a real impact! In this hands-on role, you will drive the architecture, development, and optimization of our Data infrastructure, ensuring scalable and high-performance data solutions that support analytics, AI, and business intelligence needs. You will collaborate closely with analysts, Product, DevOps, and software engineers to cultivate a robust data ecosystem.

This position will report to the CISO and can be based out of Jerusalem or Tel-Aviv.

What you will be responsible for
Lead the design and implementation and maintenance of our DWH & Data Lake architecture to support both analytical and operational use cases.
Develop scalable ETL/ELT pipelines for ingestion, transformation, and optimization of structured and unstructured data.
Ensure data quality, governance, and security throughout the entire data lifecycle.
Optimize performance and cost-efficiency of data storage, processing, and retrieval.
Work closely with BI and analytics teams to guarantee seamless data integration with visualization tools.
Collaborate with stakeholders (BI teams, Product, and Engineering) to align data infrastructure with business needs.
Mentor and guide analysts, fostering a culture of best practices and professionalism.
Stay updated with industry trends and evaluate new technologies for continuous improvement.
Requirements:
5+ years of experience in data engineering, with at least 2-3 years experience in a Tech Lead role.
At least 3 years of hands-on experience with AWS, including services like S3, Redshift, Glue, Athena, Lambda, and RDS.
Expertise in DWH & Data Lake architectures, including columnar databases, data partitioning, and lakehouse concepts.
Strong experience with cloud data solutions like Redshift, Snowflake, BigQuery, or Databricks.
Proficiency in ETL/ELT tools (e.g., dbt, Apache Airflow, Glue, Dataflow).
Deep knowledge of SQL & Python for data processing and transformation.
Experience working with BI and visualization tools such as Power BI, Tableau, Looker, or similar.
Experience with real-time data streaming (Kafka, Kinesis, Pub/Sub) and batch processing.
Understanding of data modeling (Star/Snowflake), data governance, and security best practices.
Experience with CI/CD, infrastructure-as-code (Terraform, CloudFormation), and DevOps for data.
The personal competencies you need to have:
Excellent communication skills and the ability to work as a team.
Strong sense of ownership, urgency, and drive.
Ability to take the initiative, come up with ideas and solutions, and run it with a "getting things done" attitude.
Ability to work independently and manage tight deadlines.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8129541
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a talented Senior Data Engineer to join us.
As a Data Engineer, you will be a key member of the data team, at the core of a data-driven company, developing scalable, robust data platforms and data models and providing business intelligence. You will work in an evolving, challenging environment with a variety of data sources, technologies, and stakeholders to deliver the best solutions to support the business and provide operational excellence.
If you are passionate about data, a team player, and proactive, we want to hear from you.
Responsibilities:
Design, Develop & Deploy Data Pipelines and Data Models on various Data Lake / DWH layers
Ingest data from and export data to multiple third-party systems and platforms (e.g., Salesforce, Braze, SurveyMonkey).
Architect and implement data-related microservices and products
Ensure the implementation of best practices in data management, including data lineage, observability, and data contracts.
Maintain, support, and refactor legacy models and layers within the DWH.
Requirements:
5+ years of experience in software development, data engineering, or business intelligence developement
Proficiency in Python - A must.
Advanced SQL skills - A must
Strong background in data modeling, ETL development, and data warehousing - A must.
Experience with big data technologies, particularly Airflow - A must
Familiarity with tools such as Spark, Hive, Airbyte, Kafka, Clickhouse, Postgres, Great Expectations, Data Hub, or Iceberg is advantageous.
General understanding of cloud environments like AWS, GCP, or Azure - A must
Experience with Terraform, Kubernetes (K8S), or ArgoCD is advantageous.
A bachelors degree in Computer Science, Engineering, or a related field is advantageous but not mandatory.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8141433
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
30/03/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
We are seeking a highly skilled Senior Data Engineer to join our Data Group and help drive the development and optimization of our cutting-edge data infrastructure.
As a key member of the Platform team, you will play an instrumental role in building and evolving our feature store data pipeline, enabling machine learning teams to efficiently access and work with high-quality, real-time data at scale.
In this dynamic, fast-paced environment, you will collaborate with other data professionals to create robust, scalable data solutions. You will be responsible for architecting, designing, and implementing data pipelines that ensure reliable data ingestion, transformation, and storage, ultimately supporting the production of high-performance ML models.
We are looking for data-driven problem-solvers who thrive in ambiguous, fast-moving environments and are passionate about building data systems that empower teams to innovate and scale. We value independent thinkers with a strong sense of ownership, who can take challenges from concept to production while continuously improving our data infrastructure.
As a Data Engineer you will...
Design and implement large-scale batch & streaming data pipelines infrastructure
Build and optimize data workflows for maximum reliability and performance
Develop solutions for real-time data processing and analytics
Implement data consistency checks and quality assurance processes
Design and maintain state management systems for distributed data processing
Take a crucial role in building the group's engineering culture, tools, and methodologies
Define abstractions, methodologies, and coding standards for the entire Data Engineering pipeline.
Requirements:
5+ years of experience as a Software Engineer with focus on data engineering
Expert knowledge in building and maintaining data pipelines at scale
Strong experience with stream/batch processing frameworks (e.g. Apache Spark, Flink)
Profound understanding of message brokers (e.g. Kafka, RabbitMQ)
Experience with data warehousing and lake technologies
Strong Python programming skills and experience building data engineering tools
Experience with designing and maintaining Python SDKs
Proficiency in Java for data processing applications
Understanding of data modeling and optimization techniques
Bonus Points
Experience with ML model deployment and maintenance in production
Knowledge of data governance and compliance requirements
Experience with real-time analytics and processing
Understanding of distributed systems and cloud architectures
Experience with data visualization and lineage tools/frameworks and techniques.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8119235
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
18/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineering Team Lead to drive the development and optimization of our data infrastructure.

So, what will you be doing all day?

Lead & Mentor: Provide technical leadership and mentorship, setting clear goals, fostering innovation, managing stakeholders and guiding the professional growth of the team.
Data Infrastructure Development: Design and enhance scalable data architectures and pipelines to support current and future business needs.
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Monitoring & Optimization: Ensure system reliability, observability, and performance using monitoring, logging, and alerting tools.
Collaboration & Stakeholder Engagement: Work closely with BI analysts, business stakeholders, and cross-functional teams to define data requirements and enable self-service BI capabilities.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs.
Requirements:
BSc in Computer Science, Engineering, or equivalent practical experience.
5+ years of experience in data engineering, with 2+ years in a leadership role.
Technical Expertise:
Strong SQL skills and experience designing scalable and reliable data architectures.
Expertise in building data pipelines and workflows with tools like Airflow, DBT, and Databricks.
Experience with cloud-based data warehouses such as Snowflake, Redshift, BigQuery, or Databricks.
Proficiency in Python for data engineering and automation.
Familiarity with infrastructure-as-code tools like Terraform, Kubernetes (K8s), Docker, or Nomad.
Soft Skills:
Strong problem-solving abilities and a passion for learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8142388
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
5 ימים
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer (Analytics)
Tel Aviv
As a Senior Data Engineer, you will play a key role in shaping and driving our analytics data pipelines and solutions to empower business insights and decisions. Collaborating with a variety of stakeholders, you will design, develop, and optimize scalable, high-performance data analytics infrastructures using modern tools and technologies. Your work will ensure data is accurate, timely, and actionable for critical decision-making.
Key Responsibilities:
Lead the design, development, and maintenance of robust data pipelines and ETL processes, handling diverse structured and unstructured data sources.
Collaborate with data analysts, data scientists, product engineers and product managers to deliver impactful data solutions.
Architect and maintain the infrastructure for ingesting, processing, and managing data in the analytics data warehouse.
Develop and optimize analytics-oriented data models to support business decision-making.
Champion data quality, consistency, and governance across the analytics layer.
What your day might look like:
Leading the design and implementation of scalable data pipelines to support analytical workloads.
Collaborating with stakeholders to gather requirements, propose solutions, and align on data strategies.
Writing and optimizing ETL processes to ensure seamless integration of new data sources.
Designing analytics-focused data modeling solutions tailored for strategic decision-making.
Troubleshooting data issues and implementing measures to improve system reliability and accuracy.
Sharing knowledge and mentoring team members to promote a culture of learning and excellence.
This is an opportunity to make a significant impact by enabling data-driven decision-making at scale while growing your career in a dynamic, collaborative environment.
Requirements:
5+ years of experience as a Data Engineer or in a similar role.
Expertise in SQL and proficiency in Python for data engineering tasks.
Proven experience designing and implementing analytics-focused data models and warehouses.
Hands-on experience with data pipelines and ETL/ELT frameworks (e.g Airflow, Luigi, AWS Glue, DBT).
Strong experience with cloud data services (e.g., AWS, GCP, Azure).
A deep passion for data and a strong analytical mindset with attention to detail.
Bonus points:
Strong understanding of business metrics and how to translate data into actionable insights
Experience with data visualization tools (e.g., Tableau, Power BI, Looker)
Familiarity with data governance and data quality best practices
Excellent communication skills to work with cross-functional teams including data analysts, data scientists, and product managers.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8153675
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
18/04/2025
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for a Data Engineer to join our team and play a key role in designing, building, and maintaining scalable, cloud-based data pipelines. You will work with AWS (Redshift, S3, Glue, Managed Airflow, Lambda) to integrate, process, and analyze large datasets, ensuring data reliability and efficiency.

Your work will directly impact business intelligence, analytics, and data-driven decision-making across the

What Youll Do:
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Data Modeling: Design and maintain logical and physical data models to support business needs.
Optimization & Scalability: Improve process efficiency and optimize runtime performance to handle large-scale data workloads.
Collaboration: Work closely with BI analysts and business stakeholders to define data requirements and functional specifications.
Monitoring & Troubleshooting: Ensure data integrity and reliability by proactively monitoring pipelines and resolving issues.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs
Requirements:
BSc in Computer Science, Engineering, or equivalent practical experience.
3+ years of experience in data engineering or related roles.
Technical Expertise:
Proficiency in Python for data engineering and automation.
Experience with Big Data technologies such as Spark, Databricks, DBT, and Airflow.
Hands-on experience with AWS services (S3, Redshift, Glue, Managed Airflow, Lambda)
Knowledge of Docker, Terraform, Kubernetes, and infrastructure automation.
Strong understanding of data warehouse (DWH) methodologies and best practices.
Soft Skills:
Strong problem-solving abilities and a proactive approach to learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8142386
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
17/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
In this role, you will design and develop scalable data solutions, optimize data workflows, and support critical business processes. You will work with a variety of databases and big data tools in a cloud environment, focusing on data modeling, governance, and analytics.

What youll be doing:

Design, develop, and maintain end-to-end ETL pipelines, from gathering business requirements to implementation.
Work with multiple database technologies, especially BigQuery.
Optimize data models (DWH, fact & dimension tables, RI, SCDs) for performance and scalability.
Implement data governance best practices and maintain comprehensive documentation.
Utilize Big Data tools in cloud environments (GCP preferred).
Develop and support complex business workflows and data processes.
Design and implement monitoring systems to ensure data quality throughout the pipeline.
Workflow orchestration using Apache Airflow.
Collaborate with analysts and stakeholders to ensure high-quality data for business insights.
Support and optimize Tableau infrastructure for data visualization.
Requirements:
4+ years of experience in Data Engineering.
Strong SQL skills and expertise in BigQuery or similar databases.
4+ years of Python experience for data processing and automation.
Proven experience in designing complex business workflows and data processes.
Deep understanding of data modeling principles and best practices.
Hands-on experience with cloud-based big data tools (GCP preferred).
Must have experience with Apache Airflow for orchestrating data workflows.
Strong analytical skills with the ability to translate business needs into technical solutions.
Experience with Tableau infrastructure management is an advantage.
Excellent communication skills and ability to work cross-functionally.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8142109
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
07/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
venture-backed, fast-growing startup with a mission to use the power of Artificial Intelligence (AI) to get everyone access to higher quality healthcare at more affordable costs. Were looking for mission driven individuals to join our team and help us eliminate healthcare inequalities to build a better and healthier future.

Featured most recently in Forbes and Business Insider as a leading AI startup,a telehealth company that harnesses the power of technology to help provide the smartest digital healthcare platform to patients, hospital systems, and providers across the United States. Our AI powered application helps bring together the knowledge of thousands of doctors and anonymous medical data to provide the highest quality care to our patients. We offer a free symptom checker, 24/7 access to board-certified doctors, ability to refill prescriptions from your phone, and more. All within one application - no insurance or preauthorization required.
was founded in 2016, and has partnered with visionary and leading hospital systems and providers such as Cedars-Sinai, Mayo Clinic, and Elevance Health. Join us on our mission to help provide better healthcare for less.

About the Role
Our data engineering team is looking for an experienced professional with expertise in SQL, Python, and strong data modeling skills. In this role, you will be at the heart of our data ecosystem, designing and maintaining high-quality data pipelines and models that drive decision-making across the organization.

You will play a key role in ensuring data quality, building scalable systems, and supporting cross-functional teams with clean, accurate, and actionable data.

What you will do:
Design, develop, and optimize scalable data pipelines, ensuring data is clean, accurate, and ready for analysis.
Build and maintain robust data models that support clinical, business intelligence and operational needs.
Work closely with data analysts, data scientists, engineers and cross-functional teams to understand data requirements and deliver high-quality solutions.
Implement and enforce data quality standards, monitoring, and best practices across systems and pipelines.
Collaborate on the design and implementation of database schemas to support analytical and transactional workloads.
Manage and optimize large-scale data storage and processing systems to ensure reliability and performance.
Requirements:
At least 5 years of experience as a data-engineer / BI developer / Data developer
Python Proficiency: Proven ability to build data processing scripts, automation, and integration tools using Python.
SQL Expertise: Deep experience in crafting complex queries, optimizing performance, and working with large datasets.
Strong knowledge of data modeling principles and best practices for relational and dimensional data structures.
A passion for maintaining data quality and ensuring trust in business-critical data.
Bonus points:
Experience with cloud platforms (e.g., AWS, GCP, or Azure) for data engineering and storage solutions.
Familiarity with dbt (data build tool) or Dataform for transformation pipelines and data modeling.
Knowledge of data orchestration tools (e.g., Airflow)
Understanding of CI/CD practices and version control systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8130988
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
02/04/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Mid-Level Data Engineer with experience in data pipeline development, data modeling, and modern cloud-based tools to join our team and drive data solutions at scale.
Design, implement, and maintain robust ETL/ELT pipelines using tools like Airflow, dbt, or similar.
Develop, optimize, and maintain SQL scripts and queries for large-scale data processing.
Work extensively with cloud-based data platforms such as Snowflake, BigQuery.
Build and enhance data models to support analytics and reporting requirements.
Automate data workflows and processes using Python or other programming languages.
Monitor and improve data pipeline performance, reliability, and scalability.
Collaborate with cross-functional teams to ensure alignment on data architecture and solutions.
Requirements:
Strong understanding of data modeling concepts (e.g., star schema, snowflake schema, normalization).
Proficiency in SQL and Python, with hands-on experience in data workflows and automation.
Experience working with modern data tools like dbt, Airflow, and cloud data warehouses.
Knowledge of data governance and best practices for ensuring data quality and integrity.
Ability to troubleshoot and resolve pipeline and data processing issues.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8125899
סגור
שירות זה פתוח ללקוחות VIP בלבד