דרושים » דאטה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 18 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our BI Team within the R&D department. This role is all about building and maintaining robust data pipelines and enabling data-driven decision-making across all departments.

Responsibilities:
Develop and maintain ETL processes and data pipelines to ensure smooth, accurate data flow.
Designed, built, and managed Airflow DAGs to orchestrate complex data workflows and ensure reliable execution of ETL and analytics tasks
Design and optimize relational databases, including conceptual and physical data modeling (ERD), schema design, and performance tuning to support scalable analytics and business applications.
Collaborate with cross-functional teams to translate business needs into technical solutions.
Build and maintain automation processes powered by algorithms to optimize workflows for reducing manual effort.
Build and maintain custom data scripts for integrations with brand and publisher partners
Support data visualization efforts through BI tools.
Gain in-depth understanding of discrepancy processes across departments and improve data accuracy.
Requirements:
3+ years of hands-on experience in data engineering or BI development.
Strong SQL skills (complex queries, stored procedures).
Practical experience with ETL processes and data pipelines.
Experience with Airflow, Dagster, or similar orchestration tools.
1+ Hands-on experience with Python.
Knowledge of database architecture (ERD, OLTP/OLAP models).
Strong analytical mindset with the ability to translate business requirements into data solutions.
B.Sc. degree in Information Systems Engineering, Industrial Engineering, or a related field.
Experience with DBT, BigQuery, AWS - advantage.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8476306
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced BI Data Engineer to join our Data team within the Information Systems group.
In this role, you will be responsible for building and maintaining scalable, high-quality data pipelines, models, and infrastructure that support business operations across the entire company, with a primary focus on GTM domains.
You will take ownership of core data architecture components, ensuring data consistency, reliability, and accessibility across all analytical and operational use cases.
Your work will include designing data models, orchestrating transformations, developing internal data applications, and ensuring that business processes are accurately represented in the data.
This role requires a combination of deep technical expertise and strong understanding of business operations.
You will collaborate closely with analysts, domain experts, and engineering teams to translate complex business processes into robust, scalable data solutions. If you are passionate about data architecture, building end-to-end data systems, and solving complex engineering challenges that directly impact the business wed love to meet you!
Key Responsibilities:
Design, develop, and maintain end-to-end data pipelines, ensuring scalability, reliability, and performance.
Build, optimize, and evolve core data models and semantic layers that serve as the organizations single source of truth.
Implement robust ETL/ELT workflows using Snowflake, dbt, Rivery, and Python.
Develop internal data applications and automation tools to support advanced analytics and operational needs.
Ensure high data quality through monitoring, validation frameworks, and governance best practices.
Improve and standardize data modeling practices, naming conventions, and architectural guidelines.
Continuously evaluate and adopt new technologies, features, and tooling across the data engineering stack.
Collaborate with cross-functional stakeholders to deeply understand business processes and translate them into scalable technical solutions.
Requirements:
5+ years of experience in BI data engineering, data engineering, or a similar data development role.
Bachelors degree in Industrial Engineering, Statistics, Mathematics, Economics, Computer Science, or a related field required.
Strong SQL expertise and extensive hands-on experience with ETL/ELT development required.
Proficiency with Snowflake, dbt, Python, and modern data engineering workflows essential.
Experience building and maintaining production-grade data pipelines using orchestration tools (e.g., Rivery, Airflow, Prefect) an advantage.
Experience with cloud platforms, CI/CD, or DevOps practices for data an advantage.
Skills and Attributes:
Strong understanding of business processes and the ability to design data solutions that accurately represent real-world workflows.
Strong analytical and problem-solving skills, with attention to engineering quality and performance.
Ability to manage and prioritize tasks in a fast-paced environment.
Excellent communication skills in Hebrew and English.
Ownership mindset, curiosity, and a passion for building high-quality data systems.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8441718
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.

In this role, you will be responsible for building the foundations of our data analytics platform enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.

You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.

If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle this role is for you!

Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.

Lead the design and architecture of the data platform from integration to transformation, modeling, storage, and access.

Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.

Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.

Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.

Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.

Promote a data-driven culture be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.

Proven track record of designing and implementing large-scale data platforms or ETL architectures from the ground up.

Strong hands-on experience with ETL tools and data Warehouse/Lakehouse products (Airflow, Airbyte, dbt, Databricks)

Experience supporting both batch pipelines and real-time streaming architectures (e.g., Kafka, Spark Streaming).

Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).

Familiarity with data visualization tools like Power BI, Looker, or similar.

BSc in Computer Science or a related field from a leading university
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8423261
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
21/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced Senior data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data -driven solutions.Key Responsibilities:
* Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
* Develop and optimize data models, schemas, and Storage solutions to support analytics and Machine Learning initiatives.
* Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
* Ensure data quality, integrity, and governance across multiple sources and systems.
* Monitor and troubleshoot data workflows, resolving performance and reliability issues.
* Evaluate and implement new data technologies and frameworks to improve the data platform.
* Document processes, best practices, and data architecture.
* Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Required Qualifications:
* Bachelors or Masters degree in Computer Science, Engineering, or a related field.
* 5+ years of experience in data engineering, ETL development, or a similar role.
* Strong proficiency in SQL and experience with relational and NoSQL databases.
* Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
* Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
* Solid programming skills in Python, JAVA, or Scala.
* Strong problem-solving, analytical, and communication skills.
* Knowledge of data governance, security, and compliance standards.
* Experience with data warehousing, Big Data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks. Preferred Qualifications (Advantage):
* Familiarity with Machine Learning workflows and MLOps practices.
* Work with data Lakehouse architectures and technologies such as Apache Iceberg.
* Experience working with data ecosystems in Open Source/On-Premise environments. Why Join Us:
* Work with cutting-edge technologies and large-scale data systems.
* Collaborate with a talented and innovative team.
* Opportunities for professional growth and skill development.
* Make a direct impact on data -driven decision-making across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8401647
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Senior Data Engineer to join our Data team - someone whos passionate about building reliable, scalable data infrastructure and thrives on solving complex technical challenges.
In this role, youll own the design and development of end-to-end data pipelines that power analytics and data-driven decision-making.
Youll collaborate closely with both business and technical stakeholders to ensure data flows smoothly, accurately, and efficiently across the company.
What You Will Do:
Design, implement, and maintain large-scale ETL and ELT pipelines using modern data frameworks and cloud technologies.
Work with Redshift data warehouses to design efficient schemas and optimize performance.
Build and manage data ingestion processes from multiple sources - APIs, SaaS platforms, internal systems, and databases.
Collaborate with stakeholders to deliver clean, well-modeled, and high-quality data.
Build and evolve a modern, efficient, and scalable data warehouse architecture.
Ensure observability, monitoring, and testing across all data processes.
Apply best practices in CI/CD, version control (Git), and data quality validation.
Requirements:
5+ years of experience as a Data Engineer or ETL Developer, building large-scale data pipelines in a cloud environment (AWS, GCP, or Azure).
Strong SQL expertise, including query optimization and data modeling.
Hands-on experience with ETL/ELT tools such as Matillion, Rivery, SSIS, Talend, or similar.
Solid understanding of data warehouse concepts and dimensional modeling.
Excellent analytical and problem-solving skills.
A collaborative mindset and the ability to work cross-functionally with internal teams.
A self-starter and agile learner who thrives in a fast-paced, dynamic environment.
AI/Data-related development capabilities experience building or integrating AI-driven data solutions is a plus.
Nice to Have:
Experience with Redshift and Matillion - big advantage.
Experience with BI tools such as Qlik or Power BI - big advantage.
Familiarity with CI/CD pipelines.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8435478
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role:We are seeking an experienced Senior Data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data-driven solutions.Key Responsibilities:

Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
Develop and optimize data models, schemas, and storage solutions to support analytics and machine learning initiatives.
Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
Ensure data quality, integrity, and governance across multiple sources and systems.
Monitor and troubleshoot data workflows, resolving performance and reliability issues.
Evaluate and implement new data technologies and frameworks to improve the data platform.
Document processes, best practices, and data architecture.
Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Bachelors or Masters degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering, ETL development, or a similar role.
Strong proficiency in SQL and experience with relational and NoSQL databases.
Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
Solid programming skills in Python, Java, or Scala.
Strong problem-solving, analytical, and communication skills.
Knowledge of data governance, security, and compliance standards.
Experience with data warehousing, big data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8437853
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
14/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are seeking a Data Engineer to join our dynamic data team. In this role, you will design, build, and maintain robust data systems and infrastructure that support data collection, processing, and analysis. Your expertise will be crucial in developing scalable data pipelines, ensuring data quality, and collaborating with cross-functional teams to deliver actionable insights.

Key Responsibilities:
Design, develop, and maintain scalable ETL processes for data transformation and integration.
Build and manage data pipelines to support analytics and operational needs.
Ensure data accuracy, integrity, and consistency across various sources and systems.
Collaborate with data scientists and analysts to support AI model deployment and data-driven decision-making.
Optimize data storage solutions, including data lakehouses and databases, to enhance performance and scalability..
Monitor and troubleshoot data workflows to maintain system reliability.
Stay updated with emerging technologies and best practices in data engineering.

Please note that this role is on a hybrid model of 4 days/week in our Tel-Aviv office.
Requirements:
Requirements
3+ years of experience in data engineering or a related role within a production environment.
Proficiency in Python and SQL
Experience with both relational (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Elasticsearch).
Familiarity with big data AWS tools and frameworks such as Glue, EMR, Kinesis etc.
Experience with containerization tools like Docker and Kubernetes.
Strong understanding of data warehousing concepts and data modeling.
Excellent problem-solving skills and attention to detail.
Strong communication skills, with the ability to work collaboratively in a team environment.

Preferred Qualifications:
Experience with machine learning model deployment and MLOps practices.
Knowledge of data visualization tools and techniques.
Practical experience democratizing the companys data to enhance decision making.
Bachelors degree in Computer Science, Engineering, or a related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8456661
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
07/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a versatile, talented, and highly motivated Data Engineer to join our growing team.

If youre passionate about solving complex problems, thrive in dynamic environments, and love working at the intersection of data engineering, machine learning infrastructure, and AI innovation, this role is for you.

As a Data Engineer, youll play a key role in shaping how data flows through the company, from building scalable pipelines and robust infrastructure to powering data science models and enabling internal teams with intelligent GenAI-powered tools. This is a hands-on, high-impact role with plenty of room for ownership, creativity, and growth.

This is a high-impact role where your work will shape how the company leverages data and AI. If you want to build, innovate, and push boundaries in a collaborative and fast-moving environment, wed love to meet you.

Responsibilities
Own the entire data lifecycle from understanding business needs and building reliable pipelines to ensuring data quality, observability, and performance.
Design, build, and scale modern data infrastructure including data lakes, warehouses, and complex ETL/ELT pipelines.
Integrate and consolidate diverse data sources (CRMs, APIs, databases, SaaS platforms) into a single, trusted source of truth.
Implement and manage CI/CD, observability, and infrastructure-as-code in a cloud-native environment.
Work with the data science team on their ML pipelines, giving data scientists the infrastructure and automation they need to deploy models to production with speed and confidence.
Collaborate with cross-functional teams to embed GenAI agents into business processes, creating smart workflows that boost efficiency and reduce manual work.
Develop frameworks and internal tooling that empower other teams to safely adopt AI and accelerate innovation.
Optimize data infrastructure for performance and cost-efficiency, with a focus on BigQuery optimization.
Ensure high data quality and integrity across large-scale ETL processes. Work closely with analysts, data scientists, and product managers to support data modeling, governance, and analytical initiatives.
Requirements:
5+ years of experience as a Data Engineer.
Strong programming skills in Python and SQL, with a focus on clean, maintainable, production-grade code.
Proven experience building data pipelines with Airflow.
Hands-on experience with modern analytical databases
Experience working with cloud platforms.
Solid knowledge of data modeling, database design, and performance optimization.
Strong problem-solving abilities, analytical mindset, and attention to detail.
Experience working in production-grade environments.
Excellent communication and collaboration skills.
Familiarity with modern CI/CD, observability, and infrastructure-as-code practices.
Experience with Kubernetes, Docker, and Terraform.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8446375
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
17/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Design, implement, and maintain robust data pipelines and ETL/ELT processes on GCP (BigQuery, Dataflow, Pub/Sub, etc.).
Build, orchestrate, and monitor workflows using Apache Airflow / Cloud Composer.
Develop scalable data models to support analytics, reporting, and operational workloads.
Apply software engineering best practices to data engineering: modular design, code reuse, testing, and version control.
Manage GCP resources (BigQuery reservations, Cloud Composer/Airflow DAGs, Cloud Storage, Dataplex, IAM).
Optimize data storage, query performance, and cost through partitioning, clustering, caching, and monitoring.
Collaborate with DevOps/DataOps to ensure data infrastructure is secure, reliable, and compliant.
Partner with analysts and data scientists to understand requirements and translate them into efficient data solutions.
Mentor junior engineers, provide code reviews, and promote engineering best practices.
Act as a subject matter expert for GCP data engineering tools and services.
Define and enforce standards for metadata, cataloging, and data documentation.
Implement monitoring and alerting for pipeline health, data freshness, and data quality.
Requirements:
Bachelors or Masters degree in Computer Science, Engineering, or related field.
6+ years of professional experience in data engineering or similar roles, with 3+ years of hands-on work in a cloud env, preferably on GCP.
Strong proficiency with BigQuery, Dataflow (Apache Beam), Pub/Sub, and Cloud Composer (Airflow).
Expert-level Python development skills, including object-oriented programming (OOP), testing, and code optimization.
Strong data modeling skills (dimensional modeling, star/snowflake schemas, normalized/denormalized designs).
Solid SQL expertise and experience with data warehousing concepts.
Familiarity with CI/CD, Terraform/Infrastructure as Code, and modern data observability tools.
Exposure to AI tools and methodologies (i.e, Vertex AI).
Strong problem-solving and analytical skills.
Ability to communicate complex technical concepts to non-technical stakeholders.
Experience working in agile, cross-functional teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8462182
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
07/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer.
As a Senior Data Engineer, your mission is to build the scalable, reliable data foundation that empowers Yotpo to make data-driven decisions. You will serve as a bridge between complex business needs and technical implementation, translating raw data into high-value assets.
You will own the entire data lifecyclefrom ingestion to insightensuring that our analytics infrastructure scales as fast as our business.
Key Responsibilities:
Strategic Data Modeling: Translate complex business requirements into efficient, scalable data models and schemas. You will design the logic that turns raw events into actionable business intelligence.
Pipeline Architecture: Design, implement, and maintain resilient data pipelines that serve multiple business domains. You will ensure data flows reliably, securely, and with low latency across our ecosystem.
End-to-End Ownership: Own the data development lifecycle completelyfrom architectural design and testing to deployment, maintenance, and observability.
Cross-Functional Partnership: Partner closely with Data Analysts, Data Scientists, and Software Engineers to deliver end-to-end data solutions.
Requirements:
4+ years of experience as a Data Engineer, BI Developer, or similar role.
Modern Data Stack: Strong hands-on experience with DBT, Snowflake, Databricks, and orchestration tools like Airflow.
SQL & Modeling: Strong proficiency in SQL and deep understanding of data warehousing concepts (Star schema, Snowflake schema).
Data Modeling: Proven experience in data modeling and business logic design for complex domainsbuilding models that are efficient and maintainable.
Modern Workflow: Proven experience leveraging AI assistants to accelerate data engineering tasks.
Bachelors degree in Computer Science, Industrial Engineering, Mathematics, or an equivalent analytical discipline.
Preferred / Bonus:
Cloud Data Warehouses: Experience with BigQuery or Redshift.
Coding Skills: Proficiency in Python for data processing and automation.
Big Data Tech: Familiarity with Spark, Kubernetes, Docker.
BI Integration: Experience serving data to BI tools such as Looker, Tableau, or Superset.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8445987
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced BI / Data Engineer to develop systems that make vast amount of data more accessible and efficient throughout the organization. This is a perfect opportunity if youre passionate about data and have strong analytical and technical skills.

You will be developing innovative and creative ways to deliver data and information at scale in a fast-growing, exciting industry while working closely with our data analysts, product managers, software engineering teams, and management.

Responsibilities:

Design, manage, and optimize scalable data models and BI infrastructures for internal and external stakeholders
Work with diverse relational and non-relational databases (Presto, Athena, Firebolt, MySQL, etc.)
Build and maintain reliable ETL/ELT pipelines using Airflow, Python, and Upsolver
Design and develop data visualizations and dashboards (Tableau)
Identify and implement technological solutions for large-scale analytics
Serve as the central data authority, providing guidance, documentation, and support across the company
Requirements:
4+ years of experience as a BI/Data Engineer
High proficiency in SQL and hands-on experience with databases such as Snowflake, BigQuery, Redshift, or Firebolt
Strong skills in data modeling, ETL/ELT design, and development methodologies
Hands-on experience with Tableau, Looker, or similar BI tools
Experience with Airflow and solid programming skills in Python (or similar languages)
Ability to manage multiple tasks and drive them end-to-end to completion
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8448717
סגור
שירות זה פתוח ללקוחות VIP בלבד