דרושים » דאטה » Senior Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 7 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Your Mission As a Senior Data Engineer, your mission is to build the scalable, reliable data foundation that empowers us to make data-driven decisions. You will serve as a bridge between complex business needs and technical implementation, translating raw data into high-value assets. You will own the entire data lifecycle-from ingestion to insight-ensuring that our analytics infrastructure scales as fast as our business.

Key Responsibilities:
Strategic Data Modeling: Translate complex business requirements into efficient, scalable data models and schemas. You will design the logic that turns raw events into actionable business intelligence.
Pipeline Architecture: Design, implement, and maintain resilient data pipelines that serve multiple business domains. You will ensure data flows reliably, securely, and with low latency across our ecosystem.
End-to-End Ownership: Own the data development lifecycle completely-from architectural design and testing to deployment, maintenance, and observability.
Cross-Functional Partnership: Partner closely with Data Analysts, Data Scientists, and Software Engineers to deliver end-to-end data solutions.
Requirements:
What You Bring:
Your Mindset:
Data as a Product: You treat data pipelines and tables with the same rigor as production APIs-reliability, versioning, and uptime matter to you.
Business Acumen: You dont just move data; you understand the business questions behind the query and design solutions that provide answers.
Builders Spirit: You work independently to balance functional needs with non-functional requirements (scale, cost, performance).
Your Experience & Qualifications:
Must Haves:
6+ years of experience as a Data Engineer, BI Developer, or similar role.
Modern Data Stack: Strong hands-on experience with DBT, Snowflake, Databricks, and orchestration tools like Airflow.
SQL & Modeling: Strong proficiency in SQL and deep understanding of data warehousing concepts (Star schema, Snowflake schema).
Data Modeling: Proven experience in data modeling and business logic design for complex domains-building models that are efficient and maintainable.
Modern Workflow: Proven experience leveraging AI assistants to accelerate data engineering tasks.
Bachelors degree in Computer Science, Industrial Engineering, Mathematics, or an equivalent analytical discipline.
Preferred / Bonus:
Cloud Data Warehouses: Experience with BigQuery or Redshift.
Coding Skills: Proficiency in Python for data processing and automation.
Big Data Tech: Familiarity with Spark, Kubernetes, Docker.
BI Integration: Experience serving data to BI tools such as Looker, Tableau, or Superset.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8511741
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were seeking our first Data Engineer to join the Revenue Operations team. This is a high-impact role where youll build the foundations of our data infrastructure - connecting the dots between systems, designing and maintaining our data warehouse, and creating reliable pipelines that bring together all revenue-related data. Youll work directly with the Director of Revenue Operations and partner closely with Sales, Finance, and Customer Success.
This is a chance to shape the role from the ground up and create a scalable data backbone that powers smarter decisions across the company.
Role Overview:
As the Data Engineer, you will own the design, implementation, and evolution of our data infrastructure. Youll connect core business systems (CRM, finance platforms, billing systems,) into a central warehouse, ensure data quality, and make insights accessible to leadership and revenue teams. Your success will be measured by the accuracy, reliability, and usability of the data foundation you build.
Key Responsibilities:
Data Infrastructure & Warehousing:
Design, build, and maintain a scalable data warehouse for revenue-related data.
Build ETL/ELT pipelines that integrate data from HubSpot, Netsuite, billing platforms, ACP, and other business tools.
Develop a clear data schema and documentation that can scale as we grow.
Cross-Functional Collaboration:
Work closely with Sales, Finance, and Customer Success to understand their reporting and forecasting needs.
Translate business requirements into data models that support dashboards, forecasting, and customer health metrics.
Act as the go-to partner for data-related questions across revenue teams.
Scalability & Optimization:
Continuously monitor and optimize pipeline performance and warehouse scalability.
Ensure the infrastructure can handle increased data volume and complexity as the company grows.
Establish and enforce best practices for data quality, accuracy, and security.
Evaluate and implement new tools, frameworks, or architectures that improve automation, speed, and reliability.
Build reusable data models and modular pipelines to shorten development time and reduce maintenance.
Requirements:
4-6 years of experience as a Data Engineer or in a similar role (preferably in SaaS, Fintech, or fast-growing B2B companies).
Strong expertise in SQL and data modeling; comfort working with large datasets.
Hands-on experience building and maintaining ETL/ELT pipelines (using tools such as Fivetran, dbt, Airflow, or similar).
Experience designing and managing cloud-based data warehouses (Snowflake, BigQuery, Redshift, or similar).
Familiarity with CRM (HubSpot), ERP/finance systems (Netsuite), and billing platforms.
Strong understanding of revenue operations metrics (ARR, MRR, churn, LTV, CAC, etc.).
Ability to translate messy business requirements into clean, reliable data structures.
Solid communication skills - able to explain technical concepts to non-technical stakeholders.
What Sets You Apart:
Youve been the first data hire before and know how to build from scratch (not a must).
Strong business acumen with a focus on revenue operations.
A builder mindset: you like solving messy data problems and making systems talk.
Comfortable working across teams and translating business needs into data solutions.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481826
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer I - GenAI Foundation Models
21679
The Content Intelligence team is at the forefront of Generative AI innovation, driving solutions for travel-related chatbots, text generation and summarization applications, Q&A systems, and free-text search. Beyond this, the team is building a cutting-edge platform that processes millions of images and textual inputs daily, enriching them with ML capabilities. These enriched datasets power downstream applications, helping personalize the customer experience-for example, selecting and displaying the most relevant images and reviews as customers plan and book their next vacation.
Role Description:
As a Senior Data Engineer, youll collaborate with top-notch engineers and data scientists to elevate our platform to the next level and deliver exceptional user experiences. Your primary focus will be on the data engineering aspects-ensuring the seamless flow of high-quality, relevant data to train and optimize content models, including GenAI foundation models, supervised fine-tuning, and more.
Youll work closely with teams across the company to ensure the availability of high-quality data from ML platforms, powering decisions across all departments. With access to petabytes of data through MySQL, Snowflake, Cassandra, S3, and other platforms, your challenge will be to ensure that this data is applied even more effectively to support business decisions, train and monitor ML models and improve our products.
Key Job Responsibilities and Duties:
Rapidly developing next-generation scalable, flexible, and high-performance data pipelines.
Dealing with massive textual sources to train GenAI foundation models.
Solving issues with data and data pipelines, prioritizing based on customer impact.
End-to-end ownership of data quality in our core datasets and data pipelines.
Experimenting with new tools and technologies to meet business requirements regarding performance, scaling, and data quality.
Providing tools that improve Data Quality company-wide, specifically for ML scientists.
Providing self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise.
Acting as an intermediary for problems, with both technical and non-technical audiences.
Promote and drive impactful and innovative engineering solutions
Technical, behavioral and interpersonal competence advancement via on-the-job opportunities, experimental projects, hackathons, conferences, and active community participation
Collaborate with multidisciplinary teams: Collaborate with product managers, data scientists, and analysts to understand business requirements and translate them into machine learning solutions. Provide technical guidance and mentorship to junior team members.
דרישות:
Bachelors or masters degree in computer science, Engineering, Statistics, or a related field.
Minimum of 6 years of experience as a Data Engineer or a similar role, with a consistent record of successfully delivering ML/Data solutions.
You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; ‌ you have hands-on experience with schema design and data modeling and working with ML scientists and ML engineers to provide production level ML solutions.
You have experience designing systems E2E and knowledge of basic concepts (lb, db, caching, NoSQL, etc)
Strong programming skills in languages such as Python and Java.
Experience with big data processing frameworks such, Pyspark, Apache Flink, Snowflake or similar frameworks.
Demonstrable experience with MySQL, Cassandra, DynamoDB or similar relational/NoSQL database systems.
Experience with Data Warehousing and ETL/ELT pipelines
Experience in data processing for large-scale language models like GPT, BERT, or similar architectures - an advantage.
Proficiency in data manipulation, analysis, and visualization using tools like NumPy, pandas, and המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8498339
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced and visionary Data Platform Engineer to help design, build and scale our BI platform from the ground up.
In this role, you will be responsible for building the foundations of our data analytics platform - enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.
You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.
If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle - this role is for you!
Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.
Lead the design and architecture of the data platform - from integration to transformation, modeling, storage, and access.
Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.
Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.
Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.
Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.
Promote a data-driven culture - be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
Requirements:
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.
Strong hands-on experience with ETL orchestration tools (Apache Airflow), and data lakehouses (e.g., Snowflake/BigQuery/Databricks)
Vast knowledge in both batch processing and streaming processing (e.g., Kafka, Spark Streaming).
Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).
Familiarity with data visualization tools ( Power BI, Looker, or similar.
BSc in Computer Science or a related field from a leading university
Nice to have
Experience working in early-stage projects, building data systems from scratch.
Background in building operational analytics pipelines, in which analytical data feeds real-time product business logic.
Hands-on experience with ML model training pipelines.
Experience in cost optimization in modern cloud environments.
Knowledge of data governance principles, compliance, and security best practices.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8482840
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior BI Data Engineer, youll join a company where culture isnt a slogan - its our DNA.

Youll be part of a data-driven organization where every voice matters, working at the heart of our BI team to raise the bar for analytical excellence.

In this role, youll take end-to-end ownership of complex, high-impact BI initiatives, shaping how measures success, makes decisions, and scales. Your work will directly impact over 1M users worldwide, empowering B2B sales teams to unlock new revenue opportunities and drive sustainable growth.

What Youll Actually Do:

Lead the design and evolution of BI data foundations, owning key product and GTM metric definitions and data models
Build, scale, and maintain high-quality ELT pipelines and curated datasets (raw → modeled → semantic) that power dashboards and self-serve analytics
Collaborate closely with Data Scientists to operationalize machine learning use cases, including feature pipelines and analytical datasets
Take senior ownership within the BI team by mentoring peers, reviewing critical work, and promoting best practices in analytics engineering and semantic modeling
Own the performance, cost, and reliability of the data warehouse and BI layer by optimizing models, queries, and incremental processing patterns
Translate ambiguous business questions into clear, governed metrics and scalable data models, partnering with Product, RevOps/GTM, Finance, and Analytics
Establish and maintain strong data quality, observability, and monitoring practices (tests, SLAs, anomaly detection)
Drive standardization and automation across the BI development lifecycle, including version control, CI/CD, documentation, and release processes
Stay ahead of modern BI and analytics engineering trends, including AI-assisted development, and apply them pragmatically to increase trust and speed
Requirements:
5+ years of experience in Data Engineering / BI roles, with proven ownership of scalable, end-to-end data ecosystems
Expert-level experience with modern data stacks, including Snowflake or Databricks and dbt as a core transformation layer
Advanced SQL and Python skills, with hands-on responsibility for CI/CD pipelines, data quality frameworks, and observability
Deep understanding of dimensional modeling, data warehousing patterns, and semantic layer design
Hands-on experience with orchestration tools such as Airflow, managing complex and high-availability ELT workflows
Experience working with large-scale data processing, including exposure to Kafka, Spark, or streaming architectures
Strong ability to independently lead cross-functional initiatives and translate business requirements into executable technical solutions
Strong business and analytical mindset, with an AI-forward approach and curiosity for emerging tools and methodologies
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8481843
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Engineer II - GenAI
20718
The Content Intelligence team is at the forefront of Generative AI innovation, driving solutions for travel-related chatbots, text generation and summarization applications, Q&A systems, and free-text search. Beyond this, the team is building a cutting-edge platform that processes millions of images and textual inputs daily, enriching them with ML capabilities. These enriched datasets power downstream applications, helping personalize the customer experience-for example, selecting and displaying the most relevant images and reviews as customers plan and book their next vacation.
Role Description:
As a Data Engineer, youll collaborate with top-notch engineers and data scientists to elevate our platform to the next level and deliver exceptional user experiences. Your primary focus will be on the data engineering aspects-ensuring the seamless flow of high-quality, relevant data to train and optimize content models, including GenAI foundation models, supervised fine-tuning, and more.
Youll work closely with teams across the company to ensure the availability of high-quality data from ML platforms, powering decisions across all departments. With access to petabytes of data through MySQL, Snowflake, Cassandra, S3, and other platforms, your challenge will be to ensure that this data is applied even more effectively to support business decisions, train and monitor ML models and improve our products.
Key Job Responsibilities and Duties:
Rapidly developing next-generation scalable, flexible, and high-performance data pipelines.
Dealing with massive textual sources to train GenAI foundation models.
Solving issues with data and data pipelines, prioritizing based on customer impact.
End-to-end ownership of data quality in our core datasets and data pipelines.
Experimenting with new tools and technologies to meet business requirements regarding performance, scaling, and data quality.
Providing tools that improve Data Quality company-wide, specifically for ML scientists.
Providing self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise.
Acting as an intermediary for problems, with both technical and non-technical audiences.
Promote and drive impactful and innovative engineering solutions
Technical, behavioral and interpersonal competence advancement via on-the-job opportunities, experimental projects, hackathons, conferences, and active community participation
Collaborate with multidisciplinary teams: Collaborate with product managers, data scientists, and analysts to understand business requirements and translate them into machine learning solutions. Provide technical guidance and mentorship to junior team members.
דרישות:
Bachelors or masters degree in computer science, Engineering, Statistics, or a related field.
Minimum of 3 years of experience as a Data Engineer or a similar role, with a consistent record of successfully delivering ML/Data solutions.
You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; ‌ you have hands-on experience with schema design and data modeling and working with ML scientists and ML engineers to provide production level ML solutions.
You have experience designing systems E2E and knowledge of basic concepts (lb, db, caching, NoSQL, etc)
Strong programming skills in languages such as Python and Java.
Experience with big data processing frameworks such, Pyspark, Apache Flink, Snowflake or similar frameworks.
Demonstrable experience with MySQL, Cassandra, DynamoDB or similar relational/NoSQL database systems.
Experience with Data Warehousing and ETL/ELT pipelines
Experience in data processing for large-scale language models like GPT, BERT, or similar architectures - an advantage.
Proficiency in data manipulation, analysis, and visualization using tools like NumPy, pandas, and matplotlib - an advantage.#ENG המשרה מיועדת לנשים ולגברים כאחד.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8498343
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications. Your technical skills and analytical mindset will be utilized designing and building some of the world's most extensive data sets, helping to craft experiences for billions of people and hundreds of millions of businesses worldwide.
In this role, you will collaborate with software engineering, data science, and product management teams to design/build scalable data solutions across Meta to optimize growth, strategy, and user experience for our 3 billion plus users, as well as our internal employee community.
You will be at the forefront of identifying and solving some of the most interesting data challenges at a scale few companies can match. By joining Meta, you will become part of a world-class data engineering community dedicated to skill development and career growth in data engineering and beyond.
Data Engineering: You will guide teams by building optimal data artifacts (including datasets and visualizations) to address key questions. You will refine our systems, design logging solutions, and create scalable data models. Ensuring data security and quality, and with a focus on efficiency, you will suggest architecture and development approaches and data management standards to address complex analytical problems.
Product leadership: You will use data to shape product development, identify new opportunities, and tackle upcoming challenges. You'll ensure our products add value for users and businesses, by prioritizing projects, and driving innovative solutions to respond to challenges or opportunities.
Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
Data Engineer, Product Analytics Responsibilities
Conceptualize and own the data architecture for multiple large-scale projects, while evaluating design and operational cost-benefit tradeoffs within systems
Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve
Collaborate with engineers, product managers, and data scientists to understand data needs, representing key data insights in a meaningful way
Define and manage Service Level Agreements for all data sets in allocated areas of ownership
Determine and implement the security model based on privacy requirements, confirm safeguards are followed, address data quality issues, and evolve governance processes within allocated areas of ownership
Design, build, and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains
Solve our most challenging data integration problems, utilizing optimal Extract, Transform, Load (ETL) patterns, frameworks, query techniques, sourcing from structured and unstructured data sources
Assist in owning existing processes running in production, optimizing complex code through advanced algorithmic concepts
Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts.
Requirements:
Minimum Qualifications
Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent
4+ years of experience where the primary responsibility involves working with data. This could include roles such as data analyst, data scientist, data engineer, or similar positions
4+ years of experience (or a minimum of 2+ years with a Ph.D) with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.)
Preferred Qualifications
Master's or Ph.D degree in a STEM field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8478330
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 7 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Are you a talented and experienced Data Engineer? If so, we want you to be part of our dynamic Data Engineering Team, a part of the R&D, contributing to our vision and making a difference in the eCommerce landscape. Join us on this journey as we seek the best and brightest minds to drive our mission forward.

Responsibilities:
Developing, implementing and supporting robust, scalable solutions to improve business analysis capabilities.
Managing data pipelines from multiple sources, including designing, implementing, and maintaining.
Translating business priorities into data models by working with business analysts and product analysts.
Collaborate across the business with various stakeholders, such as data developers, systems analysts, data scientists and software engineers.
Owning the entire data development process, including business knowledge, methodology, quality assurance, and maintenance.
Work independently while considering all functional and non-functional aspects and provide high quality and robust infrastructures to the organization.
Requirements:
What you need:
Bachelors degree in Computer Science, Industrial engineering, Maths, or other numerate/analytical degree equivalent.
4 years of experience working as a BI Developer / Data Engineer or a similar role.
Advanced proficiency and deep understanding of SQL.
Skills in data modeling, business logic processes, as well as experience with DWH design.
An enthusiastic, fast-learning, team-player, motivated individual who loves data.

Advantage:
Experience working with DBT (big advantage).
Knowledge in BI tools such as Looker, Tableau or Superset.
Experience with Python.
Experience working with DWH, such as BigQuery/Snowflake/Redshift.
Experience working with Spark, Kubernetes, Docker.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8511686
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
This role has been designed as Hybrid with an expectation that you will work on average 2 days per week from an HPE office.
Job Description:
We are looking for a highly skilled Senior Data Engineer with strong architectural expertise to design and evolve our next-generation data platform. You will define the technical vision, build scalable and reliable data systems, and guide the long-term architecture that powers analytics, operational decision-making, and data-driven products across the organization.
This role is both strategic and hands-on. You will evaluate modern data technologies, define engineering best practices, and lead the implementation of robust, high-performance data solutionsincluding the design, build, and lifecycle management of data pipelines that support batch, streaming, and near-real-time workloads.
What Youll Do
Architecture & Strategy
Own the architecture of our data platform, ensuring scalability, performance, reliability, and security.
Define standards and best practices for data modeling, transformation, orchestration, governance, and lifecycle management.
Evaluate and integrate modern data technologies and frameworks that align with our long-term platform strategy.
Collaborate with engineering and product leadership to shape the technical roadmap.
Engineering & Delivery
Design, build, and manage scalable, resilient data pipelines for batch, streaming, and event-driven workloads.
Develop clean, high-quality data models and schemas to support analytics, BI, operational systems, and ML workflows.
Implement data quality, lineage, observability, and automated testing frameworks.
Build ingestion patterns for APIs, event streams, files, and third-party data sources.
Optimize compute, storage, and transformation layers for performance and cost efficiency.
Leadership & Collaboration
Serve as a senior technical leader and mentor within the data engineering team.
Lead architecture reviews, design discussions, and cross-team engineering initiatives.
Work closely with analysts, data scientists, software engineers, and product owners to define and deliver data solutions.
Communicate architectural decisions and trade-offs to technical and non-technical stakeholders.
Requirements:
610+ years of experience in Data Engineering, with demonstrated architectural ownership.
Expert-level experience with Snowflake (mandatory), including performance optimization, data modeling, security, and ecosystem components.
Expert proficiency in SQL and strong Python skills for pipeline development and automation.
Experience with modern orchestration tools (Airflow, Dagster, Prefect, or equivalent).
Strong understanding of ELT/ETL patterns, distributed processing, and data lifecycle management.
Familiarity with streaming/event technologies (Kafka, Kinesis, Pub/Sub, etc.).
Experience implementing data quality, observability, and lineage solutions.
Solid understanding of cloud infrastructure (AWS, GCP, or Azure).
Strong background in DataOps practices: CI/CD, testing, version control, automation.
Proven leadership in driving architectural direction and mentoring engineering teams
Nice to Have
Experience with data governance or metadata management tools.
Hands-on experience with DBT, including modeling, testing, documentation, and advanced features.
Exposure to machine learning pipelines, feature stores, or MLOps.
Experience with Terraform, CloudFormation, or other IaC tools.
Background designing systems for high scale, security, or regulated environments.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8461496
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Reality Labs, Threads). Your technical skills and analytical mindset will be utilized designing and building some of the world's most extensive data sets, helping to craft experiences for billions of people and hundreds of millions of businesses worldwide.
In this role, you will collaborate with software engineering, data science, and product management teams to design/build scalable data solutions across Meta to optimize growth, strategy, and user experience for our 3 billion plus users, as well as our internal employee community.
You will be at the forefront of identifying and solving some of the most interesting data challenges at a scale few companies can match. By joining Meta, you will become part of a world-class data engineering community dedicated to skill development and career growth in data engineering and beyond.
Data Engineering: You will guide teams by building optimal data artifacts (including datasets and visualizations) to address key questions. You will refine our systems, design logging solutions, and create scalable data models. Ensuring data security and quality, and with a focus on efficiency, you will suggest architecture and development approaches and data management standards to address complex analytical problems.
Product leadership: You will use data to shape product development, identify new opportunities, and tackle upcoming challenges. You'll ensure our products add value for users and businesses, by prioritizing projects, and driving innovative solutions to respond to challenges or opportunities.
Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
Data Engineer, Product Analytics Responsibilities
Conceptualize and own the data architecture for multiple large-scale projects, while evaluating design and operational cost-benefit tradeoffs within systems
Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve
Collaborate with engineers, product managers, and data scientists to understand data needs, representing key data insights visually in a meaningful way
Define and manage Service Level Agreements for all data sets in allocated areas of ownership
Determine and implement the security model based on privacy requirements, confirm safeguards are followed, address data quality issues, and evolve governance processes within allocated areas of ownership
Design, build, and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains
Solve our most challenging data integration problems, utilizing optimal Extract, Transform, Load (ETL) patterns, frameworks, query techniques, sourcing from structured and unstructured data sources
Assist in owning existing processes running in production, optimizing complex code through advanced algorithmic concepts
Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts.
Requirements:
Minimum Qualifications
Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent
7+ years of experience where the primary responsibility involves working with data. This could include roles such as data analyst, data scientist, data engineer, or similar positions
7+ years of experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala or others.)
Preferred Qualifications
Master's or Ph.D degree in a STEM field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8478326
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
24/12/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an Analytics Engineer to join our team and play a key role in shaping how data drives our product and business decisions.
This role is perfect for someone who enjoys working at the intersection of data, product, and strategy. While Product Analysts focus on turning data into insights, youll focus on building the strong data foundations that make those insights possible. You wont just run queries; you will design the data architecture and own the "source of truth" tables that power our strategic decision-making.
Youll work closely with our Growth and Solutions teams, helping them move faster and smarter by making sure the data behind Generative AI, Data-as-a-Service (DaaS), and advanced product models is clear, reliable, and easy to use. Your work will have a direct impact on how we build, scale, and innovate our products.
What Youll Do
Define the Source of Truth: Take raw, complex data and transform it into clean, well-structured tables that Product Analysts and Business Leads can use for high-stakes decision-making.
Translate Strategy into Logic: Work with Product, Growth, and Solutions leads to turn abstract concepts (like "Activation," "Retention," or "Feature Adoption") into precise SQL definitions and automated datasets.
Enable High-Tech Initiatives: Partner with our AI and DaaS specialists to ensure they have the structured data foundations they need to build models and external data products.
Optimize for Usability: Ensure our data is not just "there," but easy to use. You will design the data logic that powers our most important product dashboards and growth funnels.
Maintain Data Integrity: Act as the guardian of our metrics. You will ensure that the numbers used across our product and business reports are consistent, reliable, and logical.
Requirements:
Expert SQL Mastery: You are a SQL power-user. You enjoy solving complex logic puzzles using code and care deeply about query efficiency and data accuracy.
The "Bridge" Mindset: You can sit in a meeting with a Product Manager to understand a business need, and then translate that into a technical data structure that serves that need.
Logical Architecture: You have a natural talent for organizing information. You know how to build a table that is intuitive and easy for other analysts to query.
Product & Business Acumen: You understand SaaS metrics (ARR, funnels, activation, etc.) and how data logic impacts product growth and strategy.
Experience with Analytics Tools: Proficiency in BI tools (Looker, Tableau, etc.) and a strong understanding of how data flows from technical logs to the end-user interface.
Degree: B.Sc. in Industrial Engineering, Information Systems, Economics, Computer Science or a related quantitative field.
Experience: 1+ years of prior experience in a relevant analytics/technical role.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8471303
סגור
שירות זה פתוח ללקוחות VIP בלבד