דרושים » תוכנה » Data Engineer- Team Lead

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Data Engineer- Team Lead
Description
We are building cutting-edge data solutions, primarily on BigQuery, Databricks and Snowflake and for large scale customers.
Our solutions introduce next level automations and aim for low-code/no-code platforms that allows our customers to deliver fast while we hide all the engineering complexity behind.
If you are not afraid of any engineering task, likes to build cool stuff and enjoy when customers use your craft you belong with us.
Requirements:
Hands-on Experience building cloud-scalable, real-time and high-performance Data Solutions of 5+ years in a capacity of a team leader.
Strong Programming Skills with Python/Scala (Java in addition is a plus).
Solid engineering foundations (good coding practices, good architectural design skills).
Proven experience and in depth knowledge with one or more of the following data platforms. Certifications are desired
Big Query big advantage
Databricks
Snowflake
Proven experience with building data pipelines, preferably with Airflow, DBT, Spark and alike
Experience with ETL tools like Rivery, Fivetran, Talend or others
Solid cloud development experience in one or more cloud providers. Certifications are desired
Advanced SQL knowledge and capabilities
Experience or knowledge in AI/GenAI/ML/MLOps - advantage
Team player, positive, likes to dream big and fulfill dreams all at the same time.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8224017
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Data Engineer
Description
We are building cutting-edge data solutions, primarily on BigQuery, Databricks and Snowflake and for large scale customers.
Our solutions introduce next level automations and aim for low-code/no-code platforms that allows our customers to deliver fast while we hide all the engineering complexity behind.
If you are not afraid of any engineering task, likes to build cool stuff and enjoy when customers use your craft you belong with us.
Requirements:
Experience building cloud-scalable, real-time and high-performance Data Solutions of 3+ years.
Strong Programming Skills with Python/Scala (Java in addition is a plus).
Solid engineering foundations (good coding practices, good architectural design skills).
Proven experience and in depth knowledge with one or more of the following data platforms. Certifications are desired
Big Query big advantage
Databricks
Snowflake
Proven experience with building data pipelines, preferably with Airflow, DBT, Spark and alike
Experience with ETL tools like Rivery, Fivetran, Talend or others
Solid cloud development experience in one or more cloud providers. Certifications are desired
Advanced SQL knowledge and capabilities
Experience or knowledge in AI/GenAI/ML/MLOps/LLM - advantage
Team player, positive, likes to dream big and fulfill dreams all at the same time.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8224015
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer for the Insight Team to join our Data Group and a new team responsible for developing innovative features based on multiple layers of data. These features will power recommendation systems, insights, and more. This role involves close collaboration with the core teams within the Data Group, working on diverse data pipelines that tackle challenges related to scale and algorithmic optimization, all aimed at enhancing the data experience for our customers.
Where does this role fit in our vision?
Every role at our company is designed with a clear purposeto integrate collective efforts into our shared success, functioning as pieces of a collective brain. Data is everything, its at the heart of everything we do. The Data Group is responsible for shaping the experience of hundreds of thousands of users who rely on our data daily.
The Insight Team monitors user behavior across our products, leveraging millions of signals and time-series entities to power a personalized recommendation and ranking system. This enables users to access more unique and tailored data, optimizing their experience while maintaining a strong focus on the key KPIs that drive the success of our Data Group.
What will you be responsible for?
Develop and implement robust, scalable data pipelines and integration solutions within our Databricks-based environment.
Develop models and implement algorithms, with a strong emphasis on delivering high-quality results.
Leverage technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Design innovative data solutions that support millions of data points, ensuring high performance and reliability.
Requirements:
3+ years of experience in data engineering, building and optimizing scalable data pipelines.
5+ years of experience as a software developer, preferably in Python.
Algorithmic experience, including developing and optimizing machine learning models and implementing advanced data algorithms.
Experience working with cloud ecosystems, preferably AWS (S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP).
Expertise in extracting, ingesting, and transforming large datasets efficiently.
Deep knowledge of big data platforms, such as Spark, Databricks, Elasticsearch, and Kafka for real-time data streaming.
(Nice-to-have) Hands-on experience working with Vector Databases and embedding techniques, with a focus on search, recommendations, and personalization.
AI-savvy: comfortable working with AI tools and staying ahead of emerging trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212720
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infrastructure Engineer
Tel Aviv
We are using technology to transform transportation around the world. From changing a single persons daily commute to reducing humanitys collective environmental footprint weve got huge goals.
As a Data Engineer, you will join a talented team of engineers, in charge of our core data infrastructure, while working closely with other engineering and data teams. You will lead the development of complex solutions, work on various cutting edge technologies, and be a critical part in our mission to shape the future of transportation, affecting tens of thousands of riders daily worldwide.
What Youll Do:
Design, build, and maintain scalable data platforms, pipelines, and solutions for data ingestion, processing and orchestration, leveraging Kubernetes, Airflow, Kafka and more.
Develop and manage cloud-based data solutions on AWS utilizing Infrastructure as Code (IaC) frameworks like Terraform or CloudFormation
Manage Snowflake environments, including database architecture, resource provisioning, and access control.
Ensure system reliability and observability with monitoring, logging, and alerting tools like Prometheus and CloudWatch.
Continuously optimize storage, compute, and data processing costs and performance.
Requirements:
BSc. Computer Science, Engineering or similar.
Minimum of 4 years of professional experience in data engineering
Minimum of 4 years of professional experience in backend or software development
Independent, responsible, ready to face challenges head-on
Passionate about data engineering and constantly expanding your knowledge
Excellent communication and strong analytical and problem-solving skills
Proficient in Python
Experience with Data streaming, Kafka, Airflow, cloud platforms (preferably AWS), k8s, terraform or equivalents.
Have a solid background working with data warehousing technologies, such as Snowflake, Databricks, Redshift, BigQuery, etc.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8200225
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206357
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for an experienced Data Engineer Team Lead to join our growing data group. In this role, youll not only design and implement high-scale, data-intensive platforms but also lead a team of talented engineers, driving the evolution of our data infrastructure. You will research and implement cutting-edge algorithmic solutions and collaborate with cross-functional teams on significant, high-impact projects that support our mission of providing accurate and actionable business intelligence.
Where does this role fit in our vision?
Every role at our company is designed with a clear purposeto integrate collective efforts into our shared success, functioning as pieces of a collective brain. Data is at the heart of everything we do. Our data engineers are responsible for building the core products that drive our business. The data teams own and manage the pipelines that create our assets, with a strong focus on the company KPIs.
What will you be responsible for?
Solve Complex Business Problems with Data Integration and Engineering Solutions
Develop and implement robust, scalable data pipelines and integration solutions within our Databricks-based environment.
Leverage technologies like Spark, Kafka, and Airflow to tackle complex data challenges and enhance business operations.
Design innovative data solutions that support millions of data points, ensuring high performance and reliability.
Collaborate with Business Stakeholders and Define Priorities
Partner with Product Managers, Data Scientists, and other key stakeholders to prioritize data initiatives that directly impact our core product offerings and user experience.
Ensure data solutions align with business goals, helping drive product features that bring value to our users and streamline B2B sales processes.
Requirements:
3+ years of experience in leading people
3+ years of experience in data engineering
5+ years of experience as a software developer (Python preferred)
Proficient in the composition of Advanced SQL (analytical functions) and query performance tuning skills
Experience working within the AWS ecosystem (AWS S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP)
Experience extracting, ingesting, and transforming large data sets
Experience with working with multiple kinds of databases (relational, document storage, key-value, time-series) and their associated query languages - specifically understanding when to use one vs. another
Experience with big data platforms such as Spark, Databricks, and Kafka as a streaming bus
Experience utilizing enterprise data warehousing systems such as Redshift or Snowflake
AI-savvy: comfortable working with AI tools and staying ahead of emerging trends.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8212732
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/05/2025
Location: Tel Aviv-Yafo and Netanya
Job Type: Full Time
We are looking for a Data Engineering Lead to join our Platform Group.
In this role, you will drive the development of scalable data pipelines and infrastructures that are crucial to our platforms success. You will collaborate across departments, and innovate to ensure our data ecosystem is robust, secure, and optimized for growth.
As a Data Engineering Lead you will...
Architect and develop data pipelines Lead the design and implementation of data pipelines that support our platform, ensuring high data quality, security, and governance. Introduce new tools and technologies to enhance data workflows and integration
Develop a strategic roadmap that outlines key engineering solutions to support our platforms scalability and performance, aligned with our overall vision and objectives
Collaborate across teams Work closely with internal teams including DevOps, BI, Product, and development groups to ensure seamless data integration and drive data-driven decision-making across the organization
Establish data guidelines and documentation Define best practices for data generation, consumption, and management within the platform. Create thorough documentation for all data processes to facilitate clear communication and future maintenance.
Requirements:
5+ years hands-on proven experience designing, building, and optimizing scalable and highly available data-intensive systems
In-depth understanding of big data engines and frameworks, such as Spark, and experience with ETL/ELT tools for robust data pipeline development
Proven ability to lead initiatives and drive technical agendas in a hands-on capacity, with potential for team building in the future
Knowledge of machine learning frameworks and strategic industry trends a plus
Strong business skills and strategic thinking, an innovative and growth mindset
Strong interpersonal skills to collaborate with internal teams and external partners effectively.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8197237
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
6 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a highly skilled and experienced BackEnd Team Lead to join our dynamic team in Tel Aviv. In this pivotal role, you will oversee the design, development, and optimization of our infrastructure, ensuring efficient flow and quality to support data-driven decision-making across the organization. You will lead a team of backend developers and data engineers, collaborate with cross-functional teams, and drive initiatives to enhance our data collection and data pipeline capabilities.
Key Responsibilities:
Lead and oversee a team of BackEnd and Data Engineers, mentor the team members, fostering skill development and professional growth
Establish and enforce data quality standards, implementing processes for data accuracy, reconciliation, and consistency.
Oversee the design and implementation of robust ETL processes to ensure accurate and timely data integration from various sources.
Collaborate across the business with various stakeholders, such as the CTO, Data Analysts and Product Managers to deliver impactful solutions
Continuously monitor and optimize system flows for performance, scalability, and cost-efficiency.
Requirements:
At least 7 years of experience in backend or data engineering, with a minimum of 3 years in a team lead role.
Experience with development, deployment and automation on AWS - S3, SNS, lambda, batch, dynamo, RDS etc.
Expertise in advanced SQL, including writing complex, efficient queries and building optimized datasets.
Proficient in Python and Python-based data tools.
Working with micro-services on K8S
Proven track record of delivering large-scale, high-quality products
Independent, autodidact with excellent problem solving and passion to learn cutting-edge technologies
Proven ability to lead and mentor a team, manage projects, and collaborate effectively with cross-functional teams.
Outstanding interpersonal skills
Advantage:
Experience in building data stream pipelines and ETL using Big Data frameworks such as Spark and Apache Airflow
Experience with Serverless frameworks
Experience with streaming technologies such as Amazon Kinesis and Apache Kafka.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8217961
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
4 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infra Tech Lead
A day in the life and how youll make an impact:
Were seeking an experienced and skilled Data Infra Tech Lead to join our Data Infrastructure team and drive the companys data capabilities at scale.
As the company is fast growing, the mission of the data infrastructure team is to ensure the company can manage data at scale efficiently and seamlessly through robust and reliable data infrastructure. As a tech lead, you are required to independently lead the design, development, and optimization of our data infrastructure, collaborating closely with software engineers, data scientists, data engineers, and other key stakeholders. You are expected to own critical initiatives, influence architectural decisions, and mentor engineers to foster a high-performing team.
You will:
Lead the design and development of scalable, reliable, and secure data storage, processing, and access systems.
Define and drive best practices for CI/CD processes, ensuring seamless deployment and automation of data services.
Oversee and optimize our machine learning platform for training, releasing, serving, and monitoring models in production.
Own and develop the company-wide LLM infrastructure, enabling teams to efficiently build and deploy projects leveraging LLM capabilities.
Own the company's feature store, ensuring high-quality, reusable, and consistent features for ML and analytics use cases.
Architect and implement real-time event processing and data enrichment solutions, empowering teams with high-quality, real-time insights.
Partner with cross-functional teams to integrate data and machine learning models into products and services.
Ensure that our data systems are compliant with the data governance requirements of our customers and industry best practices.
Mentor and guide engineers, fostering a culture of innovation, knowledge sharing, and continuous improvement.
Requirements:
7+ years of experience in data infra or backend engineering.
Strong knowledge of data services architecture, and ML Ops.
Experience with cloud-based data infrastructure in the cloud, such as AWS, GCP, or Azure.
Deep experience with SQL and NoSQL databases.
Experience with Data Warehouse technologies such as Snowflake and Databricks.
Proficiency in backend programming languages like Python, NodeJS, or an equivalent.
Proven leadership experience, including mentoring engineers and driving technical initiatives.
Strong communication, collaboration, and stakeholder management skills.
Bonus Points:
Experience leading teams working with serverless technologies like AWS Lambda.
Hands-on experience with TypeScript in backend environments.
Familiarity with Large Language Models (LLMs) and AI infrastructure.
Experience building infrastructure for Data Science and Machine Learning.
Experience collaborating with BI developers and analysts to drive business value.
Expertise in administering and managing Databricks clusters.
Experience with streaming technologies such as Amazon Kinesis and Apache Kafka.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8220200
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We're looking for a hands-on Senior BI Analyst to drive the full business intelligence stack- from data ingestion and modeling to dashboarding and stakeholder insights. If youve led BI projects end-to-end, enjoy building scalable systems, and want to have a strategic impact, this role is for you.

This is not your typical analyst role- were looking for someone who can independently set up and scale our BI foundations while partnering closely with business, product, and engineering teams.
Youll be a one-person BI powerhouse.
About the role:
Lead the design and execution of cross-company BI initiative from concept to implementation.
Partner with GTM, product, and operations teams to define KPIs, track performance, and uncover growth opportunities.
Implement and scale BI platforms (e.g., Power BI, Tableau) and define best practices for dashboards and reporting.
Own the full data flow: from source systems to dashboards - helping ensure data quality, consistency, and accessibility.
Write optimized SQL for complex analyses, recurring reports, and data validations.
Build and maintain clean, scalable data pipelines and models in collaboration with data engineering.
Requirements:
4+ years of experience as a BI Analyst, Analytics Engineer, or Data Engineer in a B2B SaaS or high-growth environment.
Proven ability to lead end-to-end BI projects, not just participate in isolated deliverables.
Demonstrated success in delivering impactful insights and solutions based on data analysis.
Strong grasp of the modern data stack- especially dbt, Airbyte/Fivetran, and orchestration tools like Airflow.
Hands-on experience with data modeling (e.g., star/snowflake schema), SQL optimization, and data architecture.
Experience with data warehouses such as Snowflake, BigQuery, or Redshift.
Advanced proficiency in one or more BI tools (Looker, Tableau, Power BI).
Strong communication skills able to translate data into clear business insights.
Bonus Points For:
Ability to build and maintain backend data infrastructure in addition to analytical skills - a huge plus
Experience working with platforms like DataBricks or similar data lake environments.
Exposure to GTM data tooling (e.g., HubSpot, Salesforce, product analytics tools) and metrics frameworks.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8200239
סגור
שירות זה פתוח ללקוחות VIP בלבד