דרושים » תוכנה » Senior data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 5 שעות
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are the leader in hybrid-cloud security posture management, using the attackers perspective to find and remediate critical attack paths across on-premises and multi-cloud networks. we are looking for a talented Senior data Engineer Join a core team of experts responsible for developing innovative cyber-attack techniques for Cloud-based environments (AWS, Azure, GCP, Kubernetes) that integrate into our fully automated attack simulation. About the Role:We are seeking an experienced Senior data Engineer to join our dynamic data team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure, ensuring the availability, reliability, and quality of our data. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate across teams to deliver data -driven solutions.Key Responsibilities:
* Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes.
* Develop and optimize data models, schemas, and Storage solutions to support analytics and Machine Learning initiatives.
* Collaborate with software engineers and product managers to understand data requirements and deliver high-quality solutions.
* Ensure data quality, integrity, and governance across multiple sources and systems.
* Monitor and troubleshoot data workflows, resolving performance and reliability issues.
* Evaluate and implement new data technologies and frameworks to improve the data platform.
* Document processes, best practices, and data architecture.
* Mentor junior data engineers and contribute to team knowledge sharing.
Requirements:
Required Qualifications:
* Bachelors or Masters degree in Computer Science, Engineering, or a related field.
* 5+ years of experience in data engineering, ETL development, or a similar role.
* Strong proficiency in SQL and experience with relational and NoSQL databases.
* Experience with data pipeline frameworks and tools such as: Apache Spark, Airflow & Kafka. - MUST
* Familiarity with cloud platforms (AWS, GCP, or Azure) and their data services.
* Solid programming skills in Python, JAVA, or Scala.
* Strong problem-solving, analytical, and communication skills.
* Knowledge of data governance, security, and compliance standards.
* Experience with data warehousing, Big Data technologies, and data modeling best practices such as ClickHouse, SingleStore, StarRocks. Preferred Qualifications (Advantage):
* Familiarity with Machine Learning workflows and MLOps practices.
* Work with data Lakehouse architectures and technologies such as Apache Iceberg.
* Experience working with data ecosystems in Open Source/On-Premise environments. Why Join Us:
* Work with cutting-edge technologies and large-scale data systems.
* Collaborate with a talented and innovative team.
* Opportunities for professional growth and skill development.
* Make a direct impact on data -driven decision-making across the organization.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8401647
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our data ecosystem.

The groups mission is to build a state-of-the-art Data Platform that drives us toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.

In this role youll:

Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams.

Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights.

Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance.

Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights.

Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions.

Collaborate closely with other Staff Engineers across us to align on cross-organizational initiatives and technical strategies.

Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions.

Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas.

A B.Sc. in Computer Science or a related technical field (or equivalent experience).

Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions.

Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines.

A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage.

Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions.

Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases.

Ability to work in an office environment a minimum of 3 days a week.

Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8358644
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer to join our Platform group in the Data Infrastructure team.
Youll work hands-on to design and deliver data pipelines, distributed storage, and streaming services that keep our companys data platform performant and reliable. As a senior individual contributor you will lead complex projects within the team, raise the bar on engineering best-practices, and mentor mid-level engineers while collaborating closely with product, DevOps and analytics stakeholders.
Code & ship production-grade services, pipelines and data models that meet performance, reliability and security goals
Lead design and delivery of team-level projects from RFC through rollout and operational hand-off
Improve system observability, testing and incident response processes for the data stack
Partner with Staff Engineers and Tech Leads on architecture reviews and platform-wide standards
Mentor junior and mid-level engineers, fostering a culture of quality, ownership and continuous improvement
Stay current with evolving data-engineering tools and bring pragmatic innovations into the team.
Requirements:
5+ years of hands-on experience in backend or data engineering, including 2+ years at a senior level delivering production systems
Strong coding skills in Python, Kotlin, Java or Scala with emphasis on clean, testable, production-ready code
Proven track record designing, building and operating distributed data pipelines and storage (batch or streaming)
Deep experience with relational databases (PostgreSQL preferred) and working knowledge of at least one NoSQL or columnar/analytical store (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid hands-on experience with event-streaming platforms such as Apache Kafka
Familiarity with data-orchestration frameworks such as Airflow
Comfortable with modern CI/CD, observability and infrastructure-as-code practices in a cloud environment (AWS, GCP or Azure)
Ability to break down complex problems, communicate trade-offs clearly, and collaborate effectively with engineers and product partners
Bonus Skills
Experience building data governance or security/compliance-aware data platforms
Familiarity with Kubernetes, Docker, and infrastructure-as-code tools
Experience with data quality frameworks, lineage, or metadata tooling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8383979
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Data Engineer to join our growing team!
This is a great opportunity to be part of one of the fastest-growing infrastructure companies in history, an organization that is in the center of the hurricane being created by the revolution in artificial intelligence.
"our company's data management vision is the future of the market."- Forbes
we are the data platform company for the AI era. We are building the enterprise software infrastructure to capture, catalog, refine, enrich, and protect massive datasets and make them available for real-time data analysis and AI training and inference. Designed from the ground up to make AI simple to deploy and manage, our company takes the cost and complexity out of deploying enterprise and AI infrastructure across data center, edge, and cloud.
Our success has been built through intense innovation, a customer-first mentality and a team of fearless company ronauts who leverage their skills & experiences to make real market impact. This is an opportunity to be a key contributor at a pivotal time in our companys growth and at a pivotal point in computing history.
In this role, you will be responsible for:
Designing, building, and maintaining scalable data pipeline architectures
Developing ETL processes to integrate data from multiple sources
Creating and optimizing data models for efficient storage and retrieval
Implementing data quality controls and monitoring systems
Collaborating with data scientists and analysts to deliver data solutions
Building and maintaining data warehouses and data lakes
Performing in-depth data analysis and providing insights to stakeholders
Taking full ownership of data quality, documentation, and governance processes
Building and maintaining comprehensive reports and dashboards
Ensuring data security and regulatory compliance.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field
3+ years experience in data engineering
Strong proficiency in SQL and Python
Experience with ETL tools and data warehousing solutions
Knowledge of big data technologies (Hadoop, Spark, etc.)
Experience with cloud platforms (AWS, Azure, or GCP)
Understanding of data modeling and database design principles
Familiarity with data visualization tools - Tableau, Sisense
Strong problem-solving and analytical skills
Excellent communication and collaboration abilities
Experience with version control systems (Git).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8384334
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Shape the Future of Data - Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8397812
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/10/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a top-notch Senior Software Engineer to help us tackle the toughest challenge in cybersecurity: turning endless amounts of data into crisp, easy, and actionable insights.

Responsibilities
Collaborate with a senior Agile Scrum team to design, develop, and maintain large-scale, cloud-based data processing pipelines and backend components. Work with cutting-edge technologies including Spark, Kubernetes, AWS, and modern data lakes like Databricks and Snowflake.
Design and implement scalable, cost-effective solutions that deliver high performance and are easy to maintain. Tackle complex, high-scale problems and drive performance optimization and cost-efficiency across the data pipeline.
Partner with engineers across our company R&D and Product teams to enhance our platform and provide capabilities for internal and external users to build data transformations and detection pipelines at scale.
Build robust monitoring and observability solutions to ensure full visibility across all stages of data processing.
Stay current with trends in big data processing and distributed computing. Contribute to code quality through regular reviews and adherence to best practices.
Requirements:
4+ years of experience as a Backend Engineer
3+ years of hands-on experience in Scala/Python/JAVA and cloud architecture (EMR/K8S).
Deep technical expertise in distributed systems, stream processing, and data modeling of large data sets.
Proven track record of delivering scalable, and secure systems in a fast-paced working environment.
Experience with data governance practices, data security, and performance & cost optimization containers, working with AWS services such as S3, EKS, and more.
Strong problem-solving skills and ability to work independently.
A team player with excellent communication skills.
B.Sc. in computer science or an equivalent.
Advantages:

Experience in Big Data frameworks such as Spark
Experience with modern Data lakes/warehouses such as Snowflake and Databricks.
Production experience working with SaaS environments.
Experience in data modeling.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8389801
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Senior Data Engineer , youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.

Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.

Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.



Responsibilities:

Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8398063
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/10/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a highly skilled and experienced Software Engineer to join our dynamic team. The ideal candidate will have a robust background in software development and passion for creating secure, innovative and efficient software solutions.
Key Responsibilities:
Design, develop, and maintain software solutions across the backend (Node.js), frontend (Vue.js), and additional client (Go).
Participate in designing and building the architecture of robust, scalable, and secure software solutions across the entire stack.
Ensure seamless integration between the backend and frontend components.
Implement and manage robust security practices to protect our software and clients data in the cloud and on-premise.
Implement or utilize AI/ML algorithms within the product as needed.
Collaborate with other team members to define, design, and ship new features.
Write clean, maintainable, and efficient code.
Conduct code reviews and provide constructive feedback to team members.
Troubleshoot, debug, and upgrade existing software.
Stay up-to-date with emerging technologies and industry trends.
Requirements:
Bachelors degree in Computer Science, Engineering, or a related field, or equivalent experience.
2+ years of professional experience in software development.
Strong understanding of security principles and experience with security technologies (e.g., encryption, hashing, authentication, security protocols).
Experience with secure cloud and on-premise data storage practices, securing data transport (e.g., HTTPS, TLS, VPN) and secure key management practices.
Understanding of secure software development lifecycle (SDLC) practices.
Experience with Git version control.
Experience working in an Agile development environment and familiarity with Agile methodologies.
Excellent problem-solving skills and attention to detail.
Strong communication skills and ability to work collaboratively in a team environment.
Preferred Qualifications:
Proficiency in JavaScript and experience with Node.js, experience with front-end frameworks, particularly Vue.js and familiarity with the Go programming language.
Experience with cloud services such as AWS.
Knowledge of containerization technologies like Docker and Kubernetes.
Familiarity with CI/CD pipelines and DevOps practices.
Experience with database management and design (SQL and NoSQL).
Experience with distributed systems and decentralized applications (dApps, Web3), proficiency in programming languages commonly used for blockchain development (e.g., Solidity, Hyperledger Fabric). Experience in developing smart contracts.
Strong understanding of blockchain technology, including consensus mechanisms, cryptography, and smart contracts. Experience working with private blockchains or other distributed ledger technologies (DLTs).
Knowledge of common security vulnerabilities (e.g., OWASP Top Ten) and mitigation strategies.
Familiarity with compliance standards and regulations (e.g., GDPR, HIPAA, ISO).
Experience with security tools such as static code analysis tools, vulnerability scanners, and penetration testing tools.
Proven experience in implementing secure coding practices and conducting security audits.
Contributions to open-source projects or participation in the tech community.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8367199
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a hands-on leader to take ownership of our data and analytics landsape.
This isnt a pure management role we need someone who can roll up their sleeves, dive into AWS environments, and still have the strategic view to guide a growing data team.
The ideal candidate combines strong technical expertise in data engineering and analytics with a solid understanding and genuine passion for Artificial Intelligence (AI) including how AI can enhance data-driven decision-making and operational efficiency in the financial domain.
In this position youll:
Shape the companys overall data strategy and decide how we store, move, and use data across the business.
Lead a small but growing group of engineers, analysts, and BI specialists, making sure they have the right tools and direction.
Stay close to the technology designing and reviewing data pipelines, checking performance, and being the go-to person for AWS data solutions like Redshift, Glue, Athena, S3, and others.
Work with business leaders (finance, risk, compliance) to turn their questions into clear, data-driven answers.
Identify opportunities to integrate AI/ML into data workflows and business processes.
Establish best practices for data governance, quality, and security in a financial services org.
Drive automation and efficiency through modern data and AI tools
Requirements:
A deep background in databases, BI, and analytics (7+ years).
Several years leading or directing data teams, ideally in finance or a regulated industry.
Hands-on skills in SQL, Python, ETL, and cloud data platforms not just theory.
Strong AWS knowledge you should feel at home in that ecosystem.
Strong understanding and enthusiasm for AI and ML concepts, with the ability to identify and implement practical applications within data and analytics environments.
Experience in applying AI technologies such as LLM, predictive modeling, or AI-driven automation in financial or data-intensive settings.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8393100
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
23/10/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we curate and contextualize data from hundreds of security and business tools to help companies understand and address their riskiest problems. Our robust data platform and strong security modules provide immediate and secure benefits to our customers. If you're passionate about data and helping companies improve their security posture, we'd love to have you join our company as we make the world a more secure place.

We're looking for an experienced Senior Data Engineer to join our AEM team. Reporting to the Manager, Software Engineering, you'll be responsible for:

Designing, developing, and supporting new requirements from Product and from Research
Building, deploying and maintaining data infrastructure and pipelines
Building automated validation processes to ensure data integrity
Leading data investigations and escalations with other teams
Writing clear documentation
Requirements:
What We're Looking for (Minimum Qualifications)

4+ years of experience as a Backend Engineer/Data Engineer, building data pipelines and handling high-scale data loads
Experience in backend development with expertise in one or more programming languages, such as Java, Python, Node.js, C#, or similar technologies
Good knowledge and experience with SQL
Experience with developing SAAS products over public cloud infrastructure such as AWS, Azure, GCP
B.Sc./M.Sc. in Engineering, Computer Science or relevant field, or graduate of a technical unit in the IDF
What Will Make You Stand Out (Preferred Qualifications)

Experience in Spark or equivalent (Hadoop, BigQuery, Elasticsearch etc.)
Cyber security industry experience
Software system testing methodology
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8382318
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
Required Analytics Engineer
Tel Aviv
Want to shape how data drives product decisions?
As an Analytics Engineer, youll design the foundations of our data infrastructure, build robust and scalable data models, and empower teams across the organization with actionable insights that fuel our product direction.
As an Analytics Engineer, you will:
Design and implement robust data models to transform raw data into analytics-ready tables, enabling confident decision-making across product and business teams.
Own and maintain our dbt pipelines with a focus on scalability, modularity, and clear documentation.
Continuously evolve our data models to reflect changing business logic and product needs.
Build and maintain comprehensive testing infrastructure to ensure data accuracy and trust.
Monitor the health of our data pipelines, ensuring integrity in event streams and leading resolution of data issues.
Collaborate closely with analysts, data engineers, and product managers to align data architecture with business goals.
Guide the analytics code development process using Git and engineering best practices.
Create dashboards and reports in Tableau that turn insights into action.
Drive performance and cost optimization across our data stack, proactively improving scalability and reliability.
Requirements:
You should apply if you are:
A data professional with 4+ years of experience in analytics engineering, BI development, or similar data roles.
Highly skilled in SQL, with hands-on experience using Snowflake or similar cloud data warehouses.
Proficient in DBT for data transformation, modeling, and documentation.
Experienced with Tableau or similar BI tools for data visualization.
Familiar with CI/CD for data workflows, version control systems (e.g., Git), and testing frameworks.
A strong communicator who can collaborate effectively with both technical and non-technical stakeholders.
Holding a B.Sc. in Industrial Engineering, Computer Science, or a related technical field.
Passionate about translating complex data into clear, scalable insights that drive product innovation.
Bonus points if you have:
Experience with event instrumentation and user behavioral data.
Scripting ability in Python for automation and data processing.
Familiarity with modern data stack tools such as Airflow, Fivetran, Looker, or Segment.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8361241
סגור
שירות זה פתוח ללקוחות VIP בלבד