דרושים » תוכנה » Data Platform Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: More than one
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers.
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8232260
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
19/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking talented and passionate Data Engineer to join our growing Data team. In this pivotal role, you will be instrumental in designing, building, and optimizing the critical data infrastructure that underpins innovative creative intelligence platform. You will tackle complex data challenges, ensuring our systems are robust, scalable, and capable of delivering high-quality data to power our advanced AI models, customer-facing analytics, and internal business intelligence. This is an opportunity to make a significant impact on our product, contribute to a data-driven culture, and help solve fascinating problems at the intersection of data, AI, and marketing technology.
Key Responsibilities
Architect & Develop Data Pipelines: Design, implement, and maintain sophisticated, end-to-end data pipelines for ingesting, processing, validating, and transforming large-scale, diverse datasets.
Manage Data Orchestration: Implement and manage robust workflow orchestration for complex, multi-step data processes, ensuring reliability and visibility.
Advanced Data Transformation & Modeling: Develop and optimize complex data transformations using advanced SQL and other data manipulation techniques. Contribute to the design and implementation of effective data models for analytical and operational use.
Ensure Data Quality & Platform Reliability: Establish and improve processes for data quality assurance, monitoring, alerting, and performance optimization across the data platform. Proactively identify and resolve data integrity and pipeline issues.
Cross-Functional Collaboration: Partner closely with AI engineers, product managers, developers, customer success and other stakeholders to understand data needs, integrate data solutions, and deliver features that provide exceptional value.
Drive Data Platform Excellence: Contribute to the evolution of our data architecture, champion best practices in data engineering (e.g., DataOps principles), and evaluate emerging technologies to enhance platform capabilities, stability, and cost-effectiveness.
Foster a Culture of Learning & Impact: Actively share knowledge, contribute to team growth, and maintain a strong focus on how data engineering efforts translate into tangible product and business outcomes.
Requirements:
3+ years of experience as a Data Engineer, building and managing complex data pipelines and data-intensive applications.
Solid understanding and application of software engineering principles and best practices. Proficiency in a relevant programming language (e.g., Python, Scala, Java) is highly desirable.
Deep expertise in writing, optimizing, and troubleshooting complex SQL queries for data transformation, aggregation, and analysis in relational and analytical database environments.
Hands-on experience with distributed data processing systems, cloud-based data platforms, data warehousing concepts, and workflow management tools.
Strong ability to diagnose complex technical issues, identify root causes, and develop effective, scalable solutions.
A genuine enthusiasm for tackling new data challenges, exploring innovative technologies, and continually expanding your skillset.
A keen interest in understanding how data powers product features and drives business value, with a focus on delivering results.
Excellent ability to communicate technical ideas clearly and work effectively within a multi-disciplinary team environment.
Advantages:
Familiarity with the marketing/advertising technology domain and associated datasets.
Experience with data related to creative assets, particularly video or image analysis.
Understanding of MLOps principles or experience supporting machine learning workflows.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8223467
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
As part of the Data Infrastructure group, youll help build our companys data platform for our growing stack of products, customers, and microservices.
We ingest our data from our operational DBs, telematics devices, and more, working with several data types (both structured and unstructured). Our challenge is to provide building tools and infrastructure to empower other teams leveraging data-mesh concepts.
In this role youll
Help build our companys data platform, designing and implementing data solutions for all application requirements in a distributed microservices environment
Build data-platform ingestion layers using streaming ETLs and Change Data Capture
Implement pipelines and scheduling infrastructures
Ensure compliance, data-quality monitoring, and data governance on our companys data platform
Implement large-scale batch and streaming pipelines with data processing frameworks
Collaborate with other Data Engineers, Developers, BI Engineers, ML Engineers, Data Scientists, Analysts and Product managers
Share knowledge with other team members and promote engineering standards.
Requirements:
5+ years of prior experience as a data engineer or data infra engineer
B.S. in Computer Science or equivalent field of study
Knowledge of databases (SQL, NoSQL)
Proven success in building large-scale data infrastructures such as Change Data Capture, and leveraging open source solutions such as Airflow & DBT, building large-scale streaming pipelines, and building customer data platforms
Experience with Python, Pulumi\Terraform, Apache Spark, Snowflake, AWS, K8s, Kafka
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206372
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our companys data ecosystem.
The groups mission is to build a state-of-the-art Data Platform that drives our company toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.
In this role youll
Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams
Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights
Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance
Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights
Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions
Collaborate closely with other Staff Engineers across our company to align on cross-organizational initiatives and technical strategies
Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions
Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas
A B.Sc. in Computer Science or a related technical field (or equivalent experience)
Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions
Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines
A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage
Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions
Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases
Ability to work in an office environment a minimum of 3 days a week
Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8206357
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
04/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
What You'll Do:
Shape the Future of Data- Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
Bonus points:
Hands-on experience with our stack: Databricks, Delta Lake, Kafka, Docker, Airflow, Terraform, and AWS.
Experience in building self-serve data platforms and improving developer experience across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8204175
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
17/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Data Infra Tech Lead
A day in the life and how youll make an impact:
Were seeking an experienced and skilled Data Infra Tech Lead to join our Data Infrastructure team and drive the companys data capabilities at scale.
As the company is fast growing, the mission of the data infrastructure team is to ensure the company can manage data at scale efficiently and seamlessly through robust and reliable data infrastructure. As a tech lead, you are required to independently lead the design, development, and optimization of our data infrastructure, collaborating closely with software engineers, data scientists, data engineers, and other key stakeholders. You are expected to own critical initiatives, influence architectural decisions, and mentor engineers to foster a high-performing team.
You will:
Lead the design and development of scalable, reliable, and secure data storage, processing, and access systems.
Define and drive best practices for CI/CD processes, ensuring seamless deployment and automation of data services.
Oversee and optimize our machine learning platform for training, releasing, serving, and monitoring models in production.
Own and develop the company-wide LLM infrastructure, enabling teams to efficiently build and deploy projects leveraging LLM capabilities.
Own the company's feature store, ensuring high-quality, reusable, and consistent features for ML and analytics use cases.
Architect and implement real-time event processing and data enrichment solutions, empowering teams with high-quality, real-time insights.
Partner with cross-functional teams to integrate data and machine learning models into products and services.
Ensure that our data systems are compliant with the data governance requirements of our customers and industry best practices.
Mentor and guide engineers, fostering a culture of innovation, knowledge sharing, and continuous improvement.
Requirements:
7+ years of experience in data infra or backend engineering.
Strong knowledge of data services architecture, and ML Ops.
Experience with cloud-based data infrastructure in the cloud, such as AWS, GCP, or Azure.
Deep experience with SQL and NoSQL databases.
Experience with Data Warehouse technologies such as Snowflake and Databricks.
Proficiency in backend programming languages like Python, NodeJS, or an equivalent.
Proven leadership experience, including mentoring engineers and driving technical initiatives.
Strong communication, collaboration, and stakeholder management skills.
Bonus Points:
Experience leading teams working with serverless technologies like AWS Lambda.
Hands-on experience with TypeScript in backend environments.
Familiarity with Large Language Models (LLMs) and AI infrastructure.
Experience building infrastructure for Data Science and Machine Learning.
Experience collaborating with BI developers and analysts to drive business value.
Expertise in administering and managing Databricks clusters.
Experience with streaming technologies such as Amazon Kinesis and Apache Kafka.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8220200
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
16/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for a Data Engineer to join our team and play a key role in designing, building, and maintaining scalable, cloud-based data pipelines. You will work with AWS (Redshift, S3, Glue, Managed Airflow, Lambda) to integrate, process, and analyze large datasets, ensuring data reliability and efficiency.
Your work will directly impact business intelligence, analytics, and data-driven decision-making across the
What Youll Do:
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Data Modeling: Design and maintain logical and physical data models to support business needs.
Optimization & Scalability: Improve process efficiency and optimize runtime performance to handle large-scale data workloads.
Collaboration: Work closely with BI analysts and business stakeholders to define data requirements and functional specifications.
Monitoring & Troubleshooting: Ensure data integrity and reliability by proactively monitoring pipelines and resolving issues.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs.
Requirements:
Education & Experience:
BSc in Computer Science, Engineering, or equivalent practical experience.
3+ years of experience in data engineering or related roles.
Technical Expertise:
Proficiency in Python for data engineering and automation.
Experience with Big Data technologies such as Spark, Databricks, DBT, and Airflow.
Hands-on experience with AWS services (S3, Redshift, Glue, Managed Airflow, Lambda)
Knowledge of Docker, Terraform, Kubernetes, and infrastructure automation.
Strong understanding of data warehouse (DWH) methodologies and best practices.
Soft Skills:
Strong problem-solving abilities and a proactive approach to learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
Nice to Have (Big Advantage):
Experience with JavaScript, React, and Node.js.
Familiarity with K8s for infrastructure as code.
Experience with Retool for internal tool development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219367
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking an experienced and innovative Data Engineer to join our dynamic team at ! As a leading FinTech startup based in Tel Aviv, is dedicated to providing comprehensive financial & all around solutions for small businesses. The ideal candidate will possess a strong background in data engineering, data warehousing, and data modeling. This role demands a talented professional who can design, develop, and maintain our data infrastructure, empowering stakeholders across the company to make data-driven decisions. If you are a talented, humble, and ambitious individual ready to make a significant impact in a rapidly scaling company, we invite you to join us on our journey to revolutionize services for small businesses worldwide.
About the Opportunity:
As a Data Engineer at , you will play a pivotal role in establishing and enhancing data infrastructure, empowering stakeholders across the company to make informed, data-driven decisions.

What youll be doing:
Engage with potential customers via phone, email, and online communication tools to follow up on inquiries and leads.
Build and maintain relationships with prospects, understanding their needs and offering tailored solutions.
Develop and deliver sales pitches that highlight the benefits of services.
Guide customers through the sales process, from the initial conversation to closing the deal.
Assist in onboarding new customers, ensuring they have a smooth and positive experience with platform.
Maintain accurate records of interactions, sales progress, and follow-up tasks using our CRM system.
Continuously refine your product knowledge to better assist prospects and customers.
Requirements:
Bachelors degree in Computer Science, Engineering, or equivalent practical experience
3+ years of experience as a Data Engineer
Proficiency in data modeling, ETL/ELT development, and data warehouse methodologies
Strong SQL expertise and experience working with databases such as Postgres, MySQL and MongoDB
Experience with Python for data manipulation and feature preparation
Experience with data pipeline tools like Apache Airflow or similar
Experience with cloud platforms, preferably AWS

Preferred Qualifications:
Experience with machine learning and data science workflows
Experience with big data technologies like Hadoop, Spark, or similar
Experience with MLOps practices and tools
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8214444
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer with a passion for analytics to join our growing data team! This role is ideal for someone who enjoys working across the entire data pipeline.
From data ingestion and transformation, all the way to creating analytics-ready datasets.
Youll get hands-on experience with modern tools, collaborate across functions, and help deliver data-driven insights that shape key decisions.
Youll be part of a supportive team, where mentorship, impact, and learning go hand in hand.
Responsibilities
What Youll Do:
Design, develop and maintain end-to-end data pipelines: extract raw data from sources such as MongoDB, MySQL, Neo4j, and Kafka; transform and load it into our Snowflake data warehouse.
Contribute to data modeling and data quality efforts to ensure reliable, analytics-ready datasets.
Collaborate with analytics, engineering, and business teams to understand data needs and translate requirements into actionable data solutions.
Enable data-driven decisions by building dashboards and reports using tools like dbt and AWS QuickSight.
Learn and grow in both the technical and business-facing sides of data.
Requirements:
13 years of experience in a data-related role (data engineering, analytics engineering, BI) or strong projects/coursework if you're just starting out.
Strong experience with SQL and Python for building, manipulating, and analyzing data
Comfortable with modern data tooling such us - Snowflake, dbt, Airflow, or similar
Enthusiastic about working collaboratively with teammates and stakeholders to deliver business value from data
Strong communicator and continuous learner, ready to tackle new challenges in a fast-paced environment
Hands-on experience with cloud platforms such as AWS, GCP, or Azure, and familiarity with services like AWS Glue, Google BigQuery, or Azure Data Factory.
Hands-on experience with ETL/ELT processes, data ingestion, data transformation, data modeling, and monitoring.
Nice to Have:
Experience with AWS or other cloud platforms.
Familiarity with streaming data (Kafka), Infrastructure as Code (Terraform), or Git-based workflows
Knowledge of SaaS analytics, especially for product or customer behavior.
Understanding of PII, data privacy, or compliance standards.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8228707
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
25/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
About the Role Appdome is building a new data Department, and were looking for a skilled data Engineer to help shape our data infrastructure. If you thrive in fast-paced environments, take ownership, and enjoy working on scalable data solutions, this role is for you. You'll have the opportunity to grow, influence key decisions, and collaborate with security experts and product teams. What Youll Do
* Design, build, and maintain scalable data pipelines, ETL processes, and data infrastructure.
* Optimize data Storage and retrieval for structured and unstructured data.
* Integrate data solutions into Appdomes products in collaboration with software engineers, security experts, and data scientists.
* Apply DevOps best practices (CI/CD, infrastructure as code, observability) for efficient data processing.
* Work with AWS (EC2, Athena, RDS) and ElasticSearch for data indexing and retrieval.
* Optimize and maintain SQL and NoSQL databases.
* Utilize Docker and Kubernetes for containerization and orchestration.
Requirements:
* B.Sc. in Computer Science, data Engineering, or a related field.
* 3+ years of hands-on experience in large-scale data infrastructures.
* Strong Python programming, with expertise in PySpark and Pandas.
* Deep knowledge of SQL and NoSQL databases, including performance optimization.
* Experience with ElasticSearch and AWS cloud services.
* Solid understanding of DevOps practices, Big Data tools, Git, and Jenkins.
* Familiarity with microservices and event-driven design.
* Strong problem-solving skills and a proactive, independent mindset. Advantages
* Experience with LangChain, ClickHouse, DynamoDB, Redis, and Apache Kafka.
* Knowledge of Metabase for data visualization.
* Experience with RESTful APIs and Node.js. Talent We Are Looking For Independent & Self-Driven Comfortable building from the ground up. Growth-Oriented Eager to develop professionally and take on leadership roles. Innovative Passionate about solving complex data challenges. Collaborative Strong communicator who works well with cross-functional teams. Adaptable Thrives in a fast-paced, dynamic environment with a can-do attitude. About the Company: Appdome's mission is to protect every mobile app worldwide and its users. We provide mobile brands with the only patented, centralized, data -driven Mobile Cyber Defense Automation platform. Our platform delivers rapid no-code, no-SDK mobile app security, anti-fraud, anti-malware, anti-cheat, anti-bot implementations, configuration as code ease, Threat-Events threat-aware UI / UX control, ThreatScope Mobile XDR, and Certified Secure DevSecOps Certification in one integrated system. With Appdome, mobile Developers, cyber and fraud teams can accelerate delivery, guarantee compliance, and leverage automation to build, TEST, release, and monitor the full range of cyber, anti-fraud, and other defenses needed in mobile apps from within mobile DevOps and CI/CD pipelines. Leading financial, healthcare, m-commerce, consumer, and B2B brands use Appdome to upgrade mobile DevSecOps and protect Android & IOS apps, mobile customers, and businesses globally. Today, Appdome's customers use our platform to secure over 50,000+ mobile apps, with protection for over 1 billion mobile end users projected.
Appdome is an Equal Opportunity Employer. We are committed to diversity, equity, and inclusion in our workplace. We do not discriminate based on race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by law. All qualified applicants will receive consideration for employment without regard to any of these characteristics.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8118270
סגור
שירות זה פתוח ללקוחות VIP בלבד