דרושים » דאטה » Data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
2 ימים
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer, you will be instrumental in building and maintaining the data infrastructure that power our analytics and decision-making processes. Working closely with the broader data team, R&D, and various stakeholders, you will design, implement, and optimize data pipelines and storage solutions, ensuring efficient and reliable data flow across the organization.

Responsibilities:
Design, develop, and maintain scalable data pipelines using tools such as Airflow and DBT.​
Manage and optimize our data warehouse in Snowflake, ensuring data integrity and performance.​
Collaborate with analytics and business teams to understand data requirements and deliver appropriate solutions.​
Implement and maintain data integration processes between various systems and platforms.​
Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption.​
Stay updated with the latest industry trends and technologies to continually improve our data infrastructure.​
Requirements:
Requirements:
3+ years of experience in data engineering or a related field.​
Proficiency in SQL and experience with modern lakehouse modeling
Hands-on experience with data pipeline orchestration tools like Apache Airflow.​
Experience with DBT for data transformation and modeling.​
Familiarity with data visualization tools such as Tableau.​
Strong programming skills in languages such as Python or Java.​
Hands-on experience with AWS data solutions (or other major cloud vendor)
Excellent problem-solving skills and attention to detail.​
Strong communication skills and the ability to work collaboratively in a team environment.​
Relevant academic degree in Computer Science, Engineering, or related field (or equivalent work experience).

Preferred Qualifications:
Experience in the travel or insurance industries.​
Familiarity with Mixpanel or similar analytics platforms.​
Knowledge of data security and privacy best practices.
This position is open to all candidates.
 
Hide
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8230814
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
04/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
What You'll Do:
Shape the Future of Data- Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
Bonus points:
Hands-on experience with our stack: Databricks, Delta Lake, Kafka, Docker, Airflow, Terraform, and AWS.
Experience in building self-serve data platforms and improving developer experience across the organization.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8204175
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
16/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time and Temporary
We are looking for a Data Engineer to join our team and play a key role in designing, building, and maintaining scalable, cloud-based data pipelines. You will work with AWS (Redshift, S3, Glue, Managed Airflow, Lambda) to integrate, process, and analyze large datasets, ensuring data reliability and efficiency.
Your work will directly impact business intelligence, analytics, and data-driven decision-making across the
What Youll Do:
ETL & Data Processing: Develop and maintain ETL processes, integrating data from various sources (APIs, databases, external platforms) using Python, SQL, and cloud technologies.
Cloud & Big Data Technologies: Implement solutions using PySpark, Databricks, Airflow, and cloud platforms (AWS) to process large-scale datasets efficiently.
Data Modeling: Design and maintain logical and physical data models to support business needs.
Optimization & Scalability: Improve process efficiency and optimize runtime performance to handle large-scale data workloads.
Collaboration: Work closely with BI analysts and business stakeholders to define data requirements and functional specifications.
Monitoring & Troubleshooting: Ensure data integrity and reliability by proactively monitoring pipelines and resolving issues.
Data Modeling: Design and maintain logical and physical data models to support analytics and operational needs.
Requirements:
Education & Experience:
BSc in Computer Science, Engineering, or equivalent practical experience.
3+ years of experience in data engineering or related roles.
Technical Expertise:
Proficiency in Python for data engineering and automation.
Experience with Big Data technologies such as Spark, Databricks, DBT, and Airflow.
Hands-on experience with AWS services (S3, Redshift, Glue, Managed Airflow, Lambda)
Knowledge of Docker, Terraform, Kubernetes, and infrastructure automation.
Strong understanding of data warehouse (DWH) methodologies and best practices.
Soft Skills:
Strong problem-solving abilities and a proactive approach to learning new technologies.
Excellent communication and collaboration skills, with the ability to work independently and in a team.
Nice to Have (Big Advantage):
Experience with JavaScript, React, and Node.js.
Familiarity with K8s for infrastructure as code.
Experience with Retool for internal tool development.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219367
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
19/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking talented and passionate Data Engineer to join our growing Data team. In this pivotal role, you will be instrumental in designing, building, and optimizing the critical data infrastructure that underpins innovative creative intelligence platform. You will tackle complex data challenges, ensuring our systems are robust, scalable, and capable of delivering high-quality data to power our advanced AI models, customer-facing analytics, and internal business intelligence. This is an opportunity to make a significant impact on our product, contribute to a data-driven culture, and help solve fascinating problems at the intersection of data, AI, and marketing technology.
Key Responsibilities
Architect & Develop Data Pipelines: Design, implement, and maintain sophisticated, end-to-end data pipelines for ingesting, processing, validating, and transforming large-scale, diverse datasets.
Manage Data Orchestration: Implement and manage robust workflow orchestration for complex, multi-step data processes, ensuring reliability and visibility.
Advanced Data Transformation & Modeling: Develop and optimize complex data transformations using advanced SQL and other data manipulation techniques. Contribute to the design and implementation of effective data models for analytical and operational use.
Ensure Data Quality & Platform Reliability: Establish and improve processes for data quality assurance, monitoring, alerting, and performance optimization across the data platform. Proactively identify and resolve data integrity and pipeline issues.
Cross-Functional Collaboration: Partner closely with AI engineers, product managers, developers, customer success and other stakeholders to understand data needs, integrate data solutions, and deliver features that provide exceptional value.
Drive Data Platform Excellence: Contribute to the evolution of our data architecture, champion best practices in data engineering (e.g., DataOps principles), and evaluate emerging technologies to enhance platform capabilities, stability, and cost-effectiveness.
Foster a Culture of Learning & Impact: Actively share knowledge, contribute to team growth, and maintain a strong focus on how data engineering efforts translate into tangible product and business outcomes.
Requirements:
3+ years of experience as a Data Engineer, building and managing complex data pipelines and data-intensive applications.
Solid understanding and application of software engineering principles and best practices. Proficiency in a relevant programming language (e.g., Python, Scala, Java) is highly desirable.
Deep expertise in writing, optimizing, and troubleshooting complex SQL queries for data transformation, aggregation, and analysis in relational and analytical database environments.
Hands-on experience with distributed data processing systems, cloud-based data platforms, data warehousing concepts, and workflow management tools.
Strong ability to diagnose complex technical issues, identify root causes, and develop effective, scalable solutions.
A genuine enthusiasm for tackling new data challenges, exploring innovative technologies, and continually expanding your skillset.
A keen interest in understanding how data powers product features and drives business value, with a focus on delivering results.
Excellent ability to communicate technical ideas clearly and work effectively within a multi-disciplinary team environment.
Advantages:
Familiarity with the marketing/advertising technology domain and associated datasets.
Experience with data related to creative assets, particularly video or image analysis.
Understanding of MLOps principles or experience supporting machine learning workflows.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8223467
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer with a passion for analytics to join our growing data team! This role is ideal for someone who enjoys working across the entire data pipeline.
From data ingestion and transformation, all the way to creating analytics-ready datasets.
Youll get hands-on experience with modern tools, collaborate across functions, and help deliver data-driven insights that shape key decisions.
Youll be part of a supportive team, where mentorship, impact, and learning go hand in hand.
Responsibilities
What Youll Do:
Design, develop and maintain end-to-end data pipelines: extract raw data from sources such as MongoDB, MySQL, Neo4j, and Kafka; transform and load it into our Snowflake data warehouse.
Contribute to data modeling and data quality efforts to ensure reliable, analytics-ready datasets.
Collaborate with analytics, engineering, and business teams to understand data needs and translate requirements into actionable data solutions.
Enable data-driven decisions by building dashboards and reports using tools like dbt and AWS QuickSight.
Learn and grow in both the technical and business-facing sides of data.
Requirements:
13 years of experience in a data-related role (data engineering, analytics engineering, BI) or strong projects/coursework if you're just starting out.
Strong experience with SQL and Python for building, manipulating, and analyzing data
Comfortable with modern data tooling such us - Snowflake, dbt, Airflow, or similar
Enthusiastic about working collaboratively with teammates and stakeholders to deliver business value from data
Strong communicator and continuous learner, ready to tackle new challenges in a fast-paced environment
Hands-on experience with cloud platforms such as AWS, GCP, or Azure, and familiarity with services like AWS Glue, Google BigQuery, or Azure Data Factory.
Hands-on experience with ETL/ELT processes, data ingestion, data transformation, data modeling, and monitoring.
Nice to Have:
Experience with AWS or other cloud platforms.
Familiarity with streaming data (Kafka), Infrastructure as Code (Terraform), or Git-based workflows
Knowledge of SaaS analytics, especially for product or customer behavior.
Understanding of PII, data privacy, or compliance standards.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8228707
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
22/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Data Engineer
We're hiring an experienced Data Engineer to join our growing team of analytics experts in order to help & lead the build-out of our data integration and pipeline processes, tools and platform.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
The right candidate must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our companys data architecture to support our next generation of products and data initiatives.
In this role, you will be responsible for:
Create ELT/Streaming processes and SQL queries to bring data to/from the data warehouse and other data sources.
Establish scalable, efficient, automated processes for large-scale data analyses.
Support the development of performance dashboards & data sets that will generate the right insight.
Work with business owners and partners to build data sets that answer their specific business questions.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization.
Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
Own the data lake pipelines, maintenance, improvements and schema.
Requirements:
BS or MS degree in Computer Science or a related technical field.
3-4 years of Python / Java development experience.
3-4 years of experience as a Data Engineer or in a similar role (BI developer).
3-4 years of direct experience with SQL (No-SQL is a plus), data modeling, data warehousing, and building ELT/ETL pipelines - MUST
Experience working with cloud environments (AWS preferred) and big data technologies (EMR,EC2, S3 ) - DBT is an advantage.
Experience working with Airflow - big advantage
Experience working with Kubernetes advantage
Experience working with at least in one of the big data environments: Snowflake, Vertica, Hadoop (Impala/Hive), Redshift etc MUST
Experience working with Spark advantage
Exceptional troubleshooting and problem-solving abilities.
Excellent verbal/written communication & data presentation skills, including experience communicating to both business and technical teams.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8188355
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
17/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As a Data Engineer, you will collaborate with various stakeholders to design, develop, and maintain data analytics solutions that enable informed business decisions. You will use the latest tools and technologies to ensure the data is accurate, timely, and actionable
How is your day going to look, what will you be doing?
Develop data solutions in collaboration with data analysts, data scientists, product engineers, and product managers
Design and develop the infrastructure for ingesting and processing data in the analytics data warehouse
Design data analytics modeling solutions focused on business decision-making
Write robust ETL processes, handling a variety of structured and unstructured data sources
Ensure data quality, consistency, and governance across the analytics layer.
Requirements:
3+ years of experience working as a BI Developer or Data Engineer
Excellent SQL skills
Python proficiency
Experience designing analytics-oriented data models and data warehouses
Experience working with data pipelines and ETL processes (e.g., Airflow, Luigi, AWS Glue, DBT)
Passionate about data with a strong analytical approach
Experience working with cloud environment services (e.g., AWS, GCP, Azure)
Bonus points:
Strong understanding of business metrics and how to translate data into actionable insights
Experience with data visualization tools (e.g., Tableau, Power BI, Looker)
Familiarity with data governance and data quality best practices
Excellent communication skills to work with cross-functional teams including data analysts, data scientists, and product managers.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8220210
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
16/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Data Engineer to join our team and help advance our Apps solution. Our product is designed to provide detailed and accurate insights into Apps Analytics, such as traffic estimation, revenue analysis, and app characterization. The role involves constructing and maintaining scalable data pipelines, developing and integrating machine learning models, and ensuring data integrity and efficiency. You will work closely with a diverse team of scientists, engineers, analysts, and collaborate with business and product stakeholders.
Key Responsibilities:
Develop and implement complex, innovative big data ML algorithms for new features, working in collaboration with data scientists and analysts.
Optimize and maintain end-to-end data pipelines using big data technologies to ensure efficiency and performance.
Monitor data pipelines to ensure data integrity and promptly troubleshoot any issues that arise.
Requirements:
Bachelor's degree in Computer Science or equivalent practical experience.
At least 3 years of experience in data engineering or related roles.
Experience with big data Machine Learning - a must ! .
Proficiency in Python- must. Scala is a plus.
Experience with Big Data technologies including Spark, EMR and Airflow.
Experience with containerization/orchestration platforms such as Docker and Kubernetes.
Familiarity with distributed computing on the cloud (such as AWS or GCP).
Strong problem-solving skills and ability to learn new technologies quickly.
Being goal-driven and efficient.
Excellent communication skills and ability to work independently and in a team.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8219237
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
20/05/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a talented Data Engineer to join our analytics team in the Big Data Platform group.
You will support our product and business data initiatives, expand our data warehouse, and optimize our data pipeline architecture.
You must be self-directed and comfortable supporting the data needs of multiple systems and products.
The right candidate is excited by the prospect of building the data architecture for the next generation of products and data initiatives.

This is a unique opportunity to join a team full of outstanding people making a big impact.
We work on multiple products in many domains to deliver truly innovative solutions in the Cyber Security and Big Data realm.

This role requires the ability to collaborate closely with both R&D teams and business stakeholders, to understand their needs and translate them into robust and scalable data solutions.

Key Responsibilities
Maintain and develop enterprise-grade Data Warehouse and Data Lake environments
Create data infrastructure for various R&D groups across the organization to support product development and optimization
Work with data experts to assist with technical data-related issues and support infrastructure needs
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, redesigning infrastructure for scalability
Build and maintain robust ETL/ELT pipelines for data ingestion, transformation, and delivery across various systems
Requirements:
B.Sc. in Engineering or a related field
3 - 10 years of experience as a Data Engineer working on production systems
Advanced SQL knowledge and experience with relational databases
Proven experience using Python
Hands-on experience building, optimizing, and automating data pipelines, architectures, and data sets
Experience in creating and maintaining ETL/ELT processes
Strong project management and organizational skills
Strong collaboration skills with both technical (R&D) and non-technical (business) teams
Advantage: experience with Azure services such as Storage Accounts, Databricks, EventHub, and Spark
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8184909
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/06/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a Data & AI Architect.
Responsibilities:
Build the foundations of modern data architecture, supporting real-time, high-scale (Big Data) sports data pipelines and ML/AI use cases, including Generative AI.
Map the companys data needs and lead the selection and implementation of key technologies across the stack: data lakes (e.g., Iceberg), databases, ETL/ELT tools, orchestrators, data quality and observability frameworks, and statistical/ML tools.
Design and build a cloud-native, cost-efficient, and scalable data infrastructure from scratch, capable of supporting rapid growth, high concurrency, and low-latency SLAs (e.g., 1-second delivery).
Lead design reviews and provide architectural guidance for all data solutions, including data engineering, analytics, and ML/data science workflows.
Set high standards for data quality, integrity, and observability. Design and implement processes and tools to monitor and proactively address issues like missing events, data delays, or integrity failures.
Collaborate cross-functionally with other architects, R&D, product, and innovation teams to ensure alignment between infrastructure, product goals, and real-world constraints.
Mentor engineers and promote best practices around data modeling, storage, streaming, and observability.
Stay up-to-date with industry trends, evaluate emerging data technologies, and lead POCs to assess new tools and frameworks especially in the domains of Big Data architecture, ML infrastructure, and Generative AI platforms.
Requirements:
At least 10 years of experience in a data engineering role, including 2+ years as a data & AI architect with ownership over company-wide architecture decisions.
Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
Strong hands-on experience with:
Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
Real-time streaming technologies (e.g., Kafka, Pub/Sub)
Data observability and quality monitoring solutions
Excellent proficiency in SQL, and in either Python or JavaScript.
Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) or strong motivation to learn and lead in this space.
Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
Willingness to work on-site in Ashkelon once a week.
Advantage:
Experience leading POCs and tool selection processes.
Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8208147
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: More than one
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers.
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8232260
סגור
שירות זה פתוח ללקוחות VIP בלבד