רובוט
היי א אי
stars

תגידו שלום לתפקיד הבא שלכם

לראשונה בישראל:
המלצות מבוססות AI שישפרו
את הסיכוי שלך למצוא עבודה

מהנדס/ת דאטה/DATA ENGINEER

אני עדיין אוסף
מידע על תפקיד זה

לעדכן אותך כשהכל מוכן?

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP

חברות מובילות
כל החברות
כל המידע למציאת עבודה
להשיב נכון: "ספר לי על עצמך"
שימו בכיס וצאו לראיון: התשובה המושלמת לשאלה שמצ...
קרא עוד >
לימודים
עומדים לרשותכם
מיין לפי: מיין לפי:
הכי חדש
הכי מתאים
הכי קרוב
טוען
סגור
לפי איזה ישוב תרצה שנמיין את התוצאות?
Geo Location Icon

לוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/07/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are a Data and innovation team operating under the CTO group. We lead cutting-edge initiatives in data technologies and strategic innovation projects.

Our mission is to explore and implement new technologies and enrich the companys internal data assets through smart collection, integration, and automation.

We move fast, work across multiple domains, and maintain a culture that values curiosity, ownership, and impact.

This is an on-site position.



Responsibilities

Take end-to-end ownership of data pipelines: from extraction (web scraping, APIs), through transformation and orchestration, to delivering accessible and valuable datasets.
Integrate new and external data sources into the companys internal platforms.
Solve real-time issues and optimize pipeline performance through smart automation.
Collaborate with cross-functional teams to improve access to high-quality, structured data.
Work on multiple projects simultaneously in a dynamic and agile environment.
Lead and contribute to early-stage innovation projects directly impacting business strategy.
Requirements:
2+ years of hands-on Python development (ETL, scripting, automation).
Strong knowledge of SQL and ability to work independently with relational databases.
Experience building and maintaining ETL workflows and orchestrating data processes.
Familiarity with scraping tools/frameworks (e.g., requests, Selenium, BeautifulSoup).
Ability to manage multiple tasks and projects independently and efficiently.
A genuine love for technology, a curiosity to explore new tools, and an eagerness to learn.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8249956
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
08/07/2025
מיקום המשרה: חדרה וחיפה
סוג משרה: משרה מלאה
תיאור התפקיד: * ביצוע חקירה של מערכות ביטחוניות מובילות * ניתן לעבוד מאזור קריות או מאזור חדרה
דרישות:
תיאור התפקיד:

* ביצוע חקירה של מערכות ביטחוניות מובילות

* ניתן לעבוד מאזור קריות או מאזור חדרה דרישות התפקיד:

* תואר ראשון בתחום ההנדסה - חובה

* תואר שני - יתרון

* הכרות עם מערכות מוטסות - חובה

* ניסיון לפחות של 3 שנים בבדיקות מערכתיות או ניסיון בתפקיד Data Analyst/ Data Engineer / Data Science המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8248515
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
06/07/2025
Location: Ramat Gan
Job Type: Full Time and Hybrid work
You will specialize in designing and building world class, scalable data architectures, ensuring reliable data flow and integration for groundbreaking biotechnological research. Your expertise in big data tools and pipelines will accelerate our ability to derive actionable insights from complex datasets, driving innovations in improving patients outcome and in delivering life savings treatment solutions.

In this role, you will work closely with data scientists, analysts, and other cross-functional teams to understand their data needs and requirements. You will also be responsible for ensuring that data is easily accessible and can be used to support data-driven decision making.

Location: Ramat Gan, Hybrid Model

What will you do?
Design, build, and maintain data pipelines to extract, transform, and load data from various sources, including databases, APIs, and flat files.
Enhance our data warehouse system to dynamically support multiple analytics use cases.
Reimplement and productize scientific computational methods.
Implement data governance policies and procedures to ensure data quality, security, and privacy.
Collaborate with data scientists and other cross-functional teams to understand their data needs and requirements.
Develop and maintain documentation for data pipelines, processes, and systems.
Requirements:
We will only consider data engineers with strong coding skills with an extensive background in data orchestration, data warehousing and ETL tools.

Required qualifications:
Bachelor's or Master's degree in a related field (e.g. computer science, data science, engineering, computational biology).
At least 5 years of experience with programming languages, specifically Python.
Must have at least 3+ years of experience as a Data Engineer, ideally with experience in multiple data ecosystems.
Proficiency in SQL and experience with database technologies (e.g. MySQL, PostgreSQL, Oracle).
Familiarity with data storage technologies (e.g. HDFS, NoSQL databases).
Experience with ETL tools (e.g. Apache Beam, Apache Spark).
Experience with orchestration tools (e.g. Apache Airflow, Dagster).
Experience with data warehousing technologies (ideally BigQuery).
Experience working with large and complex data sets.
Experience working in a cloud environment.
Strong problem-solving and communication skills.
Familiarity with biotech or healthcare data - an advantage.

Desired personal traits:
You want to make an impact on humankind.
You prioritize We over I.
You enjoy getting things done and striving for excellence.
You collaborate effectively with people of diverse backgrounds and cultures.
You constantly challenge your own assumptions, pushing for continuous improvement.
You have a growth mindset.
You make decisions that favor the company, not yourself or your team.
You are candid, authentic, and transparent.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8246216
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
03/07/2025
Location: Herzliya
Job Type: Full Time
looking for a skilled data engineer to join a cutting-edge project supporting a military technology unit. This role offers a unique opportunity to work on innovative cloud-based data infrastructure within a secure and challenging environment.

Responsibilities:
Data Infrastructure Development: Design, build, and manage advanced data infrastructure in Google Cloud Platform (GCP) to support the operational and analytical needs of the units clients.
Data Pipeline Engineering: Develop robust and scalable data pipelines in GCP, perform integrations with diverse data sources, and enable AI/ML workflows using tools such as Dataflow, Pub/Sub, and BigQuery.
ETL & DWH Automation: Improve, optimize, and automate ETL processes and data warehouse operations in the cloud.
Cross-Organizational Tech Support: Participate in high-impact, cross-organizational technology projects in a dynamic and innovative technical environment.
Requirements:
At least 2 years of experience as a Data Engineer.
Familiarity with cloud platforms (Google Cloud Platform, AWS, or Azure) and the ability to implement solutions across diverse environments.
Experience with CI/CD tools such as Jenkins, GitLab CI/CD, Azure DevOps, Cloud Build, or GitHub Actions.
Strong knowledge of SQL, including writing complex queries and performance optimization.
Proficiency in Python, including experience with data processing libraries such as Pandas.
Hands-on experience in building ETL processes and integrating various data sources.
Familiarity with Infrastructure as Code (IaC) tools such as Terraform.
Experience in designing and implementing Data Warehouse infrastructure.
Ability to understand and implement data workflows on cloud platforms.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8243031
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
the leading global financial news & data platform, is on the lookout for a Senior Data Engineer to join our growing data team!
In this role, you will design and develop scalable data solutions, optimize data workflows, and support critical business processes. You will work with a variety of databases and big data tools in a cloud environment, focusing on data modeling, governance, and analytics.

What youll be doing:
Design, develop, and maintain end-to-end ETL pipelines, from gathering business requirements to implementation.
Work with multiple database technologies, especially BigQuery.
Optimize data models (DWH, fact & dimension tables, RI, SCDs) for performance and scalability.
Implement data governance best practices and maintain comprehensive documentation.
Utilize Big Data tools in cloud environments (GCP preferred).
Develop and support complex business workflows and data processes.
Design and implement monitoring systems to ensure data quality throughout the pipeline.
Workflow orchestration using Apache Airflow.
Collaborate with analysts and stakeholders to ensure high-quality data for business insights.
Support and optimize Tableau infrastructure for data visualization.
Requirements:
4+ years of experience in Data Engineering.
Strong SQL skills and expertise in BigQuery or similar databases.
4+ years of Python experience for data processing and automation.
Proven experience in designing complex business workflows and data processes.
Deep understanding of data modeling principles and best practices.
Hands-on experience with cloud-based big data tools (GCP preferred).
Must have experience with Apache Airflow for orchestrating data workflows.
Strong analytical skills with the ability to translate business needs into technical solutions.
Experience with Tableau infrastructure management is an advantage.
Excellent communication skills and ability to work cross-functionally.
Nice to have:
Familiarity with streaming data frameworks (Kafka, Pub/Sub).
If you are passionate about data, scalability, and building efficient solutions, wed love to hear from you! Apply now and be part of our data-driven journey.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8239782
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/07/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are making the future of Mobility come to life starting today.
At our company we support the worlds largest vehicle fleet operators and transportation providers to optimize existing operations and seamlessly launch new, dynamic business models - driving efficient operations and maximizing utilization.
At the heart of our platform lies the data infrastructure, driving advanced machine learning models and optimization algorithms. As the owner of data pipelines, you'll tackle diverse challenges spanning optimization, prediction, modeling, inference, transportation, and mapping.
As a Senior Data Engineer, you will play a key role in owning and scaling the backend data infrastructure that powers our platformsupporting real-time optimization, advanced analytics, and machine learning applications.
What You'll Do
Design, implement, and maintain robust, scalable data pipelines for batch and real-time processing using Spark, and other modern tools.
Own the backend data infrastructure, including ingestion, transformation, validation, and orchestration of large-scale datasets.
Leverage Google Cloud Platform (GCP) services to architect and operate scalable, secure, and cost-effective data solutions across the pipeline lifecycle.
Develop and optimize ETL/ELT workflows across multiple environments to support internal applications, analytics, and machine learning workflows.
Build and maintain data marts and data models with a focus on performance, data quality, and long-term maintainability.
Collaborate with cross-functional teams including development teams, product managers, and external stakeholders to understand and translate data requirements into scalable solutions.
Help drive architectural decisions around distributed data processing, pipeline reliability, and scalability.
Requirements:
4+ years in backend data engineering or infrastructure-focused software development.
Proficient in Python, with experience building production-grade data services.
Solid understanding of SQL
Proven track record designing and operating scalable, low-latency data pipelines (batch and streaming).
Experience building and maintaining data platforms, including lakes, pipelines, and developer tooling.
Familiar with orchestration tools like Airflow, and modern CI/CD practices.
Comfortable working in cloud-native environments (AWS, GCP), including containerization (e.g., Docker, Kubernetes).
Bonus: Experience working with GCP
Bonus: Experience with data quality monitoring and alerting
Bonus: Strong hands-on experience with Spark for distributed data processing at scale.
Degree in Computer Science, Engineering, or related field.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8238970
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
30/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are seeking a skilled and motivated Data Engineer with expertise in Elasticsearch, cloud technologies, and Kafka. As a data engineer, you will be responsible for designing, building and maintaining scalable and efficient data pipelines that will support our organization's data processing needs.
The role will entail:
Design and develop data platforms based on Elasticsearch, Databricks, and Kafka
Build and maintain data pipelines that are efficient, reliable and scalable
Collaborate with cross-functional teams to identify data requirements and design solutions that meet those requirements
Write efficient and optimized code that can handle large volumes of data
Implement data quality checks to ensure accuracy and completeness of the data
Troubleshoot and resolve data pipeline issues in a timely manner.
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
3+ years of experience in data engineering
Expertise in Elasticsearch, cloud technologies (such as AWS, Azure, or GCP), Kafka and Databricks
Proficiency in programming languages such as Python, Java, or Scala
Experience with distributed systems, data warehousing and ETL processes
Experience with Container environment such AKS\EKS\OpenShift is a plus
high security clearance is a plus.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8237124
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
Location: Airport City
Job Type: Full Time
Design, implement, and maintain the complete data flow, from extraction to visualization using external tools
Work within a product team where solutions are collaboratively proposed. You will be expected to translate requirements into technical designs and implement them as part of large-scale data engineering solutions
Utilize various Machine Learning models to explore data, collaborating with cutting-edge big data tools and techniques
Collaborate with various stakeholders to plan, design, develop, test and maintain extraordinary features
Collaborating with a team of skilled developers to develop high-traffic, cloud-based applications
Requirements:
At least 5 years of experience in Big-Data technologies as a data engineer, including ETL/ELT processes and data exploration
At least 3 years of experience in building end-to-end data pipelines using Spark, Databricks or similar tools is required
At least 3 years of experience in Python programming and SQL queries
Experience as a backend developer- advantage
Team player committed to the success of the team
Strong verbal and written communication skills with the ability to clearly explain technical concepts
Excellent debugging, investigating and problem-solving abilities
Curious learner who loves sharing knowledge and best practices, and can work both independently and in a team
Ability to thrive in a fast-paced, ambiguous, and changing environment
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8235678
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
Location: Petah Tikva
Job Type: Full Time
We are seeking an experienced Senior Data Engineer.

The ideal candidate is a self-motivated, multi-tasking team player with a proven track record of collaboration.

You will be responsible for designing, developing, managing, and maintaining our open-source data platform, including our Data Lakehouse (S3, Delta Lake, and ClickHouse), ETL processes, and orchestration tool (Temporal Workflow).

What You Will Do
Develop a scalable data platform that integrates multiple sources for easy access.
Design and enhance data tools, including orchestration, governance, Data Lakehouse, BI, and more.
Ensure the smooth operation of data systems for analysts, data scientists, and engineers.
Optimize data pipelinesingestion, processing, and outputwithin a microservices environment.
Requirements:
5+ years of experience in a data engineering-related position
SQL expertise, including working with various databases, data warehouses, third-party data sources, and AWS cloud services
Proficient in Python (OOP + Processing packages like Polars)
Experience in building, designing, and optimizing data pipelines
Self-driven, can-do attitude
Nice to Have:

Experience with Spark
Experience with Open Table Format (Delta Lake / Iceberg)
Experience with ClickHouse
Experience with Temporal Workflow
Familiarity with ERP systems (big advantage)
Familiarity with supply chain systems (advantage)
Passion for open-source tools
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8235037
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We seek a Director of Data to join us and lead our data group.
As our Director of Data, you will be a key member of our R&D leadership team. You will be responsible for developing and executing a data strategy that aligns with our business goals, overseeing data management, analytics, and validation, and ensuring data integrity at every stage of product development and production.
A day in the life and how youll make an impact:
Define and execute a strategic data roadmap aligned with business objectives, fostering a data-driven culture and leading a high-performing team of data engineers, scientists, and analysts.
Establish robust data validation frameworks, ensuring product integrity and accuracy through all stages, from data acquisition to end-user delivery.
Build and optimize scalable data infrastructure and pipelines to support our data needs and ensure data security, compliance, and accessibility.
Collaborate with product and engineering teams to create and launch data-driven products, ensuring they are built on reliable data and designed to meet customer needs.
Guide the team in generating actionable insights to drive business decisions and product innovation in areas such as personalization, marketing, and customer success.
Implement data governance policies and maintain compliance with industry regulations and best practices.
Requirements:
10+ years of experience in data-related roles, with at least 5 years in a leadership position (ideally within a tech or AI-driven startup environment).
M.Sc. or PhD in Data Science/Computer Science/Engineering/Statistics, or a related field.
Extensive experience with cloud platforms (AWS, GCP, or Azure) and modern data warehouses (Snowflake, BigQuery, or Redshift).
Proficiency in data technologies, such as SQL, Python, R, Looker and big data tools (e.g., Hadoop, Spark).
Proven experience in leveraging data for product development, business intelligence, and operational optimization.
Strong track record of building and managing cross-functional data teams and influencing across all levels of an organization.
Excellent communication skills, with the ability to convey complex data insights in an accessible manner to non-technical stakeholders.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8234801
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for an experienced Senior Data Engineer to join our Data team.
In this role, you will lead and strengthen our Data Team, drive innovation, and ensure the robustness of our data and analytics platforms.
A day in the life and how youll make an impact:
Design and develop high-performance data pipelines and ETL processes to support diverse business needs.
Work closely with business intelligence, sales, and other teams to integrate data solutions, ensuring seamless alignment and collaboration across functions.
Continuously improve our data analytics platforms, optimizing system performance while ensuring a robust and reliable data infrastructure.
Oversee the entire data lifecycle, from infrastructure setup and data acquisition to detailed analysis and automated reporting, driving business growth through data-driven insights.
Implement robust data quality checks, monitoring mechanisms, and data governance policies to maintain data integrity and security, troubleshooting and resolving any data-related issues efficiently.
Requirements:
B.Sc. in computer science/information systems engineering
5+ years of experience in data engineering (Preferably from a startup company)
Familiarity with data engineering tech stack, including ETL tools (Airflow, Spark, Flink, Kafka, Pubsub).
Strong SQL expertise, working with various databases (relational and NoSQL) such as MySQL, FireStore, Redis, and ElasticSearch.
Experience with cloud-based data warehouse solutions like BigQuery, Snowflake, and Oracle, and proficiency in working with public clouds (AWS/GCP).
Coding experience with Python
Experience with dashboard tools.
Ability to communicate ideas and analyze results effectively, both verbally and in writing.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8234728
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/06/2025
Location: Tel Aviv-Yafo
We are looking for an experienced Data Engineer to join our team. In this role, you will be a critical part in developing Secure Data Collaboration Platform. This position extends beyond traditional data engineering, incorporating key elements of backend development, including designing and implementing APIs for seamless interactions between teams.
You will work closely with data scientists, supporting their workflows by building scalable infrastructure, optimizing data pipelines, and ensuring efficient access to diverse data sources. A strong understanding of data science principles and computational frameworks is essential to succeed in this role
Responsibilities:
Planning and developing infrastructure for the Data Science team (Computational Framework)
Building and optimizing distributed data pipelines
Developing and maintaining connectivity to various data sources (Relational DBs and others)
Designing and setting up APIs for cross-team integrations
Maintaining and improving the Computational Framework CI process
Delivering high-quality, well-tested code with excellent automated test coverage
Requirements:
Vast experience (5+ years) in backend processes (data pipelines and workflows)
Previous experience with Python, Databases, and Linux, as well as distributed computing frameworks such as Spark or Dask, is an advantage.
Analytical and problem-solving skills
Experience with backend test automation tools (not necessarily UI based)
B.Sc. in Computer Science / Software Engineering or equivalent
Good communication skills. Fluent in English verbal and written
Desired Skills
Basic understanding of Data Science principles (statistics, ML, etc.)
Experience with Data Science tools/frameworks/platforms
Familiarity with CI/CD methodologies
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8233248
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/06/2025
Location: Tel Aviv-Yafo
Job Type: Full Time
We're seeking a seasoned Data Engineer to build the data infrastructure that fuels our groundbreaking intelligent agent. You'll play a crucial role in developing large-scale data-intensive systems that power Apollo's capabilities.
**What You'll Do:**
- Design and implement massive parallel processing solutions for both real-time and batch scenarios
- Develop real-time stream processing solutions using technologies like Apache Kafka or Amazon Kinesis
- Build infrastructures that bring machine learning capabilities to production
- Orchestrate containerized applications in cloud environments (AWS and GCP)
- Write production-grade Python code and work with various database systems
- Administer and design cloud-based data warehousing solutions
- Work with unstructured data, complex data sets, and perform data modeling
- Collaborate with cross-functional teams to integrate data solutions into our AI systems
Requirements:
- 3+ years of experience building massive parallel processing solutions (e.g., Spark, Presto)
- 2+ years of experience developing real-time stream processing solutions (e.g., Apache Kafka, Amazon Kinesis)
- 2+ years of experience developing ML infrastructures for production (e.g., Kubeflow, Sagemaker, Vertex)
- Experience orchestrating containerized applications in AWS and GCP using EKS and GKE
- 3+ years of experience writing production-grade Python code
- Experience working with both relational and non-relational databases
- 2+ years of experience administering and designing cloud-based data warehousing solutions (e.g., Snowflake, Amazon Redshift)
- 2+ years of experience working with unstructured data, complex data sets, and data modeling
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8233196
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: More than one
We're seeking an outstanding and passionate Data Platform Engineer to join our growing R&D team.
You will work in an energetic startup environment following Agile concepts and methodologies. Joining the company at this unique and exciting stage in our growth journey creates an exceptional opportunity to take part in shaping Finaloop's data infrastructure at the forefront of Fintech and AI.
What you'll do:
Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform.
Develop and optimize data infrastructure to support real-time analytics and reporting.
Implement data governance, security, and privacy controls to ensure data quality and compliance.
Create and maintain documentation for data platforms and processes
Collaborate with data scientists and analysts to deliver actionable insights to our customers.
Troubleshoot and resolve data infrastructure issues efficiently
Monitor system performance and implement optimizations
Stay current with emerging technologies and implement innovative solutions
Tech stack: AWS Serverless, Python, Airflow, Airbyte, Temporal, PostgreSQL, Snowflake, Kubernetes, Terraform, Docker.
Requirements:
3+ years experience in data engineering or platform engineering roles
Strong programming skills in Python and SQL
Experience with orchestration platforms like Airflow/Dagster/Temporal
Experience with MPPs like Snowflake/Redshift/Databricks
Hands-on experience with cloud platforms (AWS) and their data services
Understanding of data modeling, data warehousing, and data lake concepts
Ability to optimize data infrastructure for performance and reliability
Experience working with containerization (Docker) in Kubernetes environments.
Familiarity with CI/CD concepts
Fluent in English, both written and verbal
And it would be great if you have (optional):
Experience with big data processing frameworks (Apache Spark, Hadoop)
Experience with stream processing technologies (Flink, Kafka, Kinesis)
Knowledge of infrastructure as code (Terraform)
Experience building analytics platforms
Experience building clickstream pipelines
Familiarity with machine learning workflows and MLOps
Experience working in a startup environment or fintech industry
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8232260
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
26/06/2025
מיקום המשרה:מרכז
סוג משרה: משרה מלאה
חברה מגייסת data Engineer עבור מוסד פיננסי מוביל במרכז!
דרישות:
- 6+ שנות ניסיון בפיתוח עם פייתון
- 4+ שנות ניסיון עם Git/Gitlab ו-Jenkins
- 3+ שנות פיתוח API Rest כולל API מהיר
- 2+ שנות ניסיון עבודה עם קונטיינרים, Kubernetes, openShift
- 3+ שנות קידוד עם SQL ואופטימיזציית שאילתות ב-MS-SQL/MySQL/Postgres וכו'
- שנתיים+ ניסיון עבודה עם מסדי נתונים NoSQL כגון Hbase, mongoDB או Cassandra.
- שנתיים+ ניסיון עם רכיבי מערכת אקולוגית של Hadoop: HDFS, Hive, Spark. המשרה מיועדת לנשים ולגברים כאחד.
 
עוד...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8231574
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות שנמחקו