דרושים » תוכנה » data Engineer

משרות על המפה
 
בדיקת קורות חיים
VIP
הפוך ללקוח VIP
רגע, משהו חסר!
נשאר לך להשלים רק עוד פרט אחד:
 
שירות זה פתוח ללקוחות VIP בלבד
AllJObs VIP
כל החברות >
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
לפני 10 שעות
Location: Netanya and Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We're hiring a data Engineer
Were looking for a data Engineer with solid experience building and maintaining large-scale data pipelines in the cloud.
Youll be joining a collaborative R D team working on cutting-edge SaaS products, with full ownership over data workflows, architecture and performance.
Requirements:
What were looking for:
2-3 years of experience in data engineering roles
Hands-on experience with Azure cloud services, especially data Factory, data Lake, and Databricks
Solid knowledge of Databricks, including Spark-based processing, notebook orchestration, and ML pipeline integration
Proficient in Python and SQL for data manipulation and automation
Experience with orchestration tools like Airflow
Familiarity with Docker, CI/CD, and working in cross-functional development teams
Youll work closely with data scientists, engineers and product teams to build data -driven features
This position is open to all candidates.
 
Hide
הגשת מועמדות
עדכון קורות החיים לפני שליחה
8302213
סגור
שירות זה פתוח ללקוחות VIP בלבד
משרות דומות שיכולות לעניין אותך
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
28/08/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
As we continue to expand and evolve, were looking for a Data Engineer to join our growing team and help shape the future of trust and security in the decentralized world.
We have a large amount of varied, exciting, and unique data on our hands, and were already squeezing value out of it for our customersbut theres so much more in there. Your goal will be to help construct the data in ways that enable both our business users and research group to dig deeper. As our very first dedicated data engineer, youll have a huge impactbut youll also need the independence and proactiveness to own it.
What Youll Do:
Design and build complex data pipelines to ingest, process, and transform data from a variety of sources, especially logs and textual inputs.
Collaborate closely with Software Engineering and Product teams to ensure data is accessible and usable.
Develop efficient ETL processes using frameworks such as DBT, Airflow, or their equivalents.
Own and optimize your data environment (e.g., Snowflake), focusing on performance tuning, governance, and reliability.
Build dashboards for various company-wide use cases.
Implement best practices for data management, quality assurance, and security within cloud infrastructures (AWS).
Enable ML and analytics teams by building pipelines that feed feature stores and model training workflows.
Requirements:
4+ years of hands-on Data Engineering experience, in a cybersecurity or security-adjacent environment.
Proficiency in Python and SQL, with proven experience handling large or unstructured data.
Familiarity with data warehouse technologies (Snowflake, BigQuery).
Experience with big data infrastructure (Snowflake/Databricks), orchestration tools (Airflow), and cloud platform (AWS).
Solid understanding of data governance, quality assurance, and pipeline observability.
Ability to deliver end-to-end solutionsfrom ingestion to production-ready datasets with minimal supervision.
Nice to Have:
Experience in cybersecurity, threat intelligence, or blockchain data processing.
Experience orchestrating large-scale ETLs in Snowflake
Experience using DBT in production
Knowledge of OLTP databases (e.g., PostgreSQL).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8323338
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for an experienced and passionate Staff Data Engineer to join our Data Platform group in TLV as a Tech Lead. As the Groups Tech Lead, youll shape and implement the technical vision and architecture while staying hands-on across three specialized teams: Data Engineering Infra, Machine Learning Platform, and Data Warehouse Engineering, forming the backbone of our data ecosystem.

The groups mission is to build a state-of-the-art Data Platform that drives us toward becoming the most precise and efficient insurance company on the planet. By embracing Data Mesh principles, we create tools that empower teams to own their data while leveraging a robust, self-serve data infrastructure. This approach enables Data Scientists, Analysts, Backend Engineers, and other stakeholders to seamlessly access, analyze, and innovate with reliable, well-modeled, and queryable data, at scale.

In this role youll:

Technically lead the group by shaping the architecture, guiding design decisions, and ensuring the technical excellence of the Data Platforms three teams.

Design and implement data solutions that address both applicative needs and data analysis requirements, creating scalable and efficient access to actionable insights.

Drive initiatives in Data Engineering Infra, including building robust ingestion layers, managing streaming ETLs, and guaranteeing data quality, compliance, and platform performance.

Develop and maintain the Data Warehouse, integrating data from various sources for optimized querying, analysis, and persistence, supporting informed decision-makingLeverage data modeling and transformations to structure, cleanse, and integrate data, enabling efficient retrieval and strategic insights.

Build and enhance the Machine Learning Platform, delivering infrastructure and tools that streamline the work of Data Scientists, enabling them to focus on developing models while benefiting from automation for production deployment, maintenance, and improvements. Support cutting-edge use cases like feature stores, real-time models, point-in-time (PIT) data retrieval, and telematics-based solutions.

Collaborate closely with other Staff Engineers across us to align on cross-organizational initiatives and technical strategies.

Work seamlessly with Data Engineers, Data Scientists, Analysts, Backend Engineers, and Product Managers to deliver impactful solutions.

Share knowledge, mentor team members, and champion engineering standards and technical excellence across the organization.
Requirements:
8+ years of experience in data-related roles such as Data Engineer, Data Infrastructure Engineer, BI Engineer, or Machine Learning Platform Engineer, with significant experience in at least two of these areas.

A B.Sc. in Computer Science or a related technical field (or equivalent experience).

Extensive expertise in designing and implementing Data Lakes and Data Warehouses, including strong skills in data modeling and building scalable storage solutions.

Proven experience in building large-scale data infrastructures, including both batch processing and streaming pipelines.

A deep understanding of Machine Learning infrastructure, including tools and frameworks that enable Data Scientists to efficiently develop, deploy, and maintain models in production, an advantage.

Proficiency in Python, Pulumi/Terraform, Apache Spark, AWS, Kubernetes (K8s), and Kafka for building scalable, reliable, and high-performing data solutions.

Strong knowledge of databases, including SQL (schema design, query optimization) and NoSQL, with a solid understanding of their use cases.

Ability to work in an office environment a minimum of 3 days a week.

Enthusiasm about learning and adapting to the exciting world of AI a commitment to exploring this field is a fundamental part of our culture.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8358644
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
11/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
we are looking for a Senior Data Engineer
What You'll Do:

Shape the Future of Data - Join our mission to build the foundational pipelines and tools that power measurement, insights, and decision-making across our product, analytics, and leadership teams.
Develop the Platform Infrastructure - Build the core infrastructure that powers our data ecosystem including the Kafka events-system, DDL management with Terraform, internal data APIs on top of Databricks, and custom admin tools (e.g. Django-based interfaces).
Build Real-time Analytical Applications - Develop internal web applications to provide real-time visibility into platform behavior, operational metrics, and business KPIs integrating data engineering with user-facing insights.
Solve Meaningful Problems with the Right Tools - Tackle complex data challenges using modern technologies such as Spark, Kafka, Databricks, AWS, Airflow, and Python. Think creatively to make the hard things simple.
Own It End-to-End - Design, build, and scale our high-quality data platform by developing reliable and efficient data pipelines. Take ownership from concept to production and long-term maintenance.
Collaborate Cross-Functionally - Partner closely with backend engineers, data analysts, and data scientists to drive initiatives from both a platform and business perspective. Help translate ideas into robust data solutions.
Optimize for Analytics and Action - Design and deliver datasets in the right shape, location, and format to maximize usability and impact - whether thats through lakehouse tables, real-time streams, or analytics-optimized storage.
You will report to the Data Engineering Team Lead and help shape a culture of technical excellence, ownership, and impact.
Requirements:
5+ years of hands-on experience as a Data Engineer, building and operating production-grade data systems.
3+ years of experience with Spark, SQL, Python, and orchestration tools like Airflow (or similar).
Degree in Computer Science, Engineering, or a related quantitative field.
Proven track record in designing and implementing high-scale ETL pipelines and real-time or batch data workflows.
Deep understanding of data lakehouse and warehouse architectures, dimensional modeling, and performance optimization.
Strong analytical thinking, debugging, and problem-solving skills in complex environments.
Familiarity with infrastructure as code, CI/CD pipelines, and building data-oriented microservices or APIs.
Enthusiasm for AI-driven developer tools such as Cursor.AI or GitHub Copilot.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8343346
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and English Speakers
We are growing and are looking for a Senior Data Infra Engineer
who value personal and career growth, team-work, and winning!
What your day will look like:
Design, plan, and build all aspects of the platforms data, machine learning (ML) pipelines, and infrastructure.
Build and optimize an AWS-based Data Lake using best practices in cloud architecture, data partitioning, metadata management, and security to support enterprise-scale data operations.
Collaborate with engineers, data analysts, data scientists, and other stakeholders to understand data needs.
Solve challenging data integration problems, utilizing optimal ETL/ELT patterns, frameworks, query techniques, and sourcing from structured and unstructured data sources.
Lead end-to-end data projects from infrastructure design to production monitoring.
Requirements:
Have 5+ years of hands-on experience in designing and maintaining big data pipelines across on-premises or hybrid cloud environments, with proficiency in both SQL and NoSQL databases within a SaaS framework.
Proficient in one or more programming languages: Python, Scala, Java, or Go.
Experienced with software engineering best practices and automation, including testing, code reviews, design documentation, and CI/CD.
Experienced in building and designing ML/AI-driven production infrastructures and pipelines.
Experienced in developing data pipelines and maintaining data lakes on AWS - big advantage.
Familiar with technologies such as Kafka, Snowflake, MongoDB, Airflow, Docker, Kubernetes (K8S), and Terraform - advantage.
Bachelor's degree in Computer Science or equivalent experience.
Strong communication skills, fluent in English, both written and verbal.
A great team player with a can-do approach.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8313520
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
29/08/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Were looking for a Data Engineer to join our data warehouse team and play a pivotal role in driving data solutions that empower data science, GTM, finance, analytics and R&D teams.

If youre passionate about exploring and exposing product and business data to stakeholders across and beyond, wed love to hear from you.



What youll do:

Lead the design and development of scalable and efficient data warehouse and BI solutions that align with organizational goals and requirements.
Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs.
Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into a shared data warehouse.
Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency.
Collaborate with cross-functional teams (product, finance, analytics, and R&D) to deliver actionable data solutions tailored to their needs.
Requirements:
5+ years of experience in a Data Engineering or BI development role
Expertise in building scalable pipelines and ETL/ELT processes, with proven experience with data modeling, including dimensional modeling and SCD handling
Expert-level proficiency in SQL and experience with large-scale datasets
Strong experience with cloud data platforms such as Snowflake, BigQuery, AWS S3, or Redshift
Hands-on experience with ETL/ELT tools and orchestration frameworks such as Apache Airflow, dbt
Experience with Python and software development
Knowledge of data governance, data quality frameworks or semantic layer management
Strong proficiency in BI tools such as Power BI, Tableau, Looker, or Qlik for building interactive dashboards and business reports
Strong analytical and storytelling capabilities, with a proven ability to translate data into actionable insights for business users
Collaborative mindset with experience working cross-functionally with data engineers, analysts and business stakeholders
Excellent communication and documentation skills, including the ability to write clear data definitions, dashboard guides, and metric logic
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8324612
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
10/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
Required Senior Data Engineer
The opportunity
Join our dynamic Data & ML Engineering team in iAds and play a pivotal role in driving data solutions that empower data science, finance, analytics, and R&D teams. As an Experienced Data Engineer, you'll work with cutting-edge technologies to design scalable pipelines, ensure data quality, and process billions of data points into actionable insights.
Success Indicators:
In the short term, success means delivering reliable, high-performance data pipelines and ensuring data quality across the product. Long-term, you'll be instrumental in optimizing workflows, enabling self-serve analytics platforms, and supporting strategic decisions through impactful data solutions.
Impact:
Your work will directly fuel business decisions, improve data accessibility and reliability, and contribute to the team's ability to handle massive-scale data challenges. You'll help shape the future of data engineering within a global, fast-paced environment.
Benefits and Opportunities
You'll collaborate with talented, passionate teammates, work on exciting projects with cutting-edge technologies, and have opportunities for professional growth. Competitive compensation, comprehensive benefits, and an inclusive culture make this role a chance to thrive and make a global impact.
What you'll be doing
Designing and developing scalable data pipelines and ETL processes to process massive amounts of structured and unstructured data.
Collaborating with cross-functional teams (data science, finance, analytics, and R&D) to deliver actionable data solutions tailored to their needs.
Building and maintaining tools and frameworks to monitor and improve data quality across the product.
Providing tools and insights that empower product teams with real-time analytics and data-driven decision-making capabilities.
Optimizing data workflows and architectures for performance, scalability, and cost efficiency using cutting-edge technologies like Apache Spark and Flink.
Requirements:
4+ yeasrs of experience as a Data Engineer
Expertise in designing and developing scalable data pipelines, ETL processes, and data architectures.
Proficiency in Python and SQL, with hands-on experience in big data technologies like Apache Spark and Hadoop.
Advanced knowledge of cloud platforms (AWS, Azure, or GCP) and their associated data services.
Experience working with Imply and Apache Druid for real-time analytics and query optimization.
Strong analytical skills and ability to quickly learn and adapt to new technologies and tools.
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8341692
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
01/09/2025
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a hands-on Data Specialist to join our growing data group, working on the practical backbone of high-scale, financial-grade systems. Youll work closely with engineers, BI, product, and business stakeholders, expert in design, build, and optimize data pipelines and integrations in a cloud-native environment.
If you thrive on solving complex data challenges, enjoy getting deep into code, and want to make an impact on fintech infrastructure, wed love to meet you.
Your Day-to-Day:
Develop, maintain, and optimize robust data pipelines and integrations across multiple systems
Build and refine data models to support analytics and operational needs
Work hands-on with data orchestration, transformation, and cloud infrastructure (AWS/Azure)
Collaborate with engineering, BI, and business teams to translate requirements into scalable data solutions
Contribute to data governance, data quality, and monitoring initiatives
Support implementation of best practices in data management and observability
Requirements:
8+ years in data engineering, data architecture, or similar roles
Deep hands-on experience with PostgreSQL, Snowflake, Oracle etc
Strong experience with ETL/ELT, data integration (Kafka, Airflow)
Proven SQL and Python skills (must)
Experience with AWS or Azure cloud environments
Familiarity with BI tools (Looker, Power BI)
Knowledge of Kubernetes and distributed data systems
Experience in financial systems or fintech (advantage)
Strong ownership, problem-solving ability, and communication skills
Comfort working in a fast-paced, multi-system environment
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8327844
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time
We are looking for a Senior Data Engineer.
As a Senior Data Engineer, youll be more than just a coder - youll be the architect of our data ecosystem. Were looking for someone who can design scalable, future-proof data pipelines and connect the dots between DevOps, backend engineers, data scientists, and analysts.
Youll lead the design, build, and optimization of our data infrastructure, from real-time ingestion to supporting machine learning operations. Every choice you make will be data-driven and cost-conscious, ensuring efficiency and impact across the company.
Beyond engineering, youll be a strategic partner and problem-solver, sometimes diving into advanced analysis or data science tasks. Your work will directly shape how we deliver innovative solutions and support our growth at scale.
Responsibilities:
Design and Build Data Pipelines: Architect, build, and maintain our end-to-end data pipeline infrastructure to ensure it is scalable, reliable, and efficient.
Optimize Data Infrastructure: Manage and improve the performance and cost-effectiveness of our data systems, with a specific focus on optimizing pipelines and usage within our Snowflake data warehouse. This includes implementing FinOps best practices to monitor, analyze, and control our data-related cloud costs.
Enable Machine Learning Operations (MLOps): Develop the foundational infrastructure to streamline the deployment, management, and monitoring of our machine learning models.
Support Data Quality: Optimize ETL processes to handle large volumes of data while ensuring data quality and integrity across all our data sources.
Collaborate and Support: Work closely with data analysts and data scientists to support complex analysis, build robust data models, and contribute to the development of data governance policies.
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience: 5+ years of hands-on experience as a Data Engineer or in a similar role.
Data Expertise: Strong understanding of data warehousing concepts, including a deep familiarity with Snowflake.
Technical Skills:
Proficiency in Python and SQL.
Hands-on experience with workflow orchestration tools like Airflow.
Experience with real-time data streaming technologies like Kafka.
Familiarity with container orchestration using Kubernetes (K8s) and dependency management with Poetry.
Cloud Infrastructure: Proven experience with AWS cloud services (e.g., EC2, S3, RDS).
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8320416
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
חברה חסויה
Location: Tel Aviv-Yafo
Job Type: Full Time and Hybrid work
We are looking for a Staff Data Engineer to join our Platform group as part of the Data Infrastructure team. This is a senior individual contributor role for someone who combines deep technical expertise with a passion for building scalable, high-impact data infrastructure.
In this role, you will take ownership of designing, implementing, and evolving the data systems that power data orchestration, distributed storage, and streaming capabilities - with a focus on architecture, reliability, and performance. Youll collaborate across teams, influence engineering best practices, and help scale our infrastructure to support growing data needs across the organization.
Requirements:
8+ years of hands-on experience in backend or data engineering roles, with 3+ years in senior/staff-level positions
Proven ability to design, implement, and maintain large-scale distributed systems
Strong coding skills in a modern backend language (e.g. Python, Kotlin, Java, Scala), with clean, testable design patterns and production-ready code
Expertise in relational and non-relational databases, especially PostgreSQL and columnar or analytical storage systems (e.g. SingleStore, ClickHouse, Redshift, BigQuery)
Solid experience with streaming platforms like Apache Kafka
Familiarity with data orchestration and pipeline frameworks (e.g. Airflow, Dagster)
Comfortable working in cloud environments (AWS, GCP, or Azure)
Proficiency in infrastructure and deployment workflows (CI/CD, observability, monitoring)
Ability to work across teams and influence architectural decisions at the org level
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8333169
סגור
שירות זה פתוח ללקוחות VIP בלבד
סגור
דיווח על תוכן לא הולם או מפלה
מה השם שלך?
תיאור
שליחה
סגור
v נשלח
תודה על שיתוף הפעולה
מודים לך שלקחת חלק בשיפור התוכן שלנו :)
21/09/2025
חברה חסויה
Location: Netanya
Job Type: Full Time
Were seeking a data Tech Lead to drive technical excellence in data engineering and analytics. As the go-to expert, youll set the technical direction, optimize data pipelines, and tackle programming challengesclosing knowledge gaps, solving data -related questions, and streamlining operations. Youll also design scalable architectures, manage ETL workflows, and enhance data processing efficiency.
Requirements:
Were seeking a data Tech Lead to drive technical excellence in data engineering and analytics. As the go-to expert, youll set the technical direction, optimize data pipelines, and tackle programming challengesclosing knowledge gaps, solving data -related questions, and streamlining operations. Youll also design scalable architectures, manage ETL workflows, and enhance data processing efficiency. Key Responsibilities:
Oversee the technical aspects of data projects by making architectural and design decisions.
Streamline existing operations and implement improvements with the teams collaboration.
Guiding team members in technical matters, and supervising system modifications.
Conducting Code reviews for data analysts, BI Analysts and data engineers.
Bridge technical knowledge gaps within the data team, answering critical product-related questions. Requirements
5+ years of experience in data engineering & Big Data Analytics.
data Engineering & Automation: Building robust, production-ready data pipelines using SQL, Python, and PySpark, while managing ETL workflows and orchestrating data processes with Airflow (unmanaged) and Databricks.
Big Data Analysis & Distributed Processing: Expertise in Databricks (Spark, etc.) for handling large-scale data analytics with optimized efficiency.
Cloud Infrastructure: Proficient in Cloud Services (preferably Azure) for data Storage and processing.
data Architecture: Expertise in data architecture to ensure best practices in scaling, cost efficiency, and performance optimization. If youre passionate about building scalable data solutions and thrive in a fast-paced environment, wed love to hear from you!
This position is open to all candidates.
 
Show more...
הגשת מועמדותהגש מועמדות
עדכון קורות החיים לפני שליחה
עדכון קורות החיים לפני שליחה
8353385
סגור
שירות זה פתוח ללקוחות VIP בלבד